Saturday, April 20, 2013

Is Scala a Functional Language?

A few years ago, a guy called Robert Fischer wrote a blog post arguing that Scala is not a functional language.  What I found particularly interesting about it is that he claims to have worked a fair amount with Scala, but he doesn't exactly show a deep familiarity with the language.  Indeed, I've been programming with Scala for less than a year now, and I can say pretty confidently that I know it more thoroughly than he does.

So, is Scala a functional language?  I would say no.  Scala is a multi-paradigm language designed to blend functional programming with object-oriented programming.  I wonder who Robert thinks claims that it is a functional language, as most users would probably give the answer I just gave.  For example:

“Scala goes further than all other well-known languages in fusing object-oriented and functional programming.”
-- Martin Odersky

The language is designed to optimize the union between the two paradigms, rather than optimizing the use of either one separately.  However, I will also make the following claim: under the definition that Robert himself gives, Scala actually is a functional language. Let me use his examples for the purpose of demonstrating this.  According to his definition, a functional language is one which makes functional programming easy.

Let's start with the first example: Robert's claim is that the following is the easiest way to define a function:

object SomeHolderObject {
  val f(x:int):int = { x + 1 }
}
 
Bzzt, wrong.  I have no idea what compiler Robert's using, but the following runs just fine on the official compiler provided by type-safe (if you save it in a .scala file):

def f(x:Int) = x + 1

What this does, actually, is that it treats the program as a script, not an object.  This is one of the things that first attracted me to Scala, actually.  Before getting into Scala, I was a Python programmer (I still am, but I'm beginning to like Scala more and more).  The combination of type inference and ability to treat programs as scripts felt much like using a scripting language; I felt at home when using Scala.

Now, in a last-ditch attempt to save this example, one might argue that Scala requires that you specify the return type of the parameter of the function.  This is true, but it has nothing to do with making functional programming easy; it has to do with the type inference of the system.  Next.


def x(a:Int, b:Int) = a + b 
def y = Function.curried(x _)(1)
y(2)  // 3


I'm not normally one to assume my interlocutor is being dishonest, but I can't draw any other conclusion if I accept that Robert has worked "a fair amount" with the language.  Currying is actually much easier than that:

def x(a:Int)(b:Int) = a + b 
def y = x(1)_
y(2)  // 3

There's another way to do it without the syntactic sugar, which is less elegant but places more emphasis on the fact that functions are first-class values:

def x(a:Int) = {b:Int => a+b} 
def y = x(1)
y(2)  // 3

Actually, I have some reservations about the way OCaml does it, because that way, it's possible (easy?) to accidentally put the wrong number of parameters in your function.  With OCaml's syntax, instead of catching this error, the compiler will return a partially-applied function.  Also, SML is also not curried by default

Robert's final example involves algebraic data types.  Again, Scala's syntax is optimized to blend object-oriented and functional programming, rather than to use either one separately, and its implementation of case classes and pattern matching is a case in point.  Actually, one can create a similar syntax to OCaml's by using type aliases, but I'll just give this one to him.  Scala still wins 2 to 1, using his own examples.  Facepalm.

Overall, it seems that the author is either learning just barely enough Scala to write some truly horrendous code, or else purposefully writing the worst possible code in Scala and the cleanest possible code in OCaml just to give the latter the edge.

Actually, in a Stack Overflow topic on the subject, one of the commenters described as a "litmus test" for a functional language the Church numerals.  Church numerals are a way of encoding integers with the number of times a function is applied.  Well:

def thrice[T](f:T => T) = f andThen f andThen f
thrice[Int](1+)(0) // 3
thrice(thrice[Int](1+))(0) // 9
thrice(thrice[Int])(1+)(0) // 27

exactly as the commenter described it.  Another clear win for Scala, I'd say.

~Ian

Saturday, January 5, 2013

Dynamical Systems & the Technological Singularity (Part 0)

Being a more pragmatically-minded grinder, I try not to get hung up on predicting the future, or writing theoretical articles about how great the singularity is going to be.  In this entry (which will probably end up being a series), I'm going to partially break that rule, and analyze the singularity from the perspective of dynamical systems theory; in particular, I'll attempt to show what the mathematics say and, more importantly, what they don't say.  This is vitally important, as transhumanists and singularitarians frequently draw conclusions about the singularity that are unjustified by the actual mathematics.  For those who don't know what the technological singularity is, you can read this Wikipedia entry for a basic understanding.

So, dynamical systems theory is, not surprisingly, the study of dynamical systems.  The term "dynamical system" itself has a few equivalent definition, but the one I shall use is the following:


Definition:  Suppose we have a map F: U ⊆ , a time interval I ⊆ U, and an initial condition (t0,x0) ∈ (I,ℝ).  Then we have a (1-D, homogenous, continuous) dynamical system given by the solutions x to the differential equation
|x'(t) = F(x)                                            |x( t0) = x0
 Usually, for convenience, we set t0 = 0.

Why is this important?  Because the Law of Accelerating Returns (LAR) is itself a dynamical system, and can be analyzed according to the rules of dynamical systems theory.  It is given by the equation

|x'(t) = ax
|x(0) =  x0

where a < 0 is some constant representing the rate of decay, x(t) is the cost of information technology at time t, and x0 is the current cost.  In ordinary language, it states that the cost of information technology falls at a rate proportional to the cost itself.

This is very important to note:  this is all the LAR says!  Kurzweil himself will say this.  The LAR therefore cannot be used to derive any additional conclusions about the so-called march of technology, at least not without plugging in additional facts or assumptions about society, the economy, etc.

EDIT:  It looks like I'm going to have to make this a series after all.  The next entry in the series is going to contain examples of how to solve the LAR under certain societal conditions.

~ian

Monday, December 31, 2012

On the Genealogy of Rationality

Note:  This is a paper I wrote for university in early 2011.  The prompt was to take one concept from Friedrich Nietzsche's Über Wahrheit und Lüge im außermoralischen Sinn (On Truth and Lies in a Non-(Extra-)moral Sense), and to deconstruct it.  I figured that using a genealogical deconstruction was appropriate, as this was how I imagined Nietzsche would have done it himself.  The concept I picked was his "rational man," and throughout the essay, I map the genealogical change in this concept, and relate each step to Nietzsche's own (mature) epistemology.

Yes, I realize that this paper is somewhat simplistic, and has its problems.  This was written over a year and a half ago, and it would probably look very different if I were to write it today.  Plenty of sections need rewording or even rehauling.  This is also only the second draft, as I could not find a final draft.  I've made a few revisions to this draft that would have been just embarrassing to post.  Perhaps at some point I'll write a post critiquing this paper, and maybe even outlining a new genealogical deconstruction.



     Many words have more than one meaning, and these meanings can change over time. One such word is “rational” or, equivalently, “reason.” Nietzsche uses this word to characterize one possible archetype of person, as opposed to the “intuitive man.” In order to understand what Nietzsche means when he puts forward this dichotomy, an understanding of precisely what the “rational man” is, as opposed to the “intuitive man,” is necessary.
     Why does this matter? Who cares what Nietzsche thought of the concept? Well, we often take advice, at least to some extent, from other people on how to live our lives. Nietzsche is another person, one who arguably anticipated Sigmund Freud and psychoanalysis (Sorgner, Metaphysics... 84-85), and so, it could be argued, has some valuable contributions to make on the subject. Nietzsche counterposed his “rational man” psychological state to an “intuitive man” one, which Nietzsche preferred. So, if we are to take his advice, it is advisable to know where precisely the distinction lies.
     The best method, or at least the best start of a method, to this end is to have an idea of what the word has meant throughout history. Nietzsche himself was well aware of the ever-shifting nature of language (Nietzsche, On Truth... 2). The shift in meaning of this particular word has been not in its denotation, but in its connotation. While the word itself has Latin origins (“rational, n.1”), the concept itself can be traced to ancient Greece (in both the sense of the word ratio and the sense of the word reason, but I will only look at the second case here), when Socrates claimed that the “unexamined life is not worth living,” and proposed a method of dialogue in which ethical propositions were examined and their basic assumptions scrutinized.
     Some may object to this examination of the philosophical usage of the term, rather than its popular usage. I, however, think that the philosophical usage is quite appropriate. The main reason for this is that Nietzsche, obviously, was a philosopher, and one very literate in his predecessors. Nietzsche often used terms that seem downright bizarre in the context, because he was using the traditional philosophical definitions of his time (which also differs from the definitions of today, but this is irrelevant to this paper). One such case is Nietzsche's rejection of “the Truth”; however, understood in its philosophical sense, Nietzsche was exclusively using this term to refer to the traditional correspondence version of truth, which many had already recognized by Nietzsche's time as outdated (Sorgner, Metaphysics... 80).
     Plato, speaking through his character of Socrates, decided to apply this method to other areas as well, such as ontology (the study of being), and that one could thereby gain knowledge of the world as it is, as per his “Allegory of the Cave,” in which he compares the rational mind to a person who can escape from a cave to perceive the source of shadows which existed inside that cave. Plato was also among the first to refer to “philosophy” as the love of wisdom.
     Nietzsche would have agreed wholeheartedly with the idea of ethics as the foundation of philosophy, although not, as we shall see below, the idea that ethics can be based on this pure form of examination, independently of values. However, Plato's idea of an outside of the cave in his Allegory was rejected by Nietzsche. This, for him, would be as much “an absurdity and a nonsense” as “an eye turned in no particular direction” (quot. Sorgner, Metaphysics...).
     Aristotle was the next to change the standards of rationality, by requiring that things be based on experience, and came up with his own physical theory on this basis. He also repudiated the application of the Socratic method to ethics, and claimed that ethics was simply the application of reason to achieve happiness. Finally, Aristotle came up with the first formal logical syntax, which we now call syllogistic or classical logic.
     Nietzsche agreed that ethics were ultimately based on a person's values, though not on happiness (“Man does not strive for pleasure; only the Englishman does” being the most well-known quote on the subject). For Nietzsche, ethics were based simply on the philosopher's prejudices (Sorgner, Metaphysics... 27-28), whatever they may be, and the philosopher devised a metaphysical system in an attempt to justify ad-hoc these ethics, often in the form of positing a God who holds the same position. Reason, then, for Nietzsche, does not apply to ethics, in the same fashion as Aristotle.
     After the fall of Ancient Greece to the Roman Empire, there were no major revisions to the connotations of the word until the late Rennaissance, when René Descartes began the era of modern thought by asking what he could know with absolute certainty. He concluded that, while he may be dreaming, or deceived by a malicious demon, one thing that he could be certain of was his own existence. If he were deceived thusly, he would still have to exist to be deceived. Thus, Descartes returned to the Socratic method of examining basic assumptions, but applied this to all knowledge, rather than simply ethics or ontology.
     Descartes effectively founded a school of philosophical thought called Rationalism, which was prevalent during the Age of Enlightenment. Its main exponents were Benedict de Spinoza and Gottfried Leibniz; its tenet was that all thinking beings, humans included, began with innate knowledge, which they know intuitively. These ideas were an inherent property of reason and of humans, just as four sides is an inherent property of a quadrilateral. Using this knowledge, such as the Law of Non-Contradiction (a proposition and its negation cannot simultaneously be true) and the Principle of Sufficient Reason (everything that happens had a reason why it happened one way rather than another), they could build an entire system of knowledge that transcended experience, and perceived the world as it truly was, thus returning to the Platonist ideal of knowledge through “pure reason,” independent of experience (Schopenhauer 7).
     Nietzsche, as is well-known, rejected the ideas of the rationalists, and of pure reason: “There are many kinds of eyes...consequently there are many kinds of 'truths', and consequently there is no truth...Henceforth, my dear philosophers...let us guard against such contradictory concepts as 'pure reason'” (quot. Sorgner, Metaphysics... 88). Nietzsche was also opposed to many other conclusions of the Continental rationalists; particularly, the immaterial, unified soul. This was the rationalists' pet solution to the mind-body problem (the problem of the relation of the mind to the brain); the possession of an immaterial logos by humans, which separated them categorically from the rest of nature (Sorgner, Beyond Humanism... 14).
     Nietzsche rejected this immaterial soul, and posited that the human (and überhuman) is “entirely body” (Nietzsche, Thus Spoke... 32), rather than a combination of body and soul. Given the context, Nietzsche may be accused (and has been) of denying reason itself by denying the soul. However, this is contradicted by the fact that according to Nietzsche himself, reason is to be highly valued (Sorgner, Beyond... 14). The utility of reason for Nietzsche was not in any sort of inherent property of the concept, as it was for the Rationalists, but in its utility for our survival and attainment of power, as it helps us map out the world and “impose Being on Becoming” which for Nietzsche is the highest form of power (Sorgner, Metaphysics... 56).
     Directly opposed to this school was the “Empiricist” school, the main proponents of which were the British philosophers John Locke, George Berkeley, and David Hume. This school held, contra the Rationalists, that all knowledge comes to us through experience, rejecting the concept of “innate ideas.” Locke started the trend by stating that we only had knowledge of our “ideas,” which in the Empiricist tradition meant our thoughts and perceptions. Berkeley took this further by claiming that we have no empirical evidence of the material world, and thus it does not exist. Hume went even further than Berkeley by claiming that not only the material world was without empirical support, but so were inductive reasoning, causality, and even the self as unified subject. It was this extreme scepticism that aroused another philosopher, Immanuel Kant, from “dogmatic slumbers.”
     As was hinted earlier, Nietzsche disputed the unified subject which before the empiricists had been so taken for granted. For Hume, the subject was a “bundle of perceptions” that were loosely held together by their similarities, and that this creates the illusion of unity (Schopenhauer xii). As it was for Hume, so it was for Nietzsche. Nietzsche broke up a person's psyche into their conscious, rational side and their unconscious, emotional side, thus anticipating Freud's famous threefold division of the subject (Sorgner, Metaphysics 85). However, the most important similarity with the empiricists was their emphasis on “this world” as opposed to some other, and thus their (barring Hume) emphasis on science as the most reliable method of gaining knowledge: “Whereas the man of action binds his life to reason...the scientific investigator builds his hut right next to the tower of science” (Nietzsche, On Truth... 7). Reason, then, was to be derived from experience.
     Kant was the first major philosopher to attempt a reconciliation between the two schools of rationalism and empiricism. His own school of epistemology, called “Transcendental Idealism,” reasoned thus: neither experience nor reason alone can provide knowledge; while certain “innate ideas,” such as the rules of logic, may exist, they cannot be applied outside of experience without leading to “dialectical antinomies.” Kant thus justified causality: not as an outer feature of the world, but as an innate idea, which people use to map out the world and make it intelligible. We cannot, however, transcend this map to gain knowledge of the “thing in itself” by pure reason. Kant therefore seems to return to the Aristotelian view of epistemology, by synthesizing reason and experience (Schopenhauer 13).
     Nietzsche concurs significantly with this. The word “perspectivism” is often used to describe Nietzsche's epistemological stance, that truth is inseparable from the perspective or language used to describe the world. As noted earlier, Nietzsche compared a perspective to an eye turned in a specific direction, and used this analogy as a way of dismissing the idea of pure reason. This resonates significantly with Kant's idea that we cannot gain knowledge of the thing in itself, although Nietzsche criticized Kant for positing such a thing in the first place. Even so, Nietzsche regarded Kant as an important step in the evolution of perspectivism in his “History of an Error” (quot. Sorgner, Metaphysics... 90).
     After Kant came the Romantic period, which was varied in its epistemology. On the one hand, we have philosophers such as Johann Fichte and Georg Hegel, who took Kant's “dialectical antinomies” not as a sign of the failure of pure reason, but as the path that reason must take; to gain knowledge of the thing in itself, we must only apply a new form of logic, often called “dialectical logic.” Others, such as Arthur Schopenhauer, kept true to the necessity of experience, and claimed that inner experience, of one's own mind, was the key to the thing-in-itself (Schopenhauer xiii).
Nietzsche was, of course, heavily influenced by Schopenhauer, but he also came to reject many of Schopenhauer's views. Schopenhauer's dictum that “The world is my idea” (Schopenhauer 1), in other words, the only source of knowledge about the world lies in our representation of it, anticipated Nietzsche's perspectivism by nearly half a century. Nietzsche spoke of his “sense for hard facts, his good will for clarity and reason” (quot. David Berman, Schopenhauer xxxvi). However, Nietzsche also heavily criticized what he referred to as Schopenhauer's “mystical embarrassments” (Sorgner, Metaphysics... 102); most notably, the idea that one can gain knowledge of the thing in itself, even through inner experience.
     This is the context in which Nietzsche was writing. As we have seen, there are significant similarities and significant breaks with many traditional philosophies. Perhaps the most significant break, however, was not mentioned. All these versions of reason rely on a correspondence theory of truth; that is, a definition of the word “truth” to mean that an idea corresponds with reality. Nietzsche, however, rejects this, claiming that there is no way to actually know what corresponds with reality, and that truth is inseparable from a person's perspective on the world and from the language he uses to describe it (Sorgner, Metaphysics... 80). So, in reality, Nietzsche's epistemology cannot be reduced to any of its antecedents.
     Saying that, can we assess the similarities? Well, Nietzsche's perspectivism is much closer to Kant than to anyone else mentioned. Although Kant, unnecessarily in Nietzsche's opinion, posited a “true world” that existed, but was unknowable, he still noted that it was indeed unknowable. Indeed, without Kant, the philosophies of Schopenhauer, and thus of Nietzsche, would have been nearly unthinkable (Schopenhauer 16). So, it seems that Nietzsche's use of the word “rational”, while different in many ways, most represents the acceptance of reason as useful, for mapping out the structure of the world, but not for gaining knowledge of any sort of true world. This is what we should keep in mind when we assess Nietzsche's opposition of the rational man to the intuitive one

Works Cited

"rational, n.1". OED Online. March 2011. Oxford University Press. Retrieved 23 April 2011 <http://www.oed.com/viewdictionaryentry/Entry/270902>.
Nietzsche, Friedrich. On Truth and Lies in a Nonmoral Sense. 1873. eBook.
Nietzsche, Friedrich. Thus Spoke Zarathustra: A Book for All and None. New York, NY: Barnes & Noble Books, 2005. Print.
Schopenhauer, Arthur. The World as Will and Idea. London, UK: Orion Publishing, 1995. Print.
Sorgner, Stefan. "Beyond Humanism: Reflections on Trans- and Posthumanism." Journal of Evolution and Technology 21.2 (2010): 1-19. Web. 20 Apr 2011.
Sorgner, Stefan. Metaphysics Without Truth: On the Importance of Consistency Within Nietzsche's Philosophy. Rev. 2nd ed. Milwaukee, WI: Marquette University Press, 2007. Print.

Saturday, December 29, 2012

The Obligatory Post on Gun Control

Why did it take me so long to write this post?  A number of reasons, really.  Most importantly, I don't think that politicizing people's deaths is particularly ethical, so I certainly wasn't going to post this until I thought I'd given it a fair amount of time.  Indeed, I wasn't going to write this post at all, originally, but a nagging voice in my head keeps telling me to.

Okay, so what is the argument in favour of gun control?  For the four of you that haven't heard it, it briefly states that if we ban (or tightly regulate) gun sales to civilians, then shootings such as the recent ones are less likely to happen, since there would be no civilian-owned guns to use.

My first problem with this argument is that I don't particularly fancy giving the same government responsible for the NDAA or the Patriot Act, as well as its militaristic behavior,  the power to regulate weapons.  I think they've proven well enough that they shouldn't be trusted in that regard.

The thing to remember here is that even if we completely got rid of civilian gun sales, the government would still have complete access to them.  As my friend Tim Cannon put it, while giving guns to a bunch of monkeys is dangerous, giving guns to only a few monkeys is even worse.

My second problem actually has to do with the gun culture, the existence of which liberals are quite right in pointing out.  Our gun culture in the United States is very much a problem, and it's very much a factor in causing these shootings.  It's precisely because of this gun culture, however, that gun control isn't going to do jack shit in this country.  A country with a strong gun culture will experience mass shootings whether guns are regulated or not.  The problem isn't the guns, it's the culture.

Finally, and most importantly, I can't find a shred of evidence backing up the claim that gun control laws would reduce the number of deaths by firearms.  Sure, people point to specific countries like the United States or Japan, but that's not evidence.  I recently came across a table, compiled by The Guardian, of the rates of gun ownership and firearm-related deaths in different countries.  Below is the result of plotting this information.  The first table is the percentage of deaths caused by firearms vs. percentage of gun owners, and the second is the number of deaths caused by firearms vs. percentage of gun owners.




 

Now, note that there is clearly no correlation in either case.  Indeed, what struck me is that the first graph appears to be completely random in terms of the rate of gun use in murders.  But what seems particularly fascinating to me is the near-bell curve in the second graph, with the most homicides occurring in countries with an intermediate number of guns.  My (probably really shitty) attempt at an explanation is as follows:  in countries with few guns, there are few gun-caused homicides because there are so few guns in the first place.  In countries with many guns, there are few of them because of the deterrence factor involved in having so many people with guns.  Only in countries with an intermediate number of guns is neither factor strong enough to stop these shootings.

Now, I imagine that this data will be criticized as not necessarily coming from the best sources, and I'm somewhat inclined to accept this criticism.  You can see the sources for each individual country in the table, but if you have a better table from a better source, link me to it and I'll plot that data as well.

Of course, even if there had been a correlation, it wouldn't demonstrate causation.  For example, the lack of gun control in the United States is probably in no small part due to the gun culture.  Since a gun culture will have more homicides, it's possible that gun control and number of shootings are related not by mutual causation, but rather by a common hidden variable, namely the gun culture.

~Ian

EDIT:  God damn it, those pictures turned out shitty.  Will try to fix them later.