Tagged as Books, Programming Languages

Written on 2007-06-08 21:06:00

On Tuesday I was quite relieved to be off work. It had been a stressful day at the office but I had exciting dinner plans with friends (trivia night at Benchwarmers with Justin and Bria, for the curious). I had finished Cradle to Cradle by McDonough and Braungart the day before and needed to start on my next summer reading book. Nothing on my list really suited though. Most of the books are rather academic and while I find social production fascinating academic writing wasn't really something I was ready for. With that in mind I stopped in two bookstores while I was running afternoon errands and walked away with a copy of Paul Graham's Hackers and Painters. I'm about 50 pages in at the moment and have really enjoyed reading it for the past few days. I'm sure some bits will make it into my next quotables. However, it has raised a number of questions\issues\thoughts in my mind for which I consulted the collective intelligence of "teh interwebs\tubes" this morning and stumbled upon someone who had made both a parody of Paul Graham and what is (in my mind) a more serious\grounded critique. Perhaps the most nerd-controversial thing Graham talks about is programming languages. There are few things more hotly contested in the programming world. In the critique of Graham though, what I found most interesting were the analogies drawn to other internet authors like Eric Raymond. I finally stumbled on this quote on a different site: "Every 5 minutes you spend writing code in a new language is more useful than 5 hours reading blog posts about how great the language is."

And it was at about that moment that I realized that (formal and\or programming) languages are completely incidental. Graham notes early in the book that Computer Science has a real mix of participants. The computer science building at a university might contain honest-to-goodness mathematicians who get their work filed as computer science for funding purposes, a middle ground of those studying computers without making things with them, and programmers trying to make software. For the programmer, language is incidental. The language is only a tool towards creating software...and arguing about programming languages is in some way similar to the creation-evolution debate in suburban America. If you're a scientist it's possible that such a debate matters. It can affect how you approach your work and\or your starting assumptions. Similarly, if you're a computer scientist at a university it affects your research. If you're a programmer though, or a suburban American, than any group can be right. It's effectively incidental to your daily life. If you live in suburban America and are neither a pastor nor a scientist than it matters not whether God created the universe or whether it began out of nothingness one day and it matters not whether it took ages or attoseconds. You will still get out of bed, dress, tend to whatever dependents you have (dogs, children, spouses, etc) and go to work. That is what you will do three hundred and fifty days a year, regardless. Similarly, the programmer can be using the world's best language or it's worst but they will still roll out of bed and try to produce software with it. The only point at which language can conceivably matter to the programmer is if it can ease software development and this is a very personal, very individual thing. The same might be said of SCMs (souce control management systems).

The extent to which people fanatically advocate languages, architectures, etc. beautifully exhibits how intrinsically subject to network effects technology and specifically information technology (that is, computers) are. It's not an issue of simple memetics. Why do mac users advocate mac? Why do linux users advocate linux? Why do python users advocate python or perl users perl? Why do nvidia owners advocate nvidia or amd users amd? Why do PS3 owners advocate PS3 or Wii owners Wii? Simply because the ability to manipulate all that information that you use computers to manipulate in your life (and there is lots of it) whenever you want and wherever you want is good. The information becomes more useful as its accessibility increases. You would never bother typing journal entries into a computer if they couldn't be posted to the internet. There would just be no damn point. You'd keep a journal. At some point Graham tries to draw a distinction between the popularity contest of high school which he argues as being pretty arbitrary and the popularity contest of the real world which supposedly is less so. Here's a thought: Neither of them are arbitrary and both of them are examples of human beings making decisions based on incentives, not that economics has any concrete idea how to explain any of this human behavior beyond such a simple premise. Clearly though the implications are profound. If there is a less mainstream technology which has become an integral part of life for an individual they will fervently advocate the technology so that others adopt it in an effort to keep the engineers hired to maintain and\or improve the technology around as long as necessary. The support industry itself advocates the technology in an effort to maintain their jobs.

Everybody hopes to know or be involved in the next big thing because it's tied to their employability. If you're looking for work, you'll try to learn Java before Algol. This may be dumb because Algol would certainly distinguish you more from the mobbish competition. Moreover, whatever companies do need Algol coders would have little in the way of alternatives and probably have to pay decently. You'd have a bit more bargaining power potentially. Of course, so would they. Take the job or be unemployed. Really though, the whole thing is just ridiculous. The fact that all we want computers for is to do useful and interesting things complicates the issue as does the fact that the differences in languages cause differences in the ease of solving certain programming problems and also perform differently where speed is concerned. And I'm convinced that just ordaining a standard language, platform, architecture, etc would result in a net productivity\innovation loss. Besides, the majority of programmers get employed in spite of all this and many innovations make it to prominence. I still don't buy the quote from Almost Famous about pop music being good because so many people like it but there's something much more complex at work here that I just feel like is slipping off the tip of my tongue and hiding in the back of my head. I guess it sort of reminds me of what McDonough said about nature having good waste (which in some sense means no waste). The extra blossoms produced by a cherry tree litter the ground but in a good sense. It's all so that one new cherry tree will emerge but it manages to fertilize the "littered" area anyway. Almost like it's all about making goods with positive network effects. And certainly languages like ripping off each others features. As do markets. So maybe this good waste is what we're really after. Markets definitely don't make good waste when they produce things. It's clear that our way of making stuff is pretty borked. But there's definitely something to be said for letting all our ideas fly around and crash into each other. But...what does it mean if there are no experts left? Were we wrong about the concept of the expert to begin with?
comments powered by Disqus

Unless otherwise credited all material Creative Commons License by Brit Butler