Content tagged Programming Languages

Ye Olde Personal Update

posted on 2010-02-17 05:32:19

2010 has, to date, been an odd year. I'm not entirely sure what I expected but I had something more idyllic in mind. The last two weeks, I've been struggling mightily with some personal issues and come to recognize some character flaws in the process. That's never fun. In particular, I have some communication problems and there are scenarios in which I simply shut down. Without warning or even conscious recognition, I withdraw and disengage. There are some rough similarities in these scenarios but not enough for me to figure out something conclusive about causality. The flaws caused trouble in both my personal and academic life in late January and early February.

My actual school courses have been going well so far but I've been screwing up my internship and am in the process right now of getting caught back up in that department. It's quite embarrassing as this is something I had looked forward to and involves people that I look up to. Murphy's Law applies to timing of our character flaws and communications problems too, I suppose. Valentine's Day was nice at least. I enjoyed preparing some really lovely Steaks with a Cognac-Peppercorn sauce. Clearly, my life is not *too* hard.

To try and keep that positive note, I'll end with a few things that have made me happy lately:
- Music by Local Natives, Band of Horses, Ametsub, Massive Attack (Heligoland is pretty solid),  Fleet Foxes, Jon Hopkins, The XX and Miles Davis' Flamenco Sketches.
- Writing wrappers for web services/APIs is (so far) reasonably straightforward and fun. It's nice to know that there's a ton of good data out there waiting for neat uses to be made.
- I have ideas and a desire to contribute to way more programming projects than I have time for. Some are others' projects. Some I'd start myself. I maintain that this is a good thing as long as I stay focused on what's on my plate and finish one thing at a time.
- I have ideas for future blog posts and github uploads. Still, I'm sticking to my "no pressure blogging" schedule for 2010. There are more important things than...well, this self-aggrandizing whatever it is. That said, I've always been surprised that I find my blog so useful for remembering where I was, what I was thinking and what I was struggling with years later. Some days it's the only way I can convince myself I'm moving forward.
- Factor is a nice language and I've spent a few hours playing with it again. There are trivial and non-trivial things I like. The biggest thing is the exceptional interactive nature of the language and how well integrated it all is. From a design standpoint, I just appreciate it. It seems to get a lot of things, compromises...right. More on that another time. Factor 0.92 was released today! It's been 2 years since the last release and I'll look forward to helping test a few things before 1.0 and keep hoping for native threading before 2.0. :)
PS: The Factor logo is a raptor! How has Randall Munroe *not* written an XKCD comic about this?
- I've finally found a few people (3) at school that are legitimately interested in programming and care about it. It's taken over a year. That's far too damn long.

On Minimalism

posted on 2009-11-15 00:09:07


Today, I want to talk about something I've been meaning to get around to for a while. Specifically, I want to mention some realizations I've had about restrictions-as-strengths as it relates to programming languages. This blog post was so long in coming in part because I had conflated that issue with a desire to also discuss the source of brittleness (or non-modularity) in programs and how this all tied back to the foundations of programming and CS. Obviously, that's too much for one post. There are three questions here and each is important to me. Later, I'll blog about the other issues and hopefully write a summary to tie it all together.


Programming is really hard. We've known this for a while. As Hal Abelson said, "Anything with Science in the name, isn't." Software engineering is no better off. Certainly no one would argue that we know how to build software as well as structural engineers know how to build bridges. Difficulty in software doesn't stem from a single source but as programmers we need to localize it as much as possible. One way we do this historically has been through tools: Our programming languages with their compilers, profilers and debuggers, our operating systems, and other bodies of code to reuse (libraries).

The Meat

Software has many demands placed upon it. First and foremost it needs to be functional by which I mean correct and relatively stable. Beyond that it needs to be reasonably fast and there are often other concerns from user friendliness to security. All of these concerns introduce complexity to our software, that complexity needs to be managed and I think a central question in managing that complexity is how to partition and quarantine it. I did a rather poor and embarrassing job of at least raising that question a while back. I was lucky enough to find an outstanding attempt at an answer on Daniel Lyons' blog. It was completely coincidental though, he hadn't read my article.

Daniel framed this problem as one of seeking minimalism. I see the same answer from a slightly different angle. To me, there seems to be a pattern of trying to handle complexity by restricting the actions of the programmer. For example, in Peter Seibel's Coders at Work there are numerous mentions of how different subsets of C++ are chosen by different teams of programmers to reduce the complexity of overlapping or interrelated features. People will entirely abandon templates or operator overloading. Douglas Crockford mentions making sure to never use the continue statement in his code. These are examples of programmers simplifying their own mental model to make problem solving more tractable.

Languages do this too of course, the most prominent examples from my very limited experience being Haskell forcing you to think in terms of its type system or Factor forcing you to think in terms of its stack. Adapting to the constraints may be awkward or difficult at first but they do provide real benefits. The Glasgow Haskell Compiler is capable of remarkable optimizations because it can assume that you've adhered to the restrictions of immutability and laziness. Judicious use of the type system can eliminate entire classes of possible bugs and restricting the use of mutable state simplifies parallel programming and concurrency. Through use of the stack, Factor strongly encourages thinking about the dataflow of your program. I've heard this sort of thing expressed before as a language being opinionated about how to solve a problem and there are plenty of diatribes on the failure of one paradigm (opinion) or another, especially OO since it's been dominant for so long. But let's not confuse this issue as being about particular paradigms or programming languages.

There are languages on the other end of the scale.  I tend to think of Common Lisp as the primary one. (Disclaimer: I've written far more code in CL than anything else. My coding experience in every other language is positively trivial by comparison.) Common Lisp has been described by many others as being agnostic or happy to let you express a problem any way you can think how. Then again, Common Lisp requires you to think in Abstract Syntax Trees. It's the rift between opinionated and unopinionated languages that I'm curious about. Of course, Haskell and Lisp are (generally speaking) solving different problems as lispm (Rainer Joswig) notes on hackernews.

Vladimir Sedach suggests that the rift is about metaprogramming. More specifically, he states that Lisp and Smalltalk are self-describing computational systems. Lisp is written in Lisp. Smalltalk is written in Smalltalk. It's that old metacircularity/homoiconicity chestnut. Furthermore, he mentions that Haskell can't be self-describing because it's built on two computational systems. The type system is one and the language built atop it is another. Besides as Vladimir says, "If type systems could be expressed in a metacircular way by the systems they were trying to type, they wouldn't be able to express anything about types (restrictions on computational power) of that system." Factor's FAQ even mentions that they avoided purity and static types to enable the benefits of metaprogramming and interactive development. Personally, I know what I miss most in Haskell is an environment like SLIME. Interactive development of that style has an entirely different feel to me.

These observations about types and metaprogramming were a revelation to me and clearly, it's a limitation on the expressive power of something like Haskell. The question seems to remain open however as to whether or not such restrictions are enough of a strength to offset their cost. It seems to me that it must be situational but I'm interested in a more in-depth examination of the problem. Unfortunately, I can't offer such an examination myself today. Googling around a bit I found a discussion has started on hackernews about lisp while I wrote this. In the discussion jerf writes largely about this issue.  Jerf suggests that the other end of the restricted/unrestricted scale is Java which sounds about right to me. He also suggests that the problem with maximum empowerment languages is that they don't scale to large team sizes and (correspondingly) large programs.

Of course, there are counterexamples like ITA Software but macros were thoroughly debated even among friends at ILC 09 as harmful or helpful to team programming. Vladimir Sedach again has a good grasp on their utility. In my opinion, neither Factor nor Lisp has resolved this question yet but more companies are getting involved with metaprogramming whether it be in Ruby or something else. I hope methods of containing the destabilizing facets of metaprogramming will emerge. Where Common Lisp is concerned, I think Fare's XCVB is an interesting opportunity in this direction and I'm watching it intently.

The Takeaway

Enabling single programmer productivity is important but so is enabling teams. Java as an extreme example of restriction has caused at least as many problems as it has solved. Languages with powerful metaprogramming features often need to be restrained in some fashion when used by large teams. There must be a middle ground somewhere. I say this because eventually our systems (codebases) become huge and unwieldy and we need our languages to support the difficult task of controlling that complexity and of keeping them modular and malleable. That problem of verbosity and inflexibility is precisely what metaprogramming tries to solve. I'll write about the problems in achieving modularity more in my next post.

RedLinux Revamped

posted on 2009-05-12 20:00:25

So the last time I really blogged about RedLinux was back in September 08 when I made the first release. I kept tweaking things now and again but at this point I've got something that I'm really not messing with very often. I've christened it "Redlinux v.20" and I'm planning on trying to make releases every six months or so. They'll mostly consist of package updates with any other changes listed in the changelog. I'll be trying to keep the latest versions of all my dotfiles and a complete archlinux package list in the redlinux docs portion of my website. With the dotfiles and the package list, you could basically build the thing yourself anyway and it keeps my life easier besides. The only real long term plans other than that are to improve documentation and make it more friendly for other people to tinker with.

I've made a Live CD image with an installer and uploaded the ISO to my Amazon S3 account. You can grab the ISO here. The old install guide still applies.

Playing with Haskell

posted on 2009-05-03 04:47:39

Before I start running my mouth without thinking, some disclaimers:
1) This has no business showing up on reddit. It's a journal entry from a dumb kid who played for a few hours. I'm acutely aware of it. Also, there would be no post (no code) at all if cgibbard (Cale) in #haskell hadn't been so damn helpful. Thanks also goes out to olsner, ski, yitz and Twey.
2) This post is completely programming related for readers of my blog who aren't interested in that.

I've been increasingly enamored with Haskell recently. Back in March I picked up Real World Haskell and read through 3 chapters stopping short of finishing the last 3 exercises in Chapter 3 one day over spring break. School and real life caught up with me unfortunately and I've just now had time to sit down with Haskell again. I've been following the language pretty heavily since February and read a number of Dons blogs on things like writing simple unix tools. That's proof enough of disclaimer #1.

Looking back over some of the RWH stuff, I stumbled upon mention of a book by Italo Calvino called Cosmicomics. Cosmicomics is just a cool word. I decided to see how many words like it I could generate. That is, how many words could I generate that are anagrams smushed together by a shared letter. Naturally, I decided it'd be fun to try to do it in Haskell.

The resulting code is 39 lines, 35 removing comments. It is quite readable as far as I'm concerned and as I mentioned, I'm an idiot. The code compiles to a nice, native linux binary that takes up 684k on my machine (ArchLinux x86_64, GHC 6.10.2, etc). It reads in a 6750 word dictionary file with a word on each line and certainly does more computation than is necessary to get the desired result: a printed list of words like cosmicomics. It executes in just over 17 seconds. Fast enough for me. Here's the code:

-- how to find words like "cosmicomics":
-- words which can be split into anagrams by the middle letter
-- PRESENTLY ASSUME THAT ALL WORDS ARE LOWERCASE! I should probably unit test or type for this.

import Data.List (sort)
import System.Environment (getArgs)

main = do
[f] <- getArgs
inFile <- readFile f
let dict = filter (not . null) (lines inFile)
printFunnies (filter notSameWord (findFunnies dict))

isAnagram :: String -> String -> Bool
isAnagram word1 word2 = sort word1 == sort word2

hasJoiner :: String -> String -> Bool
hasJoiner word1 word2 = last word1 == head word2

isFunnyWord :: String -> String -> Bool
isFunnyWord word1 word2
| isAnagram word1 word2 && hasJoiner word1 word2 = True
| otherwise = False

notSameWord :: (String, String) -> Bool
notSameWord (word1, word2) = word1 /= word2

sameLength :: String -> [String] -> [String]
sameLength word xs = filter (\x -> length x == length word) xs

printFunnies :: [(String, String)] -> IO [()]
printFunnies xs = mapM (\(x,y) -> putStrLn (x ++ tail y)) xs

partialFind :: [String] -> String -> [(String, String)]
partialFind dict word = [(word,w) | w <- sameLength word dict, isFunnyWord word w]

findFunnies :: [String] -> [(String, String)]
findFunnies xs = concatMap (partialFind xs) xs

Note from the future: Accidentally navigated away from this page, Wordpress lost the last 600 words\40 minutes of work. Future posts will be written in Emacs.

I found the tricky parts of the problem to be findFunnies, partialFind and printFunnies. findFunnies I was actually on the right track about, I was just using map instead of concatMap. I'd bumped into concatMap earlier while looking at a different problem but it slipped my mind until Cale said, "Hey, you actually want concatMap here". Remember when I said I was dumb? partialFind may have been the trickiest and there are two reasons for that. One, I'd completely forgotten about list comprehensions. Clever, right? Even if I'd remembered them I had no idea you could test predicates while you built the list. Again, thanks Cale. Then it was working but I wanted to format the results differently and introduced two non-exhaustive pattern matches trying to do it so it was back to hastle #haskell. I was soon talking with Cale again. Finally, I realized you had to get the type system to accept the IO being done in printFunnies but I was using map as I didn't know something like mapM existed. Even had I known, I doubt I would've been able to throw together the lamba function it's being passed. Cale gave me a working version with forM and I found my way from there.

I came to Haskell from SBCL (Common Lisp) which I had come to from Scheme (PLT/MIT/Chicken, take your pick). I feel like a lot of my thoughts on Haskell are deeply colored by that trajectory so take the following with a few nice big grains of salt. Now then, for three random thoughts on Haskell from one of those fools who plays way too much with languages instead of getting better by actually getting things done.

It's really nice that Haskell actually has a dominant implementation, GHC, as well as a good package manager, Cabal. Consequently, all the libraries I've encountered on Hackage are easily installable and compatible with GHC. Conversely, I can't say if any require GHC extensions or are compatible with HUGS. That said, everything seems pretty up to date and I haven't even had to think about installing something from version control. Having easily installable bindings to a major cross platform toolkit like GTK or QT and Curses is pretty essential in my opinion. Don't even get me started on the awesomeness that is Hoogle and Hayoo.

Haskell is the first language that's made me see types can actually be useful. To be fair, before this I'd only programmed in Lisp, Scheme, Python and C. A trivial amount of each, at that, so you can see why I wasn't thrilled by static typing. To be fair, I did get a good number of compiler errors over typing problems in my code but they were always helpful. I'm not decidedly on either side of the fence with regards to typing and my gut says that both lend themselves better to certain problems (see the ease of multiple arity functions in Lisp and Scheme, for example) but at least now I can see the issue from both sides.

Haskell is also the first language with non-lispy syntax that I actually like. Dons posted an article called On Syntax a while back proclaiming the superiority of Haskell's syntax but did a bit of a disservice by using a trivial example. Haskell's syntax gets a fair bit more complicated doing anything non-trivial though the code remains concise by consequence. That said, it's far from unreadable and I hope that it'll become more natural as I spend more time writing code. While I found Python's syntax pretty tolerable, Haskell's use of Pattern Matching and Guards really puts it over the top. Lisp has been said to have no-syntax which certainly makes it easier (read: feasible) to use macros, not that it's merely a matter of syntax. I don't even want to think about Template Haskell. I remember reading something along the lines of "Lisp's no-syntax as syntax is certainly flexible but it doesn't lend itself to expressing anything in a very elegant way so much as expressing nothing in a very inelegant way". At least now I can vaguely understand why people could think Perl's syntax was a good idea.

Ironically, an article got posted on the Haskell subreddit earlier regarding Haskell's syntax and difficulty reading it. I thought Nostrademons comment was pretty thoughtful and it largely sums up my issues, particularly points 2 and 3. Maybe if I'm lucky I'll encounter point 1 as I code more in the coming months. Most importantly, none of these issues (syntax, typing or anything else) are dealbreakers. If you want to learn Lisp or Haskell you absolutely can and the main reason for that is the exceedingly helpful communities that surround them. When you can jump into a chatroom, mention a problem you're having and instantly get advice from some much smarter and more experienced folks there's no reason not to dive in and play around. More and more, I'm convinced that languages make it or break it based in large part on their communities and I promise Haskell and Lisp both have damn good ones.

If you're interested in learning Haskell, these are some of the best resources I've found on the web so far (excluding #haskell, of course):
Real World Haskell - A book available in paperback and readable free online.
Learn You a Haskell - A freely readable online book.
Haskell Wikibook - Community edited online Haskell book.
The GHC Standard Library
and User Manual
Any packages docs on Hackage. Just click any of the links under "Modules".
The Haskell Wiki is phenomenal and has stacks of articles on everything from style and idioms to tutorials.
Search Engines: Search the libraries for a function by either name or type!

Finally, here's the output of time funnyWords english-words.txt behind a cut:

Apprehensive About April

posted on 2009-04-07 03:31:00

I sort of forgot that schoolwork all bunches together for a nice crunch towards the end of the semester. Consequently, I'm juggling a fair bit at the moment. Not least because I'm also sorting out Financial Aid and Housing for the next year. On the bright side, I've got some fun projects in the works and I'm not too worried about my classes. I can say I'll be very glad to be done with the semester, even though my summer classes start May 18th. Thankfully, they all are in the afternoon Monday through Thursday.

Here are a few other things that have been going on:

President Obama gave the Queen of England an iPod loaded with 40 Showtunes. And we don't know if that's legal. Think our Intellectual Property laws are messed up yet?

Apparently there's a pretty nice concrete skatepark near me. With any luck Burke and I will be regulars over the summer while he's here. :)

Unladen Swallow seems to be moving along nicely. I jumped in to check on their progress. Speaking of Python, Mark Pilgrim is working on Dive Into Python 3 and it's online. There are also a few fun articles on Functional Programming in Python. Oh, and Named Tuples kind of rock. You'll need Python 2.6 to use them though.

Arch Linux is planning an April release and they've got a fair amount slated to get done. I'm rather excited about it.

Some devs implemented a MapReduce framework in Bash over the weekend. I think it's awesome.

John Cowan has an endearing list of Essentialist Explanations about languages that's fun to peruse.

Last but not least, I've been on a real John Mayer streak lately but here are two other songs that I've really enjoyed. E.Z. L.A. by The Folk Implosion is simply awesome. That whole album is, so get that. No City by Aesop Rock is also quite excellent. I still prefer Labor Days though. In other news, Mayer and the Gorillaz have albums slated to come out in the near future. You're thrilled, right?

Quick Thoughts on Languages

posted on 2009-03-26 16:37:21

John Wiegley posted an article that I thought was fun a week or so back called "Hello Haskell, Goodbye Lisp". Naturally, it made it on Hackernews and Reddit and there were comments of all stripes. Whenever somebody posts an article about a language they're learning the initial reaction from some folks is "What an idiot. Implement something big and hard already!" and I think that is a bit obnoxious as well as missing the point. John Wiegley doesn't think he's a genius because he cracked Haskell's code. I think he knows that Haskell is a deep language and he's playing in the shallow end. What he (and many other people) are trying to figure out is which Programming Languages are equipped to deal with the future requirements of programming well and where the languages stand in terms of personal appeal.

Someone mentioned that Clojure would've been a more fair comparison for Haskell than Common Lisp and there's some truth to that. You'd think that Wiegley being a Lisp fan would cause him to lean towards Clojure but he really doesn't like the JVM for scripting. Why? He thinks it's startup times are unacceptable. I don't mind that but I'd be more concerned about the difficulty in producing standalone binaries and the reliance on having a JRE installed at all. It's just not my thing but that's what we're talking about here: preferences.

Different languages are suited to different domains. It's so bad I'm not even sure I believe in the phrase "general purpose languages" anymore. You've got the "server/enterprise" languages (Java,C#,etc), "scripting" languages (Python/Perl/Ruby generally, PHP/JS for web), "embedded" languages (Forth, C/C++, Lua), and then C and C++ again for Operating Systems, Compilers, "low level stuff" and real-time applications/games. Not that games don't often contain twice as much in the way of Lua\Python scripts as C++ core. *sigh*

So, let's be clear. You have to know several languages and the question is: Which languages do I use to be the most happy and efficient? I want my programming future to be fun! My thoughts are presently as follows. I'm not sure how many languages you can really hold in your head at one time. For regular, proficient (expert?) use I'd say I think 2-3 is about right and a half dozen is possible. I have a hard time imagining simultaneous mastery and regular use of 12 languages. I'm going to try to keep my list to 5 languages.

1. You have to be able to deal with C. You just do. It doesn't hurt to know assembly (ARM, 6502, x86, or whatever) but you have to be able to deal with C. If only to create bindings from other languages to C code. So deal. At least we now have a Low-Level and Embedded language covered that will give us the ability to work with lots of Open Source Software and older code. Also, I didn't choose C++ because I think C is simpler and cleaner. Obviously, there's a lot of C++ code around but trying to really know C++ has to be equivalent to really knowing 2 or 3 other languages. I'm skeptical. Again, preferences.

2. Python has lots of libraries, lots of existing code, and is reasonably readable and self-consistent. I think it serves as a good scripting language and can be used for web development fairly well (see: reddit). Plus it's actually employable and higher level than C.

3. I like Lisps. Don't ask me why, it's just a preference and I think they're fun. My priorities with a Lisp are being able to work with C code well, produce fast native code (preferably with some plan for multicore {futures,stm,threading solution,shared nothing message passing, etc), generate standalone executables and have a decent collection of libraries. That leaves me torn between SBCL (30MB hello-world?! I guess I could use CLISP but what about multicore performance?), Chicken Scheme (multicore performance again? they do have futures...) and Clojure (I need the JVM, you don't do standalone binaries, and there's no asdf/rubygems/cabal/chicken-setup/easy_install/anything?). *sigh* Oh, lisp.

4. Haskell does have some advantages. As far as I know it's the most "parallel-ready" programming language presently available. This is largely because it has solutions for the most granularities of parallel\concurrent code. And yes, I know parallelism and concurrency aren't the same thing...even though Wikipedia confuses them. Erlang, Scala and Clojure all have benefits but for various reasons Haskell is my pick of the lot. I'm open to that changing but it wouldn't hurt to know a lazy, bondage and disciple-style typed functional language regardless. It's an interesting way of thinking. Hackage is reasonably impressive considering the youth of the language in terms of mainstream use and Hayoo and Hoogle are quite unusual and cool methods for code search. The quality of the toolset is also impressive (GHC,Haddock,QuickCheck,Cabal,etc).

5. Factor. I'm not trying to be weird here, I'm just interested in knowing a concatenative language. The library support is pretty good. The code looks pretty clean. It produces standalone binaries which are both pretty small and pretty fast. I feel like a lot of smart design decisions have been made. I'm really interested in seeing where Factor goes. There are a considerable number of libraries given the number of people working on Factor and it's age. Hopefully, it'll only keep getting better.

So, what are my preferences stated more explicitly?

  • I prefer the late-binding/dynamic/whatever-the-hell-you-want-to-call-them languages like Lisp, Scheme, and Factor. They just make the development process more fun than anything else I've worked with. Clojure tempts me but the lack of standalone binaries, package management, and the requirement on the JVM makes me a little nauseous. I don't think Emacs is too heavyweight for text editing but I think that's too heavyweight. Sadly, these languages appear to me to be in the worst library position, save Clojure which is saved via Java. There's a lot of division of effort among the others and there always has been.

  • Haskell in emacs with ghci open in a shell isn't perfect but it isn't bad either and you do get all that type safety/power/relatively-painless-multicore-speedups. Plus the libraries and hayoo/hoogle, etc are pretty great.

  • As for C and Python, they're on the list so I can possibly be employed at some point as well as play in the codebases of my favorite Open Source Software.

I like High Level Languages and I like a lean towards functional programming. Haskell has been fun and I'm now interested in type systems that are actually nice to use but I would still run to a Lisp that made fast parallel code easily, even if it meant having some special typing constructs just for stateful behavior. Object Orientation was the hammer used to hit everything that looked like a nail for the last 20 years and while it unquestionably has it's place, I prefer the functional style in my personal experience where possible. That may well change when I start writing projects > 1000 LOC. Other than that, I just want a nice development environment and good libraries.

Before I forget, I should note that gaze made a very good comment worth keeping in mind on a recent reddit thread about programming languages which I'll try to sum up in a sentence or two: "The important part of a programming language is using it. You'll learn theory as you go, now write code." Anyway, I'm going to run away from social news sites for a bit and try to get some real work done.

Trying to Stay Ahead

posted on 2009-03-04 16:31:57

There are a bunch of links I want to dump out of my browser and I can't think of a coherent way to do so. Here we go:

The berrics has finally wrapped up. Unbelievably, I predicted Benny Fairfax's defeat of PJ Ladd. What a Long Shot. That match was awesome. Similarly the PJ Ladd vs Billy Marks match was damn fine. I feel a little robbed about Marc Johnson still but I made decent predictions.

The Bush Administration's legal counsel was the devil. I don't really have any good words for this. I told you so maybe? *sigh*

This is a minimal social compact for the 21st century. I've only read bits and pieces of it and need to go through it again but it's hella cool.

Copyright is really going to struggle to figure out the 21st century and the consumer products it's bringing. Just check out this Ars Technica piece revolving around the kindle. New media is continuing to eat old media's face. (See: newspapers folding)

Miru Kim takes pictures of herself in abandoned industrial complexes and urban ruins. It's rather interesting and at the very least there are some good photos. She feels compelled to take them nude though so it's NSFW. She also gave a lecture about it. The Bennett School is interesting and the Catacombs and St Jacques are just crazy. I don't know what I think.

I believe I posted a link to this before but just in case I still find a bunch of futurists bantering on about the direction of things to be interesting most of the time. So here is The State of the World in 2009 with the Well and Bruce Sterling.

While we're on futurism I might as well mention that genetics is exploding and it won't be long before we're all seeing it. For the record, I do think we should be careful and concerned but if we take the right precautions I promise that synthetic biology will be flipping awesome. Also maybe solar power will close the gap at last. Go figure that the simple, elegant solutions are the right ones.

If you think Computer Architecture isn't in the middle of drastic upheaval for the first time in two decades, read this and then punch yourself in the face. Okay, so maybe that's not as indicative of the upheaval as it could be but I don't have a solid link explaining the shift to multicore architectures (probably) featuring simpler individual cores. Anyway, it's fairly interesting that the industry heavyweight (i.e. Monopoly) may have to completely reinvent itself depending on what the market does at this crucial juncture.

I know you readers care about multicore support in Programming Languages. Neat GHC mutterings in a new paper. Can we expect GHC 6.12 in the next few months? While we're on Haskell, I thought that this library to automatically generate typesafe bindings to C code was pretty f-in awesome. If we have to deal with the legacy of C code, this should help tons. Hackage gets more useful by the day. Other papers? You got it. Lambda the Ultimate helped me stumble on this interesting approach to Compiler Optimization from some smart doctorate seeker named Ross Tate.

A while back an article called Hard Work and Practice in Programming got posted to both hackernews and the programming reddit and generated a lot of comments. It was both discouraging and motivating at the same time. I'm afraid of all the work I'd have to do to know all I want to know but I'm slowly moving in the right direction. That's actually what's been distracting me lately and what the title of this post is about. I want to be ahead of the curve and able to deal with parallelism. There are few programming languages in which that's even possible and that'll only become more pronounced and important in the coming years. Moreover, I know that whatever I end up doing with programming the language I write code in makes a difference to me in how much I enjoy it.

Some languages I like more than others, some paradigms feel more productive or sensible to me than others. Unfortunately, my tastes lean towards the esoteric which means that to work professionally in the languages I like it might require a Phd unless the market starts adopting them pretty quick. Or at least a Masters. There's a lot of mathematics involved and I could stand to brush up on my Calculus as it is. Ah, well. I really just need to shoot a sheepish e-mail to Don Stewart or Bryan O'Sullivan asking what I can do beyond getting my BS and writing Haskell and Lisp to be prepared to get jobs at a place like Galois.

Then again, I think part of my concern is that while getting a job writing High Assurance would at least be fun, challenging and use Haskell, I'm not sure my passion is in somebody else's software. On the other hand, I haven't figured out what I want software to do for me yet. Hopefully working through the BS will point to some ideas. I still have the idea of a social network for self-schooling or autodidacts lingering in the back of my head. School's fine but I've always wanted a bit more flexibility than I've been able to get at the institutions I've attended.

Speaking of the math involved, I've been harboring an interest in Category Theory lately due to Haskell and I really enjoyed this man's story. I may be a theory learner myself. I definitely appreciate working through the abstract in SICP a lot more than working through some arbitrary example in CS class. Anyway, there's more to gather from this and maybe with a bit of work I'll actually be able to tackle Herstein's Topics in Algebra. It's been calling my name for a little while. Well, more later. I've got a midterm to finish.

Redlinernotes Reborn

posted on 2009-02-27 23:29:33

I can't believe it's only been 8 days since I posted last.  Life has been moving at an insane speed but I've been really happy.  A lot of things are just sort of coming together lately and I have to say it's a pretty pleasant change of pace. Dad's on the mend. We don't have the cancer beat but it's certainly at bay for a little while. He'll likely never be out of the woods completely. Schoolwork is going pretty well. I haven't been working absolutely as hard as I can but I have low A's or high B's in all my courses. I planned out my course schedule for the next few semesters and figured out that I'll graduate in December of 2010. That's with 4 course summer sessions. It's longer than I'd like but nothing from Oglethorpe carries anywhere. Plus CS is my second time switching majors so I was practically starting over. *sigh* Still, I have a plan and that's pretty nice. I also did taxes this week and found out I'll get a $1500 rebate.

You also may notice that is going significantly faster of late. I finally manned up and began paying for "real hosting". There are a number of benefits, not least of which are freeing up my home connection from requiring a Static IP. Moreover, the upload speed and latency are much, much better at the hosting facility. It's a Virtual Private Server running ArchLinux which I purchased through Linode. It's $20 a month and so far I couldn't be happier with it. I may do web development on it in Haskell, Scheme or Lisp at some point but that's down the road a bit.

Not everything is roses though. I got hosed on my berrics predictions. Of course, I blame Steve Berra. Marc and Steve were supposed to have a nice game of skate but then Steve caught something awful that looks like chicken pox. Instead of putting the round off further, Steve MC'd and pitted Marc against Johnny Layton who failed to make his first round appearance. Marc was definitely having an off day. He missed like 4 tricks before beginning to hit his stride and it was too little, too late.  If I recall Marc missed a regular 360 flip and a nollie flip. It was painful to watch. Anyway, my whole bracket is F-ed.

I'm having a Street Fighter IV tournament tonight. I've been spending a lot of time working on my game this week. For some reason I get really competitive about fighting games but only fighting games. I don't think SF4 has the mass appeal or the elegance of Smash Bros though. I'll probably try to write more on that later and I should acknowledge I have a strong bias that I'm trying to compensate for from years of Smash Bros play. So far I've settled on Fei Long as my main character and I'm planning to spend some time getting decent with Gouken as my secondary. The tournament should be fun, at any rate.

Other than that, I'm having trouble thinking of what else has been going on. The one bug in Linux that's been bugging me is fixed upstream so the next ALSA release will make me pretty damn happy. I'm increasingly enamored with Haskell. I'm slowly beginning to work my way through Real World Haskell and plan to spend a good bit more time on it over spring break (March 8th-14th if you were wondering). It's the only language I've seen that seems like it can handle issues of parallelism and concurrency more or less today. I'm definitely keeping a close eye on it.

The Generic Quick Post

posted on 2009-02-19 21:20:53

I didn't even realize I hadn't posted in over a week. I just got through the first big "crunch period" in school and did pretty well. This weekend will be relaxation and unwinding to a pretty large extent. So what's happened besides the school stuff?

I got Street Fighter IV. I am planning on throwing a tournament...details forthcoming. I already think I prefer my Smash Bros tournaments.
I'm enjoying Fleet Foxes and also The Stills and Vampire Weekend at moment. Mmm, mmm, music.
ArchLinux finally put out a new release for the first time in a while. They're also going to try to drop releases with each Kernel release from now on which would be pretty damn cool. A distro that releases 4 times a year? Watch out. Not that most of us Archers don't just install and roll along...
This is my jam and beautifully and entertainingly explains what I'm trying to say about parallel programming and the future. It also advocates haskell a bit which is nice.
This just generally talks smack about for loops which is not a bad thing. I'm so sick of for loops. I'm not going to get into my snobbery right here. Just know that the fact that I ought to learn C for the future so I can deal with the past is a little frustrating at times.
Finally, I'm 3 for 3 on my berrics predictions and with any luck I'll be 4 for 4 this weekend when Marc Johnson finally fights Steve Berra.

Fumbling through February

posted on 2009-02-05 02:18:44

Some days (or nights) you just feel like an idiot. There's no rhyme or reason and there's just no stopping it. I suspect that it's a result of wanting to do so many things and being unable to cover it all. The last week or so has flown by. Everything seems to be moving very quickly. I've done alright on some of my new year's resolutions but others have fallen behind. Clearly, I'm blogging enough and I've been exercising and/or skateboarding fairly frequently. I haven't extracted samples from my music library since around the second week of school though and I haven't made any progress on HTDP or really any code.

Moreover, I'd like to participate in the Summer of Code if at all possible and really want to work my way all the way through Real World Haskell and The C Programming Language in the near future. The latter things will have to wait for academics and essentials. I just hate not being able to do it all. I think I have the time, I'm just not dedicated enough. I'm not sure.

That said, things have been going pretty well in 2009. Even better than 2008 which was mostly good to me aside from some work troubles and restlessness towards the end. Things with Teresa are outstanding, I'm at least partly enjoying school, I'm dealing with financial aid and I've got friends and hobbies on the weekend. I'm doing well in my classes and need to get back to that for now. There's work left to do.

Aside from work there are two interesting things in the next two days. Tomorrow, a Killzone 2 demo is launching on the European PSN. It should prove interesting as a technology showcase. I remember reading a cool paper about it but I can't remember where I found it. Aha. Google found it for me but be warned it's a PDF. So here's a quick primer on the Deferred Rendering techniques they used if you're interested

Alien Workshop is releasing Mind Field on Friday and I'm quite looking forward to seeing that. They've got a hell of a team these days and Heath Kirchart has the ender. I always liked Kirchart. While we're on a skateboarding note I should mention "The Berrics". The Berrics is a combination of the first names of Steve Berra and Eric Koston, close friends and prominent pro skaters who co-own this private skatepark. They've decided to hold a rather far-ranging Game of Skate and I might as well get in on the action and post my foolish bracket before things get further along:

Marc Johnson will beat Berra, sorry Steve. I saw footy that showed Erik Ellington beat Jimmy Cao which wraps up the second round. For the quarterfinals, Benny Fairfax will take out Erik Ellington, Marc Johnson will take out Billy Marks, Mike Mo Capaldi will take out Mike Carroll and in an upset PJ Ladd will take out Eric Koston. For the semis, Marc Johnson will take out Mike Mo (this is actually probably an upset) and Benny Fairfax will take out PJ Ladd (this is definitely an upset). Finally Marc Johnson will take the crown.

Okay. Enough of this whiny nonsense. Back to work. *sigh*

Normal Thoughts, Nerd Thoughts

posted on 2009-02-01 00:18:46

I'm going to try to keep this short. Top 5 things that have been stuck in my head the last 2 days.

1: Actually, 1 hasn't been stuck in my head it's a few interesting news bits from this morning. One being an interesting interview and look at Middle East policy with Obama, the other being Jessica Alba calling out Bill O'Reilly on WWII neutrality. I normally wouldn't post the latter sort of junk but I found it pretty funny for one reason or another.

2: Amon Tobin is awesome. Literally, awesome. My favorite two albums of his are Supermodified and Permutation but I can't choose between those two. Seriously though, just listen to Nightlife off of Permutation or Slowly off of Supermodified. Listen to those for me. Please. Tell me they're not masterfully composed or beautiful. It's all sample-based. He's staggering.

3: I'm going to be moving by the end of may. I need to save up for a down payment on an apartment (with Teresa) somewhere nearby and public transit accessible. I may also change internet service providers. If I do that, does it make sense to buy hosting from someone (I'd definitely choose a Linode 360 in Atlanta at $20/month)? I'd still keep a server at home for, uh..."file transfer operations", media serving and SSH access or some such. I just don't want to have to run off of it for bandwidth and downtime reasons.

**Computer Nerd warning: I think what follows may be the most concise explanation of what really interests me in Computer Science that I've written, namely item 5. Item 4 is prerequisites, sort of. If you want to understand some of the reasons I'm into computing and the questions that interest me you could do worse than read what follows. Note that I think Computer Science is generally one of the most interesting fields that exists because it lets you study anything: Games, AI (Psychology/Philosophy/Ontology/Nature of intelligence), Theory of Computation (Mathematical Foundations of Logic), User Interface Design/HCI (Psychology/Aesthetics/Usability), Programming Languages (Linguistics). etc...but what follows are my personal reasons, not general ones.**

4: I posted this reddit thread yesterday but the topics debated are so interesting to me I'm going to post it again. Consequently, I've spent this morning peeking at things like this, and a lot of the work Jonathan Shapiro has been doing over the last few years, particularly BitC and Coyotos. I came into programming last year excited about my understanding that to support the trend towards parallelism we had to rework something significant on at least one of the following levels {Computer Architecture, Operating Systems, Programming Languages}.

I also understood that the field had a lot of lovely innovations which (debatably) never conquered the mainstream such as Lisp, Unix on one side, Plan 9 on the other, RISC architectures, etc. One always has to struggle with Worse is Better. Note that I did say debatably, Lisp/Scheme increasingly influence recent languages, Unix is slowly working towards the consumer market through OS X and Linux and has always been strong in Industry, Plan 9...well...[pdf warning]Rob Pike has some interesting words[\pdf], and Intel's x86 chips apparently hardware translate the CISC ISA down to some sort of RISC-like micro-ops.

The point is the solutions which were elegant or "technologically superior" did not tend to be the ones favored by the market for various reasons. Note that I am not saying we all should be using Lisp Machines. These technologies were beaten in the market for good reasons but that doesn't mean they were a direction we shouldn't pursue. Consequently, I am beginning to understand that because the foundations of this industry which has taken over the world since 1970 are in many ways fundamentally unsound that we should harbor a desire to eventually rework those foundations. Namely by the insertion of abstractions to aid the modern programmer in issues of parallelism and secure and reliable code.

5: I guess the question that really gets me is, "Where should all this abstraction be?". That is, what are the right layers in which to have the abstractions we settle upon? I think a number of things suggest that the Computer Architecture and Operating System layers are not the correct ones and that the abstractions should wind up in our Programming Languages. Backwards compatibility and the price-performance competition with existing industry being the principle obstacles to Architecture and Operating Systems. Of course, once you've figured out where the abstraction should be you have to move on to "How do we create these abstractions and put them in their place?" or the question of implementation which is for all intents and purposes much harder. This has taken way too long to write and I'm pretty spent at the moment. Hopefully, I'll get a chance to flesh out these ideas later.

**/end nerdery**

PS: It's begun. Mike Miller was right. I'm doomed to be a Computer Historian...

Counting Down

posted on 2009-01-13 00:45:25

It's been a pretty eventful holiday season. I wrecked my Maxima on Dad's birthday (the 23rd), in particular. I didn't want to mention something until I had a concrete opinion to express and now I do. It was for the best. Seriously. I had spent considerable amounts of money trying to keep the car in good repair this year, it wasn't paid off yet and my parents and I had long since agreed it was a lemon but had no way to get rid of it.

Luckily, the insurance has paid off the car, I'm fine and this enables me to cancel my insurance and have a bit more financial leeway for the coming school year. I had to figure out a method of public transit from Brookhaven to Marietta but that didn't turn out to be too tricky. It ultimately just means I'll spend about 3 hours twice a week (Tuesdays and Thursdays) on CCT and Marta. Today I went for the first time as a test run. The ride was enjoyable and afforded me a bit of time to get some sampling done. I took care of a few on-campus errands, bumped into several disparate acquaintances such as John Valentine and some Fayeteville folks and scoped out food options.

In short, I'm ready for school to start even though they're going to make me learn C#. I hope I can use Mono most of the way.

I keep reading about the economy, earmarks and entitlements. Is no one else thinking that the real issue here is sustainability, limited resources and population control? Minsky gave a talk on this but he's not the first person to raise the issue. When are we going to admit the planet can't indefinitely support 6 billion people or at least do the math/research to prove that it can? That's the question I'd really like us to be thinking about. If we want to be honest and accountable, the big picture is the only place to talk. The futurists and sustainability freaks seem to be pretty much the only people doing that. I'll rant about that more later but I wanted to at least note that it's been occupying increasing amounts of headspace for the past 6 months.

Speaking of mindspace, I still really love ogling the stack languages. I played with Forth a little bit but didn't get too far. It was just a fun diversion from Lisp at the time. I still really want to check out Factor.  Frankly, after seeing how fast the factor guys grow the ecosystem and libraries around the language I believe their productivity gain claims. Go Planet Factor, Go Slava. I'm sure I'll get around to playing with it sooner or later. I have a nightly installed and FUEL setup...which actually popped up on Reddit today ironically enough.

The only consumer-y thing I can think of that I'm excited about for the foreseeable future is the upcoming release of a PS3 game called Skate 2. Skate 2 is really just a patched-up and glorified Skate 1 to me. I'm still excited and I don't mean to speak ill of EA Blackbox but I could care less about much of the new stuff. I just wanted custom soundtracks, a tripod camera and good PS3 framerates. It comes out on January 21st and I'll disappear for a week in all likelihood exploring all it's corners.

I still marvel at and love my new X200 and btdubs, btrfs is in for 2.6.29. For the record, I've had a newsgroups subscription with Astraweb for about a week now. I've poked around for a few Oscar screeners but haven't observed anything I couldn't find on isohunt or thepiratebay. Sure, the download speed is a boon but I'm looking for content that isn't readily available on other networks. I've checked out nzbmatrix and What am I missing?

The RIAA has said they're giving up lawsuits and trying something else. I'll be keeping an eye out and looking for service that don't discriminate to Static IP users with their own blogs or pander to RIAA/MPAA/etc. In IP related news, Lawrence Lessig appeared on the Colbert Report. He has much more interesting things to say beyond what was covered so I'd recommend picking up some of his books, reading them free online or at least reading the Wikipedia articles on his first book and Free Content.

I've also been catching up on my music obsession over the cold season and particularly enjoyed White Winter Hymnal by Fleet Foxes and Grounds for Divorce by Elbow the last few days. Also, I love Battles at least a little bit for writing Tonto, Atlas and Leyendecker. I'm also enjoying Amon Tobin all over again because he's a damn genius.

I haven't been keeping up tremendously well with my New Years Blogging resolution, as you may have noticed, but I think that will change now that I'm busier. I've been doing much better with the music sampling and skateboarding. On to the coding and schooling.

On Languages and Libraries

posted on 2008-11-04 03:41:19

At this point there are a few languages I'm set on learning, a few I'm eyeing as potential future candidates, a few languages I'll dabble in and a few I'm trying to avoid for one reason or another. I thought I'd post about them and link to their main package indexes where appropriate. I'll have a more intelligent article on languages and libraries down the line.

Languages I'm set on learning:
Scheme: Scheme has several implementations and libraries depend on said implementation. I'm a Chicken Scheme and PLT Scheme fan so I'll be using Planet and Eggs for libraries. I realize I vouched for Gambit Scheme a while back due to it's concurrency options with Termite. I hereby publicly retract that view and promise to update the On Schemes article or follow it up in the near future.
Common Lisp: I'm an SBCL fan and will happily use cliki in my library questing. ASDF-Install does the rest.
Python: Where I shall be aided through the powerful PyPi.

Languages I'm keeping an eye on:
Factor: And it's lovely indexed vocabulary.
Haskell: The magnificent HackageDB will aid me in that quest.

Languages I'll dabble in:
Lua: Bindings galore.
Perl: For use with the almighty CPAN.
Ruby: RubyGems will be used in conjunction with RailsLodge's directory for great joy.

I refuse to mention the languages I'd like to avoid. I'd likely start a flamewar for heaven's sake.

Belated Blogging

posted on 2008-11-03 22:50:51

A lot has been happening lately and I guess I've been too wrapped up in it to write anything down here. I've been readmitted to SPSU for the Spring 09 semester, have filed the FAFSA and am currently looking into financial aid options. I'll have more on that soon but I am planning on going. Better to be there and learning than out of school doing Help Desk work and not learning enough about programming. That said, I'm totally out of funds about now and a part-time Help Desk position would be wonderful for the foreseeable future (i.e. post starting at SPSU). Or a contract position until school starts.

Due to the aforementioned brokeness I won't be grabbing LittleBigPlanet which a few people have asked me about. I am impressed with some of the things people have churned out with it though including a working 1,600 part calculator and a recreation of Gradius. Cute.

Will also got back in touch with me which I was quite happy about and I made some changes at his suggestion to my little hangman program. It's down to 115 lines of code and is pretty polished at this point. The only way to go forward would be to add new features but I'll put that aside until I've finished PCL. I also may have a quick weekend project to write a BASH script for RedLinux in the near future thanks to some of the great resources at the Linux Documentation Project. I've got some ideas for a future RedLinux release but I'll likely put that off until December or so.

What else has been going on lately? Well, OOPSLA and Lisp50 happened fairly recently and I couldn't make it but I've enjoyed reading about it thanks to articles on Lispy's blog and some words from Luke Gorrie. I'm still pretty jealous of Luke Gorrie as he always seems to be playing with neat ideas and technologies and generally hangs out with the "cool kids" a lot. He was at OOPSLA and Lisp50 and then managed to be hanging out with Alan Kay, Ian Piumarta and co at VPRI when Slava Pestov came through to talk about Factor. What a jerk! (jk lukego) There's a great video of Slava's Factor talk which he delivered at Google as well. It would be neat if some of the Lisp50 talks made it online but somehow I don't expect to see that happen. I've also been keeping an eye on the btrfs and xorg mailing lists but that's not too relevant really. BTRFS for 2.6.29!

I've been doing a little bit of reading on Lisp Machines of late and hope to run one in a VM when/if I get an X200. I'd also love to run a copy of Linux 0.01 in QEMU or VirtualBox and maybe ReactOS as well. Nothing like a small, well-understood system right? A nice external keyboard wouldn't hurt either as mine has gotten a bit beaten down over the years and is a PS/2 keyboard so it won't play with the X200. Reddit has some suggestions and I'm rather leaning towards a Das Keyboard but one of the mechanical-switching Cherry units would be fine too. Paul Stamatiou has some interesting suggestions about back to school stuff but I'll mostly stick to his thoughts on study habits and motivation. I think I've got the rest sorted out. His thoughts on living the cloud life and using newsgroups should be useful though.

That's all for now. I'm off to skateboard and shower while there's still some good sunshine out before hunkering down with more lisp. Did I mention a new version of SBCL came out? Don't forget to vote tomorrow. Keep an eye on things with the help of Peter Seibel and Randall Munroe! won't hurt either. ;-)

Just for Fun

posted on 2008-09-10 20:05:53

There's so much I've been meaning to post about lately and so much that's been going on. It's very hard to keep up with it all. This will consequently seem a bit scattered but it's largely divided into Gifts, Linux Stuff (which continues to bring me perpetual joy), programming language stuff and hardware stuff.

Redlinux: I've been working on my own ArchLinux Derivative over the past few months and mentioned it a bit here. I'm hoping to get an ISO for an installable LiveCD of it online by the end of September with a sort of beginner's guide and homepage for it set up here. There won't be a forum or anything initially. Just e-mail me for feedback/help. We'll see how that goes. I'm calling it Redlinux. Also, I put all the default *rc files and other important config files (including new user documentation and the changelog) in a new folder on the site. It's at Redlinux is currently at version v.07. The initial online release will incur an automatic version bump to the nearest .x0 rounding up.

Logos: I'm looking to get a sort of logo for the site. I'm not sure where to go with this. I also need a separate logo for Redlinux. Any ideas are welcome. I have one for a site logo. It's a Unix Shebang combined with a lowercase Lambda. Like so: #!λ. I think it's pretty cool but it'd take some work to make it prettier. The Inconsolata font would be a good start. I don't think they have a lowercase lambda symbol though. :-(, Sad Panda. I'm thinking we call it the *Lambdabang*. Eh, eh?

Gifts: I've been thinking about money and my actual needs and wants a good deal lately. Part of that comes from having to constantly figure out finances due to being young and broke in a struggling economy. The other part is me thinking about the few material things I enjoy and which I'd like to prioritize. Good ideas for gifts for me that I hadn't previously considered are Internet Hosting (and you know I'll want pretty serious control over the box. Maybe linode or lylix.), a subscription to LWN (Linux Weekly News) which I've been enjoying a lot lately (the back issues are free) and various books from the amazon wishlist, as always. Cooking supplies might also be good but I'm probably best off picking those myself. Homemade good food. It's expensive, but fun!

Hardware: I've been thinking about buying a new computer for about a year to a year and a half now. I recently moved into the "strongly considering it/planning it" phase and started saving. This box would probably end up replacing my aging homemade beast of a "main desktop" which would in all likelihood become my server box. I decided fairly early on I wanted the new system to be a laptop because I'd really like to be able to go portable at any time and not be at a loss for processin power. Plus, that'll make it easy for me to move around lifestyle and home-wise which seems reasonable at the moment. To be honest, my needs are essentially met by my current equipment and the extra processing power wouldn't go to use too much as I don't game anymore. The Thinkpad A31 (present laptop) hates secure wireless networks for some reason and I wasn't able to wrestle it into submission. A larger concern would be hardware dying in the Desktop. It's still going strong but we're passing the 4-year mark and you can never be too sure. Besides, I'd love to catch some of the new emerging tech like Multicore processors, new wireless standards (Wi-Max and draft-n, I'm looking at you), and Solid State Drives! I'd also love to be able to get something based on AMD's upcoming Fusion processors but that's still a year out and I'm not sure I'll wait that long. I like the direction they've gone with the Athlon series and feel that they're more motivated than Intel to innovate. Always have. They're still not as fantastic about supporting Open Source as Intel though and that's beginning to become a deal breaker for me. Especially considering that their Shrike mobile platform may use broadcom wifi or something equally messy where Linux is concerned. I know I want something 12 or 13", preferably 13, with a minimum of 4 hours of battery life, a dual-core processor and a 60GB SSD. Ideally, it'll be Shrike-based (that's waiting a year), have HDMI or Displayport out with good Linux support and draft-n or Wi-Max. Vendorwise, I'm torn between IBM/Lenovo and Dell. I've had good experiences with Thinkpads (IBM, now Lenovo) and like them but they're not the best Open Source company. Dell has been making a real push in that direction of late and have some very competitive looking offerings which I could even buy with Ubuntu pre-installed. My final three is presently a tie between the Lenovo X300, the Dell XPS M1330 and the Dell Latitude E4300. I'll be coming back to re-evaluate when I've got about $1,500 stashed away. :-)

Languages: In the near future, I'd like to get a post up revising some of my former opinions on Programming Languages. Particularly of the Scheme family. Some of my earlier ramblings now seem quite misguided. Plus I've been playing around with Common Lisp more and though I'm not quite a fan of the funcall syntax I'm starting to grok some of the reasoning for multiple namespaces. My experiences with PLaneT vs. ASDF-Install bear that out. *shivers* Collisions are ugly.

Linux Tip: Ever been frustrated trying to transfer directories with spaces in them via scp? I have. There are one or two things that seem like they should work but don't. I've been too lazy to look up how to do it until today. Here's how:
scp -r "user@host:/path/to/directory\ withspace/" .
Simple, right? Duh.

I was going to mention how Linux Kernel Hackers make me happy and throw a few quotes from the mailing lists on here but I think this is more than enough for now. Later, folks.

Language Adoption and Lisps

posted on 2008-04-03 01:12:53

Long-Winded Preface
This is my (marginally informed but mostly inexperienced) personal opinion. I have a habit of thinking about things prematurely and overanalyzing them but I indulge in it. It seems most lispers at some point or another either do a roundup of the available lisps to decide on an implementation, try to figure out why lisp isn't more popular, or try to figure out why other languages seem to be growing towards it. I have an opinion on these issues after obsessing over them for a month or two and in the interest of believing some useful knowledge came from that obsession I'll document my opinions here. I should note that I don't yet consider myself a "lisper" for two reasons. One, I haven't yet encountered a formal definition of that term or a sufficiently common form of usage. Two, I simply haven't written enough lisp yet though I am sold on it to date.

Factors in Language Adoption
There are a few things that seem to drive adoption of programming languages generally but I'm interested in a small subset of programming languages so I'll be covering an appropriate subset of influencing factors. Specifically, I'm interested in languages that are not owned or pushed by a corporation but still have achieved some prominence among language elitists and/or some mainstream success. (4/7/08 Edit: This mostly serves as a disclaimer that I won't cover C, C++, C#, or Java here.)

As far as I can tell, these languages all have some of these critical features:
1. A single or dominant implementation.
2. A module system that works across implementations.
3. A killer app or great libraries.

Case(in_point) ->
The languages I'm thinking of are the following: Python, Perl, PHP, Lua, Ruby, Haskell, Erlang, and OCaml. Smalltalk is a perhaps crucial omission from this list but I count both Smalltalk and Lisp on a different language plane because of the sheer history surrounding them. I simply feel more factors are at play.

That said, all of these languages have achieved some preeminence for one reason or another though Haskell, Erlang, and OCaml are certainly not in the mainstream. Erlang sneaks in because of it's recent hype explosion and the limited use it's seen in industry, OCaml sneaks in on the virtue of it's infamous use at Jane Street alone, and Haskell I'm letting in both for it's proselytizing FP userbase and interesting programs (Darcs, Xmonad).

We can see quite clearly that several of these languages have one standard implementation (excluding ports to the JVM or CLR) including Perl, PHP, Lua, and Erlang, Python, and Ruby. As I already mentioned, Haskell has interesting programs at work and OCaml has some commercial usage. Lua has heavy use in the Video Game industry for scripting among other places. Erlang is used at Amazon, Ericsson, Nortel, and T-Mobile. Python, Perl, and PHP are the languages that built that web and Ruby has taken off since all that Rails business.

Common Objections
Many people complain about the communities being elitist or the implementations being insufficient or the syntax being odd. I'd say those are the three complaints I see the most. Certainly, the syntax does scare a few folks. Certainly, there are cases where a lisper doesn't handle a noob with the proper tact. Certainly, there are cases where it makes more sense to use another language on a project in lieu of Lisp. I do not believe that parentheses will deter those actually interested in stretching their abilities and learning a new language. I do not believe that a few flames are indicative of the general lisp community. I do not believe that lisp not being ideal in a given scenario is necessarily a problem with the available implementations.

The Punchline
What I've failed to talk about is point 2 (a standard module system) which I think is presently the most serious drawback to the Lisps. Lisp will never have a standard de facto implementation. There are plenty of implementations already on the scene and many of good quality, combined with the fact that some implementations are designed around a spec (RnRS). I grant that this entire problem could be solved if the Schemes at least standardized on R6RS, or at least the R6RS module system but apparently that isn't happening.

As I've said, we'll never have a standard implementation and, from what I can tell, the absence of both a standard implementation and a standard module system simply slaughters the code portability that would help a healthy community of code sharing. This precludes the libraries and interesting programs that lead languages out of the elitist ghetto. Most programmers won't touch a language regardless of it's feature set if they can't be productive with it fairly rapidly without writing the sort of libraries they've come to expect. If there is something wrong with lisps, this is it. Very few programmers will seriously try a new language without easily available libraries that make the language have a batteries-included feel to it.

Now, I realize that standardizing all Schemes on R6RS is impractical but we don't need even that. If we could manage to just get the three largest Scheme implementations to unite on one module system we would have made tremendous progress. I'd say the three largest Schemes in userbase with a module system are: PLT Scheme, Gambit-C, and Chicken Scheme or Bigloo Scheme. If we could get even these 3 or 4 implementations to standardize, I think it would be enough to really get things rolling. Even 2 of them standardizing on modules might be enough to pull others in. I can't personally think of a bigger win.

The Inflammatory Bit
I don't mention Common Lisp here because, again I stress personally, I hope we stop encouraging it as a starting point for newcomers. I don't think it's a good way to encourage people to use Lisp. I consider it more fragmented and thorny than Scheme for the beginner though I acknowledge it at least has a dominant implementation, from what I can tell, in SBCL. I realize that it has contributed a great deal and I don't mean to discredit it's achievements. It simply seems something better left to the experienced.

Personally, my disagreement with CL probably stems from two things. One, I dislike the design decision of separate namespaces for functions and arguments. Secondly, I feel that if most of CL's added functionality could be bolted on to a Scheme implementation through the use of libraries then why not have a Scheme implementation with said libraries and allow that to usurp the role played by prior CL implementations. I do grant that would be a lot of work to redo without good cause. At any rate, these are my inexperienced preliminary observations and my opinion may change drastically over the next few years as I have time to give it a fair chance and read through PAIP and On Lisp.

The greatest advantages of Common Lisp over Scheme I would expect to hear as arguments are it's use of CLOS and Namespaces, it's libraries, and it's quality implementations. I believe all of those are solvable issues in Scheme provided a standard module system enters the picture. I also have some opinions about why the Schemes should focus on distribution and concurrency after modules and further opinions on compilation to C versus native code. I leave that for another time.

In short, the central problem lisp must overcome for mainstream success (which may not even be desirable) is a standardized module system. The lack of a module system prevents a culture of code sharing which is preventing the creation of a software ecosystem around lisp. This is, at root, a sociological problem emerging from a technical problem. The lack of a standard module system and shared code produces the social effect of Lisp's continued obscurity. Lisp's obscurity does not stem from some deep difficulty inherent to the language, noob-averse communities, shoddy implementations, or the Lots of Spurious Irritating Parentheses.

An Escher Videogame and other nerdities

posted on 2008-01-22 17:22:47

One of the games I'm most excited about coming out this year is called echochrome. It's coming out for Playstation 3 and the PSP. No US release date has been announced but it will land in Japan on March 19th. While I normally don't bother with writing about games, this one's special. It's one of the most novel concepts for a game I've seen in years. In short: you rotate a scene featuring an Impossible Object such that an automated walking man can navigate it. It's a perspective-based puzzle game. Here look:

Also, everybody is writing about CS Education lately which is awesome considering I've been thinking about it so much. Just look at all this mess:

The Enfranchised Mind article (which may be the best of the bunch) and associated reddit comments

Raganwald' No Disrespect article and associated reddit comments

and Mark Guzdial's take and associated reddit comments

It may sound like a cop-out but I think Abelson and Sussman had this right all along. We're so hopelessly early in the existence of Computer Science as a discipline that we don't have a clue what it really is yet. And when you don't know what something is, it's pretty hard to know how to present it. Or steer it's course. That's all for now.

Finally, a paper got thrown on LTU about a dependently typed version of Scheme. Very intriguing.

Holiday Greetings

posted on 2007-12-23 05:59:21

Hello there, everyone. I've only dropped off the face of the earth. I'm not dead. Neither is Dad. So far we're all holding together well. He's had his brain radiated a few times and had bone infusions. Chemo begins shortly after Christmas. There is a Doctor who has talked about remission and on the whole I think we're optimistic. Or fighters anyway. Now, on to all the other business.

I'll be in Montana from the 2nd to the 11th, so I've got that coming up. I'll be in Bozeman if you're wondering. It'll be nice to get away for a bit...even if it is for a family reunion with people I haven't seen in a good while, my biological Dad's side of the family. For those who don't know, Mom divorced and the all too awesome feller suffering from Lung Cancer is (technically) my Step-Dad.

I need to draft up a schedule for the new year to figure out how I'm getting my studies done. And whose lectures and course materials I'll be following as I have a choice in some cases. More on that soon. There's also been a ton of great nerd discussion floating around the blogosphere of late, some of which I'll try to comment on in the next couple of days. In the meantime, here's a trivial nugget of thought.

I watched Lecture 1A of the classic MIT Structure and Interpretation of Computer Programs series tonight and something struck me, mostly because Sussman brought the idea to the forefront with clarity at some point. He said something fairly fundamental that borders on self-evident when Computer Science is viewed introspectively but I hadn't formerly considered. In essence, Computer Science is about how to knowledge and process rather than declarative knowledge or fact. Thus, a programming languages job is to serve as a description of process and provide tools towards that end.

The part of this that I hadn't formerly considered is that this is why we bother, or even focus, on learning new programming languages and methods of abstraction rather than focusing on writing specific programs. Sure, many schools recommend a course in Compiler, Operating System, or Programming Language Design and there are plenty of blog posts detailing such undertakings in an effort to enhance skill and knowledge in the field but nothing is so popular or so emphasized as learning new languages. Regularly and of different paradigms and abstractions, if possible. There's something to think on in greater depth here about why that is that I haven't seen eloquently written about by Yegge, Graham, Braithwaite, Atwood, or anyone else. Perhaps if I can capture what it is, I'll write about it. In the meantime, it's just a thought.


posted on 2007-09-12 18:34:12

I was watching an interesting video this morning and had some thoughts I jotted down before class. I am reproducing them in their full, unsubstantiated, and provocative/controversial nature here. Later perhaps they will contribute to something a bit more formal. Note: The video features Simon Peyton-Jones talking about programming language evolution somewhat generally. I'd much rather him speak more about that stuff than the Haskell/STM talks he gave at OSCON. Hmm.

I promise I'll post something less disjointed and more intelligible/coherent/formal than this in the near future.

Referencing Upcoming Radical Visions Essay, Trend A: Moving Away From X86
Facet 1) Hardware.

I'm sick of people poo-pooing the "concurrency crisis". To be fair, concurrency is the straw breaking the camel's back. To maintain sustainable growth in the computer industry we're having to do, as always, radical things on the hardware side. Not as always, however, is the fact that they're forcing lots of change in software/CS at the same time.

Is Erlang brilliant? Maybe. Is Erlang fortuitous? Certainly. It is probably the best option for the concurrency problem at hand. I'm not convinced that Scala or F# compare. Or Haskell for that matter. Haskell is a different sort of win.

We need concurrency more than controlled effects through a type system at present. The need for Haskell is still further out. Still less urgent.

^That's it. We're having two different conversations trying to discuss what the more urgent issue is. It's not Erlang vs. Haskell. It's concurrency vs. limited effects.

But what of (insert professor/coder name here)? What of those that are uninformed? Hell, what about (insert coder name here). What will they do when their code is sitting around not scaling to available resources?

Tertiary snippets:
Syntax is not semantics. Can we mistake it for such? Of course, but what does that look like? What is it to mistake syntax for semantics?

"That boy, does he already suspect that beauty is always elsewhere and always delusive?" - Czeslaw Milosz, New and Collected Poems, p. 284

The State of State

posted on 2007-08-29 13:20:44

So, I've been having and alluding to discussions with Tim Sweeney of late. I sent him an e-mail a little over two weeks back and I received word back from him about a week ago. It's taken a while for me to digest it a little and ask his permission to post it here but he has been kind enough to grant said permission. So, without further ado, the transcript:



My name is Brit Butler. I'm a college student in Atlanta, GA and an admirer of your work. I was very taken with your POPL talk on The Concurrency Problem but curious as to why you mentioned both the message passing model and referentially transparent functions but then went on to mostly talk about the latter with Haskell. I'm certain that you've used and read about Erlang and other message-passing systems and was wondering if you could explain your position on them to me, vis-a-vis Transactional Memory or some other method. I'm assuming you wouldn't be in support of STM because it's ultimately still about sharing state. Thanks so much for your time.

Brit Butler


Lots of applications and reasonable programming styles rely on large amounts of mutable state. A game is a great example – there are 1000’s of objects moving around and interacting in the world, changing each frame at 60 frames per second. We need to be able to scale code involving that kind of mutable state to threads, without introducing significant new programming burdens. Transactional memory is the least invasive solution, as it allows writing pieces of code which are guaranteed to execute atomically (thus being guaranteed of not seeing inconsistencies in state), using a fairly traditional style, and it scales well to lots of threads in the case where transactions seldom overlap and touch the same state.

Message-passing concurrency isn’t a very good paradigm for problems like this, because we often need to update a group of objects simultaneously, preserving atomicity. For example, within one transaction, the player object might decide to shoot, issue a command to his weapon, check an ammunition object, remove ammunition from it, and spawn a new bullet that’s now flying through the world. And the sets of objects which may need to atomically interact isn’t statically known – any objects that come into contact, or communicate, or are visible to each other, may potentially interact.

When implementing that kind of code on top of a message-passing concurrency layer, you tend to get bogged down writing numerous message interchanges which really just turn out to be ad-hoc transaction protocols. That’s quite error-prone. Better to just use transactions in that case.

This argument for transactional memory is limited in scope:

For algorithms which can be made free of side effects, pure functional programming (or “nearly pure functional programming” as in Haskell+ST) is cleaner and allows more automatic scaling to lots of threads, without committing to a particular granularity as with message-passing.

For algorithms that need to scale to multiple PCs, run across the Internet, etc, transactions seem unlikely to be practical. For in-memory transactions on a single CPU, the overhead of transactions can be brought down to <2X in reasonable cases. Across the network, where latencies are 1,000,000 times higher, message passing seem like the only plausible approach. This constrains algorithms a lot more than transactions, but, hey, those kinds of latencies are way too high to “abstract away”.

The existing Unreal engine actually uses a network-based message passing concurrency model for coordinating objects between clients and servers in multiplayer gameplay. It even has a nifty data-replication model on top, to keep objects in sync on both sides, with distributed control over their actions. It’s quite cool, but it’s inherently tricky and would add an awful lot of complexity if we used that for coordinating multiple threads on a single PC.


So, there you have it. More on all this later. I've got to go play around with Erlang a bit more. I'm hoping Andre Pang and Patrick Logan will help me dig into this a bit deeper in days to come. Who else wants in on the conversation? Comment below.

Nerd Out

posted on 2007-08-17 12:27:58

So, I apparently asked a good question on Patrick Logan's blog. And he gave me a good answer:

"I can see someone making the argument in some domain I don't usually work in that shared memory threads are better than shared nothing message passing for performance reasons. Some hard-real time scenario, some huge number crunching scenario, etc. where every byte and every cycle has to count in the extreme. But the irony then is that STM seems far more suited to a language like Haskell, which is also unlikely to be suited for these performance scenarios.

My only fear is that for the masses including myself, we need *simple* mechanisms, and the fewer of those, the better. Shared nothing messages seem to scale up, out, and down. STM seems more complicated, and an incomplete solution, and only a solution for shared memory."

Indeed. Aside from Andre Pang's Erlang and Concurrency talk, I've watched Adam Welc's Google Tech Talk on Transactional Memory and am planning on listening to Simon Peyton-Jones OSCON talk on STM later today. Now, whether I'll end up listening to the other two Google Tech Talks on Transaction Memory I don't know. I have, as mentioned, read Sweeney's POPL slides.

So far, I end up thinking two things hearing these talks.
1) This is an ugly hack.
2) Overhead, overhead, overhead.

At any rate, It's nice to be involved in a high-level nerd conversation with very smart people. That's always fun.

Buffer Overflow; Core Dump

posted on 2007-08-15 13:16:08

Wow. I have been thinking about The Concurrency Problem waaaayyy too much the last 48 hours or so. I'm coming around about STM and am really hoping I get an e-mail back from Tim Sweeney answering some questions I had. Yes, that Tim Sweeney.

All the same, I'd rather Erlang or some other message passing functional+concurrent programming model get adopted than a non message-passing model such as Haskell. Erlang just seems cleaner to me. I just like it a bit better. Perhaps that will all change as I actually start trying to write code with them (Erlang, Haskell). Anyone who feels like coming along and writing something better that has the advantages of both without the disadvantages of either feel free. What I'm trying to say is, feel free to invent NBL.

Okay, here's what I've been reading:,_the_next_Java&entry=3364027251
And of course, the thing that started it all:

Twelfth Friday Refined Rant

posted on 2007-08-10 04:46:06

Okay, so my earlier post was a bit "La la la la. Look at all this crazy stuff I read today! Isn't it awesome?" and to be honest that's just not a critical enough take on everything. I mean, all that stuff I posted made me smile but that doesn't necessarily mean I should wave it in the air. With that in mind, I'd like to take a more serious look at the trends in those links and try to really address the issues that surround them. That's fairly challenging because these issues involve governance and law, technology and society, and plenty more. So I've decided since I'm not doing Linux Lessons I might as well do Steve Yegge-style rants. Just with much poorer writing, significantly less insight, and half as much alcohol. I'm by no means hammered. These will normally be on Fridays and this one was going to be on Friday but it's running a little late.

The Links
So, I posted about a lot of stuff. Web 2.0 and some of the associated blather about how open source addresses that through either Online Desktop or Open Services. Peer Production being supported by institutions/firms with examples of Wikipedia and Google Maps. The future of IT generally. Infrastructure challenges for the 21st century with regards to the internet and it's effects on business. Emerging worlds and the diminishing line between the virtual and the real. And some of the worldchanging issues, poverty, sustainability, climate change, political upheaval, etc. What can we really say about any of this? More importantly, what can we really take away from all those little blog snippets and links?

The Issues
There are a few things that I really take away. First of all there are a bunch of "World Issues". Things like Climate Change/Global Warming and the Energy Crisis, Developing Economies, Squatter Cities, and plenty else besides. I'm trying to get more informed on these fronts but right now the issues I can speak to tend to be technology issues rather than "World Issues". It's important to note that technology issues are a "World Issue" or are at least intertwined with "World Issues" when taken collectively. Technology has become too central a part of the modern world for technology's issues to not have vast repercussions. With that in mind I'd like to speak a bit more about what I can speak to, namely technology's issues. Specifically computing.

The Technology Issues
There are a few central issues or trends that jump out at me from what I posted the other day. One is that we are undoubtedly moving away from the desktop. However slowly, however gradually, it's happening. I don't know if the desktop will ever go away completely but it's diminishing in significance. Between PDAs and Smartphones, Laptops, Home Theater and Small Form Factor PCs, Amazon's Hardware as a Service efforts, and others the desktop market will continue to gradually erode. Second, we have a programming problem, specifically concurrency. This is emerging because everything is going massively parallel on both the local and network level. Between the rise of Multicore processors and the web services and datacenters popping up all over the place I'm convinced we need a better way to program for these sorts of systems. Third, Peer Production is making a big difference and this goes beyond software in many respects. There's a lot more to cover there so that's all I'll say for now. Fourth, the Law, Intellectual Property Law in particular, has a long way to go before it supports peer production models well. Our traditional notions of ownership and control are insufficient in the face of these new methods and as Steven Weber so elegantly described towards the close of The Success of Open Source we're going to have to find a way to bridge the gap there. Finally, I think it's important to note that there is a certain infrastructure that's critical today for technology to continue to operate in it's present fashion. We need energy and the electric grid, the telecommunications network, and the hardware and software stacks that make modern computing possible. For today, I'm mostly interested in covering the concurrency problem but next week I expect to write a bit about the infrastructure/stack and the gradual erosion of the desktop's significance.

The Concurrency Problem
Dijkstra was wrong, at least about concurrency, not that I blame him. I mean didn't he do concurrency research thirty years ago? For that matter, Simon Peyton-Jones is wrong too. Software Transactional Memory may in fact be the right solution but it's the solution to the wrong problem. Trying to figure out how to share state this way is a disaster. The only concurrency model I've seen that I think is at all valid is Erlang's message passing model. We need concurrency that is implicit. The most you should have to do is change a few "map"s to "pmap"s. That's it. I don't think Erlang is NBL. The syntax isn't too crazy but it's not C/Java-like enough either. The language generally is too much of a departure and not enough of an industry figure to really make it big besides. I'm not saying it doesn't have other problems I just feel like those are the ones that are going to hold it back. Could Haskell be more successful than Erlang? Well, first of all I'd have to come up with a good definition of success for programming languages. But excepting that I think the answer is yes. Hell, maybe Haskell is NBL and we'll struggle with shared state for another 30 years. I'm hoping for something a bit more drastic though. And Erlang isn't good enough. But it's the only thing I've seen that can solve the concurrency problem and that does appear to be the most prominent problem in programming from where I'm sitting. Does that mean we should have an OS written in Erlang and future applications should be written in Erlang? Do we need compositing window managers written in Erlang? Not necessarily. I suspect the desktop will be largely composed of C, C++, Python, etc for a good few years to come and probably longer, assuming the desktop sticks around another decade. That's a topic for another rant though.

The Punchline
What I'd really like to express is that I think that The Concurrency Problem is first and foremost a language problem. We need a language that makes concurrency implicit and easy. Concurrency needs to be assumed. Erlang is the only language I've seen that does that well. Whether or not something equivalent could be done with a non-message passing model such as Software Transactional Memory I don't know. Haskell may already scale from a single core system to a massively distributed or concurrent system with little or no changes to the code using a STM model. I'm not well read enough yet to know for sure. I don't think that we need to start over and re-write the OS, or desktop applications, or anything else for that matter. We need to be able to use the tools we already have. That doesn't mean that we might not benefit from parallelizing what we can but it's not our first priority. I don't believe that concurrency will be solved by getting the OS to handle threads or finding a framework or method to easily add concurrency to applications using current languages. Erlang isn't perfect either and ultimately not everything needs to be parallelized so Erlang has by no means invalidated C, Python, Ruby, Perl, Java, etc. It's just the best answer I see for one of the biggest problems programming has got.

Thinking is awesome.

posted on 2007-08-09 19:04:22

So excited right now. So many awesome things happening. I can hardly wait to write the news on the next monday update. Hell with it, I'll post some things I've looked at read or thought about in the last 48 hours.
A couple of things.
Kristian Hoegsberg is amazing.
I'm starting to think that given time Ubuntu/Linux can out-Mac Mac. More explanation necessary. I'll get to you. Note that this is not the same as saying they can out-Apple Apple.
Web 2.0 is...auhweiruhaudsf. Free data is...oiajdsofiewaofm. People are crazy. Tim O'Reilly finds the words for the stuff I've been thinking. Freedom is complicated. Delicious, and prescient too! But what about open spectrum...
Certain companies actions do make it a legitimate concern...
Carmack is a genius and anything he says is gold. Need to find out what his kool aid is and drink some.
Been thinking about some security with regard to wireless cookies and WEP Cracking.
Still waiting on news of Banshee trunk improvements.
Sun is serious about Open Source. Maybe more so than anybody else. And yet they still act funky with Java. I'm still trying to figure out how I feel about this.
This looks really useful for next time I encounter data loss. It does happen.
Between academics lining up to help and the German government, I feel like Wikipedia is going to be pretty hard to make ridiculous generalizations regarding quality about "real soon now". It's not just wikipedia tough. Everyone is getting in on the peer production action. Peer production will only become more visible.
There are lots of books that should be written about software. These are some. This is interesting as a look at where things are\might be headed.
I continue to be torn up about the language wars. Are they in some ways just plain silly? Yes. All the same, furthering our tools matters. A lot. Competitors still include Erlang, Haskell, etc.
I tend to think of the web server market as being kind of stagnant. Or at least I did until this summer. Of course, I basically just heard about Apache and IIS until this summer. I'd never actually run/setup/worked with web servers until this summer. I kind of feel like that market is in the midst/outset of a shake up though. Observe.
Amazon's hardware as a service stuff just gets more and more interesting as the days go by. We're going to wake up one morning and this will have changed the world.
There are some real shifts happening. There are different work styles emerging. We'll see what comes of it.
I'm really excited that there is video of Steve Yegge talking online. I can't believe I haven't looked for some before. He's so damn smart I'll listen to anything he says. It links to all the other OSCON 2007 content too which is great because I've been wondering why GUADEC, OSCON, and Ubuntu Live content is all over t3h int4rwebz. Conferences are good because of mindshare but please share your geniuses keynotes with me. Imitate TED.
Keep working at those Online Desktop chestnuts. Even if it doesn't turn out to be the right problem, it sure will help our platform stand out.
I'm really glad this exists. It seems like it could be much more elegant than a reverse proxy or other load balancing solution.
It's always good to know what other people are reading and if anyone is exploring a critical literature then it's Worldchanging. So I'm assuming I'll find something lifechanging on this list.
We really can do just about anything these days. Between this and some 3D printing reports from Siggraph 2007 I have high hopes for what will be possible by 2020.
Lessig is awesome. So is proof of how awesome he is.
If you think the web isn't almost an OS layer itself, you're wrong. Now let's do performance analysis on it!
Social media really does matter. Open Source is naturally on the leading edge of that too. Video and Audio included.
We really are moving away from the desktop. Whether it's the web(Online Desktop), mobile (iPhone, OpenMoko), other embedded or home theater, or some strange new device (OLPC XO, Zonbu, zareason, minis and micro-atx, etc) there are strong currents in this sea.
Gnome and Linux really are doing good things. I'm really excited about watching us surge ahead on so many fronts.
Emerging worlds are cool and it's only going to happen more and more in games and serious apps. Mash up the virtual and the physical. It's all code. What distinction?
Knowing job projections is useful.

Kernels are interesting, you've got Linux, BSDs, Solaris, Darwin\XNU, whatever powers XP and Vista. But they're really just parts of the stack. All the same, they're really important parts of the stack. Infrastructure will always matter. It's just not the focus now. What we're building with it is the focus. The OS is irrelevant in so much as it's just an enabler. This sounds obvious and stupid. I need to think more on what I'm trying to say.

Maybe the processor industry going massively multicore is the only way to force software developers to take advantage of the power that's already there. By forcing them to adopt new programming conventions they force out 30 years of cruft code and development methodology that is bug-prone. Goodbye imperative, hello functional.

Okay, that's it for now. Sorry for the linkflood\social braindump.


posted on 2007-06-08 21:06:00

On Tuesday I was quite relieved to be off work. It had been a stressful day at the office but I had exciting dinner plans with friends (trivia night at Benchwarmers with Justin and Bria, for the curious). I had finished Cradle to Cradle by McDonough and Braungart the day before and needed to start on my next summer reading book. Nothing on my list really suited though. Most of the books are rather academic and while I find social production fascinating academic writing wasn't really something I was ready for. With that in mind I stopped in two bookstores while I was running afternoon errands and walked away with a copy of Paul Graham's Hackers and Painters. I'm about 50 pages in at the moment and have really enjoyed reading it for the past few days. I'm sure some bits will make it into my next quotables. However, it has raised a number of questions\issues\thoughts in my mind for which I consulted the collective intelligence of "teh interwebs\tubes" this morning and stumbled upon someone who had made both a parody of Paul Graham and what is (in my mind) a more serious\grounded critique. Perhaps the most nerd-controversial thing Graham talks about is programming languages. There are few things more hotly contested in the programming world. In the critique of Graham though, what I found most interesting were the analogies drawn to other internet authors like Eric Raymond. I finally stumbled on this quote on a different site: "Every 5 minutes you spend writing code in a new language is more useful than 5 hours reading blog posts about how great the language is."

And it was at about that moment that I realized that (formal and\or programming) languages are completely incidental. Graham notes early in the book that Computer Science has a real mix of participants. The computer science building at a university might contain honest-to-goodness mathematicians who get their work filed as computer science for funding purposes, a middle ground of those studying computers without making things with them, and programmers trying to make software. For the programmer, language is incidental. The language is only a tool towards creating software...and arguing about programming languages is in some way similar to the creation-evolution debate in suburban America. If you're a scientist it's possible that such a debate matters. It can affect how you approach your work and\or your starting assumptions. Similarly, if you're a computer scientist at a university it affects your research. If you're a programmer though, or a suburban American, than any group can be right. It's effectively incidental to your daily life. If you live in suburban America and are neither a pastor nor a scientist than it matters not whether God created the universe or whether it began out of nothingness one day and it matters not whether it took ages or attoseconds. You will still get out of bed, dress, tend to whatever dependents you have (dogs, children, spouses, etc) and go to work. That is what you will do three hundred and fifty days a year, regardless. Similarly, the programmer can be using the world's best language or it's worst but they will still roll out of bed and try to produce software with it. The only point at which language can conceivably matter to the programmer is if it can ease software development and this is a very personal, very individual thing. The same might be said of SCMs (souce control management systems).

The extent to which people fanatically advocate languages, architectures, etc. beautifully exhibits how intrinsically subject to network effects technology and specifically information technology (that is, computers) are. It's not an issue of simple memetics. Why do mac users advocate mac? Why do linux users advocate linux? Why do python users advocate python or perl users perl? Why do nvidia owners advocate nvidia or amd users amd? Why do PS3 owners advocate PS3 or Wii owners Wii? Simply because the ability to manipulate all that information that you use computers to manipulate in your life (and there is lots of it) whenever you want and wherever you want is good. The information becomes more useful as its accessibility increases. You would never bother typing journal entries into a computer if they couldn't be posted to the internet. There would just be no damn point. You'd keep a journal. At some point Graham tries to draw a distinction between the popularity contest of high school which he argues as being pretty arbitrary and the popularity contest of the real world which supposedly is less so. Here's a thought: Neither of them are arbitrary and both of them are examples of human beings making decisions based on incentives, not that economics has any concrete idea how to explain any of this human behavior beyond such a simple premise. Clearly though the implications are profound. If there is a less mainstream technology which has become an integral part of life for an individual they will fervently advocate the technology so that others adopt it in an effort to keep the engineers hired to maintain and\or improve the technology around as long as necessary. The support industry itself advocates the technology in an effort to maintain their jobs.

Everybody hopes to know or be involved in the next big thing because it's tied to their employability. If you're looking for work, you'll try to learn Java before Algol. This may be dumb because Algol would certainly distinguish you more from the mobbish competition. Moreover, whatever companies do need Algol coders would have little in the way of alternatives and probably have to pay decently. You'd have a bit more bargaining power potentially. Of course, so would they. Take the job or be unemployed. Really though, the whole thing is just ridiculous. The fact that all we want computers for is to do useful and interesting things complicates the issue as does the fact that the differences in languages cause differences in the ease of solving certain programming problems and also perform differently where speed is concerned. And I'm convinced that just ordaining a standard language, platform, architecture, etc would result in a net productivity\innovation loss. Besides, the majority of programmers get employed in spite of all this and many innovations make it to prominence. I still don't buy the quote from Almost Famous about pop music being good because so many people like it but there's something much more complex at work here that I just feel like is slipping off the tip of my tongue and hiding in the back of my head. I guess it sort of reminds me of what McDonough said about nature having good waste (which in some sense means no waste). The extra blossoms produced by a cherry tree litter the ground but in a good sense. It's all so that one new cherry tree will emerge but it manages to fertilize the "littered" area anyway. Almost like it's all about making goods with positive network effects. And certainly languages like ripping off each others features. As do markets. So maybe this good waste is what we're really after. Markets definitely don't make good waste when they produce things. It's clear that our way of making stuff is pretty borked. But there's definitely something to be said for letting all our ideas fly around and crash into each other. But...what does it mean if there are no experts left? Were we wrong about the concept of the expert to begin with?

Everything is Code

posted on 2007-05-21 09:41:00

So, we seem to be gradually acquiring a philosophy of code. Definitely not every member of the species is but there's this sort of growing awareness in certain groups that things really are ultimately pretty simple...and also code driven. It's not that information matters, it's that information is matter. The mathematicians might've been the first. It's hard to say but they definitely had some sort of head start towards this thought process\philosophy. Later the rest of the hard sciences started getting involved. Things really kicked off with the advent of Digital Circuits and Computer Science. 0s and 1s could be used to describe or simulate pretty much anything...given enough memory and time. If it's computable, the Universal Turing Machine can do it. Then something happened again, in 1972 Walter Fiers deciphered the complete genome of the Bacteriophage MS2. Genetic sequencing began to take off. Somewhere in this process, when we really began uncovering the power of the genome and the expressiveness of the genetic code our efforts naturally shifted from reading the code to writing it. Increasingly, we are discovering that not only the virtual worlds of the computer but our actual reality are programmable. We can cause chickens to have more wings, we can make E.Coli produce plastic for stitches stronger than those available, caused cows to birth gaurs, and Australia is looking into making Tasmanian Tigers walk the earth for the first time in 70 years by birthing them from wolves. And we have created entirely new genomes. Born that which did not exist. Hopefully, we will soon discover for the inorganic universe what we have discovered for the organic and virtual universes. Maybe one day we will even discover a code which governs the fundamental forces (gravity, electromagnetic, strong, weak).

For now, I'm curious about the Genetic Code. Certainly, there is an analogue between the machine code of 0s and 1s and the Genetic Code or As, Ts, Gs and Cs. We could even define A as 0, T as one third, G as two thirds, and C as 1 to draw this analogue out a little further. I would argue that the growth of computer science was fueled by a number of things but lowering the barriers to entry for programming was certainly one of them. That is, nobody codes in binary. Even long ago, everyone coded in assembler. Now, I'm not entirely (or even remotely) comfortable advocating that the emerging industry of genetic engineering try to emulate computer science. There are way too many bugs in our programs. However, linguistically speaking, I'm curious if there is an analogue to higher level languages in computer programming such as Assembler, C, Python, Basic, PHP, etc and if there aren't such languages how one might go about creating them. Anyone have any thoughts on this?

Unless otherwise credited all material Creative Commons License by Brit Butler