Content tagged Linux

A Common Lisp Web Development Primer, Part 1

posted on 2010-09-19 19:57:54

Disclaimer Pt.1: There are many people smarter and more qualified than me when it comes to CL web development. Thankfully, this article is covering basics and my knowledge should be sufficient. Correct me where wrong, of course.
Disclaimer Pt.2: This article will deal more with config files, programming environment setup and the CL web dev landscape with a follow up article to introduce a specific framework, examples, etc.
Edit of Nov 12, 2010: This article has been updated to reflect modern Common Lisp best practices (i.e. Quicklisp).

The Hardware

The first thing to talk about is where our application will be served. This was recently discussed on lisp.reddit. Unlike languages like PHP, Perl, Ruby or Python, shared hosting is not prevalent for lisp though is a notable exception. In general, a VPS or similar setup will be required. I've heard good things about people using Slicehost, Linode and ThrustVPS. Personally, I use Linode's smallest instance for $20 a month and have been quite happy with it. I've run hunchentoot, lighttpd, postgres and mysql on it simultaneously without issue but that wasn't under significant load. I'm also aware of at least one startup using Lisp on top of Amazon's EC2. Heck, you may have a server you'd like to run out of your home. For our purposes, I will assume you have a reliable internet-facing Linux box and root access.

The Linux Distro and Programs

Any Linux distribution should be suitable for lisp web server duties. Personally, I would lean towards Archlinux as their default install is quite lean and they keep very recent versions of SBCL(1.0.42), CMUCL(20a) and others packaged. There's even a CCL AUR package (Archwiki AUR article) maintained by Leslie Polzer. Whatever distribution you wind up using to follow along with this series I recommend you also install screen, emacs, sbcl and lighttpd with your package manager. You should also grab the VCS pentafecta of darcs, git, mercurial, subversion and cvs.

Setting up Emacs and SLIME

Note that there are many other, probably better, Emacs+SLIME tutorials out there. Since the original writing of this article, Quicklisp has become the dominant method for acquiring Common Lisp libraries. Instructions for its use are here and the clbuild instructions are maintained below for posterity. First grab quicklisp with curl -O then load and install it by running sbcl --load quicklisp.lisp, followed by evaluating (quicklisp-quickstart:install), (ql:add-to-init-file) and (ql:quickload "quicklisp-slime-helper"). Finally, add (setq inferior-lisp-program "sbcl") and (load (expand-file-name "~/quicklisp/slime-helper.el")) to your ~/.emacs.

Alternate clbuild instructions
The first thing to do is grab clbuild. At least until quicklisp is released, clbuild will remain the easiest way to get all the necessary lisp libraries to get cranking on web development in linux. I like to keep clbuild in ~/builds but place it where you like. Download it with darcs get Then cd into the clbuild directory and make it executable by running chmod +x clbuild. I'd also add the directory to your path in .bashrc or add an alias like alias clbuild='/home/redline/builds/clbuild/clbuild'.

With that done, it's time to start grabbing libraries. First, get SLIME by running clbuild update slime. Then you'll want to run clbuild slime-configuration and stick that in your ~/.emacs file, taking care to change the (setq inferior-lisp-program "/home/.../.../clbuild/clbuild lisp") to (setq inferior-lisp-program "sbcl") or "/usr/bin/sbcl" or whatever is appropriate.
End clbuild-specifics

At this point you should be able to ssh into your development server, run emacs -nw (for no-window-system/terminal mode) and then type M-x (alt-x) slime and enter to get to a lisp prompt.

Getting the Lisp Libraries

After talking with Leslie a bit, I'll be using weblocks-dev over weblocks-stable. Weblocks-dev use is encouraged over stable at this time. Quicklisp uses weblocks-dev already and makes this insanely easy, just evaluate (ql:quickload 'weblocks). Done.

clbuild specifics
If you'd like to use weblocks-dev, open /path/to/your/clbuild/wnpp-projects in your favorite text editor and change the following:
1) Find elephant and change it's darcs repo from blah/blah/blah/elephant to blah/blah/blah/elephant-1.0
2) Find weblocks and change it's hg repo from to
3) Find cl-prevalence and change it's repo to cl-prevalence get_hg
Then run clbuild update weblocks and, if prompted about whether or not to download dependencies, enter 'y'. Let it work it's most excellent magic.
End clbuild-specifics

The Framework Selection

There are a multitude of ways to do web development in Common Lisp. There are web servers such as Araneida and Portable Allegroserve (both effectively unmaintained), Hunchentoot (which is dominant in the way *Ediware* often is), Apache with mod_lisp, relatively obscure or undocumented combination server/frameworks like Antiweb, Symbolicweb/SW-HTTP and Teepeedee2 and frameworks like Weblocks, UCW and RESTAS.

I wanted something relatively commonly used and well-documented but I wanted a framework as opposed to just using libraries like CL-WHO, Parenscript and Postmodern on top of hunchentoot. Since UCW already has a blog series and I've worked with Leslie on Paktahn a while, Weblocks was a natural choice for me.

The App Setup

I already run a wordpress blog and some other stuff on a lighttpd server on my Linode. Consequently, it made sense to just map a subdomain to my lisp experiments and leave the rest alone. To do this with lighttpd, add the following to /etc/lighttpd/lighttpd.conf:
$HTTP["host"] =~ "" {
proxy.server = ( "/" => ( ( "host" => "",
"port" => 4242 ) ) )

Now you wouldn't want your webapp to not restart if you had to reboot the server would you? Of course you wouldn't. I've taken a cue from Xach and daemonized it via screen as follows:
Open /etc/rc.local in your favorite text editor and insert a line similar to su redline -c 'screen -d -m -S screenslime -c /home/redline/webapps/screenrc'. This will ensure that the redline user starts a screen instance in detached mode with the name "screenslime" when the system boots that will use /home/redline/webapps/screenrc as it's configuration. If you don't know what any of that means, don't worry. It means screen is cool and you want it.

Now, you should add something like the following to whatever file you listed as your screenrc:
chdir /home/redline/projects

screen emacs -nw
screen sbcl --userinit /home/redline/.sbclrc --load /home/redline/webapps/init.lisp

This will ensure screen defaults to the /home/redline/webapps directory, starts an emacs instance in no-window-systems mode in screen window 0 and starts sbcl loading your .sbclrc and a lisp init script for your webapp(s) in screen window 1.

Next, we need to actually write the init file for your webapp. For now, it will be quite simple as I'm just playing with weblocks. In the next article, we'll build something (probably un) interesting. In your init.lisp file (or whatever you called it) insert something like:
(ql:quickload '(weblocks swank))

(setf swank-loader::*contribs* '(swank-c-p-c swank-arglists
swank-fuzzy swank-fancy-inspector
(swank:create-server :dont-close t
:port 4010
:coding-system "utf-8-unix")

This will ensure weblocks loads and swank serves on port 4010 so that we can use SLIME to connect and work on the running system. Note that you could've also put (load "/path/to/my/.sbclrc") or inlined the following as the first line(s) in your init file and avoided the --userinit portion of the sbcl invocation in screenrc. My .sbclrc simply points sbcl to the clbuild libraries like so:
(require 'asdf)
(setf asdf:*central-registry* '("/home/redline/clbuild/systems" *default-pathname-defaults*))

Unless you're using clbuild, you won't need this in your .sbclrc but if you are it's important that you do this so that sbcl can find the lisp libraries we downloaded with clbuild. If you don't, it'll be looking in ~/.sbcl/systems.

Finally, I would also add a bash alias to your ~/.bashrc to get you right back to where you were with screen+SLIME. Mine is alias webslime='screen -dR'. I also added stty -ixon to my .bashrc as detailed in my last post because screen was capturing keystrokes I wanted sent to emacs. Xach pointed out that this could be toggled in screen with C-a C-f but I preferred having it as a default.

See, now that was mostly painless, wasn't it? Next time I'll cover the basics of weblocks and develop a simple starter application. Or if I'm feeling particularly lazy, maybe I'll just walk us through the simple-blog example-app. Cheers.

Paktahn 0.9 is out!

posted on 2010-05-18 18:16:40

Well, it's been a long spell since the last paktahn release. There are reasons for it but I'm just glad we got the release out.

The "big feature" of the release is AUR updates which I am happy to have implemented. I wouldn't have been able to get it done if versioning support hadn't been kicked off by Wei Hu (wh5a) a while back though. At any rate, AUR Updates are in as is support for the .tar.xz format which Arch has adopted for packages going forward.

Beyond that we have a new contributor to the project, Justin Caratzas, that I've enjoyed working with and hope to work with more in the future. Justin fixed a bug in how AUR packages were installed regarding whether or not they were installed as dependencies.

I should have more time to hack on paktahn this semester than last semester so hopefully there won't be a commit gap for 2 months like there was before. I'm already looking at features for 1.0 and my biggest priority is reworking the command-line option handling using astine's unix-options and then extending paktahn to support pacman's -Syu, -Sy and -Su. Then I wouldn't ever need to call down to "regular old" pacman. Other than that it might be nice to get support for Paktahn on CCL or ECL and a test suite written. CCL support is complete save catching Ctrl+C and offering restarts as appropriate. There isn't exactly a clear path to implementing said support...

Anyway, if you're a user and you find a bug or want a feature, head for the issues page and let us know about it!

Silly School

posted on 2009-09-15 16:01:47

This post is likely to be a bit scattered. Partially because I need to get back to doing homework in a minute but also because my brain has been in a lot of places lately. So here's a linkpost with thoughts on IP Law, Linux and other stuff. I also have a bunch of Lisp links but I'll dump those separately later...after I get some homework done.

I keep hearing about stupid moves by Microsoft lately. It's very confusing because in many ways my opinion of them has improved over the last few years. Not that I'd ever want to use a Windows-based OS again. I'm just too happy in Linux land. My point though, is that the company clearly has an old culture of anti-competitive wonkiness and a newer culture that seems more focused on creating good products and less on market manipulation. Maybe it's all just weird management stuff though. Hopefully that will change sooner rather than later. On the other hand, Sony seems to be getting their console act together between dropping prices and actually putting out effective advertising for perhaps the first time in history. I'm also quite pleased with Google taking a (more official) stance on Data Portability. It's something I feel pretty strongly about though I won't speak more about it today.

My hatred of AT&T seems to be perpetually growing. The FCC is trying to come up with a more formal definition of broadband and the carriers are, in my view, trying to make that definition demand as little of them as possible. Generally, I've gotten to a point where I hate telecoms. So, I have a message for them: Give me fiber, or whatever wireless connectivity you're pimping this week, and shut the hell up. In other news, IP Law is still completely ridiculous and I can't begin to summarize or explain that here. I can offer an example or two though. The first is a list of seven felonies with less severe penalties than music piracy. It's meant to be humorous. It's sadly surreal. I'll actually let that be enough of an example for today and link to a discussion of what fair use might look like in the 21st century and a curious idea of making digital property "stealable". Last but not least, I'm at least glad that good arguments against software patents are being made to the Supreme Court. Crossing my fingers on that one!

Peter Seibel's Coders at Work has finally come out. I was looking forward to the book for a good while and have been enjoying reading the interviews. LtU recently posted about it also. I've got 7 of 15 knocked out. I've been surprised that the two interviews I think I've enjoyed the most were with Simon Peyton-Jones and Brendan Eich. I was expecting the Lispers or Smalltalkers to be more to my liking. *shrug* I'll likely write a review or at least talk more about it when I'm finished.

I've been following a few pieces of software (as usual). It's nice to see the competitiveness in the browser market of late. The Chrome Linux team was disbanded recently and I take it that work is now part of mainline so hopefully there will be an official Chrome release for Linux "Real Soon Now". I should also note the emerging standard for 3D Graphics on the web. Something good will come of this. Additionally, GHC 6.12 is coming along nicely. Lots of bugfixes the last few days. Looking forward to GHC 6.12.1 RC getting out there even though I won't be using it. Rock on, Haskellers. Pitivi also made another release. Now if only Arch would get an updated pitivi package, I'd be a very happy man. Oh, and there hasn't really been any more news on the N900. I'm keeping my ear to the ground.

Finally, this is the month of Linux conferences between the Atlanta Linux Conference this weekend, the almighty Linux Plumbers Conference next week and the X Developers' Conference after that. Speaking of X, anholt reports continuing progress on the Intel front and I feel warm and fuzzy inside. All for now folks, later!

RedLinux Revamped

posted on 2009-05-12 20:00:25

So the last time I really blogged about RedLinux was back in September 08 when I made the first release. I kept tweaking things now and again but at this point I've got something that I'm really not messing with very often. I've christened it "Redlinux v.20" and I'm planning on trying to make releases every six months or so. They'll mostly consist of package updates with any other changes listed in the changelog. I'll be trying to keep the latest versions of all my dotfiles and a complete archlinux package list in the redlinux docs portion of my website. With the dotfiles and the package list, you could basically build the thing yourself anyway and it keeps my life easier besides. The only real long term plans other than that are to improve documentation and make it more friendly for other people to tinker with.

I've made a Live CD image with an installer and uploaded the ISO to my Amazon S3 account. You can grab the ISO here. The old install guide still applies.

Endless Blather

posted on 2009-03-26 02:55:34

This post has everything.

It seems like a lot has happened in the past day or two. I'm all wrapped up preparing for a test tomorrow but there are other interesting things afoot. Teresa turned 20 today and there's going to be a party in her honor on Sunday. Kernel 2.6.29 has been released, it turns out cpufrequtils was never really doing anything and Skate 2 finally got a patch enabling custom soundtracks. EA Blackbox, even though you're two months late I'll take back some of those mean things I said. Speaking of games, someone finally wrote a Fei Long Guide for SFIV. It should hold some good lessons but I think I've got a lot of it down by this point.

I've got the webserver setup to play around with weblocks, leftparen and happstack. Hopefully one day I'll actually spend some time on that. It would be nice if weblocks was asdf-installable. I don't know. Maybe I'll just prototype GUIs in Chicken Scheme, Common Lisp and Python. QT seems to be the cross-platform GUI toolkit of choice. It's the only one with recent bindings for all three languages.

Oh, before I forget, if you're interested in the best general write-up on SSDs I've yet seen you should read this article from Anandtech. Generally I prefer the stuff at Arstechnica but I've yet to see anyone with an article this thorough and excellent on SSDs. Well done, guys. Speaking of which, OCZ Vertex 120GB are under $400. OCZ, you've earned my faith by this one. I'll choose you guys when I have cash to blow via pricegrabber.

There are endless good recipes on the Pioneer Woman's website. I had an abundance of Chicken, I check under Entrees->Chicken and find Braised Chicken and Parmesan Crusted Chicken. I've tried the Parmesan Crusted Chicken and the Braised Chicken. The Parmesan Crusted Chicken was pretty fantastic. Braised Chicken was tasty but I didn't like it as much.

The arguments about concurrent and parallel programming are ongoing. GHC is planning a new release for Autumn. I really hope the Haskell Platform is off the ground by then. Also, if you use Xmonad there's a good guide to Urgency Hooks here. Open Source development is still being thoughtfully explored. See, The Free as in Beer Economy and Freesouls.

The International Lisp Conference '09 has been going on and different people have said different things about it. Andy Wingo seems to have some decent writeups. Sadly, some of the things he say make me think of what Paul Snively said in his Road To Lisp survey (which I realize is likely quite dated), "My own thinking is that Lisp is the cockroach of programming languages: it'll be the only one left after the apocalypse. Not bad for a dead language." Maybe in a few decades I can hope I don't suffer the bias of echo chambers. Maybe not.

Last but not least I'll just note that I'm really enjoying Elbow tonight while doing math. Really enjoying it.

Elbow - Weather To Fly
Found at

Redlinernotes Reborn

posted on 2009-02-27 23:29:33

I can't believe it's only been 8 days since I posted last.  Life has been moving at an insane speed but I've been really happy.  A lot of things are just sort of coming together lately and I have to say it's a pretty pleasant change of pace. Dad's on the mend. We don't have the cancer beat but it's certainly at bay for a little while. He'll likely never be out of the woods completely. Schoolwork is going pretty well. I haven't been working absolutely as hard as I can but I have low A's or high B's in all my courses. I planned out my course schedule for the next few semesters and figured out that I'll graduate in December of 2010. That's with 4 course summer sessions. It's longer than I'd like but nothing from Oglethorpe carries anywhere. Plus CS is my second time switching majors so I was practically starting over. *sigh* Still, I have a plan and that's pretty nice. I also did taxes this week and found out I'll get a $1500 rebate.

You also may notice that is going significantly faster of late. I finally manned up and began paying for "real hosting". There are a number of benefits, not least of which are freeing up my home connection from requiring a Static IP. Moreover, the upload speed and latency are much, much better at the hosting facility. It's a Virtual Private Server running ArchLinux which I purchased through Linode. It's $20 a month and so far I couldn't be happier with it. I may do web development on it in Haskell, Scheme or Lisp at some point but that's down the road a bit.

Not everything is roses though. I got hosed on my berrics predictions. Of course, I blame Steve Berra. Marc and Steve were supposed to have a nice game of skate but then Steve caught something awful that looks like chicken pox. Instead of putting the round off further, Steve MC'd and pitted Marc against Johnny Layton who failed to make his first round appearance. Marc was definitely having an off day. He missed like 4 tricks before beginning to hit his stride and it was too little, too late.  If I recall Marc missed a regular 360 flip and a nollie flip. It was painful to watch. Anyway, my whole bracket is F-ed.

I'm having a Street Fighter IV tournament tonight. I've been spending a lot of time working on my game this week. For some reason I get really competitive about fighting games but only fighting games. I don't think SF4 has the mass appeal or the elegance of Smash Bros though. I'll probably try to write more on that later and I should acknowledge I have a strong bias that I'm trying to compensate for from years of Smash Bros play. So far I've settled on Fei Long as my main character and I'm planning to spend some time getting decent with Gouken as my secondary. The tournament should be fun, at any rate.

Other than that, I'm having trouble thinking of what else has been going on. The one bug in Linux that's been bugging me is fixed upstream so the next ALSA release will make me pretty damn happy. I'm increasingly enamored with Haskell. I'm slowly beginning to work my way through Real World Haskell and plan to spend a good bit more time on it over spring break (March 8th-14th if you were wondering). It's the only language I've seen that seems like it can handle issues of parallelism and concurrency more or less today. I'm definitely keeping a close eye on it.

The Generic Quick Post

posted on 2009-02-19 21:20:53

I didn't even realize I hadn't posted in over a week. I just got through the first big "crunch period" in school and did pretty well. This weekend will be relaxation and unwinding to a pretty large extent. So what's happened besides the school stuff?

I got Street Fighter IV. I am planning on throwing a tournament...details forthcoming. I already think I prefer my Smash Bros tournaments.
I'm enjoying Fleet Foxes and also The Stills and Vampire Weekend at moment. Mmm, mmm, music.
ArchLinux finally put out a new release for the first time in a while. They're also going to try to drop releases with each Kernel release from now on which would be pretty damn cool. A distro that releases 4 times a year? Watch out. Not that most of us Archers don't just install and roll along...
This is my jam and beautifully and entertainingly explains what I'm trying to say about parallel programming and the future. It also advocates haskell a bit which is nice.
This just generally talks smack about for loops which is not a bad thing. I'm so sick of for loops. I'm not going to get into my snobbery right here. Just know that the fact that I ought to learn C for the future so I can deal with the past is a little frustrating at times.
Finally, I'm 3 for 3 on my berrics predictions and with any luck I'll be 4 for 4 this weekend when Marc Johnson finally fights Steve Berra.

My .hgignore

posted on 2009-01-14 23:25:29

Before I forget it or fail to version it somewhere I want to post the contents of my .hgignore file. You may or may not find it useful if you're a mercurial user but it's really for me. :)

syntax: glob

Counting Down

posted on 2009-01-13 00:45:25

It's been a pretty eventful holiday season. I wrecked my Maxima on Dad's birthday (the 23rd), in particular. I didn't want to mention something until I had a concrete opinion to express and now I do. It was for the best. Seriously. I had spent considerable amounts of money trying to keep the car in good repair this year, it wasn't paid off yet and my parents and I had long since agreed it was a lemon but had no way to get rid of it.

Luckily, the insurance has paid off the car, I'm fine and this enables me to cancel my insurance and have a bit more financial leeway for the coming school year. I had to figure out a method of public transit from Brookhaven to Marietta but that didn't turn out to be too tricky. It ultimately just means I'll spend about 3 hours twice a week (Tuesdays and Thursdays) on CCT and Marta. Today I went for the first time as a test run. The ride was enjoyable and afforded me a bit of time to get some sampling done. I took care of a few on-campus errands, bumped into several disparate acquaintances such as John Valentine and some Fayeteville folks and scoped out food options.

In short, I'm ready for school to start even though they're going to make me learn C#. I hope I can use Mono most of the way.

I keep reading about the economy, earmarks and entitlements. Is no one else thinking that the real issue here is sustainability, limited resources and population control? Minsky gave a talk on this but he's not the first person to raise the issue. When are we going to admit the planet can't indefinitely support 6 billion people or at least do the math/research to prove that it can? That's the question I'd really like us to be thinking about. If we want to be honest and accountable, the big picture is the only place to talk. The futurists and sustainability freaks seem to be pretty much the only people doing that. I'll rant about that more later but I wanted to at least note that it's been occupying increasing amounts of headspace for the past 6 months.

Speaking of mindspace, I still really love ogling the stack languages. I played with Forth a little bit but didn't get too far. It was just a fun diversion from Lisp at the time. I still really want to check out Factor.  Frankly, after seeing how fast the factor guys grow the ecosystem and libraries around the language I believe their productivity gain claims. Go Planet Factor, Go Slava. I'm sure I'll get around to playing with it sooner or later. I have a nightly installed and FUEL setup...which actually popped up on Reddit today ironically enough.

The only consumer-y thing I can think of that I'm excited about for the foreseeable future is the upcoming release of a PS3 game called Skate 2. Skate 2 is really just a patched-up and glorified Skate 1 to me. I'm still excited and I don't mean to speak ill of EA Blackbox but I could care less about much of the new stuff. I just wanted custom soundtracks, a tripod camera and good PS3 framerates. It comes out on January 21st and I'll disappear for a week in all likelihood exploring all it's corners.

I still marvel at and love my new X200 and btdubs, btrfs is in for 2.6.29. For the record, I've had a newsgroups subscription with Astraweb for about a week now. I've poked around for a few Oscar screeners but haven't observed anything I couldn't find on isohunt or thepiratebay. Sure, the download speed is a boon but I'm looking for content that isn't readily available on other networks. I've checked out nzbmatrix and What am I missing?

The RIAA has said they're giving up lawsuits and trying something else. I'll be keeping an eye out and looking for service that don't discriminate to Static IP users with their own blogs or pander to RIAA/MPAA/etc. In IP related news, Lawrence Lessig appeared on the Colbert Report. He has much more interesting things to say beyond what was covered so I'd recommend picking up some of his books, reading them free online or at least reading the Wikipedia articles on his first book and Free Content.

I've also been catching up on my music obsession over the cold season and particularly enjoyed White Winter Hymnal by Fleet Foxes and Grounds for Divorce by Elbow the last few days. Also, I love Battles at least a little bit for writing Tonto, Atlas and Leyendecker. I'm also enjoying Amon Tobin all over again because he's a damn genius.

I haven't been keeping up tremendously well with my New Years Blogging resolution, as you may have noticed, but I think that will change now that I'm busier. I've been doing much better with the music sampling and skateboarding. On to the coding and schooling.

ArchLinux64 on the X200: A Field Report

posted on 2008-12-10 19:39:06

As I mentioned before, I have been hankering for a nice, shiny new ultraportable for almost a year and Christmas came early in 2008. I settled on a Lenovo Thinkpad X200 (formerly IBM) for a number of reasons.

1) It has a dock. I really wanted something with a dock since I'll be commuting with this puppy everyday for a while.

2) It weighs under 4 lbs, under 5 in the laptop bag with a power cord, and gets over 6 hours of battery life under most use scenarios, sometimes as much as 9 hours.

3) It's relatively affordable compared to similar ultraportables and I'm a big Thinkpad fan. This model also avoids touchpads which I dislike and ships without an optical drive which is a plus. After all, how often do you really use your optical drive? Save on weight. Save on power. There's a DVD/RW in the dock for when you're home anyway.

I've had it since Monday and the results are in. It's a lovely, lovely little machine and any complaints I have would be directed at software rather than hardware. That's the way it should be. I had it shipped with Windows Vista Basic because there was no option to have it shipped without an OS or with Linux and having it shipped with XP meant adding the cost of XP to the purchase price. You effectively purchase Vista and XP. Ridiculous. Therefore, the first order of business was to get Vista scrubbed off and XP SP2 installed. Now, it may seem ridiculous to worry about running XP instead of Vista but I'll cut it short and just call it a personal choice rather than defend the decision. I believe there are performance and compatibility reasons to do so as well. Google around and decide for yourself.

I have an XP Pro disc but if you try to install from it the installer blue screens. There were two separate problems in my case which both needed addressing. The first being that my disc was XP SP0 and XP SP2 is required to recognize some of the hardware. The second being that even XP SP2 needs a driver added to recognize the Intel Storage Controller unless you want to fiddle with BIOS settings for the controller before and after the install. I decided the best path was to Slipstream Service Pack 2 and the requisite drivers onto my XP install disc. Slipstreaming is inserting new updates, service packs or fixes into an old installer for a given piece of software. The program I used to do it was n-lite. Microsoft kindly offers a free download of SP2 and Lenovo, of course, offers a download for the Intel Storage Controller Driver. I should note that you still need an original XP disc for n-lite to copy down to the hard drive and modify before you burn yourself the updated (slipstreamed) version. When slipstreaming the Intel Drivers make sure to select textmode drivers instead of PNP and select the Intel ICH9M-E/M SATA AHCI Controller as the device to support. That about does it for Windows. Afterwards it's just the normal grab all the drivers from Lenovo runaround.

Linux has proven to be a bit more interesting. I used gparted on a LiveCD to shrink Windows' NTFS partition to 30GB of the 160GB drive in the laptop. Then I fired up the latest ArchLinux64 CD to install that. Afterwards I found that neither the ethernet card nor the wireless card in the laptop was recognized by the old kernel version. My solution was to grab my trusty USB Flash Drive and another PC, head to (my Arch repo mirror of choice) and follow the links down to the most recent x86_64 kernel package. I downloaded the package onto the Flash Drive and then installed it with pacman -U, hooked it to a wire and grabbed the iwlwifi-5000-ucode package and added iwlagn to the modules array in /etc/rc.conf to enable wireless. Goodbye, networking problems.

I had stated that I'd be making a dual RedLinux v.014 release in the near future on i686 and x86_64. Apparently I didn't do my homework though as I found that the flashplugin and virtualbox-ose packages are only available on i686. Granted, flashplugin for x86_64 is in testing as Adobe finally released a 64-bit alpha for Linux recently. They've been dragging their feet on 64-bit Linux support for tentatively go Adobe. That said, it could be a while before that moves from testing to extra. Apparently there might be legal issues. While perusing the mailing lists to learn more I stumbled upon some big transitions in the ArchLinux camp. There are lots of reasons this will likely end up an exciting and good thing but I'll be keeping a close eye on it. As for virtualbox-ose, I just traded it for the virtualbox_bin aur package. In the process I lost a little software freedom but I gained some features. I then asked pacman to go download and install about 70 or 80 packages (2gb worth) set up a few config files and called it a day.

The next morning however, I found a few things amiss. The most serious issue for me at the moment is that when docked on the Ultrabase, Linux failes to pump audio out through the dock's headphone port making the use of external speakers a pain. There's a bug filed in Launchpad but I can't find anything in alsa's bugtracker because you need an account to browse it. That could surely do with a bit more openness. At any rate, I'm not sure how soon to expect a fix on that as it's surely not affecting many people so I'm keeping my eyes peeled for workarounds. There is nothing on the ALSA mailing lists either.

After that, my main concern was getting suspend working. Yes, I can hear your groans but it's not that bad. I promise. The issue was a some sort of concurrency related nastiness in the current xf86-video-intel driver that was causing problems on resume. A nice workaround script was posted here to get it to work properly by disabling one of the cores during the suspend operation. A new XServer with improved Xrandr (display hotplugging) and xf86-video-intel driver will release with Kernel 2.6.28 in January which will also conveniently fix the issue of dealing with the external monitor connected to the Ultrabase. It presently only detects it when booted up on the base and the resolution options are less than perfect. The new drivers will indeed be nice. I also got to learn a bit about pm-utils and cpufrequtils throughout all this. Both of which you should install to get the most out of your battery life. Oh, and powertop. Glorious powertop. Hell, check out all of while you're at it.

That's most of what I've been up to for tuning at this point. Next is probably just a little xmodmap and /etc/sudoers magic to get a few of these functions keys doing what I want. Then I'll probably just wait on alsa, kernel and xorg/xf86-video-intel updates. I'll also probably end up using lxrandr to play with the Xrandr settings when the new release happens. The other fun thing I learned in all of this is just how well loved Thinkpads are and a few signs of their support among linux users through things like the Ubuntu X200 Owner's Thread and the linux-thinkpad mailing list with their discussion on maximizing battery life.

For now, I'm very happy with the new machine and probably will continue to tweak settings for a while. Turning knobs is fun after all. I also got a 1TB external drive for backup with the laptop and a Scorpius M10 mechanical keyswitch keyboard for use with the dock. The keyboard is lovely except for some spacebar sticking so I'm filling out a form to get that replaced. Ah, the joys of computing.

Wandering Flame

posted on 2008-12-07 22:41:28

It's official. I'm not dead. I mean, 4 weeks without posting? I'm pretty sure that's a personal record. A variety of things have been going on, mostly positive. I spent Thanksgiving meeting the parents of my girlfriend (you know, Teresa?) in Virginia. We had volunteered to cook while we were up which went smoothly, to my relief. One thing we cooked was particularly excellent, the Pioneer Woman's Cinnamon Rolls. They really are as good as she says. I also have finally found a Fried Chicken recipe that I'm happy with which is a plus as I'm hosting 30 Rock viewing parties on Thursdays and they make a good\simple meal.

I've been wrestling with financial aid a lot lately. I'm excited about going back to school in the Spring but figuring out money always seems to be a bear. That seems to be true out of school too though. Most of the financial aid paperwork is done though, I've accepted some federal aid and I'm waiting on some private loans. Oh, the debtors joy. Also, I've got my schedule worked out and it's Tuesday-Thursday only so I'll be able to spend the rest of my time learning and hopefully pick up a part time job! Observe:

Technical Writing - TCOM 2010
TR; 10am - 11:15am; Atrium Building J-213; Jonathan Arnett
Prog and Problem Solving II - CSE 1302
TR; 1pm - 2:15pm; Atrium Building J-217; Jon Preston
Same Class - Lab
T; 2:30pm - 4:15pm; Atrium Building J-201L; John Vande Ven
Global Issues - POLS 2401
TR; 4:30pm - 5:45pm; Atrium Building J-101; Jason Seitz
Discrete Mathematics - MATH 2345
TR; 6pm - 7:15pm; D-Classroom 235; Jennifer Vandenbussche

Of course, I have to look and recall all the lost credits from Oglethorpe. Why am I taking Global Issues again? Oh, right. Transfer fail. Moving on.

Speaking of...a Lenovo X200 is on it's way to me and should arrive tomorrow. Damn you, online package tracking! My Thinkpad A31 just wouldn't cut it for commuting to SPSU. I tried that in Fall 2007. 45 minutes of battery life and temperamental wireless doesn't make for a good student laptop. I've also been working on RedLinux a bit lately and should be making a dual 686 and x86_64 release of RedLinux version 0.14 sometime next week. I'm also hoping to read some Lessig/Benkler or maybe one of the Open Sources books soon. I'd be wise to work through some of Spivak's Calculus or Head First Java before starting back at SPSU. We'll see. That's all for now.

This just in: Hotmail is evil and hates Linux users.

posted on 2008-11-08 21:07:53

This shouldn't particularly surprise anyone and I should be more surprised that it's taken me so long to ditch my old hotmail account. Embarassed even. It is my own fault to be sure that I'm using crappy lock-in focused webmail.

The short version: Hotmail (now Live Mail) won't allow users on Linux platforms to type in the body fields of e-mails rendering the service useless. The quick fix (on my comp at least) is to disguise yourself as a non-Linux user. Type "about:config" in the address bar and acknowledge any warnings you may get. Then type "useragent" in the filter and change the vendor value (if it's there) and the general.useragent.extra.firefox value to "Firefox 3.0.3".

Here's a general note to anyone working on a web site/platform: If you're doing HTTP header checks to see the platform and browser the page recipient is using and warning them if they're not using the Browser/Platform your site is designed for, you're doing it wrong. I don't ever want to see another warning or sorry this page won't work for you. That's fundamentally not what the internet is about. That's not the way the web works. It's called openness bitches. Get used to it.

I'll dump my messages out of hotmail, switch a few services to use something else and move on with my life. This account has been around way too long. Not that I couldn't just keep fooling them by modifying my useragent strings.

Another Emacs/Slime Cheatsheet

posted on 2008-10-12 05:37:04

I've been picking up more and more Emacs and SLIME while working my way through Practical Common Lisp over the last week or two. I'm really happy with it as a work environment at this point but have tons left to learn. I haven't even written any elisp code to script it. Of course, I haven't had a need yet. I'll get there and I'll update this as I learn new things. I'll just note that I'm also quite attached to ArchLinux as my distro and, increasingly, Xmonad as my window manager. It's the first time I've felt really settled on an Operating System/environment since moving to Linux. Maybe ever. I'm pretty happy about it and aside from using RedLinux as a way to see what I like, I've posted all the config files here. Before the cheatsheet here's a quick Linux tip on killing processes I found. Try passing -1 or -9 to kill along with the PID. Try -1 first then -9 if all else fails. On to the cheatsheet.

EDIT: Yes, it's ugly. Piss off. I miss monospaced fonts already, I'm grumpy, I'm tired, it's 1:40 am and I haven't been up this late in forever. I'll fix it later. ;-P

;;Emacs Cheatsheet:

; C-7                         Undo.
; C-8                         Backspace.
; C-s                          I-search forward.
; C-v                          Page-down
; M-v                         Page-up
; M-<                        Beginning of document/file.
; M->                        End of document/file.
; C-l                          Center screen on cursor.
; C-n                         Next-line/Down-arrow
; C-p                         Previous-line/Up-arrow
; M-f                         Forward a word
; M-b                         Backward a word
; M-bksp                    Delete previous word.
; C-k                         Send a line to the kill ring. Cut.
; C-y                         Place a line from the kill ring. Paste.
; C-x C-f                    Find (or create) a file and open it in the buffer.
; C-x C-s                    Save the file in the buffer.
; C-x b                       Switches to a buffer. Type for a specific buffer or hit enter to go with the default (last buffer).
; C-x o                       Moves the cursor between windows.
; C-x 0                       Closes the current window if other windows exist. (Kill this window.)
; C-x 1                       Makes the current window the only window. (Kill all other windows.)
; C-h t                       Start the emacs tutorial.
; C-h k                       Prompts for a keystroke and tells what command it invokes.
; C-h w                      Prompts for a command and describes the keystroke it's bound to.
; C-h b                       Displays a list of bindings to various commands.
; C-u num command   Repeats the given command num times.

;; Slime Cheatsheet:

; M-p                                  Is the up arrow for the slime repl.
; C-c C-c                             Sends an s-expression to slime.
; C-c C-k                             Compile and load the file represented by the current buffer.
; C-C C-L                             Load a file in slime, defaults to the file in the current buffer.
; C-c C-z                              Pulls up the repl in a frame and moves the cursor there.
; C-c RET                             Runs macroexpansion.
; , quit                                 Kills the running inferior-lisp and closes all the SLIME buffers.
; q                                       Leave the debugger and return to the repl.
; M-x Slime-inspect               Run the inspector.
; M-x Slime-profile-package   Run the profiler.
; M-x Slime-profile-report      Check the profiler results.
; M-x Slime-profile-reset        Reset the profiler.

Everything in the World

posted on 2008-10-12 05:08:07

I can't believe it's already October. It seems like only yesterday that I decided to take a break from school. For that matter, it seems like only yesterday that I became unemployed...but this was week 3 and a pleasant week it was. I'm continuing to try and buckle down and be more productive in various ways in spite of the fact that I don't really need money for another month and a half or so. So, what's been going on of late?

The Employment World: My interview with King and Spalding went pretty well. It was very straightforward and none of the technical questions were remotely difficult. By the sound of it, it will also pay more than my last job. That's a good and bad thing. It's good because a decent wage would be nice and my last job wasn't one in my opinion. It's bad because it may be more remedial than my last job. It's a little retroactively upsetting to realize that I'd be paid more here for what sounds like substantially less difficult technical work. We'll see. I also know it'll take a week or two before they let me know whether or not I'm on the list for an in-person interview. Thanks to everyone who asked about it and or wished me well. Devon and Don, I'm looking at ya'll.

The Education World: My friend Will keeps sending me awesome links to research, papers, sites and articles. I also had a fascinating conversation on schools and education with Oglethorpian Chris Latshaw and was reminded why I love Oglethorpe in the process. Conversations like those made the school worth it. I should get around to writing more about all that next week. Also, (to Will) I'm half-way through Practical Common Lisp and hung up on an element of the chat program. I'm being a sissy about e-mailing you questions. I'll write this one off soon, I promise but I'm just trying to wrap a sane Chat UI around the Spread library. I'll send more details soon. Finally, I've downloaded about 100GB of video lectures about coding and math this week. I spent an afternoon queuing them up and left it running a few days. Remember me complaining about everything being in Real Format? Well, I still won in the end. It wrapped up this afternoon. My apologies to the Internet Archive's Ars Digita mirror. They must feel violated.

The Linux World: The Linux Kernel version 2.6.27 was released Friday. Development will start on 2.6.28 now. I'm excited about 2.6.28 because I'm hoping btrfs gets pushed into mainline. That could take a little while but it's still fun. Also, this is the first time that release season has come around and I'm really not interested in Ubuntu or Fedora. Arch/RedLinux has me pretty satisfied.

The Code World: There are some really cool lectures at the S3 conference. I posted about it before because Dan Ingalls presented the Lively Kernel but at this point I'm also really interested in the STEPS project and Ian Piumarta's work. Partially because I'm really jealous of Luke Gorrie, again. And I hope that OLPC XO's really do become more reflective and Lisp Machine like. Beyond that, I stumbled on two web framework tutorials lately, neither of which I have the time to work through really. Sad. One is in Factor and the other is in PLT Scheme. Sexy!!!

The Friends World: Don Gerz has written a number of things that caught my eye lately. Particularly a piece about Kierkegaard. Lex has also written some provocative questions about Banksy. I hope she'll post her paper when it's done. She's also looking to try Ubuntu in the near future. Go lex! Kris Osterhage simply hasn't been posting enough. ;-)
Chris Blair wants this election to be over. I'm rather with him on that one.

That's most of it. I need to write up a cheatsheet for the emacs and slime commands I'm using and then 2 or 3 articles on the stuff about Common Lisp I've been learning. Maybe at the end of it all I'll go back and revise my positions from the Language Adoption and Lisp article. Other than that, I'm trying to get through Season 2 of 30 Rock before Season 3 kicks off at the end of this month and really enjoying the break from employment that I have. Now somebody hire me already! More soon, everyone.

Funcalls and Fun w/Code

posted on 2008-09-29 17:49:22

Life: I have a part-time job interview tomorrow and I've gotten by so far through contract work. I'm also really enjoying not having a car. I've picked up some new tunes and am in guitar fingerpicking mode. I should learn how to myself, really. For now though I'm just listening to Kaki King and Andy McKee. Oh, and Calexico too. They're awesome. Moving on...

Techie stuff: I've pretty much completely switched to Xmonad. It's great and I've polished up my key layout and config for it. There will be some changes in that sense in my next RedLinux release (Fast Amazon download mirror and install guide here). For example, there won't be a Caps Lock key in my Linux. It will just be another Control key. It's not like you use it anyway, right? I'm also starting to finally get comfortable with emacs and slime. And Practical Common Lisp is a really fun and great book to pick up lisp. More on all that later. Here are some fun code snippets:

(dotimes (x 30)
(dotimes (y 30)
(format t "~3d" (* (1+ x) (1+ y))))
(format t "~%"))

(do ((n 0 (1+ n))
(cur 0 next)
(next 1 (+ cur next)))
((= 10 n) cur))

Pop quiz: What do these two Common Lisp snippets do?
(reverse '(The first prints out a multiplication table up to 30x30. The second computes the 11th fibonacci number.))

And the first macro:

(defmacro do-primes ((binder lbound ubound) &rest expr)
`(do ((,binder (next-prime ,lbound) (next-prime (1+ ,binder))))
((> ,binder ,ubound))

Sure it's useless but it makes sense and points the way to some great possiblities. Additionally, destructured lists like so are grand. That's enough lisp to bug you folks with for one day. Deuces.

Let That Which Does Not Matter Truly Slide

posted on 2008-09-25 22:23:34

Being unemployed is starting to get scary if only because I don't know how I'm going to make it to the next thing. There's a next thing I'd really like to see happen though and I'm prepared to risk a fair amount to get it. I may need an interim thing to pay bills until the next thing is actuality though. That's tricky business. If you know of anybody that would be interested in a contract I.T./Linux/Computer nerd, please let me know to get in touch with them.

My buddy Chris Blair started his blog in the last few days with a review of an old Business Sim game but he's already got up entries on everything from Football to SpaceX's Falcon1. Good stuff.

Kris Osterhage also recently dove into the blogging world first with blogspot but then quickly migrated to livejournal. He's blogged on gaming and hardware mostly but he recently put up some politics entries and...well, I'm not ready to dive into that yet. Call me preoccupied. Those conversations can get long and bitter and for the moment, my energies are best spent pretending this isn't an election year.

Now, on to RedLinux. At long last, I've got a really fast mirror for RedLinux so if you're looking to download an ISO to try it out, downloading it from my Amazon S3 mirror is the way to go. Just click here. So, if you'd like to try it out just download that ISO, burn the ISO to a CD (if you're not familiar with this google can help) and reboot your computer with the CD in the drive. Then follow my simple installation guide:


Wait for the CD to boot to the login screen. Type guest as the username, hit enter and then type guest as the password and hit enter. Tada. You're now using RedLinux. To get to my analog for a "start menu" just right-click on the desktop. You'll see something like this.

Open a Terminal by clicking terminal, then simply type "sudo /opt/larchin/run/" entering guest for the password when prompted. After a moment, the installer will pop up like so. Personally, with Linux installations I like getting the hard part out of the way first so once Larchin is up click on "Edit Partitions Manually" and then select the Gparted option like this. It will pop up Gparted which should look something like this if you're on a Windows only or Mac only machine. If you're partition layout is more complicated than that (i.e. your drive doesn't end in one large partition) than you may want to ask a friend (or me) for help. I'll be more than happy to give you a hand over the phone or, if possibly, in person.

It would also be a good idea at this point to pull up another terminal as before and typing "killall thunar" to make sure that the file manager doesn't try to automatically open any of the partitions as you're working on them. If the terminal complains that there is no thunar process when you tell it to killall them, then you're in good shape and can get back to Gparted. Just type exit to close the terminal.

Once back in Gparted, click on the partition and select the resize button in the top left to get this screen, then click the ending right arrow and drag it to the left to shrink the partition like so. Once installed, RedLinux will take up about 2.5GB (~2500MB) so I'd give it at least 3000MB but you can go as high as Gparted will let you really. Click "Resize/move" once you're satisfied and you should end up hereabouts. Select the unallocated space and click the new partition button to get this dialog. Set the filesystem to be ext3 and then reduce the size until a little over 1000MB of free space will remain afterwards, then click add. Select the unallocated space and click the new partition button once again, this time setting the file-system type to linux-swap and then clicking add and you should end up with something like this. Before moving forward, note the /dev/sda* numbers of your partitions. Know which one is swap and which is ext3.

You've now reached the scary/boring step. Click the apply button near the top center and click the apply confirmation to have Gparted save your changes to the disk. This could take anywhere from 10-20 minutes depending on the size of your hard disk but shouldn't take much longer than that. Once it's done, exit Gparted and select "Use existing partitions/finished editing partitions" in the Larchin installer. It will ask you about the swap partition at the next screen, and the box next to your swap partition number should be checked. Click okay. It will now ask you to select the install partition and should have defaulted to your ext3 partition. Hit the checkbox under format and click the dropdown under Mount Point to select "/". It will request one more confirmation that it has your information down correctly. Hit okay to proceed. The installation will proceed accordingly.

Almost done. All you need to do now is give it a root password (this is not the same as your regular logon/user password. Pick something that else that you'll remember). Write it down somewhere if it makes you feel more comfortable. You really shouldn't ever need it. Last but not least, tell it to install grub to your MBR and you're done! Click okay to exit the installer and then right-click on the desktop and click reboot. When you're computer reboots you should have the option of booting into your brand new RedLinux install. The username and password will still be guest, so read the new_user_guide online or by logging in, right-clicking to open the start menu, opening a text editor and then opening "new_user_guide.txt". Happy hacking!

RedLinux released!

posted on 2008-09-23 17:13:53

At long last and at least 3 days later than expected I'm released RedLinux into the wild. Keep in mind this is really just an ArchLinux derivative that I've had lots of fun working on. Most to all thanks should go to the awesome folks behind ArchLinux and the larch software for creating Live CDs. That said, you can now download the ISO image for RedLinux from my High-Speed Amazon S3 mirror and there is an Install Walkthrough now available. Burn the ISO to CD and reboot with it in your CD or DVD drive to try it out.

There's a new user guide at and the config files also reside in that directory. I'll be creating an individual page with an installation tutorial and a few other things in the coming weeks. There are a few other things I'm focused on at the moment but that is coming. You can of course always contact me via blog comments, IM, or e-mail if you're trying it and having any problems however.

Four quick pointers (not in hexadecimal):
1). The default username and password are guest. The new_user_guide.txt file on my website or in the home directory of the install tells you how to change it and/or create new users, etc.
2). The "start menu" so to speak is accessed by right-clicking on an empty part of the desktop. You should be able to get to anything you need from there (file manager, text editor, web browser, music player, reboot/shutdown, etc).
3). There's a run command available by pressing Alt+F2. Typing wicd-client gets you the wireless/network browser, firefox gives you the browser, etc. Wicd-client is supposed to load on launch but does not. Accidentally messed up a start script, getting it fixed and uploaded this afternoon hopefully.
4). You can install RedLinux by opening a terminal (Right-click, then Terminal) and typing "sudo /opt/larchin/run/" and hitting enter. Enter the guest password at the prompt and off you go.

I'll be updating this post and the upcoming RedLinux portion of the site as more content is available.

Odds and Events

posted on 2008-09-17 14:36:11

Good morning. Plenty has happened since yesterday. Let's recap.

Nick Ali has written up some details about Atlanta Linux Fest in his blog. It's this Saturday on Northside Pkwy from 11am-6pm, there's pizza for $5 and I'll have my laptop and my ArchLinux derivative in case you want to see it along with download links and (if I can get some blank CDs) hard copies. You should come! I'll make it fun. I promise. Don't you all wonder why I'm so crazy about this shit sometimes? You'll know.

I was wondering where I'd host the ISOs for my ArchLinux derivative. It looks like that problem is solved. I give you Badongo. They have a 700mb upload limit and files stay up until they're inactive for 90 days if you're a free member. That's pretty excellent. Hopefully, I'll put out releases every three months or so though they'll mostly be package updates in all likelihood. At some point in the future (circa me getting a new laptop) I plan to do an Arch64-based RedLinux build and get images for it online. Now if only I can get the RedLinux portion of my site up by Saturday...

In bad tech news, I may have to get a computer with iTunes going just for iTunes U. I don't know if that content is DRM'd and I suspect I could strip it out anyway. I guess I'm still evaluating my options for stealing an education.

In neat tech news, I'm generally more excited than the stuff Amazon is doing with AWS than the stuff Google is doing these days technically. Let it never be said that I don't think the Floating Data Center idea is pretty kick ass though.

Final tech note, Wordpress optimization seems to be about two things. Installing WP-Cache or something similar and database tuning. I mean, really, it's about reducing the number of times that PHP calls or database accesses need to occur but I should learn more about databases. Later on that is, when I'm thinking about building real sites.

Note to Benchwarmers Clairmont: I know you have a 21 and over age limit set. I'm 22. My girlfriend and my buddy Kris aren't. We won your trivia night last week and things were cool with us then. We showed up this week and someone ID'd the whole table and asked us to leave. Here's a hint: If we're not planning on ordering alcohol and just want to play trivia and eat food you're losing business by asking us to leave. It's not like we haven't been there dozens of times before. Just a thought.

I'm looking at various options for housing next year. Our lease expires in May and I'm not sure what I'll be up to or where I'll be working but I suspect I'll want to live in roughly the same area I'm in now. I don't know that I'd want to be in the same house. The rate is good, the location is good, the home itself is really pretty decent. That said, managing 5 people in a house is a little...bleh. Teresa and I are both fond of the idea of Post Oglethorpe as a lot of our friends are there and it's close to school for her. I'm not sure how I feel about the prices though. I'd certainly want 3 people in a 2 bedroom since there are cheaper options than Post Oglethorpe available. Post Oglethorpe was sold just this August though. Maybe new management will bring changes. Time will tell.

Eat Food, Sleep, Have Visions

posted on 2008-09-12 16:19:27

Wow. What a day. I've been managing to keep fairly busy lately but also enjoying myself a good deal. So what's in store today? A word about music, a bit about games, some programming thoughts (on Factor and Common Lisp) and a mention of RedLinux.

Music: Four Tet has been making me really happy for the past 24-48 hours. I've known and liked Four Tet for a few years now but I think just how good he is at what he's doing only hit me recently. Two tracks did the trick for me and both are off his album Everything is Ecstatic. One is titled 'Smile Around The Face' and the other is titled 'And Then Patterns'. I'm posting a streaming link to 'Smile Around The Face' because it's awesome. There's a pretty interesting list of his 9 most influential records here and some interviews here and here that I may read later. As a sidenote, I'm jealous of all those NCF kids and their Walls. I want to throw a wall. If I did though I'd probably be silly/lame and try to sneak this track in...

Four Tet - Smile Around The Face
Found at

Games: I've been saying that one of my favorite things about the new game consoles is the downloadable games. Xbox 360 and PS3 seem particularly strong in this category to me though the Wii has old mainstays from Nintendo lore to prop it up. I've already mentioned echochrome and everyday shooter here in the past and they're both quite good but I don't think I've mentioned Super Stardust HD. It has been and continues to be simply delightful. Some video tips on the game were recently released as a free downloaded on the PSN and convinced me to pick up the $4.99 single player expansion pack. They also released a free patch for the game so that you could play music off the PS3 hard drive once that functionality was possible through firmware updates. If anyone who has worked on the game is reading this: Excellent, excellent work guys. Really. This is how to build a title and continually improve it, create community, etc. I look forward to your future releases.

Languages: I'm going to separate my blathering here into sub-ramblings based upon the language concerned. First up, Common Lisp. I've been having some fun working with a friend to get a simple Gmail scraper/wrapper API developed in Common Lisp that would allow me to connect to accounts grab and compose messages, etc. We were relying on a CL library named mel-base to achieve this. I've been doing development locally and in the process gotten a bit more familiar with SBCL, SLIME and ASDF-Install. I've definitely come around to the idea that there is a place for both Common Lisp and Scheme which I, in misguided form, derided some time back. There are certainly pros and cons to each. At any rate, the combination of ASDF-Install, SBCL and SLIME is pretty great. That said, I realized after a bit of tinkering that mel-base lacks SSL support even here in 2008. That means it won't work with most (if not all) of today's web-based e-mail services which require SSL to encrypt the connection to the server (you know, so people can't steal your password and e-mails). I'm quite surprised it isn't there by now but assume the maintainer has been busy. Luckily, there is a CL library for SSL called CL+SSL, appropriately. I'm very tempted to find a way to patch SSL support (with a dependency on CL+SSL, of course) into the POP3, IMAP and SMTP folders in mel-base and contribute the patch upstream for the next release. I have no idea what I'd be doing really and I'm fairly intimidated but it seems like a good start and a reasonable place to help and try my hand. There are some other people who have pursued this though that I should get in touch with first to make sure no patches are already in circulation for SSL. Next up, Factor. I've been interested in stack-based languages since I first learned of them and still am quite intent on learning Forth in the near future. Possibly as my first non-lisp language. I stumbled into some blog entries by Phil Dawes on why he likes Factor and has enjoyed learning it. He also has an excellent post digging down into the versatility and usefulness of the compiler. Speaking of which, the Planet Factor blog offers some of the clearest insight into the development and internals of a programming language I've ever read, particularly one as young as Factor. Keep an eye on this one. You've got one more year, Slava. I want my 1.0.

RedLinux and Logos: I'm solidifying "plans" for the v.08 update to RedLinux. You can catch a glimpse at the changelog. I also have some logo designs (Thanks, Neil!) for the lambdabang, I just need to decide on size and color. Once I get a logo for RedLinux, I'll start working on the web page for it and get the ISOs up. End of September? We just might be able to do that.

Just for Fun

posted on 2008-09-10 20:05:53

There's so much I've been meaning to post about lately and so much that's been going on. It's very hard to keep up with it all. This will consequently seem a bit scattered but it's largely divided into Gifts, Linux Stuff (which continues to bring me perpetual joy), programming language stuff and hardware stuff.

Redlinux: I've been working on my own ArchLinux Derivative over the past few months and mentioned it a bit here. I'm hoping to get an ISO for an installable LiveCD of it online by the end of September with a sort of beginner's guide and homepage for it set up here. There won't be a forum or anything initially. Just e-mail me for feedback/help. We'll see how that goes. I'm calling it Redlinux. Also, I put all the default *rc files and other important config files (including new user documentation and the changelog) in a new folder on the site. It's at Redlinux is currently at version v.07. The initial online release will incur an automatic version bump to the nearest .x0 rounding up.

Logos: I'm looking to get a sort of logo for the site. I'm not sure where to go with this. I also need a separate logo for Redlinux. Any ideas are welcome. I have one for a site logo. It's a Unix Shebang combined with a lowercase Lambda. Like so: #!λ. I think it's pretty cool but it'd take some work to make it prettier. The Inconsolata font would be a good start. I don't think they have a lowercase lambda symbol though. :-(, Sad Panda. I'm thinking we call it the *Lambdabang*. Eh, eh?

Gifts: I've been thinking about money and my actual needs and wants a good deal lately. Part of that comes from having to constantly figure out finances due to being young and broke in a struggling economy. The other part is me thinking about the few material things I enjoy and which I'd like to prioritize. Good ideas for gifts for me that I hadn't previously considered are Internet Hosting (and you know I'll want pretty serious control over the box. Maybe linode or lylix.), a subscription to LWN (Linux Weekly News) which I've been enjoying a lot lately (the back issues are free) and various books from the amazon wishlist, as always. Cooking supplies might also be good but I'm probably best off picking those myself. Homemade good food. It's expensive, but fun!

Hardware: I've been thinking about buying a new computer for about a year to a year and a half now. I recently moved into the "strongly considering it/planning it" phase and started saving. This box would probably end up replacing my aging homemade beast of a "main desktop" which would in all likelihood become my server box. I decided fairly early on I wanted the new system to be a laptop because I'd really like to be able to go portable at any time and not be at a loss for processin power. Plus, that'll make it easy for me to move around lifestyle and home-wise which seems reasonable at the moment. To be honest, my needs are essentially met by my current equipment and the extra processing power wouldn't go to use too much as I don't game anymore. The Thinkpad A31 (present laptop) hates secure wireless networks for some reason and I wasn't able to wrestle it into submission. A larger concern would be hardware dying in the Desktop. It's still going strong but we're passing the 4-year mark and you can never be too sure. Besides, I'd love to catch some of the new emerging tech like Multicore processors, new wireless standards (Wi-Max and draft-n, I'm looking at you), and Solid State Drives! I'd also love to be able to get something based on AMD's upcoming Fusion processors but that's still a year out and I'm not sure I'll wait that long. I like the direction they've gone with the Athlon series and feel that they're more motivated than Intel to innovate. Always have. They're still not as fantastic about supporting Open Source as Intel though and that's beginning to become a deal breaker for me. Especially considering that their Shrike mobile platform may use broadcom wifi or something equally messy where Linux is concerned. I know I want something 12 or 13", preferably 13, with a minimum of 4 hours of battery life, a dual-core processor and a 60GB SSD. Ideally, it'll be Shrike-based (that's waiting a year), have HDMI or Displayport out with good Linux support and draft-n or Wi-Max. Vendorwise, I'm torn between IBM/Lenovo and Dell. I've had good experiences with Thinkpads (IBM, now Lenovo) and like them but they're not the best Open Source company. Dell has been making a real push in that direction of late and have some very competitive looking offerings which I could even buy with Ubuntu pre-installed. My final three is presently a tie between the Lenovo X300, the Dell XPS M1330 and the Dell Latitude E4300. I'll be coming back to re-evaluate when I've got about $1,500 stashed away. :-)

Languages: In the near future, I'd like to get a post up revising some of my former opinions on Programming Languages. Particularly of the Scheme family. Some of my earlier ramblings now seem quite misguided. Plus I've been playing around with Common Lisp more and though I'm not quite a fan of the funcall syntax I'm starting to grok some of the reasoning for multiple namespaces. My experiences with PLaneT vs. ASDF-Install bear that out. *shivers* Collisions are ugly.

Linux Tip: Ever been frustrated trying to transfer directories with spaces in them via scp? I have. There are one or two things that seem like they should work but don't. I've been too lazy to look up how to do it until today. Here's how:
scp -r "user@host:/path/to/directory\ withspace/" .
Simple, right? Duh.

I was going to mention how Linux Kernel Hackers make me happy and throw a few quotes from the mailing lists on here but I think this is more than enough for now. Later, folks.

An Update

posted on 2008-09-05 20:07:29

Wow. So I started a post on Friday (I think, maybe Tuesday) but I must not have saved or finished it. It's not sitting in my drafts folder. Anyway, I've been a bit up and down lately but I'm really glad Fall is starting. It may be my favorite season. I dig the cooler weather. I got my wisdom teeth out yesterday morning and that went smoother than I expected. I miss chewing food but other than that have no real complaints. I haven't needed much of the hydrocodone (basically vicodin) they gave me and haven't been sleepy or incoherent much either. In fact, I've been up since 8am today and really enjoyed catching up on e-mails and little things like that. Ah, vacations.

Newsflash: Cooking is really fun. I've cooked a little ever since moving into the house and meant to move on to more advanced dishes but really stuck to basics (pasta, pizzas, burgers + dogs, sauteed chicken) for the first month or two. Recently though I found some food blogs and have been trying to cook real meals. I started this monday with made from scratch blueberry muffins and filet mignon and mashed potatoes for dinner. It was wicked good. In the upcoming weeks I'm hoping to pick up fried chicken and empanadas. Tonight I may just try to figure out french onion soup. :-)

So, the next two months are huge. September there are three major Linux Conferences (X Developers Summit, Linux Plumbers Conference and the Kernel Development Summit) going on and a fourth in Atlanta. The one in Atlanta (Atlanta Linux Fest 2008) is on Saturday, September the 20th and I couldn't be more excited. It's from 11am-6pm over on Northside Pkwy and I signed up as soon as I heard about it. Plus I just got my wisdom teeth out. October there are lots of Linux Distribution releases (particularly Ubuntu and Fedora) and a number of awesome games coming out for PS3 including LittleBigPlanet, Fallout 3 and Bioshock. I'd really like to get the custom ArchLinux derivative I've developed over the summer out by October, too. Even if that just means putting the ISO on megauploads and creating a page for it on my site.

Beyond that, I'm just trying to get back into programming. I haven't moved as quickly as I'd like but I am having fun. I'm really tempted to try to learn about Factor, a concatenative language (like Forth) developed by Slava Pestov. It looks really cool but I can't quite afford to get sidetracked at the moment. If you're interested in what it's like trying to write a modern programming language though they've got a great blog and Slava has made some great posts on the Compiler Architecture lately. My big programming focus at the moment though is trying to do some hacking for a startup in NYC. I got contacted by one of their developers and think I can learn a lot from them though how much I'll be able to help is still up for debate. At the moment, I'm mostly writing glue code for Common Lisp libraries but I'm really enjoying it. It's also made me realize just how crucial libraries and the way they're handled by a language is. I can't believe they managed to leave module systems out of the RnRS for so long! I can also see why Common Lisp opted for multiple namespaces but I'm still not sure I like it. And I definitely just don't like the syntax for funtion calls after being used to Scheme. Ah, well. There is no perfect language.

Feel free to swing by the house if you want to help me learn to cook or see me looking like a chipmunk.

One Week Worth of Tricks

posted on 2008-08-03 17:18:28

In the course of migrating my webserver to a new box this week I learned two useful tricks. They may or may not prove useful for anybody else but I think they're fun so here we go.

One, SSH Tunnels. SSH Tunnels are useful if you ever need to surf the web securely and you're on public or untrusted wireless (say at Starbucks) or when a website is blocked by a firewall and you need to access it.

SSH Tunnels are actually quite easy. Assuming you've got ssh setup on the remote server and an account at that server all you have to do is run "ssh -D 8080". Once you're logged in, open the Preferences for your web browser. This example will use Firefox 3. Go to Edit->Preferences, then the Advanced section, the Network tab and click Settings. Click the "Manual Proxy Configuration" radio button and under SOCKS Host put "localhost" and set the port to 8080. That's it! Surf away.

Two, resetting your wordpress admin account password. This is useful if you're such an idiot that you forget to change the random password that's initially on the account after you install wordpress. It assumes you have an ssh account to the server hosting the blog as well as access to the database tables for the blog. SSH into the server and run "echo -n your-new-password | md5sum". Copy that down and hang on to it. Then run "mysql -u user-with-access -p". Then run the following commands:

USE wordpress-database-name;
UPDATE wp_users SET user_pass="md5-you-wrote-down"
WHERE user_login="admin";

Check to make sure it went through with "SELECT user_pass FROM wp_users;" then type "exit".

Yep. It's good stuff. That's all I've got for now. I'm hoping to post up some mostly finished SICP sections in the next few days. If I'm lucky I'll even write something intelligent about education.

Damn Servers...

posted on 2008-08-01 12:59:51

It's been a good while since I posted last. After meaning to do it for months, I've finally migrated to a new web server. I'm self-hosting so I have no one to yell at about everything taking so long but myself. At any rate, I've migrated my Desktop, Server and Laptop from Ubuntu Linux to the custom Arch Linux build I've been working on lately. Some old links in old posts are broken at the moment but I'm hoping to repair them in the coming weeks.

For now, I'm just glad this thing is back up and functioning after the week of chaos. I'm still committing code as regularly as I can and for the first time in a long time Redline Radio is live again as well. Let me know if you're interested in getting access. We will now return to the regular posting schedule. :-)

From Distro-Hopping to Good Easy

posted on 2008-07-07 17:47:20

Linux is a complicated beast. Unlike Windows and Mac there are literally hundreds of different competing versions or distributions vying for attention and often catering to a specific niche. Beginning Linux users are often all waved towards the two or three most popular and general-purpose distributions and with good reason. While three distributions in particular (Ubuntu, Fedora, Suse) seem to dominate and are good places to start, I have often experienced a desire to see what else exists with Linux since there is so much in the way of choice. Additionally, Linux distributions tend to have one release or more a year while Windows and Mac tend to see a new version only every few years. The three distributions I mentioned earlier all strive to issue a new release every six months and they all do so at roughly the same time often with no more than a month separating them.

While this may at first seem undesirable there is no pressure to upgrade but there is also no cost to upgrade (remember, they're free!). Some upgrades have a few more bugs or new features than others but upgrades tend to be relatively safe and easy. Moreover, because of the regular releases large changes happen gradually and there is little to no learning curve. It's also worth noting that upgrading does not require you to reinstall the operating system. It's usually just an hour or two of downloading and a reboot.

So, "Distro Release Season" comes twice a year if you use Ubuntu, Fedora, or Suse. That's even better than Christmas! However, this release cycle rather disappointed me. Ubuntu's Hardy Heron was a bit buggier than I'd like. Fedora 9 seems better and better every time but they still lack a few software packages I want. To be honest, I've never been interested in Suse much. I'd also been meaning to move to a more stripped-down version of Linux for a long time. Ubuntu and Fedora come with a lot of bells and whistles that I may not necessarily need and that slow my system down.

It was time to try something new and, this season, I decided to go with Arch Linux. I won't go too deeply into my decision to use Arch. There are a lot of very good things about it and though it's not easy the way Ubuntu is, it's simple and worth the effort you put into it. You can make it into whatever you want it to be and that's precisely what I've done. I've spent about a week setting it up to perform as I'd like and with the programs I'd like. I've documented the entire process and will post that here as my personal "Good Easy". A good easy, for those who haven't heard of one, is a detailed description of someone's computer configuration. One reason I'd like to do one is that it's handy in case I have to duplicate it at some point in the future. It might be nice to do a Good Easy for my server at some point as well. I detail a bit at the end how to turn your installed system into a Live CD. I plan to do a little bit more work and remove personal data to turn that Live CD into something I could distribute at some point though mostly just to a few nerd friends. I wouldn't expect, or want, to take users from the wonderful Arch Linux after all.

My Good Easy...

Official Fix for MIT-Scheme in Hardy

posted on 2008-05-05 15:34:39

Bug reports work! Finlay McWalter commented on the bug report saying that the package maintainer, Chris Hanson, found a fix for the issue. His fix is far better than my workaround. Apparently, AppArmor is preventing applications (such as MIT-Scheme) from accessing lower memory. The fix is to edit /etc/sysctl.conf and change the vm.mmap_min_addr value from 65536 to 0. Afterwards, MIT-Scheme works just fine.

In other news, this has been a really disastrous Monday.

Setting up a Mercurial VCS on Ubuntu

posted on 2008-04-22 17:47:37

I've been meaning to set up a Version Control System for a long time now. VCS as a concept has been around for a while but recently a new paradigm in VCS called DVCS (for distributed version control system) has emerged. This new paradigm has cured or at least mitigated the warts of the old "centralized" version control systems and brought new benefits. The thing is there hadn't been a dominant VCS up until now and though some would argue to the contrary I would claim there still isn't. This article strives not to be concerned with the choice of a DVCS. The truth if you choose any of the 4 relatively high-profile DVCS systems that exist today (darcs, bazaar, mercurial, or git) you'll be able to migrate between any of them later without too much difficulty if you change your mind. I was initially leaning towards bazaar or git but eventually settled on (and am thus far quite happy with) mercurial. If you must know some of my reasoning I find that Dave Dribin has captured something close to my opinions.

So, the three things I'd like to cover in this article are mostly covered in two or so places on the mercurial site. I thought I'd group it all together here for simplicity and future reference. The three things I'll be covering are the initial setup including installation and repository creation, making that repository accessible via web browser through a cgi script, and then setting up some authentication to allow you to push changes over HTTP. Let's get started!

Initial Setup
Installation of the software is really straightforward on Ubuntu (see the mercurial site for other installs). Just run
sudo apt-get install mercurial and you're done. You'll have to set one thing in your configuration file before you can make your repository. Insert the following in ~/.hgrc with nano:

your_username = FirstName LastName

Your next step of course will be to create a repository. For this, I assume you have some code you'd like stored. I make no claims about what will happen if you try creating an empty repository. It is also worth noting that mercurial tracks content, not files, so it won't carry empty directories. You can work around this on Linux by dropping a hidden file in the directory like so: touch dirname/.hidden.
Once you've done that navigate to the directory you'd like the repo to be in and run the following commands:

hg init
hg add
hg commit

This should bring up nano for you to enter a commit message into describing what changes you're making. Type in a message, save and exit nano. Now you can check and see if this was successful with hg log. If there are no changes listed or exiting nano generated an error you should check your file permissions and see if entering the [ui] information in another config file (perhaps /etc/mercurial/hgrc) fixes the problem.

Setting up the Web Server
I'm assuming that you already have a working repo (naturally) and an install of apache at this point. If you don't have the apache install try running sudo apt-get install apache2. There should be a copy of the cgi script mercurial uses in /usr/share/doc/mercurial/examples/. Copy that to /var/www/cgi-hg/ and rename it index.cgi, then open it up with nano. Around the the third to last line (under def make_web_app) you should see something like:

return hgweb("/path/to/your/repo", "Your Repo Description Here")

Fill in those values with your information but preserving the quotation marks. Also make sure to remove the comments (#s) before import os and import sys. Finally, be sure to run sudo chmod a+x index.cgi on the file so that Apache can execute it. You'll need to edit your apache config file so fire up nano again and open /etc/apache2/apache2.conf to insert the following:

Alias /code /var/www/cgi-hg

DirectoryIndex index.cgi
AddHandler cgi-script .cgi
Options ExecCGI
Order allow,deny
Allow from all

Finally, restart the server with sudo /etc/init.d/apache2 restart and try navigating your web browser to

Adding Push Support
Use nano to open your_repo/.hg/hgrc (not ~/.hgrc) again. This time we're putting in the server configuration and it should look something like this:

allow_push = your_username
push_ssl = false

Now you'll have to set up a file to hold the passwords for the users you want to allow to upload changes. To create it and add the first user and password run sudo htpasswd -c /etc/apache2/hg.pass your_username but make sure to omit the -c argument when adding new users in the future or the file will be overwritten. Then you'll need to set up apache to check this information. Open /etc/apache2/conf.d/hg.conf in nano and add the following:

AuthUserFile /etc/apache2/hg.pass
AuthGroupFile /dev/null
AuthName "Your Repository Name"
AuthType Basic

Require valid-user

Save and exit nano. You'll need to change the permissions on the repo tree to allow apache to write to it so run sudo chown -R www-data:www-data /path/to/your/repo and then restart the apache server again. That's it! Time to test it...

To test your setup, go to another computer and use apt-get to install mercurial and set your username in the ~/.hgrc. Then run hg clone and you should see mercurial make a clone off of your server. Open one of the cloned files and make a small edit or add some new files, whatever you like. When you're done run hg add and hg commit remembering to add commit messages. If you need to delete a file do it through hg remove filename. Finally, try pushing your changes back to the server by running hg push You should get asked for a username and password and if you give valid login information your changes should push through. Now navigate to in the browser and see if the changes show up. If they do, you're off to the races.

Topping off the day

posted on 2008-04-22 04:59:45

Okay, so today has been pretty excellent. I've really gotten a good bit done. I fixed my problem with MIT-Scheme on Hardy by just compiling a new version from source, I figured out how to get Emacs to behave like edwin when evaluating s-expressions (at least with MIT-Scheme) and I tidied up the first two SICP entries with 1.3 left for tomorrow. Finally, I installed a revision control server and moved all my code into it.

I'm really pleased about it because I've been meaning to do it forever. You may remember me babbling on about "git". That was what I initially intended to use as my RCS but I settled on mercurial and I'm quite taken with it. At any rate, to see the fruits of my labor just navigate to the new code section of my website. You can see the two different changesets I've uploaded so far and browse through the files by clicking the "manifest" button at the top, then going to books, then sicp, then chapter01. I'll do a more thorough writeup tomorrow. Hope your days were as joyful and productive.

Setting up a MoinMoin Wiki on Ubuntu

posted on 2008-04-15 17:55:23

I had a bit of trouble with one or two things setting this up at work last week so here's a report from my side of the IT world for anyone who is looking to do this.

1) Don't apt-get moinmoin.
2) You will create a specific instance for your wiki separate from the base install. This is explained below.

Okay. So, first things first. Check to see if python is installed by running this on the command line:
python -v

then wget the tar from moinmoin:

extract it:
tar zxvf moin-1.6.2.tar.gz

and then run:
sudo python install --prefix='/usr/local' --record=install.log

This will get a basic install up and running. Now you need to create a wiki instance. You can create one wherever you like but I'd put it in /var/www/YOURWIKI or whatever you'd like to call it. The moinmoin folks have even created a handy script to take most of the work out of your hands and make sure the permissions are right. Grab and read that, decide where you'd like the files to go and then:

sudo ./ /var/www/YOURWIKI
sudo mkdir /var/www/YOURWIKI/cgi-bin
sudo cp /usr/local/share/moin/server/moin.cgi /var/www/YOURWIKI/cgi-bin/.
cd /var/www
sudo chown -R www-data:www-data YOURWIKI/cgi-bin
sudo chmod -R ug+rx YOURWIKI/cgi-bin
sudo chmod -R o-rwx YOURWIKI/cgi-bin

Your instance is installed, now you just have to set the config files up to point to it. Add the following two lines to /etc/apache2/httpd.conf where the Alias is /moin_staticYOURVERSIONNUMBER:

Alias /moin_static162/ /usr/local/share/moin/htdocs/
ScriptAlias /tvswiki /var/www/tvswiki/cgi-bin/moin.cgi

Set the path to your wiki to /var/www/YOURWIKI in /var/www/YOURWIKI/cgi-bin/moin.cgi
Set the data dir and underlay dir to their (absolute, i.e. /var/www/YOURWIKI/WHATEVER) paths in /var/www/YOURWIKI/
Restart apache with
sudo /etc/init.d/apache2 restart

and you're done!
Of course, the next thing to look into will be setting up groups and ACLs and theming it. Happy Hacking!

MIT-Scheme is Broken in Hardy

posted on 2008-04-15 15:57:01

UPDATE: There is a fix for this posted on my blog here. I figured everyone would find that via google but all the traffic seems to be coming here instead. Hit the link.
Now, I realize both that I can use another Scheme to work on SICP and that I can use Launchpad to submit a [patch/bug report/etc] but this is still kind of frustrating. At any rate, going to the source didn't work so I can't just use Debian's unstable packages. Going back in time and using a Gutsy or Debian Etch build might do the trick though. I filed a bug report. We'll see what happens. *sigh* Maybe they're all just trying to tell me to listen to Andy Wingo. I've been meaning to play around with F9 anyway. I always do. Distro release season is so fun and it comes twice a year! More on that later.

For the curious, MIT-Scheme when run from the command line will produce the following error:
Largest address does not fit in datum field of object.
Allocate less space or re-configure without HEAP_IN_LOW_MEMORY.

Encounters with an XO

posted on 2008-04-15 03:28:56

So, I recently received my OLPC XO. After playing with it a bit I'm pleased with it but I don't think that has terribly much to do with the device itself. I didn't really buy it to support One Laptop Per Child though I think the idea of a small, comprehensible system would go a long way towards engendering a new generation of hackers the way something like the Commodore 64 or Amiga did. OLPC: Way more hardcore than your middle school's Laptop Program. It is a goal I can identify with and support but I did this because I think it's a neat piece of hardware produced by passionate people. It may not be the next Lisp Machine but it's pretty cool nonetheless.

I was ecstatic when I got the thing. Naturally, I fiddled with the initial setup a bit but quickly wanted to move on to other things, namely emacs and lisp since I'm working through SICP at the moment. It was trivial to use yum to install emacs-nox and also quite straightforward to set up quack. What surprised me was how easy it was to compile Gambit on the XO as seen on Bill Clementson's blog.

Once that was done two things really started to eat at me. 1) I wanted to try getting a Debian-based install running on the XO. 2) I wanted a different Window Manager. I just am not comfortable with Sugar for some reason and I definitely wanted a browser with tabs. Looking into getting Debian going on the XO made me realize that getting a developer key was my first priority and I'd advise anyone with an XO to do it. Then you can play with the Forth prompts at boot, etc.

As for Window Managers, I've always had a bit of a fetish for them. Of late, I've been meaning to try out a tiling window manager and I got my choices down to ratpoison (which has the most bad ass supported hardware page ever), dwm, and Xmonad. For a variety of reasons, I'm itching to try Xmonad on one of my boxes soon but that will have to get in line behind setting git up on my blog server. At any rate, Xmonad is pretty awesome and it runs on the XO. I'm not sure how much of it is Haskell Voodoo and how much of it would be beginner-friendly but I'm sure that a full-featured 1200-line Window Manager has something to teach. I'll be keeping my eye on the upcoming book. More on all that later.

I read somewhere on that they'll rebase a later build on FC9. I hope it's started before F10 hits and I hope that by F10 the 'Good Haskell Support' ticket gets completed. Long story short, I ran olpc-update debian-big on the XO and found that it's not really what I'm looking for. I'll probably later get Xubuntu Hardy on a Flash Drive and then replace the Window Manager with Xmonad but until then Sugar will be fine.

So, aside from my inane banter, is the XO any good? Well, good for what? The stock configuration is good for a limited set of uses but I imagine it'd be great for kids or if, like Luke Gorrie, you're hacking Forth.

An oft overlooked ability of the XO is it's SD expansion slot. If I was looking to do serious programming on it I'd slap the biggest SD card I could in there and hit the road. As long as what you're doing doesn't eat processor and RAM like crazy and you can port your tools over, it's a great travel box. Your hands will get used to the keyboard...eventually.

Disabling SCIM in Hardy

posted on 2008-03-11 17:25:24

If anyone else has been struggling/annoyed/ready-to-kill due to the SCIM program that runs by default in Ubuntu Hardy I found, thanks to this gentleman, that you can disable the application altogether by doing the following:

sudo update-alternatives --set xinput-all_ALL /etc/X11/xinit/xinput.d/none

Restart your X server and that should be it!

EDIT: I received an anonymous comment from a British locale user that said the original solution switched him back to English. A cursory google didn't turn up anything tremendously helpful/enlightening, just this, and I'm lazy tonight so I hypothesize that /etc/X11/xinit/xinput.d/default sets your locale to the setup default and /etc/X11/xinput.d/none disables SCIM altogether. Maybe I'm right. YMMV.

On Schemes…

posted on 2008-03-05 21:28:56

One thing I've been thinking about a lot lately (even if it's premature) is what scheme implementation I want to settle on. Presumably at some point I'll be developing real applications. Or at least applications that I'll want to be able to pass along to one or two friends. At that point I'll need a way to pass said applications on without asking the friends to download and install the scheme environment themselves. See, Scheme is a LISP and LISPs are interpreted languages. There are native code compilers but they're not guaranteed.

So, my wishlist for a scheme implementation was something that was fairly fast, had a good FFI because we live in an inexorably polyglot programming age, could compile native binaries to be passed on to whomever without requiring a scheme install on their part, and decent library\module support. Other interesting features would be support for concurrency, documentation and community size, the corresponding development activity, and R6RS compliance (or plans of compliance).

This already cut my options down pretty quickly. The standout option was PLT Scheme/MzScheme. Bill Clementson proclaimed it the best open source LISP and in the general case I might agree with him. Now, I am another person that disagrees with DrScheme as an editor but I wouldn't let that stop me from using it. There's no reason one can't incorporate it into emacs, after all.

However, I couldn't find a way to force PLT Scheme to produce a standalone executable instead of a launcher and a cursory google does not lend me to believe that there is any way. That's more or less a deal breaker for me. PLT Scheme has incredible momentum and fantastic module support but what's a guy to do? There may be a way in here but I'm not really looking to embed MzScheme in whatever standalone I want to produce.

As I said, the options were already limited. Ikarus is of course the closest thing to R6RS but it's still sitting at 0.0.3. It's receiving heavy attention but I wouldn't use it yet. It's more something to keep an eye on. Scheme48 and Scsh would be great for Unix scripting but I'm interested in something with a larger community and more cross-platform nature. Guile has limitations to it's garbage collector that I question, it was designed as an extension language, and the community seems fairly small.

Bigloo, Gambit, and Chicken were the remaining options and of the three Gambit swayed me over. It's hard to say what factors exactly did the trick. Bigloo, Gambit, and Chicken all have FFI's to C and will generate native code. All three have active communities and decent module systems. I think what really compelled me in the end were three things, Gambit had some impressive benchmarks (even though they were on the Gambit homepage), I was compelled by Snow as a package system, and the fact that Termite is implemented on top of Gambit and it had such lightweight threads was highly compelling. After all, I do think concurrency via lightweight threads and message passing is going to matter a lot down the road so that was a pretty enticing bonus. As a final bonus, it runs on the OLPC.

So, now I had to figure out how to set it up. I won't keep you waiting but I should first note that I'll be installing the terminal (no-X) version of emacs because GTK emacs annoys me (aesthetically speaking).

sudo apt-get install emacs-snapshot-nox gambc

Create a desktop launcher thusly:

locate emacs-snapshot.desktop
sudo nano /your/path/to/emacs-snapshot.desktop

Change name to Emacs Snapshot (nox)
Change Terminal to true
Change Exec to /usr/bin/emacs-snapshot-nox)

Then install quack:

cd /usr/share/emacs/site-lisp
sudo wget

And to start out you'll want to edit your .emacs using

sudo nano ~/.emacs

to look something like this:

(require 'quack)
;; custom-set-variables was added by Custom.
;; If you edit it by hand, you could mess it up, so be careful.
;; Your init file should contain only one such instance.
;; If there is more than one, they won't work right.
'(quack-default-program "gsi")
'(quack-pretty-lambda-p t))
;; custom-set-faces was added by Custom.
;; If you edit it by hand, you could mess it up, so be careful.
;; Your init file should contain only one such instance.
;; If there is more than one, they won't work right.

And last but not least, a few choice commands:
C-x 1 kills all other windows
C-x C-s saves the buffer
C-x C-c exits immediately
M-x run-scheme RET drops you into the repl
C-d backs you out of runlevels
C-/ is undo
ESC-` Menu Bar

Snow will come next...

E-mails on Wordpress

posted on 2008-03-05 04:54:12

For those who don't know, Wordpress is the blogging software I run on my server. Whenever a comment is left though if it passes through my spam filters it goes to a moderation queue to be approved by me. This is practical for two reasons: 1) In spite of my spam filter (Akismet) being competent, it's not perfect and stuff still slips through. 2) I have low enough traffic that I can keep up with the moderation queue pretty closely.

However, until today there was a drawback to this. Wordpress is supposed to send out e-mails for various reasons but it never worked and the reason is that the program it needed to send mail (called sendmail, cleverly enough) I didn't install by default with Ubuntu 6.06 Server (now upgraded to 8.04).

Now, sendmail is not a mail client like Outlook, Thunderbird, or iMail. It's a MTA (Mail Transfer Agent) and frankly I have no desire to run an MTA on my server. That's overkill and it's just one more thing that can get hacked. So today, I did two things. 1) Set up ssmtp as an alternative that pushes everything to my gmail account. 2) Installed a plugin so I get notifications whenever a comment is in the moderation queue and the user gets an e-mail letting them know too.

Here's how I did it:

For the debian-based kids, try running:

sudo apt-get install ssmtp

After that completes, you're going to want to:

sudo nano /etc/ssmtp/ssmtp.conf

There are six values you'll want to change in here and they should look like the following:

From there you'll want to hit Ctrl-O Ctrl-X to save and exit. After that you'll need to tell your php configuration to use ssmtp instead of the default (sendmail). So type:

sudo nano /etc/php5/apache2/php.ini

and hit Ctrl-W, then type sendmail and hit enter to search for sendmail in the document. You should find something called sendmail_path, modify it to look like:

sendmail_path = /usr/sbin/ssmtp -t -i -au username -ap password -am LOGIN

Restart your Apache server to take advantage of the changes by typing:

sudo /etc/init.d/apache2 restart

Finally, download this to your server and unzip it into /wordpress_root/wp-content/plugins and activate it in the plugins menu. That's it!


SSMTP Gmail Guide

SSMTP Guide for PHP on Debian/Ubuntu

Wordpress Plugin

Spontaneous Monday Linkpost

posted on 2008-03-03 17:52:17

Bookshelf Jealousy

I need to try OpenBox and build a trim install from the ground up again. Maybe with Gentoo this time? Or should I stick to Arch of Foresight?

I continue to hear good things about Barack Obama. Staggering amounts of good things. It's not that Marc Andreesen is saying this. It's that everyone who's had contact with the guy is saying this. Also, he's big on civil liberties. Maybe from lecturing on Constitutional Law at University of Chicago. Hopefully that means he'll handle these fiascos a little better than the current administration. It wouldn't be too hard.

I really want to hear a good comparison of bzr and git and I'm not convinced I've heard one yet. It seems to be very "Linus made Git!" vs "Yeah but mere mortals can use Bzr!". Please guys can we elevate the sophistication in this debate?

Luis Villa comes up with some great ideas and this is one of them. Also, I may finally have to try greasemonkey because adding pictures to my posts continually sounds like a better and better idea. Well, at least some of my posts. While we're on Luis though, I take RSS feed reading seriously but I don't get near 800 feeds a day. I'd be interested in hearing what he settles on.

I'm wondering if I should start contributing to Ubuntu's Weekly Newsletter. It'd be a chance to do some volunteer work for a community I do care about and I have been thinking that down the road I might like to do some freelance writing so it wouldn't be a bad way to get a feel. What can I say? Ben inspired me.

I'm glad people are thinking about the future. This article from worldchanging appears particularly promising. Anyone have any formal responses to this? I'm going to work on mine along with an update of the Secondhand Standards essay.

Also, I'm not personally a Nine Inch Nails fan but it is pretty cool that they've released their latest album as CC'd work and I kind of hope Radiohead does that with their next album...

Personally, I share sogrady's taste in laptops and while I'm not in the market right now I am wildly optimistic about grabbing one of these in a year or so off craigslist or something.

I'm trying to really get into emacs. I want to settle on an editor and really learn it. Since I'm learning Scheme for the next year or so Emacs seems like an insanely reasonable place to start. Making it pretty seems like a good idea though.

Finally, this guy is totally awesome and I hope I can come up with a project as cool as this after my self-education.

Tutorials I’ve been meaning to do…

posted on 2008-02-26 21:50:16

as part of a getting things done streak. You know? Like, learn x y per z. Anyway, aside from reading SICP and writing code, getting it posted on here, and getting this essay up these are the other thing I've been lagging on:
An Emacs Tutorial
Git Tutorial Part 1
Git Tutorial Part 2
A Much more focused collection of *nix & associated utilities sheets
A Massive Index of Cheat Sheets

Also, I'm not sure I buy it but there was some pretty optimistic news about Concentrated Solar Power today. I'd love to see more detailed plans and a price/time-to-completion estimate.

Finally, if anyone has any insights about why I'm getting a bad EIP value and a kernel panic whenever I try to transfer large files (or dozens of songs) with my server, feel free to let me know. I will buy you a (coffee/beer/etc). It seems related to this issue from an openSuse user. It could also be related to me using the 8139cp module instead of 8139too for my ethernet card. Whatever, I doubt i'll get anywhere but I'll be looking into it.

Now to grab dinner and finish that essay...

One More Reason

posted on 2008-02-15 18:44:31

I knew this day would come sooner or later. I finally ran up against a task that was perfect to use Unix Pipes for. I needed to rename 2335 MP3s and I did it in 1 minute and 40.686 seconds.

Here's the backstory:
I recently got a new MP3 player because my old one died. When I got my last MP3 player I had selected the 2000 songs I actually liked and regularly listened to out of my 17000 or 18000 and copied them to the MP3 player and my PS3. Those songs were sort of a pain to get off my PS3 onto the new player so I figured I might as well keep a separate copy of them somewhere on my desktop. Eventually I'm planning on moving them to my server as well at which point Redline Radio will get up and running again!

Anyway, I copied all the files off of my new MP3 player to my desktop to create said separate copy and found that the filename on each song had been changed to SONG_NAME.b-mtp-XXXX where the X's were the song numbers up to 2335. I'll be damned if I was going to rename all those by hand. So, I thought I'd use Unix Pipes which allow you to take the output from one command and feed it into another command. And just for fun, I timed it.

To time a command you just put the word time in front of it. So to time the change directory command you'd type, "time cd DIRECTORY". To rename all those files I would use the rename command, but it would only rename one file at a time or at best one directory. I had a few hundred directories. So, I needed a command that would find each file by searching through the directories and find is the perfect command for the job. I can call find and tell it to search for files with that weird extension and then each time it does, run rename on the file with a regular expression to change that extension to mp3. And here's how that looks:

time find ./ -type f -exec rename 's/b-mtp-[0123456789]*/mp3/' {} ;

That's it. It says time the find command searching from this directory down for files (ignoring directories, etc) and when you find one rename it based on this regular expression. Done. 2000 files renamed in 2 minutes. And people wonder why I use Linux. Yes, Apple Automator does do this. But do you know why the robot icon for Automator is holding a pipe? That's right. It's an homage to Unix Pipes. You could of course do this in Apple's Terminal. That would be pretty cool. You hear me Dobbs?

Anyway, I've been thinking a bit about Alan Perlis' Programming Epigram: "A language that doesn't affect the way you think about programming, is not worth knowing." And I think there's an analogue for Operating Systems as well in all likelihood. Text editors, too (here comes the Vi vs. Emacs crowd). So, I may be using a blub operating system but I haven't soaked up all I can learn from it yet. I just need to remember to try other things too.

Just for fun, here's one more. This one outputs the 50 largest directories on my hard drive and their sizes to a text file on my desktop. du is Disk Usage, the -S option tells it to not to include subdirectories (so I only get leaves and not branches of the filesystem tree in my results). | passes output from one command as input to the next so | sort -nr passes the list of the directories and their sizes to sort which thanks to the nr switch sorts them in Reverse Numerical order. Greatest first, right? Then | head -n50 takes that output and cuts off everything after the first 50 lines. Finally, > dumps the final output into a text file at the location given. Ta da.

du -S / | sort -nr | head -n50 > /home/redline/Desktop/50large.txt

Eleventh Friday Linux Lesson

posted on 2007-08-03 13:14:05

Hell with it. How's this for a Linux Lesson?

Linux Commands

Eighth Friday Linux Lesson

posted on 2007-07-07 01:42:00

Okay. So I was having a really hard time thinking of what to cover this week and ran this command " history | awk '{print $2}' | awk 'BEGIN {FS="|"} {print $1}'|sort|uniq -c | sort -n | tail | sort -nr" on the command line to see what commands I use the most. Long story short that gave me an idea for what to do this week.

This week's concept is that of process management. Process management is what happens when you hit control-alt-delete in Windows to end task on that stupid program that just won't close. The concept is analogous in Linux. Occasionally something gets so out of line that you just have to beat it over the head and tell it to go away. It's also nice to be able to tell what is really eating your system resources.

So, there are two commands for this week. "ps" and "kill". "ps" lists all active processes when run with the -ax arguments like so: "ps -ax". This gives you a nice list of all processes running and their associated pid (process id numbers). Those id numbers can be passed to "kill" to kill the associated process like so: "kill your_idnumber". It's that easy. The process should cough and wheeze and go down pretty quick and that tends to be a good feeling when it's pissed you off enough to kill it in the first place. So, there you are. That's really all I've got for this week. I'm pretty out of it and uninspired of late. If anyone has any ideas or suggestions feel free to pass them along.

Seventh Friday Linux Lesson

posted on 2007-06-29 19:06:00

Okay, folks. I have a confession. My Linux Lessons are so under par. Seriously. I was looking at a site just the other day that did a much better job of presenting concepts in a sensible order and getting users familiar with the command prompt. Oh, well. There's still plenty to be done here so on I go. Today we're going to be looking at a few simple things that might go missed at the command prompt but are insanely great. If there is really a concept today it's on shortcuts and things like tab completion.

So, let's start with a few basics. If you're at a command prompt and press the Up Arrow you'll cycle through the commands you entered last. That's history. If you're midway though typing a command or a directory and you hit the Tab key, the system will try to autocomplete it for you. That is, if you're typing "cd /home/user/Desktop" and once you've typed "cd /home/user/D" you hit tab, as long as there is no other directory that starts with a capital d in "/home/user" it will finish typing "esktop" for you. This ends up being useful for all kinds of things. Especially when something is several directories deep but you don't feel like typing or remembering whatever you typed earlier. Additionally, one should always remember that "." is equivalent to the directory you're currently in, as in "cd .", and ".." is equivalent to a directory one above the directory you're in such that "cd .." while in "/home/user/Desktop" would move you to "/home/user". Finally, how about a command? Typing "pwd" will print the present working directory to let you know where you are in the system. This, in conjunction with tools like "ls" and "cd" will help you navigate the filesystem in a sensible manner.

Finally, I'm going to talk about "|" today. That's shift-backslash. The wonderful thing about "|" is that it feeds the output of one command into a second command. For example, "ls /home/user/Desktop | wc -l" would use "wc" the word count program to count the number of lines in "/home/user/Desktop". You end up putting two commands together to get a count of the number of files in that directory. You could also use something like "grep" to find a file like so: "ls /home/user/Desktop | grep *.doc" and list all your documents. This simple combination allows for some fantastically complex things which I'll be exploring in the coming weeks. For now, I hope you enjoyed the friday Linux Lesson and (if I'm lucky) the next Linux Lesson will be found here and also on "". Have a good one folks.

Sixth Friday Linux Lesson

posted on 2007-06-23 02:36:00

So, picking up from last week we need to know how to look up more detailed information about various commands and how to change permissions on things.

Today's concept is self-help. I won't be recommending books like The Power Of Now or trying to sell you anything. I will explain the linux man command and why self-help is the most important aspect of linux or even general computer usage in my opinion. Self-help means knowing how to go to google and search for a problem to look for a solution. Self-help means being able to think about the computer as something that does whatever you tell it to, including breaking.

The command of the day is the "man" command. Man stands for manual and when you type "man command" where command is another linux command (e.g. "man sudo") you pull up the "manpages" for that respective command. So, if we look at last week's problem: We typed ls -l to get permissions information on the output of a directory but couldn't decipher what all of the output meant. If we type "man ls" we learn that -l is a long list option but it doesn't tell us much about what that really means. It does note at the bottom that a command "info ls" will generate more information. At this point, you type ":q" to exit the manpage and then run "info ls" at the command prompt. In this case, the info page is identical to the manpage but now you're done. Besides, you didn't care much what all the obscure things meant anyway. And that's how easy self-help is. Note that not all problems really need to be solved. The next logical step with this though would have been what? Google, of course. And next week we'll dive into some general terminal tricks and the "|" operator. Get ready to have fun.

Random Midweek Thoughts

posted on 2007-06-20 19:47:00

I wish I had included John Mayer - New Deep and Guster - Amsterdam in the songs of summer. They'll probably be in next week's. I also am really getting back into Amon Tobin but his jazz/samba/drum'n'bass/insanity isn't really any particular season.

Lessig made a big announcement this week. I'll cover it in the monday update. He's smart and awesome.

In response to Justin last night: Linux has as many problems as you want it to. Get your hands dirty.

Speaking of, this week's linux lesson is looking like it should be fantastic. Max, get ready. Expect to learn about <tab> at the command prompt, more permissions, man pages, and the "|" key (that's shift backslash). It'll let you do fun things like "ls -l | wc -l" and "ps -ax | grep "yourmom"" and such.

Currently setting up subversion box and building jetty\terracotta cluster. More on that in t3h future. This week is turning out pretty nice.

Also, I was at the command prompt and ran "locate yourmom" and got nothing? So, where is she?


I got yer jetty server right here!

posted on 2007-06-19 15:50:00

Jetty Install Process:
Install JDK 6u1 from Sun:
Grab the bin file and run:
sudo chmod +x *.bin
sudo sh ./jdk*.bin
sudo mv jdk1.6.0_01 Java6u1
sudo mv Java6u1 /usr
sudo update-alternatives --install /usr/bin/java java /usr/Java6u1/bin/java 300
sudo update-alternatives --config java
Select whichever number corresponds to /usr/Java6u1/bin/java

Grab latest jetty from website and run:
sudo mkdir /opt/jetty
sudo chown $USER /opt/jetty
Unzip to /opt/jetty
Throw timekeeper in /opt/jetty/webapps via sudo cp -R timekeeper /opt/jetty/webapps
sudo chown -R jetty /opt/jetty
sudo chmod -R ugo+rw /opt/jetty
sudo cp /opt/jetty/bin/ /etc/init.d/jetty
sudo touch /etc/init.d/jetty
set JETTY_HOME=/opt/jetty and JAVA_HOME=/usr/Java6u1 in /opt/jetty/bin/
set Log location in /opt/jetty/etc/jetty.xml to /opt/jetty/logs

Get Jetty to accept connections on port 80:
sudo /sbin/iptables -t nat -I PREROUTING -p tcp --dport 80 -j REDIRECT --to-port 8080

Get Jetty to run on bootup:
sudo ln -s /etc/init.d/jetty /etc/rc2.d/S86jetty

Load Balancing Modifications:
Research ongoing.

Fifth Linux Lesson; First Saturday, hopefully only

posted on 2007-06-16 20:39:00

Good afternoon netizens. Fear not, in spite of great adversity and personal difficulty I am here to bring you this week's Linux Lesson. It occurred to me that I was so foolish in our inaugural Linux Lesson that I taught you only the cd command and not the ls command. This is incredibly silly. These two should almost always be known in tandem. So with that in mind here we go.

Today's concept is finding files and getting file information via the terminal. Obviously, we've discussed things like permissions and file management on here before. However, rather than guessing where a file is or what permissions it has wouldn't you like to know? That's what we hope to cover today.

So, command number one. "ls" which stands for list. Clever, right? Maybe not but this handy little devil will solve most of your file searching and permission checking problems. For example, say you're looking for the "xorg.conf" file we mentioned a few weeks ago. If you think it's in /etc/X11 then you can do "cd /etc/X11" and then type "ls" or you could type "ls /etc/X11". Additionally, if you want to be shown hidden files you can append -a like so "ls -a /etc/X11" and to be shown permissions for files you can append -l like so "ls -l /etc/X11" or both in any order "ls -la /etc/X11" there are other options you can use but those are the two I find useful. When you check permissions with -l the output might look a little crazy to you. Probably something like this:
-rw-r--r-- 1 redline redline 1751 2007-02-04 15:21 aptrepository.asc
drwxr-xr-x 8 redline redline 4096 2007-06-14 19:01 Desktop
drwxr-xr-x 5 redline redline 4096 2007-04-17 22:48 doom3
lrwxrwxrwx 1 redline redline 36 2007-04-17 22:48 doom3-dedicated -> /home/redline/doom3//doom3-dedicated
-rw-r--r-- 1 redline redline 21145838 2007-04-17 22:41
lrwxrwxrwx 1 redline redline 26 2007-02-04 11:38 Examples -> /usr/share/example-content
-rw-r--r-- 1 redline redline 1300 2007-03-17 02:04 file.txt
-rw-r--r-- 1 redline redline 1209 2007-03-17 02:03 file.txt~
-rw-r--r-- 1 redline redline 20017 2007-02-22 20:17 hs_err_pid5956.log
-rw-r--r-- 1 redline redline 0 2007-05-16 17:31 logfile.txt
-rw-r--r-- 1 redline redline 151552 2007-06-03 22:58 nautilus-debug-log.txt
drwxr-xr-x 3 redline redline 4096 2007-02-21 19:18 Photos
-rw-r--r-- 1 redline redline 44628 2007-02-27 08:48 python-gtkglext1_1.1.0-2feisty_i386.deb
drwxr-xr-x 2 redline redline 4096 2007-04-04 19:35 scripts
lrwxrwxrwx 1 redline redline 36 2007-04-12 09:11 TransGaming_Drive -> /home/redline/.cedega/Doom 3/c_drive
-rw-r--r-- 1 root root 684455936 2007-06-13 00:27 ubuntu-custom-live.iso
-rw-r--r-- 1 redline redline 3004403712 2007-05-31 20:21 windows.img
drwxr-xr-x 3 redline redline 4096 2007-04-22 00:57 workspace
-rw-r--r-- 1 redline redline 5368709120 2007-06-08 17:53 xubuntu-test.img
The permissions are found on the far left followed by the owner and group access file size, date last modified, and file name. Some of the files you'll notice have a -> and then another file listed. This is because they are something called a symlink which is effectively the same thing as a Windows shortcut. As for the permissions on the far left, I admit they look a little weird. Let me explain it this way, permissions are doled out separately on Unix-based systems to users, groups, and owners. Someone owns the file and generally speaking the owner has the right to choose who has access to it and whether they have read-only or read-write access as well as whether or not the file is executable. Groups are collections of users which share a given set of rights. Generally, you don't have to worry about groups. On your own computer you'll either own something or not. If you don't own it that means root (the superuser account) probably owns it in which case sudo comes in handy. Users are a more generic entity that probably only matters if people are remotely accessing a directory to upload files over the web but here I'm getting into things I don't know so I'll shut up. For our purposes and most beginning users purposes the only thing that matters is that you own the file and have read or write access to it. That can be seen in the permissions on the far left. The r stands for read, w for write, and x for executable. I believe that the first three are for the owner, the second three are for the group, and the third three are for users but don't quote me. We'll pick up here next week with some more permissions commands and a command to resolve the questions I've opened about permissions towards the end of this lesson. See you then.

Fourth Friday Linux Lesson

posted on 2007-06-08 22:38:00

So, I've been thinking about how I could improve these lessons and if anyone has ideas they should let me know (max, that's you :-). Anyway, I was reading full circle magazine early (it's an online rag all about Ubuntu) and they had a section on the directory structure in Linux which would definitely be useful for someone who isn't used to Linux. Thinking about this I realized that my requiring one application or command to go with one concept is an artificial and possibly bad constraint to place on these articles. I mean, file management involves (at least at the command line level) a minimum of two or three utilities and as many as you like really. One, cd to navigate the file system and two, rm to delete stuff. Potentially there are also commands like locate\find for searching, nano\vim for editing, apt-get\yum for installing\removing software, mkdir and rmdir to create and delete directories, and more still. Now, even though that list may appear intimidating it's really not and it's far from comprehensive as well. It is however really nice to just use whatever makes your life easier and ignore the rest. Anyway, let me know what you would be interested in hearing about next week cause I'd really like to know.

The concept for this week is file management and\or the Linux directory structure. The base of the Linux directory structure, the mother ship from which all other things spring, is "/". Under that you have an assortment of "/bin", "/boot", "/dev", "/etc", "/home", "/mnt", "/media", "/opt", "/root", "/tmp", "/usr", and "/var". "/bin" is where a lot of your basic command line utilities go so when you type a command in the terminal it starts by looking here. It's sort of like Program Files but it basically just holds executables or "binaries" hence bin. "/boot" holds the information the bootloader needs to get the system up and the configuration files for said bootloader. "/dev" holds all your devices. Literally. Your cdrom drives, hard drives, usb drives, audio and video cards, everything, has a "/dev" entry. So the operating system talks to your stuff through this directory and it sort of just maps out where everything is for your system. "/etc" holds a ton of configuration files which is nice though not everyone stores all their config data there. I guess that's a good thing. There are a few central documents of importance there you may find yourself playing with. More on that later. "/home" is just what it sounds like. That's where your user directories, desktop, and most of your other stuff is found. "/mnt" is a place where drives get "mounted" so you can read and\or write to them. Some distributions use "/media" instead of "/mnt" but they're essentially interchangeable. They are where you (not the operating system) go to check out CDs and USB drives and iPods and such. They do something called symlink (which is sort of like a precursor to hyperlinks on webpages) to the "/dev" entries for whatever you want to play with. "/opt" contains optional stuff you might install but not everything you do install. It's a grab bag. Mine's pretty much empty. It kind of has what you put in it. "/root" is the root user's version of "/home" it's normally pretty sparse and remember root is god so you don't want to play around in there. Root might get pissed at you. "/tmp" is just what it sounds like, temporary files and stuff. "/usr" is the real Program Files. This is where a ton of your shit ends up going and you'll notice it has subdirectories like "bin" "docs" etc. Finally, there's "/var" which I mainly find myself going in to peek around the "log" subdirectory. That's a pretty decent overview of the Linux filesystem.

The utilities to keep in mind with file management for me have been "mkdir", "rmdir", and "rm", "cp" and "mv". That's really all I use. mkdir and rmdir are two sides of the same coin. mkdir makes directories and rmdir removes them but, interestingly enough, only if they're empty. The syntax looks like this "mkdir mp3s" and "rmdir mp3s". Since you'll want to put your mp3s there you can use cp to copy them over. It'd be easier to use mv but run with me on this. Say all your mp3s are in "/home/your_username/downloads/music" you could use "cp /home/your_username/downloads/music/*.mp3 mp3s" and it would copy all your files with the .mp3 extension (*.mp3) to the mp3s directory. Then you could do "cd /home/your_username/downloads/music" and "rm *.mp3". That's not very efficient though, is it? It'd be easier to just blow away the directory. Well, you can. With rm -R. "rm -R /home/your_username/downloads/music" will remove files recursively starting at /home/your_username/downloads/music. That is, it deletes everything in the directory as well as the directory itself. Interestingly you can rename things with the mv command and that might be the easiest way to do this. For example, "mv /home/your_username/downloads/music /home/your_username/downloads/mp3s" then "mv /home/your_username/downloads/mp3s /wherever/you/want/mp3s". Generally I do file management from the file browser. I still like a GUI for it but if you're running into permissions errors it helps to come down and "sudo rm -R" or cp or mv something. Of course, if you don't know what file is giving you permissions errors you might not want to do that. Anyway, this hasn't been explained beautifully so let me know if this makes sense and\if you have questions.

Jetty, Upstart, and Hell

posted on 2007-06-07 20:15:00

Hi lazyweb, I've been trying to get an install of jetty on ubuntu that will load during boot up and I've run update-rc.d jetty defaults but it's not working. What am I doing wrong?
Jetty Install Process:
Install JDK 6u1 from Sun:
Grab the bin file and run:
sudo chmod +x *.bin
sudo sh ./jdk*.bin
sudo mv jdk1.6.0_01 Java6u1
sudo mv Java6u1 /usr/lib
sudo update-alternatives --install /usr/bin/java java /usr/lib/Java6u1/bin/java 300
sudo update-alternatives --config java
Select whichever number corresponds to /usr/lib/Java6u1/bin/java

Grab latest jetty from website and run:
sudo mkdir /opt/jetty
sudo chown $USER /opt/jetty
Unzip to /opt/jetty
Throw timekeeper in /opt/jetty/webapps via sudo cp -R timekeeper /opt/jetty/webapps
sudo chown -R jetty /opt/jetty
sudo chmod -R ugo+rw /opt/jetty
sudo cp /opt/jetty/bin/ /etc/init.d/jetty
sudo touch /etc/init.d/jetty
set JETTY_HOME=/opt/jetty and JAVA_HOME=/usr/lib/Java6u1 in /opt/jetty/bin/
set Log location in /opt/jetty/etc/jetty.xml to /opt/jetty/logs
sudo ln -s /opt/jetty/bin/ /etc/init.d/jetty
sudo update-rc.d jetty defaults



Guides to Hacks

posted on 2007-06-05 16:15:00

Setting up VNC with GDM Login with steps to edit gdm.conf from here
Installing up Tomcat 6 on Feisty
Configuring Tomcat 6 on Feisty
background no
update_interval 1.0
double_buffer yes

use_xft yes
xftfont Purisa:size=07.5
xftalpha 1.0

own_window no
own_window_transparent yes
own_window_type override
own_window_hints undecorated,below,sticky,skip_taskbar,skip_pager
#on_bottom yes
#on_top yes

minimum_size 300 50
draw_shades no
draw_outline no
draw_borders yes
draw_graph_borders no
stippled_borders 0
border_margin 3
border_width 0

default_color white
default_shade_color black
default_outline_color black

alignment top_right
gap_x 57
#gap_y 34
gap_y 10

no_buffers yes

${color black}$nodename - $sysname $kernel on $machine Uptime:${color blue} $uptime ${color black}- CPU Usage:${color red} $cpu% ${color black}RAM Usage:${color blue}$mem/$memmax - $memperc% ${color black}Down:${color blue}${downspeed eth0} k ${color black}Up:${color blue} ${upspeed eth0} k ${color black}Swap Usage:${color blue} $swap/$swapmax - $swapperc% ${color black} Disk Usage: ${color blue} ${fs_used /bin/bash}/${fs_size /bin/bash} - ${fs_free_perc /bin/bash}% Free

Third Friday Linux Lesson

posted on 2007-06-02 07:23:00

Happy Friday my fellow netizens! I've been meaning to write a piece about Linux on the Desktop and perhaps it just may surface this weekend. If anything though it has become more of a Linux and Mac perspective piece for me. I don't want to say anything just yet but I will say that it is more about what each has to offer than who's going to win on some higher level. That said, we've got some Linux Learning to do.

This time I'll be focusing on the concept of configuration files. Configuration files are more useful than you might think. Indeed, it may seem that any configuration that can be done should be done through nice pretty graphical toolkits but text file configuration can be incredibly helpful. Especially when for some reason the system won't boot and there's one little edit you could make in a configuration file to fix it if only Windows or Mac tried to let you boot to a command prompt to fix things. Not that you can't, it's just a bit of a pain in the ass compared to Linux. Then again maybe it happens more in Linux. Ok, I promise to save the rest for the Mac\Linux article. So, now that we've covered how to move around the filesystem with cd and how to act as the superuser with sudo we need a practical task to use those commands with. I've got just the thing. Say you install some nice new graphics drivers...but then your system won't boot into the GUI after the command prompt. Odds are good that those new drivers aren't working right. Wouldn't it be nice if you could just use the old ones? Well, you can. That's why text configuration files are useful. So, let's see how to solve this state of affairs with this week's application: nano.

The configuration file that tells the computer what graphics driver to use is stored on your system at "/etc/X11/xorg.conf". Open a terminal (as covered in our first lesson) and cd to the directory that holds the configuration file "cd /etc/X11/xorg.conf". Now, to edit it with our text editor (nano) type "nano xorg.conf". It will probably spit back some error about you not having adequate permissions to touch that file and go to hell. That's why we have sudo. Type "sudo nano xorg.conf" to tell the machine to open xorg.conf, with nano, as the superuser. It'll ask for the password and if you enter it correctly nano will start right up. Now, alternatively you could've skipped the whole cd step and just given the full path to the file at the start "sudo nano /etc/X11/xorg.conf" but I figured I'd employ as much that we've covered as possible. Now, you'll see the file in the editor, which won't remind you much of Microsoft Word by the way. Look on the bottom of the screen to where it says "^W where is". This is a shortcut. The ^ represents the control key. Hit Ctrl-W and nano will ask what you're looking for. Type (Section "device") (without parentheses) and hit enter. Notice how the list of shortcuts at the bottom changed. Under that there should be a line that looks something like (driver "some_driver"). Erase the some_driver part from quotations and type vesa to load a basic graphics driver and get your system running again. It should look like this (Driver "vesa") again, without the parentheses. Look at the shortcuts again. The one for saving (here referred to as "writing out") is ^O or Ctrl-O. Type Ctrl-O and hit enter when it asks for the filename to save it as the same name. Now reboot and you're all done but...ah don't know how to reboot do you? Ah well, I guess it'll have to wait til next week won't it?

Second Friday Linux Lesson

posted on 2007-05-25 19:59:00

So, I'm running a little short on time here and cutting things a bit close but I've been thinking it out and we've got some great stuff to cover. Additionally, there's more great news on the Linux front that I'll be covering in depth before Monday but that's for later.

For my second Linux Lesson we're using what I think is one of the more important concept-application distinctions between *nix systems (Mac included) and Windows. The concept of the hour: Permissions also known as User Access Control and numerous other things. I should also note that permissions is somewhat tied to the concept of a superuser and so I'll brush over that as well.

Permissions are, in effect, the Operating System restricting the user to only being able to do the things they need to do. The idea here is to keep from handing the common computer user (who may not be a nerd) a desert eagle or shotgun with which they might blow off their own foot. So, the user has permission to mess with things like their /home/username directory (which in *nix systems is analogous to the My Documents, My this, My that, of Windows). Windows effectively lacks a permissions structure pre-Vista by default. I'll probably write more on this later as a concept called Sensible Defaults but for now just remember that Linux wants to help you not hurt yourself by keeping your ability to do bad things to a minimum.

Now for the command, sudo. Occasionally, you my find that you need to do something that you don't have permission to do. This doesn't in itself mean that what you want to do is a bad idea. Just use common sense. If you're deleting files you don't recognize, maybe it's a bad idea. If it's installing an application you want it's probably okay. It just depends. If you open the terminal, covered in our last lesson, and type "sudo other_command_you_want_to_do" then the Operating System will let you run that one command as the superuser (which is simply a user with no permissions restrictions, i.e. the ability to do anything) and prompt you for your password. If you successfully enter your password (which shouldn't be too tricky as long as you remember it) it will let you run the one command you didn't have permissions for one time and that's it. That single command, in conjunction with all the other commands you'll end up using it with, will end up making your life easier over and over again. That's it for this week.

First Friday Linux Lesson

posted on 2007-05-19 02:27:00

This series is going to be meatier than the other four and also more diverse. Since people who use computers on a regular basis generally know how to do the things they need to do I'm going to be teaching things that most people may or may not use but which should often be new. At least initially, I'll be starting with a concept, then moving on to a useful command, application, hotkey/shortcut, hack/tip, code snippet/language feature, etc. Also, these "lessons" will be linux-centric because it qualifies as new knowledge for many people and because I love it so damn much. I don't think I could interest myself in writing windows tutorials every week. This series will generally assume that the reader is using a GNOME-based distribution (I like Ubuntu, but Fedora and Arch are nice too) where applicable.

For the inaugural post we're starting off with the only sensible thing:
(Concept) The Terminal, also known as the command prompt or the ominous opaque cause of digital doom.

The Terminal, more familiar to many as the command prompt, can be found in Linux by hitting Alt-F2 and typing "gnome-terminal" then pressing enter or by going to the Applications Menu, Accessories (or System Tools if you're using Fedora), then Terminal. The terminal need not be dangerous or painful to use. In fact, sometimes I find that it's vastly preferable to something with a GUI.

We're also starting off with the only sensible command: "cd".
cd is the change directory command. It allows you to (unsurprisingly) change the directory you're in which in turn enables you to navigate the file system to accomplish various tasks. cd can be used in one of two ways, to enter a folder in the folder you're in or to enter a specific folder anywhere on the system. Let's see how that looks. If you're in a folder, say your user's Home folder, and there is a subfolder in that folder called Desktop you would navigate to it like this, "cd Desktop" or like this, "cd /home/USERNAME/Desktop". The only thing to be mindful of is that it's case sensitive but that will get you around the Linux filesystem.

Of Revolutions, or sometimes I think I should just act as a news feed\aggregator

posted on 2007-05-04 00:17:00

May 1st was a watershed day. It was a big day for digital revolutions for two reasons. One, Dell announced they would be selling machines with Linux pre-installed. Two, HD-DVD encryption was broken and when media companies tried to censor this fact the web denizens responded in a massive virtual riot. The tenuous connection between those two things is that they both demonstrate the growing power of networks over hierarchies where structures of organization or authority are concerned. I realize this may seem a ridiculous or unsubstantiated claim and if you want I'll argue it with you personally or in the comments. To start though I just thought I'd post a few links.
First, a link to the official dell announcement: Ubuntu on Dell. Yay.
Second, a nice visualization of the extent of the HD-DVD rioting: 900 thousand google results when I searched regarding the riot.
Third, news stories about the HD-DVD rioting: from Forbes, and the New York Times, twice. I'm sure there are others.
Finally, a few different views and examples of the protest: Youtube, IPv6 addresses, an Image Puzzle, a Song, a flickr search and our new celebrity of course has it's own website. Two, in fact.

While I feel these produce a pretty good pastiche of May 1st's two events and their significance it may not be fully evident. If that's the case, let me know in the comments or contact me and I'll try to explain it in a short but thorough post later. I may just do it anyway...

Update: Added the flickr search link. Very cool. Also wrote that second piece but it didn't end up being so short.

The Big Day

posted on 2007-05-01 12:57:00

It finally happened. The first big vendor flipped over. OEM Linux here we come.
Dell will be selling computers with Ubuntu Linux preloaded and support them fully this May 2007.
Quoth Linus:

Unless otherwise credited all material Creative Commons License by Brit Butler