from internet import *
Three posts that caught my eye today.
Ruby School
Gregory Brown over on O'Reilly net has an article about using Ruby in Computer Science courses, at least in later algorithm classes. It's not a bad argument, but I think it'd be more convincing if the Ruby example was a little cleaner and easier to read compared to the pseudo-code.
Let's see... The last time I had to care about this issue was about eight years ago when my grad institution was going through a somewhat controversial revamp of the CS curriculum. The fight, as always, is between the theorists and the pragmatists. The theorists want to teach a lot of "pure" CS up front -- Turing machines, big "O" analysis, computational theory, that kind of thing. The pragmatists want the students to be able to get jobs.
You should know that I spent the better part of three years as part of a group working with an object-oriented class that we taught in Squeak Smalltalk. Lovely language, to be sure, but we had to spend part of every course explaining to some nervous students why we weren't teaching them C++ or Java...
At the time, the initial CS classes were moving to Java, with some relief. This is because a) nobody wanted to inflict C or C++ on unsuspecting new CS majors, and b) the previous most common language, Pascal, was woefully obsolete. Java is reasonably straightforward to teach and is actually used in real programs, both high points.
Personally, I think you can make a pretty nice case for a scripting language like Python or Ruby in the initial CS class. They are both pretty easy to get started with, the syntax is clean enough that algorithms are easy to visualize (which was Brown's original point). In Python you can do it without introducing objects (which most CS1 classes didn't do eight years ago, don't know if that's changed). In Ruby it's easy to teach meta-programming.
Cha-Ching
Paul Julius of ThoughtWorks about how CruiseControl can save you $12,535 per broken test. The money coming from the difference between the cost of fixing a bug immediately versus not catching the bug until integration testing.
I dunno. I love continuous integration, and would shout about it from the rooftops if they'd let me on the roof, and that number still sounds a bit more "look what I can do with numbers" than "look what I can do with Continuous Integration". But then I'm skeptical of nearly every numerical analysis of programming productivity.
Plus, Marshmallows
Over at Roughly Drafted, Daniel Eran goes on about the smooth, harmonious relationship between Apple and Sun. Naturally, I want to talk about one of his sidebars...
The name of Apple's Mac OS X frameworks was even named Cocoa in part to associate it with Sun's Java. The other reason was that Apple already owned the Cocoa trademark, having using it earlier for a children's programing environment.
You know, I've always wondered about that. The original Cocoa was a project that was being worked on in Apple's Advanced Technology Group the summer I interned there, plus it some buzz in Educational Technology circles for a while. Internally, it was called KidSim, but the name was changed to Cocoa when it was being prepared for release. Java was programming for grown-ups, so Cocoa was programming for kids. It seems like Apple isn't really using that connotation of the name anymore.
The project (now called Stagecast Creator) is a graphical rule-based programming language, something like a cellular automata program. The user specifies an initial arraignment of sprites on the screen, then specifies how that arrangement should change in the new time slice. Complex programs could be created with almost no typing (although, like all such programs, you still had to use drawing tools to create your own sprites -- that was still hard). Stagecast still seems to be around, although it's been ages since I tried the software. It was pretty cool, though.