

“Every once and a while I run across a situation that just amazes me. While this topic is not strictly about software development it is about the subject of the business of software.”

… there are many interesting parts here:
- The HTML 5 Canvas getImageData API is used to get at the pixel data from the Captcha image. Canvas gives you the ability to embed an image into a canvas (from which you can later extract the pixel data back out again).
- The script includes an implementation of a neural network, written in pure JavaScript.
- The pixel data, extracted from the image using Canvas, is fed into the neural network in an attempt to divine the exact characters being used - in a sort of crude form of Optical Character Recognition (OCR).

… are the tags that people create really an effective way of describing information so that it can be found and managed, folded and put in the right drawer?

One of the things that I’ve always loved about the Mac is its cohesiveness. Everything just flows. It’s the experience that careful attention to design has created.

Imagine you’re writing your own programming language, and you want to let the user use WebView to display a browser in a window. All the IBAction functions on a web view are of the same form: -(void) methodName: (id)senderwhere the sender is generally ignored. So you can take the strings your programming language gets from the user and use NSSelectorFromString() to get a SEL, (the data type that @selector() returns) and then you can use [myWebView performSelector: theSel withObject: nil] or whatever to actually make the call. That way, you only need to write code for each kind of command, not for each and every command individually.

After working in Scheme, Python or Ruby, all of which (more or less) support function objects and the map(), filter() and reduce() functions, languages that don’t seem to be somewhat cumbersome. Cocoa manages to get these paradigms almost correctly implemented.