Thursday, March 30, 2006

Execution in the Kingdom of Nouns

 They've a temper, some of them—particularly verbs: they're the proudest—adjectives you can do anything with, but not verbs—however, I can manage the whole lot of them! Impenetrability! That's what I say!
— Humpty Dumpty

Hello, world! Today we're going to hear the story of Evil King Java and his quest for worldwide verb stamp-outage.1

Caution: This story does not have a happy ending. It is neither a story for the faint of heart nor for the critical of mouth. If you're easily offended, or prone to being a disagreeable knave in blog comments, please stop reading now.

Before we begin the story, let's get some conceptual gunk out of the way.

The Garbage Overfloweth

All Java people love "use cases", so let's begin with a use case: namely, taking out the garbage. As in, "Johnny, take out that garbage! It's overflowing!"

If you're a normal, everyday, garden-variety, English-speaking person, and you're asked to describe the act of taking out the garbage, you probably think about it roughly along these lines:

  get the garbage bag from under the sink
carry it out to the garage
dump it in the garbage can
walk back inside
wash your hands
plop back down on the couch
resume playing your video game (or whatever you were doing)

Even if you don't think in English, you still probably still thought of a similar set of actions, except in your favorite language. Regardless of the language you chose, or the exact steps you took, taking out the garbage is a series of actions that terminates in the garbage being outside, and you being back inside, because of the actions you took.

Our thoughts are filled with brave, fierce, passionate actions: we live, we breathe, we walk, we talk, we laugh, we cry, we hope, we fear, we eat, we drink, we stop, we go, we take out the garbage. Above all else, we are free to do and to act. If we were all just rocks sitting in the sun, life might still be OK, but we wouldn't be free. Our freedom comes precisely from our ability to do things.

Of course our thoughts are also filled with nouns. We eat nouns, and buy nouns from the store, and we sit on nouns, and sleep on them. Nouns can fall on your head, creating a big noun on your noun. Nouns are things, and where would we be without things? But they're just things, that's all: the means to an end, or the ends themselves, or precious possessions, or names for the objects we observe around around us. There's a building. Here's a rock. Any child can point out the nouns. It's the changes happening to those nouns that make them interesting.

Change requires action. Action is what gives life its spice. Action even gives spices their spice! After all, they're not spicy until you eat them. Nouns may be everywhere, but life's constant change, and constant interest, is all in the verbs.

And of course in addition to verbs and nouns, we also have our adjectives, our prepositions, our pronouns, our articles, the inevitable conjunctions, the yummy expletives, and all the other lovely parts of speech that let us think and say interesting things. I think we can all agree that the parts of speech each play a role, and all of them are important. It would be a shame to lose any of them.

Wouldn't it be strange if we suddenly decided that we could no longer use verbs?

Let me tell you a story about a place that did exactly that...

The Kingdom of Nouns

In the Kingdom of Javaland, where King Java rules with a silicon fist, people aren't allowed to think the way you and I do. In Javaland, you see, nouns are very important, by order of the King himself. Nouns are the most important citizens in the Kingdom. They parade around looking distinguished in their showy finery, which is provided by the Adjectives, who are quite relieved at their lot in life. The Adjectives are nowhere near as high-class as the Nouns, but they consider themselves quite lucky that they weren't born Verbs.

Because the Verb citizens in this Kingdom have it very, very bad.

In Javaland, by King Java's royal decree, Verbs are owned by Nouns. But they're not mere pets; no, Verbs in Javaland perform all the chores and manual labor in the entire kingdom. They are, in effect, the kingdom's slaves, or at very least the serfs and indentured servants. The residents of Javaland are quite content with this situation, and are indeed scarcely aware that things could be any different.

Verbs in Javaland are responsible for all the work, but as they are held in contempt by all, no Verb is ever permitted to wander about freely. If a Verb is to be seen in public at all, it must be escorted at all times by a Noun.

Of course "escort", being a Verb itself, is hardly allowed to run around naked; one must procure a VerbEscorter to facilitate the escorting. But what about "procure" and "facilitate?" As it happens, Facilitators and Procurers are both rather important Nouns whose job is is the chaperonement of the lowly Verbs "facilitate" and "procure", via Facilitation and Procurement, respectively.

The King, consulting with the Sun God on the matter, has at times threatened to banish entirely all Verbs from the Kingdom of Java. If this should ever to come to pass, the inhabitants would surely need at least one Verb to do all the chores, and the King, who possesses a rather cruel sense of humor, has indicated that his choice would be most assuredly be "execute".

The Verb "execute", and its synonymous cousins "run", "start", "go", "justDoIt", "makeItSo", and the like, can perform the work of any other Verb by replacing it with an appropriate Executioner and a call to execute(). Need to wait? Waiter.execute(). Brush your teeth? ToothBrusher(myTeeth).go(). Take out the garbage? TrashDisposalPlanExecutor.doIt(). No Verb is safe; all can be replaced by a Noun on the run.

In the more patriotic corners of Javaland, the Nouns have entirely ousted the Verbs. It may appear to casual inspection that there are still Verbs here and there, tilling the fields and emptying the chamber pots. But if one looks more closely, the secret is soon revealed: Nouns can rename their execute() Verb after themselves without changing its character in the slightest. When you observe the FieldTiller till(), the ChamberPotEmptier empty(), or the RegistrationManager register(), what you're really seeing is one of the evil King's army of executioners, masked in the clothes of its owner Noun.

Verbs in Neighboring Kingdoms

In the neighboring programming-language kingdoms, taking out the trash is a straightforward affair, very similar to the way we described it in English up above. As is the case in Java, data objects are nouns, and functions are verbs.2 But unlike in Javaland, citizens of other kingdoms may mix and match nouns and verbs however they please, in whatever way makes sense for conducting their business.

For instance, in the neighboring realms of C-land, JavaScript-land, Perl-land and Ruby-land, someone might model taking out the garbage as a series of actions — that is to say, verbs, or functions. Then if they apply the actions to the appropriate objects, in the appropriate order (get the trash, carry it outside, dump it in the can, etc.), the garbage-disposal task will complete successfully, with no superfluous escorts or chaperones required for any of the steps.

There's rarely any need in these kingdoms to create wrapper nouns to swaddle the verbs. They don't have GarbageDisposalStrategy nouns, nor GarbageDisposalDestinationLocator nouns for finding your way to the garage, nor PostGarbageActionCallback nouns for putting you back on your couch. They just write the verbs to operate on the nouns lying around, and then have a master verb, take_out_garbage(), that springs the subtasks to action in just the right order.

These neighboring kingdoms generally provide mechanisms for creating important nouns, when the need arises. If the diligent inventors in these kingdoms create an entirely new, useful concept that didn't exist before, such as a house, or a cart, or a machine for tilling fields faster than a person can, then they can give the concept a Class, which provides it with a name, a description, some state, and operating instructions.

The difference is that when Verbs are allowed to exist independently, you don't need to invent new Noun concepts to hold them.

Javalanders look upon their neighbors with disdain; this is the way of things in the Kingdoms of Programming.

If You Dig a Hole Deep Enough...

On the other side of the world is a sparsely inhabited region in whose kingdoms Verbs are the citizens of eminence. These are the Functional Kingdoms, including Haskellia, Ocamlica, Schemeria, and several others. Their citizens rarely cross paths with the kingdoms near Javaland. Because there are few other kingdoms nearby, the Functional Kingdoms must look with disdain upon each other, and make mutual war when they have nothing better to do.

In the Functional Kingdoms, Nouns and Verbs are generally considered equal-caste citizens. However, the Nouns, being, well, nouns, mostly sit around doing nothing at all. They don't see much point in running or executing anything, because the Verbs are quite active and see to all that for them. There are no strange laws mandating the creation of helper Nouns to escort each Verb, so there are only exactly as many Nouns as there are Things in each kindgom.

As a result of all this, the Verbs have the run of the place, if you'll pardon the expression. As an outsider, you could easily form the impression that Verbs (i.e., the functions) are the most important citizens by far. That, incidentally, is why they're called the Functional Kingdoms and not the Thingy Kingdoms.

In the remotest regions, beyond the Functional Kingdoms, lies a fabled realm called Lambda the Ultimate. In this place it is said that there are no nouns at all, only verbs! There are "things" there, but all things are created from verbs, even the very integers for counting lambs, which are the most popular form of trading currency there, if the rumors speak truth. The number zero is simply lambda(), and 1 is lambda(lambda()), 2 is lambda(lambda(lambda())), and so on. Every single Thing in this legendary region, be it noun, verb or otherwise, is constructed from the primal verb "lambda".3

To be quite honest, most Javalanders are blissfully unaware of the existence of the other side of the world. Can you imagine their culture shock? They would find it so disorienting that they might have to invent some new nouns (such as "Xenophobia") to express their new feelings.

Are Javalanders Happy?

You might think daily life in Javaland would be at best a little strange, and at worst grossly inefficient. But you can tell how happy a society is through their nursery rhymes, and Javaland's are whimsically poetic. For instance, Javaland children oft recite the famous cautionary tale:

For the lack of a nail,
throw new HorseshoeNailNotFoundException("no nails!");

For the lack of a horseshoe,
EquestrianDoctor.getLocalInstance().getHorseDispatcher().shoot();

For the lack of a horse,
RidersGuild.getRiderNotificationSubscriberList().getBroadcaster().run(
new BroadcastMessage(StableFactory.getNullHorseInstance()));

For the lack of a rider,
MessageDeliverySubsystem.getLogger().logDeliveryFailure(
MessageFactory.getAbstractMessageInstance(
new MessageMedium(MessageType.VERBAL),
new MessageTransport(MessageTransportType.MOUNTED_RIDER),
new MessageSessionDestination(BattleManager.getRoutingInfo(
BattleLocation.NEAREST))),
MessageFailureReasonCode.UNKNOWN_RIDER_FAILURE);

For the lack of a message,
((BattleNotificationSender)
BattleResourceMediator.getMediatorInstance().getResource(
BattleParticipant.PROXY_PARTICIPANT,
BattleResource.BATTLE_NOTIFICATION_SENDER)).sendNotification(
((BattleNotificationBuilder)
(BattleResourceMediator.getMediatorInstance().getResource(
BattleOrganizer.getBattleParticipant(Battle.Participant.GOOD_GUYS),
BattleResource.BATTLE_NOTIFICATION_BUILDER))).buildNotification(
BattleOrganizer.getBattleState(BattleResult.BATTLE_LOST),
BattleManager.getChainOfCommand().getCommandChainNotifier()));

For the lack of a battle,
try {
synchronized(BattleInformationRouterLock.getLockInstance()) {
BattleInformationRouterLock.getLockInstance().wait();
}
} catch (InterruptedException ix) {
if (BattleSessionManager.getBattleStatus(
BattleResource.getLocalizedBattleResource(Locale.getDefault()),
BattleContext.createContext(
Kingdom.getMasterBattleCoordinatorInstance(
new TweedleBeetlePuddlePaddleBattle()).populate(
RegionManager.getArmpitProvince(Armpit.LEFTMOST)))) ==
BattleStatus.LOST) {
if (LOGGER.isLoggable(Level.TOTALLY_SCREWED)) {
LOGGER.logScrewage(BattleLogger.createBattleLogMessage(
BattleStatusFormatter.format(BattleStatus.LOST_WAR,
Locale.getDefault())));
}
}
}

For the lack of a war,
new ServiceExecutionJoinPoint(
DistributedQueryAnalyzer.forwardQueryResult(
NotificationSchemaManager.getAbstractSchemaMapper(
new PublishSubscribeNotificationSchema()).getSchemaProxy().
executePublishSubscribeQueryPlan(
NotificationSchema.ALERT,
new NotificationSchemaPriority(SchemaPriority.MAX_PRIORITY),
new PublisherMessage(MessageFactory.getAbstractMessage(
MessageType.WRITTEN,
new MessageTransport(MessageTransportType.WOUNDED_SURVIVOR),
new MessageSessionDestination(
DestinationManager.getNullDestinationForQueryPlan()))),
DistributedWarMachine.getPartyRoleManager().getRegisteredParties(
PartyRoleManager.PARTY_KING ||
PartyRoleManager.PARTY_GENERAL ||
PartyRoleManager.PARTY_AMBASSADOR)).getQueryResult(),
PriorityMessageDispatcher.getPriorityDispatchInstance())).
waitForService();

All for the lack of a horseshoe nail.

It remains wonderful advice, even to this very day.

Although the telling of the tale in Javaland differs in some ways from Ben Franklin's original, Javalanders feel their rendition has a distinct charm all its own.

The main charm is that the architecture is there for all to see. Architecture is held in exceptionally high esteem by King Java, because architecture consists entirely of nouns. As we know, nouns are things, and things are prized beyond all actions in the Kingdom of Java. Architecture is made of things you can see and touch, things that tower over you imposingly, things that emit a satisfying clunk when you whack them with a stick. King Java dearly loves clunking noises; he draws immense satisfaction from kicking the wheels when he's trying out a new horse-drawn coach. Whatever its flaws may be, the tale above does not want for things.

One of our first instincts as human beings is to find shelter from the elements; the stronger the shelter, the safer we feel. In Javaland, there are many strong things to make the citizens feel safe. They marvel at the massive architectural creations and think "this must be a strong design". This feeling is reinforced when they try to make any changes to the structure; the architectural strength then becomes daunting enough that they feel nobody could bring this structure down.

In addition to the benefits of a strong architecture, everything in Javaland is nicely organized: you'll find every noun in its proper place. And the stories all take a definite shape: object construction is the dominant type of expression, with a manager for each abstraction and a run() method for each manager. With a little experience at this kind of conceptual modeling, Java citizens realize they can express any story in this style. There's a kind of "noun calculus" backing it that permits the expression of any abstraction, any computation you like. All one needs are sufficient nouns, constructors for those nouns, accessor methods for traversing the noun-graph, and the all-important execute() to carry out one's plans.

The residents of the Kingdom of Java aren't merely happy — they're bursting with pride!

StateManager.getConsiderationSetter("Noun Oriented Thinking", State.HARMFUL).run()

Or, as it is said outside the Kingdom of Java, "Noun Oriented Thinking Considered Harmful".

Object Oriented Programming puts the Nouns first and foremost. Why would you go to such lengths to put one part of speech on a pedestal? Why should one kind of concept take precedence over another? It's not as if OOP has suddenly made verbs less important in the way we actually think. It's a strangely skewed perspective. As my friend Jacob Gabrielson once put it, advocating Object-Oriented Programming is like advocating Pants-Oriented Clothing.

Java's static type system, like any other, has its share of problems. But the extreme emphasis on noun-oriented thought processes (and consequently, modeling processes) is more than a bit disturbing. Any type system will require you to re-shape your thoughts somewhat to fit the system, but eliminating standalone verbs seems a step beyond all rationale or reason.

C++ doesn't exhibit the problem, because C++, being a superset of C, allows you to define standalone functions. Moreover, C++ provides a distinct namespace abstraction; Java overloads the idea of a Class to represent namespaces, user-defined types, syntactic delegation mechanisms, some visibility and scoping mechanisms, and more besides.

Don't get me wrong; I'm not claiming C++ is "good". But I do find myself appreciating the flexibility of its type system, at least compared with Java's. C++ suffers from problems causing reasonable-looking sentences to cause listeners to snap and try to kill you (i.e., unexpected segfaults and other pitfalls for the unwary), and it can be extremely difficult to find the exact incantation for expressing a particular thought in C++. But the range of succinctly expressible thoughts far exceeds Java's, because C++ gives you verbs, and who'd want to speak in a language that doesn't?

Classes are really the only modeling tool Java provides you. So whenever a new idea occurs to you, you have to sculpt it or wrap it or smash at it until it becomes a thing, even if it began life as an action, a process, or any other non-"thing" concept.

I've really come around to what Perl folks were telling me 8 or 9 years ago: "Dude, not everything is an object."

It's odd, though, that Java4 appears to be the only mainstream object-oriented language that exhibits radically noun-centric behavior. You'll almost never find an AbstractProxyMediator, a NotificationStrategyFactory, or any of their ilk in Python or Ruby. Why do you find them everywhere in Java? It's a sure bet that the difference is in the verbs. Python, Ruby, JavaScript, Perl, and of course all Functional languages allow you to declare and pass around functions as distinct entities without wrapping them in a class.

It's certainly easier to do this in dynamically typed languages; you just pass a reference to the function, obtained from its name, and it's up to the caller to invoke the function with the proper arguments and use its return value correctly.

But many statically-typed languages have first-class functions as well. This includes verbosely-typed languages like C and C++, and also type-inferring languages like Haskell and ML. The languages just need to provide a syntax for creating, passing and invoking function literals with an appropriate type signature.

There's no reason Java couldn't simply add first-class functions and finally enter the grown-up, non-skewed world that allows people to use verbs as part of their thought processes. In fact there's a JVM language called The Nice programming language that sports a very Java-like syntax, but also includes expressive facilities for using verbs: standalone functions, which Java forces you to wrap with Callbacks or Runnables or other anonymous interface implementation classes to be able to refer to them.

Sun wouldn't even have to break their convention of requiring all functions to be "owned" by classes. Every anonymous function could carry an implicit "this" pointer to the class in which it was defined; problem solved.

I don't know why Sun insists on keeping Java squarely planted in the Kingdom of Nouns. I doubt it's a matter of underestimating their constituency; they added generics, which are a far more complex concept, so they clearly no longer care deeply about keeping the language simple. And that's not a bad thing, necessarily, because Java's established now: it makes more sense to start giving Java programmers tools that let them program the way they think.

I sure hope they fix this, so I can take the trash out and get back to my video game. Or whatever I was doing.



Notes

[1] Beginning with the verb "to stamp out", which is being replaced by a call to VerbEliminatorFactory.createVerbEliminator(currentContext).operate(). But that's getting waaaaay ahead of ourselves...

[2] And variable names are proper nouns, attributes are adjectives, operators often serve as conjunctions, varargs are the pronoun "y'all", and so on. But this is all beside the point of our story.

[3] The meaning of the verb "lambda" is allegedly "to lambda".

[4] And arguably C#, due to its similar roots.

Friday, March 24, 2006

Moore's Law is Crap

Sometimes people ask me how I find time to go learn new stuff. Here's the answer: you make time.

Nobody ever likes that kind of answer, but we all know it's the real one.

My brother Dave put on some pretty serious weight after he graduated from high school. He'd gone from playing varsity football to working two jobs and going to college full time. It didn't help that one of his jobs was pizza delivery, and the other was waiting tables. Soon he was a fat, fat kid. Went from a buck-eighty to a deuce and a half, at least, maybe a deuce sixty.

One day he saw a truck with a bumper sticker on it that said: "Lose weight now, ask me how!" So he pulled up next to the truck at the next stoplight, and said to the two cowboy-types in it: "How do I lose weight?" They yelled back: "Just lose weight, ya fat pig! Haw haw haw HAW HAW HAW HAW!" and then drove off.

Dave was sad about this advice for a brief while, but eventually he brightened up, because he did know how to lose weight. Cripes, he'd been a champion varsity football player two years prior. It's not like there's any magic to it. He went and bought a mountain bike, started riding the crap out of it, joined a gym, watched what he ate, and lost about 85 pounds over the next year.

I was 145 pounds 2 years ago today, after a similar 2-year diet and exercise kick that took me down from a deuce and a half myself. Then I fell off the wagon — it happens — and put on 50 pounds over the next 2 years. It sucks, but there's no magic. Two months ago I finally started going to the gym every day, 7 days a week, and my legs are sore every day. My weight hasn't improved at all yet. But it will. You just have to work at these things consistently.

But you knew that, didn't you?

This isn't a self-help blog, by the way. I'm not in that business. I'm not here to help, since I don't actually know the answers. I'm just here to rant, and pose occasional questions. It's what I do, when I'm not doing other stuff.

I don't know why I blog. I'm just compelled; it just happens whether I like it or not. Don't read too much into my blogs. My opinions change from day to day. The only things I've learned, the only universal constants, are that I don't know very much, and that public whale explosions are just about the funniest thing human beings can experience during our stay on Earth. I don't know why that is, either.

Today's blog is truly a rant; I just need to get this particular one off my chest, so my gym partner Todd can listen to me rant about something different next week.

The Big Choice

We all have to choose how to spend the time we're given.

If you don't choose, it just slips right by you. I know. On a trip to Vegas not too long ago, I made a pit stop in a casino restroom, and as I was washing my hands, there was this older guy there, also washing his hands. On a whim, I asked, "Hey man, how old are you?"

His reply? "Seventy-two! I have a son: I remember the day he was born like it was yesterday! I was holding him just like so. Well, guess what, he turned 40 years old just last week! It goes by in a flash! Before long, you'll be lookin' at THIS!" He pointed at his wrinkled mug, and concluded his monologue with: "Haw, haw, haw! HAW HAW HAW *cough* *cough* HAW *cough* *hack* HAW HAW HAW HAW HAW!" and walked out. I think I made his day, although I can't exactly say he made mine.

When you graduate from college (or high school, for that matter), you have a simple choice facing you. You can either keep learning, or you can stop.

There is an almost unbelievably easy heuristic for knowing whether you're learning. It goes like this: no pain, no gain. Learning is hard. If it's easy, then you're coasting; you're not making yourself better at something fundamentally new that you couldn't do before.

In other words, it's just like working out. You've gotta mix it up. If you're sore every day, then you're getting good workouts (assuming you can tell the difference between "good" soreness and "injury" soreness; if you're not sure, go ask a professional.)

When you do study, there's a lot of appeal to studying what you already know, because it's less painful. And of course to become an expert at any field, you have to focus on it pretty hard for a long time. But cross-training is well established in sports; you don't typically become a world-class baseball player by just playing baseball all the time. You have to do other kinds of workouts and exercises to maximize your strength, agility and endurance gains.

Cross-training improves you every bit as rapidly in other disciplines. That includes programming. If you're cranking out code as easily as breathing, then if you're getting better at all, it's so gradual that you'd never notice it happening. You won't have great insights until you get new perspectives from working hard, even if only occasionally, at stuff other than what you already know.

Being in school full-time is an amazing luxury, one that's hard to appreciate when you're actually there, because learning is painful. But trust me on this one: it's even more painful when all you have is scraps of time here and there.

While you're in school, assuming you make a reasonable effort at applying yourself once in a while, you learn a fantastic amount, and you learn it at a fantastic rate. Later you'll learn at a slower rate; it's pretty much guaranteed. Non-educational activities will inevitably intrude and consume the majority of your time.

Hence the choice. After you graduate, you can either learn a little, or not at all.

If you're in the "not at all" camp, well you've made your choice, and I respect it. You'll probably be happier than I am. I'm tormented by how slowly I have to move as a programmer. I now believe programming languages have failed. All of them. The programming world is a giant body shop, and we're building huge pyramids with millions of years of hard labor. I believe that stuff should be easier than it is, and it pisses me off that most people are so content with the state of the art, because it means they're not helping make it better.

To me, mainstream languages have failed because they're still fundamentally serial, fundamentally early von Neumann, fundamentally single-box languages. They're all vying for the position of "least crappy", and the current winner (whether it's Python, Ruby, Lisp, name your favorite) is just that: the least crappy. Because they're all focused on finding more elegant ways to express mostly-serial computations for crap computers. That, or faking parallelism poorly with threads, if the language supports them at all.

Sure, there have been some interesting attempts at parallel languages. Erlang is one of the better-known ones, and it's actually quite cool. But Erlang has failed too, because you haven't heard of it.

Programming's Biggest Problem

Our industry is in a seriously ugly position, right now, as we speak. Most of the hardware designers are focused on keeping Moore's Law going, because that's where the money is. They want that doubling every 18 months. Today it's probably quite within our reach to get 10x every 18 months, if we'd agree to focus on parallelism (in the massively distributed computing sense.)

But programmers like XML, to the point of focusing on it to an unhealthy extent. Same with C++. And Java. They like these things because they work, and because they like to minimize the amount of crap they have to learn. Because learning is painful. Remember? You might think I've gone way off track, off the deep end, even, but this is the same thread.

Let's face it: a parallel language will have to be radically different if it's to break free of the von Neumann Turing-machine rat race we're in. If we move to cellular automata, or in fact any other parallel computational model that's resilient to node failures, then we'll need a new language, because the current serial languages will perform badly, or be horribly hard to manage, or both.

Cell or grid (or whatever) parallel computing will have a radically different internal economy. It'll need new data structures, new algorithms, new instruction sets, new everything. You do realize that John von Neumann was an economist, right? Among (many) other things, he was an economist, and it influenced the design of his first computer, that one right there on your desk.

The computing machine JvN built was created in an environment very similar to the one in the movie Apollo 13, where the folks at Houston had to build a carbon-dioxide remover out of exactly the free junk available on the spacecraft, and then explain it to the crew so they could build their own copy of it.

Johnny went out collected a bunch of engineers: materials engineers, electrical engineers, mechanical engineers, everyone who had some spare junk. They came up with a design for a computation machine that just barely sufficed, one that could be built out of the crap they had available at the time: vacuum tubes, magnetic drums, wire, duct tape.

As he was creating this thing, Johnny was focusing on what he was calling the "internal economy" of the resulting machine. Secondary storage accesses were painfully slow. Memory accesses were faster. Register accesses were very fast. And so on. So he designed representations, data structures, and algorithms that were appropriate for that particular machine, the one he was building from spare parts.

If he'd made the machine out of Brazilian rainforest army ants, or mechanical gears, or falling dominoes with marionettes to pick them up again, his data structures and algorithms would have been very, very different. There are some commonalities, sure — you need numbers, and arithmetic, and functions, and data storage, and sorting, and so on. But what's most efficient for army ant computers isn't going to be most efficient for vacuum tube computers.

You do realize you can make a computing machine out of just about anything, right? And they don't all have to work like Turing machines do. Turing was one of the greatest geniuses of the century, but he'd have been the first person to tell you that there are infinitely many machine designs capable of the same computations. His was just one model, and his professor's (which led to Lisp) was just one other model. But who's to say they're the best models?

Some computing machines are more efficient at certain computations than others. Some are more practical to build than others. Some are faster than others. Some are more robust, or more inherently parallel.

You do realize that your brain is such a machine, right? And that it's 100,000 times faster than today's computers at pattern-matching tasks, because while JvN's machine operates serially, your neurons can all fire independently.

Let me give you a hint: your brain's operating system isn't written in C++.

Is our industry ever going to get out of this amazing backwater of a gridlock, this evolutionary dead-end we're in, where we programmers act like army ants, churning out loops and SOAP calls and UML diagrams as if we're weaving cloth with the very fabric of the computational universe?

If it ever happens, and by God I hope to witness it in my lifetime, then the computers and languages and data structures and algorithms are all going to have to change simultaneously. Because a language optimized for a serial computer will perform like crap along some important dimension of performance, probably more than one. But we can't switch wholesale to parallel languages either, because they'll perform like crap on today's computers: again, for some value of "perform" that's not worth discussing here, but it'll be some form of either computer-performance or people-performance.

And programmers are nothing if not fanatically obsessed with performance. Kinda ironic, huh?

Half the irony stems from knowing that there are far more productive languages out there than the ones most of us are using. But most of them perform poorly on our hardware, because these languages are targeting meta-virtual machines, typically "defined" (informally) by the capabilities of the language itself. And if you're not targeting exactly the hardware you're on, the impedance mismatch will slow the language down.

That's the problem with most JVM languages other than Java: they need hardware (think ants! anything can be hardware!) to support operations like long-jumps and tail-call optimization, but the JVM doesn't export those facilities as part of its abstract machine definition.

Same goes for Lisp. It can't get the performance break it deserves because the hardware available today isn't a Lisp machine. They've built them, and I can assure you that C++ would be the loser slug of a language on a Lisp machine. But, alas, performance isn't the only thing programmers care about. They also care about not having to learn anything new.

That's the other half of the irony. Programmers are obsessed with performance, and they'll go to almost any length to fiddle with their algorithms and data representations in order to eek every last cycle and byte from their programs. Any length, that is, except for learning a new language on new hardware. Even if it would get them a thousand-X performance improvement for the same productivity. Or a 1000x productivity improvement for the same performance.

Because they don't want to learn anything hard. No gain, no pain, problem solved.

And that's where we're at. Moore's Law is crap. If we ever want to be 10x as productive and computationally efficient, let alone 1000x, then our whole computing model will have to change at once. Nothing else will do. The incremental approaches have all failed. It's got to be a revolutionary change.

If everything changes all at once, that's going to pose a bit of a problem for the folks on the Zero Learning curve, wouldn't you say? Don't freak out and mail me about this, either, because I'm a pessimist now, at least about this particular topic, and I doubt we'll ever get out of our rut. We're ignoring the First Law of Holes.

You do realize that John von Neumann spent the last 10 years of his life singlehandedly developing a theory of computing based on cellular automata? The computer you're reading this blog rant on was his frigging prototype! He was going to throw it out and make a better one! And then he died of cancer, just like my brother Dave did, just like so many people with so much more to give and so much more life to live. And we're not making headway on cancer, either, because our computers and languages are such miserable crap.

You have no idea the pain I feel when I sit down to program. I'm walking on razor blades and broken glass. You have no idea the contempt I feel for C++, for J2EE, for your favorite XML parser, for the pathetic junk we're using to perform computations today. There are a few diamonds in the rough, a few glimmers of beauty here and there, but most of what I feel is simply indescribable nausea.

Are you beginning to see why I prefer to work with programmers who stay on the Upward Curve after they get out of school? Because even while we're grubbing around in the dirt — just a sorry bunch of illiterate, innumerate cavemen, here in the very heart of the Stone Age of programming — at least these upward-curve programmers give me some hope. Hope that if something better comes along, they'll give it a try, a serious try, the old college try. Or hope, even, that they'll build that "something better" with me.

Fat chance. But hope can keep ya going for a good long while.

Baby Steps

It's all still fun, though. Broken glass and razor blades aren't so bad, when I think about how much worse my lot in life could be, had I been born in a different time or place. I've got it pretty good, and so have you, in all probability.

At my current job1 they feed us and massage us like Kobe cows, and I'm surrounded by unbelievably brilliant people, all way smarter than me, and we're doing great stuff together. Make no mistake: my blog whining is all relative to a totally imaginary future, one which in all likelihood, should it ever come to pass, will be filled with even more whining about totally imagined new futures. It's just in our nature to whine. But really, I have no complaints.

I put a lot of stock in fun. And family. And trying to live my life in such a way that I won't have any major regrets when the game's over. So there's the first part of my schedule: having fun.

If you want to be on an upward curve, just make some time for it, and make it a habit. That's all there is to it. It doesn't matter if you're trying to get better at programming, or math, or fitness, or flying kites, or even humanity's Number One Fear, even worse than the fear of Death: public speaking. You just work your way up, a little at a time.

I can't promise you any satisfaction from the upward curve. You'll get better at a lot of things, and you'll have plenty of interesting insights. You may even get a better job, or build some software that makes you famous, or just have more fun at what you do. But you won't have much time for television. Something will have to give. We all have to choose how to play our time, and it's a zero-sum game.

If, like me, you're dissatisfied with the current state of affairs, well, believe you me, you can find a lot of consolation in a book on math, or machine learning, or compiler construction, or on just about anything that promises to help in some small way.

You do have to learn to put up with the pain of learning, though, and make it a habit to push yourself a little each day.

As far as the actuals go, well, you'll just have to find an approach that works for you personally. You might only be able to devote one quiet hour a week to studying, but like unit testing, or working out, or brushing your teeth, it's better than not doing it at all.

Just try to have fun, and hopefully the rest will fall into place.


Notes

[1] Permit me to assure you that I do not ever speak in any way for my current employer. Like, don't you even think it for one second. They're them, I'm me, and let's leave it at that. In fact, I don't even work there. My friend does. And neither me nor my friend speaks for them. OK? OK.

Saturday, March 18, 2006

The Truth About Interviewing

Warning: the title of this blog entry is very slightly misleading. It really ought to be called "The Partial (At Best) and In Any Case Utterly Biased So-Called 'Truth' About Certain Restricted Kinds of Technical Interviewing By a Complete Bigoted Snobby Jerk Who Doesn't Know What He's Talking About Becuz PHP r00lZ And You Don't Need To Know Anything Else!!!!"

However, that was a wee bit too long, so I shortened it. But consider yourself warned. (And yes, I like the longer title better too!)

Anyhoo...

An anonymous coward commented on one of my blogs:

In regards to the interview chronicles regarding how someone was aweful or saved themselves during job interviews, I'd have to say you're a bit of snob aren't you? Granted the people you plan to hire will be paid substantial amounts of money to do good solid work, I think you're the kind of interviewer who tends to mold the people who work under him into little clones of him. "You must know this for me to consider you a 'true engineer' otherwise, I will boot you out the door." I think it's narrow minded and it's a wonder why more people don't go into the technical fields with the people who are currently populating it. Egoists and snobs even if it is deserved.

<satire>

Well said! Except for the last sentence, which is missing a verb, not to mention any semblance of intelligibility. But other than that, I think we should give this commenter a position as the Editor-In-Chief of the New York Times.

As it happens, I'm in the same boat as the commenter. Really.

See, I've always thought it would be cool to be an airline pilot. But I went to interview for the position, and those damn snobs wanted me to be a miniature clone of their totally biased idea of an "ideal airline pilot." They expected me to know all this crapola about meteorology and electrical engineering and other stuff that obviously has nothing to do with flying a plane. Plus I had to have all these flight hours logged, I mean you just wouldn't believe the requirements. This was obviously just an "old boys" network trying to keep honest people like me out. I'm surprised anyone gets into that field at all.

So then I thought, maybe I should go be a brain surgeon. But they wouldn't even let me apply! They snobbily told me that I had to know a whole bunch more about brains, and surgery, and stuff. I've had high school biology, during which I dissected a frog, plus I'm generally a careful guy. I'm smart, and my hands don't shake much, and I can figure stuff out when I need to. And I was really motivated to do brain surgery. But they didn't consider me "true" brain surgeon material. Of all the nerve! I think those surgical interviewers are just cold-blooded snobs. They don't appreciate my true talents. They just want pathetic little clones of the existing brain surgeons out there.

Those hospital and airline directors aren't the only snobs out there, either, I can tell you from personal experience. See, I can juggle five balls pretty well (here's a short video of me doing it), and yet every time I apply for a position in Cirque du Soleil, I don't even get a reply from them! Talk about snobbery. They don't even write back. What's wrong with my juggling? Nothing! They just want a pretty little clone of Viktor Kee. Granted, they're going to pay their jugglers a lot of money to do good solid work, but I think they're the being fairly narrow-minded by not considering me.

Then I wrote a bunch of letters to Hollywood movie studios, offering to be an animal trainer, since I have a Shih Tzu named Cino, and I can get him to sit sometimes. I mean, that took a lot of work! That little dog hardly listens to a word I say. So I feel pretty qualified to be an animal trainer. Sure, I've never worked directly with a bear or a tiger before, but it can't be all that different. You just shout SIT! and then throw them a biscuit when they finally do it, right? Like, a bear-biscuit or a tiger-biscuit instead of a dog biscuit, but other than that it's gonna be the same, I just know it. But those animal-trainer egoists just booted me right out the door.

So I pondered it, and I thought: you know, I'm not going to go into a field full of egotistical airline pilots and brain surgeons and circus performers and animal trainers. They're just trying to sculpt little perfectly-moulded clones of themselves. Screw them!

I made sure to give them an earful by commenting on all their blogs, anonymously. That'll show 'em! I feel much better for telling them what a bunch of brain animal flying juggling surgery training JERKS they all are. I also left out some verbs, just to throw 'em for a loop. Deserved ego they it even wankers.

So yeah, Mr. Commenter. You've got it pegged. Snob city!

</satire>

A Minor (But Popular) Misconception

Before we get started with the real blog entry, let's get this cleared up: managers aren't the only people who interview you. Most interview loops are conducted by engineers who will be working on the same team as you. Nobody works for me, but my company still makes me interview people.

So we don't want little clones working for us; we want them working with us. </really-end-satire-now>

The Harsh Reality

If you want a job at a company like Microsoft, Yahoo!, Apple, or Amazon.com, they're going to have high standards. It doesn't matter if you "know how to program". They're going to test you on algorithmic complexity analysis, advanced data structures, algorithm design, searching and sorting, internationalization techniques, network protocols, OS-level memory management, parsing and semantic analysis, recursion and mathematical induction, graph theory, combinatorics, programming language theory, machine architecture, discrete math and logic, graphics and window systems, fonts and typesetting, color spaces and representations, databases and query languages, filesystems and storage, embedded systems, device drivers, mobile and wireless protocols, and internet standards and technologies.

If you're lucky, that is.

If you're unlucky, they'll ask you to derive the outline of their Ph.D. thesis on fault-tolerant massively parallel machine-learning systems. Or to solve a grand-unification style computation problem involving telephone switches, grid networks, and third-degree differential equations. Or, God forbid, they'll ask you about the darkest corners of C++ syntax.

And you want to know why they'll ask you about that stuff? Because they're using it every day. They've tried hiring people who don't know this stuff. Believe me, they try all the time. They want to hire more programmers, and they're out there on the constant lookout for new meat. But when they lower their standards, they get burned. The 747 crashes, the patient dies, the juggler drops a bowling pin on someone's head, the tiger rips someone's throat out. In the software world: the service goes down for days, losing them millions; the project gets delivered late or even not at all, losing them contracts and customers; they lose the business battle to competitors who hired better engineers.

Putting together a pretty Ruby on Rails site is no small feat. Learning to program is no small feat. Many people try and fail to get even that far. But it's 3- to 5-ball juggling, and it just doesn't cut it for the Cirque du Soleils and private hospitals of the software industry. I'm sure you want to be a racecar driver, a hang-gliding instructor, a corporate lawyer, a movie sound editor, a rocket scientist. But you know you don't have the requisite training or experience. Why do you think knowing a little (or even a lot) about programming automatically qualifies you to get hired at Microsoft?

Software companies have excruciatingly high standards, just like any other profession. Those that don't get eaten up by those that do.

Interviewers can't probe on all the important material in an interview — at least, not the way interviews are conducted in our industry today. So they use sampling. They ask a few breadth questions, to get a feel for how well you know the space we're in, and they ask a few depth questions, to see how well you can think on your feet, apply what you know, and come up with creative solutions to problems similar to the ones they encounter daily.

Or so the theory goes. Interviewing is necessarily fairly artificial. I'm not a big fan of the interviewing approach everyone uses, but there's nothing I personally can do about it. So I do my best to evaluate the software equivalents of brain surgeons and rocket scientists by chatting with them for 45 minutes. I have no other choice, nor does any other interviewer out there.

Well, that's not entirely true; there is one thing I can do about it: I can talk about it. I'm as open about the process as anyone I know, and I've blogged about it from time to time. I assure you that my interviewing style is not substantively different from the styles of thousands of interviewers at hundreds of high-tech companies.

If anything, I'm nicer, because I count motivation (that is, the desire to improve oneself) higher than any other factor in an interview. It's how I've managed to get hired at most of my jobs: my enthusiasm showed through, and was sufficient to cover my inevitable mistakes. So I tend to go a lot easier on candidates who show me they're on an upward curve, as opposed to just coasting.

But not all interviewers are nice, and I know lots and lots of interviewers, at many companies, who've decided that they can fully evaluate you based on whether you can solve some particular convex optimization problem (or graph-search problem, or logic problem, or whatever their pet Elephant Question is), and they ask every candidate this question regardless of their background or experience. In fact, I'd estimate that some 10% of all technical interviewers ask the same questions, year after year, and they could care less about your experience.

You should consider yourself lucky that I talk about this process at all, because most companies treat their interviewing strategy as some sort of proprietary secret, and it's hard to get information about it, unless you want to go interview at a bunch of places.

Hint to companies: you're all doing it the same way. Quit being so frigging secretive about it.

The reason I write about it is that I'm very interested in the whole process:
  • I want to become a better interviewer myself.
  • I want to improve the process overall, as it's far from perfect.
  • I want to help candidates prepare better for their interviews.

However, talking about it openly gets people amazingly upset. Many people seem to suffer from selective hearing in this space.

For one thing, people want jobs, and so any hint that someone out there, somewhere, might be asking questions to which they don't know the answers sends them into a spittle-emitting rage. Oddly, they don't seem to mind that they can't get hired as a brain surgeon or an animal trainer. But they think knowing a little programming means they should be a shoo-in for any tech company in the world, and anyone who asks something they don't know is a big mean jerk! (I actually discovered this lovely sentiment after posting recently to the O'Reilly Ruby Blog, so our Anonymous Coward is far from alone in his opinion.)

A friendly note to the Spittle People: Not all software development is the same. Think about that for a sec. You may be really good at web development, but that almost certainly doesn't qualify you to work on an OS kernel. Or you may be really good at OS kernels, but that almost certainly doesn't qualify you to be good at web development.

Our industry has fragmented, just the way the medical industry has, and there are different job roles now. Unless you understand this, you're going to be upset whenever a technical interviewer talks about something you're unfamiliar with. If so, take a deep breath and relax, because they're not talking about your skill set. They're talking about jugglers or hang gliding or something that sounds like what you do, but it's different. You can always go find an interviewer who will probe you on what you care about.

Another group of people who gets upset is other interviewers, because it's very much a stylistic thing. There are no proven techniques in interviewing, and personally I think it's something of a crapshoot. But most interviewers with a reasonable amount of experience seem to think they "have it down", and they think that when I talk about my style, I'm advocating that they change their style. And then they emit plenty of spittle too.

Well, wake up, angry people! There are hundreds of companies conducting technical interviews out there. They're doing it for different job roles, including DBAs, sysadmins, web programmers, unix programmers, windows programmers, java programmers, mobile-device programmers, and many other roles besides. For any given job role, they're all doing it about the same. It's not just me. In fact, those interviewers are YOU, once you get the job.

So if you think you can't get hired into a job you want, be it a software engineer at Boeing or an airline pilot for American, don't tell me about it. Whining at me isn't going to change the realities of our industry. Just go study whatever it is they want you to know.

Finding a Better Job

With all that said, you might wonder whether you want to work at a big software company in the first place. You need to think carefully about your motivations.

If you just want to make money, you can almost certainly find a job that's less technically demanding (or that's technically demanding in different ways) than working at companies like Microsoft. To cite just one of many, many possibilities: you can pick up enough PHP and MySQL to design websites, and go find a desperate hospital or insurance company or school system in need of a web presence, and no doubt it'll pay the bills just as well as Microsoft does.

If money's all you're after, you certainly don't need a computer science degree. Our industry desperately needs more programmers to crank out application code in C#, Java, PHP, Perl, C++, and other popular languages. If you go learn one or two of those languages, you're probably already a shoo-in for many companies.

You should be aware that working at most large tech companies isn't very glamorous, at least not as much as it seems from the outside. For instance, I know a lot of people who love video games, so they assume that game programming must be the most fun kind of programming out there. Then they go get jobs at a huge, mainstream game publisher, only to find that it's actually a bunch of tedious, schedule-driven work, just like the work at all other software companies.

The reality is, delivering good software just keeps getting harder, as customer expectations go up. That's why companies have to be so selective when they're interviewing. Big companies have serious challenges that aren't encountered by smaller companies, including the combinatorial communication (and dependencies) explosion, the need to maintain large legacy code bases, the problems of culture change as the company grows, and the need to find entirely new revenue streams in the face of pressure from Wall Street.

So most big companies are looking for a combination of skills: they need a mix of junior-ish people to help keep the business alive, and senior-ish people to try to help hoist them out of the holes they've dug for themselves internally by growing too fast.

Neither of those sounds very glamorous, does it? In practice, a startup is much more glamorous. You move faster; you're working on a leading-edge idea; you're burning as fast as you can to keep ahead of your competitors and your investor-imposed schedules. You can get brilliant things working, and worry about scaling them and maturing them later. All the fun part is in the startup days.

The trade-off is that you take on a great deal more risk. With a big company, you've got stability, and there's something to be said for that. A few failed startups can make just about anyone pretty risk-averse, at least for a while.

So you have to decide what you want. But like it or not, if you want to work as a software engineer for a big, top-tier, famous tech company, they're going to interview the living hell out of you. Even if you're really good, there's a chance you won't make it in the door.

You do have a certain amount of control over the situation, and this control is NOT something you exercise by commenting angrily on my blogs.

How do you participate in change? Well, first do some research. It's pretty easy to figure out what a company is interviewing for, and what the interviewers are going to be like. Just read the job descriptions on their website. They're usually pretty up-front, saying things like "B.S. in computer science, or equivalent", or they'll tell you exactly which tools and technologies they expect you to have mastered. If you're considering a particular company, then before you interview you should ask your recruiting contact there what kinds of questions to expect from the interview process.

If a company's going to ask you questions that you don't think matter, then don't work there! If you're right, then they'll go out of business, eventually. Or at least you can go throw your weight behind a company that you think has priorities aligned with your own. Vote with your feet!

Once you've actually started working somewhere, you'll eventually have to do interviews of your own. If you don't like their interviewing process, make a stink about it, and maybe they'll listen. Companies don't want false positives or false negatives any more than you do. If you have a great idea for improving the process, let them know. There are all kinds of minor variations you can try. Or maybe you can all agree to experiment with an entirely new process, and see how the resulting hires measure up.

I want to nitpick anyway!

Well, go ahead and comment; that's why I've opened this blog up for anonymous comments. So far, anyway. But blog comments don't actually carry much weight.

If you really want to make a difference, and start changing peoples' minds, go set up your own blog, and express your well-considered opinions there. It's free, and it's easy, and we ALL have access to self-publishing now. If you want to set up an alternate school of interviewing thought, well, I'm quite happy to read what you have to say. Seems like almost anything would be better than the mess we have with interviewing today.

The truth about interviewing is that everyone does it their own way, and it's always about the same: people interview for what they know. Nobody's very good at interviewing for skills they don't know, so the tendency to produce clones is actually quite widespread — the commenter had at least that much right. The best most of us can do is at least try to hire high-quality, generalist clones who pose the least risk of tiger throat-ripping software outages if they're hired.

But please: if you don't know something, don't go bersek on us. Just go read about it on Wikipedia. It's not as hard as you think!

Friday, March 17, 2006

Math For Programmers

I've been working for the past 15 months on repairing my rusty math skills, ever since I read a biography of Johnny von Neumann. I've read a huge stack of math books, and I have an even bigger stack of unread math books. And it's starting to come together.

Let me tell you about it.

Conventional Wisdom Doesn't Add Up

First: programmers don't think they need to know math. I hear that so often; I hardly know anyone who disagrees. Even programmers who were math majors tell me they don't really use math all that much! They say it's better to know about design patterns, object-oriented methodologies, software tools, interface design, stuff like that.

And you know what? They're absolutely right. You can be a good, solid, professional programmer without knowing much math.

But hey, you don't really need to know how to program, either. Let's face it: there are a lot of professional programmers out there who realize they're not very good at it, and they still find ways to contribute.

If you're suddenly feeling out of your depth, and everyone appears to be running circles around you, what are your options? Well, you might discover you're good at project management, or people management, or UI design, or technical writing, or system administration, any number of other important things that "programmers" aren't necessarily any good at. You'll start filling those niches (because there's always more work to do), and as soon as you find something you're good at, you'll probably migrate towards doing it full-time.

In fact, I don't think you need to know anything, as long as you can stay alive somehow.

So they're right: you don't need to know math, and you can get by for your entire life just fine without it.

But a few things I've learned recently might surprise you:

  1. Math is a lot easier to pick up after you know how to program. In fact, if you're a halfway decent programmer, you'll find it's almost a snap.

  2. They teach math all wrong in school. Way, WAY wrong. If you teach yourself math the right way, you'll learn faster, remember it longer, and it'll be much more valuable to you as a programmer.

  3. Knowing even a little of the right kinds of math can enable you do write some pretty interesting programs that would otherwise be too hard. In other words, math is something you can pick up a little at a time, whenever you have free time.

  4. Nobody knows all of math, not even the best mathematicians. The field is constantly expanding, as people invent new formalisms to solve their own problems. And with any given math problem, just like in programming, there's more than one way to do it. You can pick the one you like best.

  5. Math is... ummm, please don't tell anyone I said this; I'll never get invited to another party as long as I live. But math, well... I'd better whisper this, so listen up: (it's actually kinda fun.)


The Math You Learned (And Forgot)

Here's the math I learned in school, as far as I can remember:

Grade School: Numbers, Counting, Arithmetic, Pre-Algebra ("story problems")

High School: Algebra, Geometry, Advanced Algebra, Trigonometry, Pre-Calculus (conics and limits)

College: Differential and Integral Calculus, Differential Equations, Linear Algebra, Probability and Statistics, Discrete Math

How'd they come up with that particular list for high school, anyway? It's more or less the same courses in most U.S. high schools. I think it's very similar in other countries, too, except that their students have finished the list by the time they're nine years old. (Americans really kick butt at monster-truck competitions, though, so it's not a total loss.)

Algebra? Sure. No question. You need that. And a basic understanding of Cartesian geometry, too. Those are useful, and you can learn everything you need to know in a few months, give or take. But the rest of them? I think an introduction to the basics might be useful, but spending a whole semester or year on them seems ridiculous.

I'm guessing the list was designed to prepare students for science and engineering professions. The math courses they teach in and high school don't help ready you for a career in programming, and the simple fact is that the number of programming jobs is rapidly outpacing the demand for all other engineering roles.

And even if you're planning on being a scientist or an engineer, I've found it's much easier to learn and appreciate geometry and trig after you understand what exactly math is — where it came from, where it's going, what it's for. No need to dive right into memorizing geometric proofs and trigonometric identities. But that's exactly what high schools have you do.

So the list's no good anymore. Schools are teaching us the wrong math, and they're teaching it the wrong way. It's no wonder programmers think they don't need any math: most of the math we learned isn't helping us.

The Math They Didn't Teach You

The math computer scientists use regularly, in real life, has very little overlap with the list above. For one thing, most of the math you learn in grade school and high school is continuous: that is, math on the real numbers. For computer scientists, 95% or more of the interesting math is discrete: i.e., math on the integers.

I'm going to talk in a future blog about some key differences between computer science, software engineering, programming, hacking, and other oft-confused disciplines. I got the basic framework for these (upcoming) insights in no small part from Richard Gabriel's Patterns Of Software, so if you absolutely can't wait, go read that. It's a good book.

For now, though, don't let the term "computer scientist" worry you. It sounds intimidating, but math isn't the exclusive purview of computer scientists; you can learn it all by yourself as a closet hacker, and be just as good (or better) at it than they are. Your background as a programmer will help keep you focused on the practical side of things.

The math we use for modeling computational problems is, by and large, math on discrete integers. This is a generalization. If you're with me on today's blog, you'll be studying a little more math from now on than you were planning to before today, and you'll discover places where the generalization isn't true. But by then, a short time from now, you'll be confident enough to ignore all this and teach yourself math the way you want to learn it.

For programmers, the most useful branch of discrete math is probability theory. It's the first thing they should teach you after arithmetic, in grade school. What's probability theory, you ask? Why, it's counting. How many ways are there to make a Full House in poker? Or a Royal Flush? Whenever you think of a question that starts with "how many ways..." or "what are the odds...", it's a probability question. And as it happens (what are the odds?), it all just turns out to be "simple" counting. It starts with flipping a coin and goes from there. It's definitely the first thing they should teach you in grade school after you learn Basic Calculator Usage.

I still have my discrete math textbook from college. It's a bit heavyweight for a third-grader (maybe), but it does cover a lot of the math we use in "everyday" computer science and computer engineering.

Oddly enough, my professor didn't tell me what it was for. Or I didn't hear. Or something. So I didn't pay very close attention: just enough to pass the course and forget this hateful topic forever, because I didn't think it had anything to do with programming. That happened in quite a few of my comp sci courses in college, maybe as many as 25% of them. Poor me! I had to figure out what was important on my own, later, the hard way.

I think it would be nice if every math course spent a full week just introducing you to the subject, in the most fun way possible, so you know why the heck you're learning it. Heck, that's probably true for every course.

Aside from probability and discrete math, there are a few other branches of mathematics that are potentially quite useful to programmers, and they usually don't teach them in school, unless you're a math minor. This list includes:

  • Statistics, some of which is covered in my discrete math book, but it's really a discipline of its own. A pretty important one, too, but hopefully it needs no introduction.

  • Algebra and Linear Algebra (i.e., matrices). They should teach Linear Algebra immediately after algebra. It's pretty easy, and it's amazingly useful in all sorts of domains, including machine learning.

  • Mathematical Logic. I have a really cool totally unreadable book on the subject by Stephen Kleene, the inventor of the Kleene closure and, as far as I know, Kleenex. Don't read that one. I swear I've tried 20 times, and never made it past chapter 2. If anyone has a recommendation for a better introduction to this field, please post a comment. It's obviously important stuff, though.

  • Information Theory and Kolmogorov Complexity. Weird, eh? I bet none of your high schools taught either of those. They're both pretty new. Information theory is (veeery roughly) about data compression, and Kolmogorov Complexity is (also roughly) about algorithmic complexity. I.e., how small you can you make it, how long will it take, how elegant can the program or data structure be, things like that. They're both fun, interesting and useful.

There are others, of course, and some of the fields overlap. But it just goes to show: the math that you'll find useful is pretty different from the math your school thought would be useful.

What about calculus? Everyone teaches it, so it must be important, right?

Well, calculus is actually pretty easy. Before I learned it, it sounded like one of the hardest things in the universe, right up there with quantum mechanics. Quantum mechanics is still beyond me, but calculus is nothing. After I realized programmers can learn math quickly, I picked up my Calculus textbook and got through the entire thing in about a month, reading for an hour an evening.

Calculus is all about continuums — rates of change, areas under curves, volumes of solids. Useful stuff, but the exact details involve a lot of memorization and a lot of tedium that you don't normally need as a programmer. It's better to know the overall concepts and techniques, and go look up the details when you need them.

Geometry, trigonometry, differentiation, integration, conic sections, differential equations, and their multidimensional and multivariate versions — these all have important applications. It's just that you don't need to know them right this second. So it probably wasn't a great idea to make you spend years and years doing proofs and exercises with them, was it? If you're going to spend that much time studying math, it ought to be on topics that will remain relevant to you for life.

The Right Way To Learn Math

The right way to learn math is breadth-first, not depth-first. You need to survey the space, learn the names of things, figure out what's what.

To put this in perspective, think about long division. Raise your hand if you can do long division on paper, right now. Hands? Anyone? I didn't think so.

I went back and looked at the long-division algorithm they teach in grade school, and damn if it isn't annoyingly complicated. It's deterministic, sure, but you never have to do it by hand, because it's easier to find a calculator, even if you're stuck on a desert island without electricity. You'll still have a calculator in your watch, or your dental filling, or something.

Why do they even teach it to you? Why do we feel vaguely guilty if we can't remember how to do it? It's not as if we need to know it anymore. And besides, if your life were on the line, you know you could perform long division of any arbitrarily large numbers. Imagine you're imprisoned in some slimy 3rd-world dungeon, and the dictator there won't let you out until you've computed 219308862/103503391. How would you do it? Well, easy. You'd start subtracting the denominator from the numerator, keeping a counter, until you couldn't subtract it anymore, and that'd be the remainder. If pressed, you could figure out a way to continue using repeated subtraction to estimate the remainder as decimal number (in this case, 0.1185678219, or so my Emacs M-x calc tells me. Close enough!)

You could figure it out because you know that division is just repeated subtraction. The intuitive notion of division is deeply ingrained now.

The right way to learn math is to ignore the actual algorithms and proofs, for the most part, and to start by learning a little bit about all the techniques: their names, what they're useful for, approximately how they're computed, how long they've been around, (sometimes) who invented them, what their limitations are, and what they're related to. Think of it as a Liberal Arts degree in mathematics.

Why? Because the first step to applying mathematics is problem identification. If you have a problem to solve, and you have no idea where to start, it could take you a long time to figure it out. But if you know it's a differentiation problem, or a convex optimization problem, or a boolean logic problem, then you at least know where to start looking for the solution.

There are lots and lots of mathematical techniques and entire sub-disciplines out there now. If you don't know what combinatorics is, not even the first clue, then you're not very likely to be able to recognize problems for which the solution is found in combinatorics, are you?

But that's actually great news, because it's easier to read about the field and learn the names of everything than it is to learn the actual algorithms and methods for modeling and computing the results. In school they teach you the Chain Rule, and you can memorize the formula and apply it on exams, but how many students really know what it "means"? So they're not going to be able to know to apply the formula when they run across a chain-rule problem in the wild. Ironically, it's easier to know what it is than to memorize and apply the formula. The chain rule is just how to take the derivative of "chained" functions — meaning, function x() calls function g(), and you want the derivative of x(g()). Well, programmers know all about functions; we use them every day, so it's much easier to imagine the problem now than it was back in school.

Which is why I think they're teaching math wrong. They're doing it wrong in several ways. They're focusing on specializations that aren't proving empirically to be useful to most high-school graduates, and they're teaching those specializations backwards. You should learn how to count, and how to program, before you learn how to take derivatives and perform integration.

I think the best way to start learning math is to spend 15 to 30 minutes a day surfing in Wikipedia. It's filled with articles about thousands of little branches of mathematics. You start with pretty much any article that seems interesting (e.g. String theory, say, or the Fourier transform, or Tensors, anything that strikes your fancy.) Start reading. If there's something you don't understand, click the link and read about it. Do this recursively until you get bored or tired.

Doing this will give you amazing perspective on mathematics, after a few months. You'll start seeing patterns — for instance, it seems that just about every branch of mathematics that involves a single variable has a more complicated multivariate version, and the multivariate version is almost always represented by matrices of linear equations. At least for applied math. So Linear Algebra will gradually bump its way up your list, until you feel compelled to learn how it actually works, and you'll download a PDF or buy a book, and you'll figure out enough to make you happy for a while.

With the Wikipedia approach, you'll also quickly find your way to the Foundations of Mathematics, the Rome to which all math roads lead. Math is almost always about formalizing our "common sense" about some domain, so that we can deduce and/or prove new things about that domain. Metamathematics is the fascinating study of what the limits are on math itself: the intrinsic capabilities of our formal models, proofs, axiomatic systems, and representations of rules, information, and computation.

One great thing that soon falls by the wayside is notation. Mathematical notation is the biggest turn-off to outsiders. Even if you're familiar with summations, integrals, polynomials, exponents, etc., if you see a thick nest of them your inclination is probably to skip right over that sucker as one atomic operation.

However, by surveying math, trying to figure out what problems people have been trying to solve (and which of these might actually prove useful to you someday), you'll start seeing patterns in the notation, and it'll stop being so alien-looking. For instance, a summation sign (capital-sigma) or product sign (capital-pi) will look scary at first, even if you know the basics. But if you're a programmer, you'll soon realize it's just a loop: one that sums values, one that multiplies them. Integration is just a summation over a continuous section of a curve, so that won't stay scary for very long, either.

Once you're comfortable with the many branches of math, and the many different forms of notation, you're well on your way to knowing a lot of useful math. Because it won't be scary anymore, and next time you see a math problem, it'll jump right out at you. "Hey," you'll think, "I recognize that. That's a multiplication sign!"

And then you should pull out the calculator. It might be a very fancy calculator such as R, Matlab, Mathematica, or a even C library for support vector machines. But almost all useful math is heavily automatable, so you might as well get some automated servants to help you with it.

When Are Exercises Useful?

After a year of doing part-time hobbyist catch-up math, you're going to be able to do a lot more math in your head, even if you never touch a pencil to a paper. For instance, you'll see polynomials all the time, so eventually you'll pick up on the arithmetic of polynomials by osmosis. Same with logarithms, roots, transcendentals, and other fundamental mathematical representations that appear nearly everywhere.

I'm still getting a feel for how many exercises I want to work through by hand. I'm finding that I like to be able to follow explanations (proofs) using a kind of "plausibility test" — for instance, if I see someone dividing two polynomials, I kinda know what form the result should take, and if their result looks more or less right, then I'll take their word for it. But if I see the explanation doing something that I've never heard of, or that seems wrong or impossible, then I'll dig in some more.

That's a lot like reading programming-language source code, isn't it? You don't need to hand-simulate the entire program state as you read someone's code; if you know what approximate shape the computation will take, you can simply check that their result makes sense. E.g. if the result should be a list, and they're returning a scalar, maybe you should dig in a little more. But normally you can scan source code almost at the speed you'd read English text (sometimes just as fast), and you'll feel confident that you understand the overall shape and that you'll probably spot any truly egregious errors.

I think that's how mathematically-inclined people (mathematicians and hobbyists) read math papers, or any old papers containing a lot of math. They do the same sort of sanity checks you'd do when reading code, but no more, unless they're intent on shooting the author down.

With that said, I still occasionally do math exercises. If something comes up again and again (like algebra and linear algebra), then I'll start doing some exercises to make sure I really understand it.

But I'd stress this: don't let exercises put you off the math. If an exercise (or even a particular article or chapter) is starting to bore you, move on. Jump around as much as you need to. Let your intuition guide you. You'll learn much, much faster doing it that way, and your confidence will grow almost every day.

How Will This Help Me?

Well, it might not — not right away. Certainly it will improve your logical reasoning ability; it's a bit like doing exercise at the gym, and your overall mental fitness will get better if you're pushing yourself a little every day.

For me, I've noticed that a few domains I've always been interested in (including artificial intelligence, machine learning, natural language processing, and pattern recognition) use a lot of math. And as I've dug in more deeply, I've found that the math they use is no more difficult than the sum total of the math I learned in high school; it's just different math, for the most part. It's not harder. And learning it is enabling me to code (or use in my own code) neural networks, genetic algorithms, bayesian classifiers, clustering algorithms, image matching, and other nifty things that will result in cool applications I can show off to my friends.

And I've gradually gotten to the point where I no longer break out in a cold sweat when someone presents me with an article containing math notation: n-choose-k, differentials, matrices, determinants, infinite series, etc. The notation is actually there to make it easier, but (like programming-language syntax) notation is always a bit tricky and daunting on first contact. Nowadays I can follow it better, and it no longer makes me feel like a plebian when I don't know it. Because I know I can figure it out.

And that's a good thing.

And I'll keep getting better at this. I have lots of years left, and lots of books, and articles. Sometimes I'll spend a whole weekend reading a math book, and sometimes I'll go for weeks without thinking about it even once. But like any hobby, if you simply trust that it will be interesting, and that it'll get easier with time, you can apply it as often or as little as you like and still get value out of it.

Math every day. What a great idea that turned out to be!

Wednesday, March 15, 2006

Blog Or Get Off The Pot

I hate blogs.

It's not that I hate them, really; I just don't like the "diary" format. It doesn't suit my needs very well.

However, I have this day job, you know, just like you probably do. And that takes up most of my time. If there's any time left, I'd like to spend some amount of it actually blogging, as opposed to dicking around endlessly with software and configuration just to enable me to blog the way I want to. If I'm going to dick around with coding stuff, I'd rather do something other than that.

But I hate blogs. The format's not right. Chronological ordering of my posts just plain sucks. It forces you to dig around through monthly archives, wondering if there's anything good in all the crap I spew out.

What I want is closer to my own personal Reddit, or Digg, or something like that (but not quite like them, either). I don't want you to come to my site and see what I've posted most recently. RSS can take care of that. I want you to see what other people have voted as my best entries. On my site. You shouldn't have to go to Reddit to get a decent directory of my blog. Anyway, you can't; anything that makes it onto Digg or Reddit or del.icio.us is mixed in with everything else, and gets bumped down into oblivion by stuff that's newer and/or better.

Blog entries should be organized by popularity, not time. Or ideally, you can pick either one. Why the hell don't blogs do that?

But it's got to be even better than that. Having a self-organizing browse interface like Digg's on my blog page would be nice, but I want more.

For instance, I want inline comments. Putting everyone's comments at the end is pretty lame. Even worse, most blog packages these days don't even seem to have Slashdot-style threaded commenting. Instead, the comments are ordered chronologically, just like the entries. So the comment threads are invisible, and commenters have to say stuff like "um, actually I was replying to Dave, not Peter", and quote each other heavily -- a whole subculture of commenter hacks, just to fake threading.

Why the hell don't blogs have threaded comments? Sure, sure, some do. But that's still not enough. Not by a long shot. They need threaded comments and inline comments.

What do I mean by inline comments? I want people to point at a particular sentence I wrote, get all excited or pissed off about it, and say "I want to comment on THIS! This point right here!" So when you're reading, you should get to that part, and see a little icon or link that takes you off to a comment thread on that particularly interesting or disputed section.

In other words, the commentary on a blog entry should grow outward, not downward.

And I want versions. I want to make changes to my entries sometimes -- heck, frequently. But that's culturally weird, and feels dishonest to me, because I've sort-of permanently overwritten the old version. I think people have a right to see how my ideas changed over time, after they yelled at me or made brilliant observations or whatever. So people should be able to see the revision history for individual posts.

It's starting to sound slightly Wiki-like, isn't it? Yeah, a little. But what I want doesn't exist today. It's not a blog, and it's not a wiki, but it's similar to both of them. I want an essay-publishing system, basically.

All that revision-history stuff complicates the commenting, of course; each comment has to keep metadata about which blog revision it was talking about. It's even more complicated if you have versioned comments, so users can go back and fix their typos or change their minds. But it's not like that stuff's impossible, is it? Aren't there companies whose full-time efforts are going into making cool blog software?

Blogs have evolved into the dominant form of self-publishing, and yet nobody's doing it right. To me, that can only mean one of two things, both depressing. Either nobody's been clever enough to figure out an interface that actually works for people who aren't just posting their daily cat picture, or web programming is so insanely hard that nobody's been able to get features out fast enough to keep pace with the ideas.

It's probably a little of both. But my God, if they're hurting for ideas, all they need to do is ask. They're bursting from me; I can't keep them in anymore.

Want some? OK. Here's one: I'm sick of global configuration options. Global config options are so 1970s. When I go into Firefox, I want to be able to override every single configuration option on a per-page basis, or even better, with url pattern-matching rules. Doesn't that seem just patently obvious? And yet there aren't any browsers that let you do that. (Or are there? You tell me. Wouldn't it be nice if you could put an inline comment right here letting me and all the readers know?)

Same goes for my blog software: I want per-post configuration. It seems like I should be able to specify different stylesheet templates for each entry if I want — at least for different categories. Technical posts should have a different stylesheet from the posts about my last vacation.

And I should be able to change the settings for how I'm notified about comments on a per-post basis, because I'll care more about some of them than others.

Isn't all this stuff obvious? How can people not think of this stuff?

Oh, and not to put too fine a point on it, but how about a decent content-editing tool? Blog software packages can't seem to get this simple thing right. You pretty much get simple "convert line breaks" behavior, or you can embed HTML tags and screw with them for hours until you figure out how to make the blog HTML renderer behave the way the exact same tags work in every other page in your browser. If you're lucky. Sometimes it's just impossible, and you have to live with their screwed-up interpretation of the spacing before/after an ordered list or a heading element or whatever.

Can it really be that hard to get this stuff right? O'Reilly's group blogs don't even put a frigging blank line after a heading element, so your first sentence is smooshed right up under the heading.

I'm not asking for WYSIWIG here; I realize that's almost impossible given the amazingly crappy mix of browser technology standards we have to work with today. All I'm asking for is something halfway decent, like you get from any Wiki worth its salt. RedCloth would be my personal preference, but gosh, just about any wiki-style markup language would be preferable to the current choices ("convert line breaks" or "embed HTML tags that don't work properly") that most blog systems give you.

So I've been meaning to set up a public blog for nearly a year, and I haven't done it because all the blog-hosting options are just so wrong. I've been struggling with this whole issue -- not having the right self-publishing software -- and wondering whether to try writing it myself, or to just bite the bullet and live with the crap that's out there today.

Giving it the Ole College Try

How would I approach writing it myself? Well, I'd probably use Ruby on Rails. I don't think it's the end of the evolutionary road for web programming, not even close, but damn if it isn't so much farging better than the alternatives. I plan to blog about my vision for web programming of the future at some point. I think you'll like it. But for now, we've got Rails.

With Ruby on Rails, you can actually think about writing your own blog software. That level of grandiose ambition (and it is grandiose if you have as little free time as I do) enters the realm of plausibility. Until recently, it just wasn't a feasible part-time effort, because, let's face it, Web Programming is a Big Crap Sandwich, and we all have to take a bite of it. But Rails makes the idea at least conceivable.

So the other day, I poked at it a little. I set up a Rails server with Apache and SCGI, and then immediately thought -- HMMMMmmmmm, if Rails is so cool, then maybe someone's already written a blog software package in Rails. I could set that up, and since it's Rails, and I'm familiar with it, I could start evolving the software in the directions I want it to go.

Well, it's true. There is one. It's called Typo. But I have no idea if it's cool or not, because you can't install it. I mean, you can, but not in the amount of time I'm willing to allot to the effort. There's almost no documentation, and what docs exist are all completely outdated, because apparently most software developers still don't understand that nobody will use their shyte if they don't document it. The Typo folks are far from the only ones guilty of this.

And as if the Typo doc situation weren't bad enough, Rails doesn't have a model yet for hosting multiple applications in the same Rails codebase. You have to manage them as separate code trees, and handle all the routing in your webserver. Which is pretty lame, if you think about it, because Rails has a really cool routing-rules minilanguage, so why the hell can't you use it for your routing? So I have to keep Typo and any other Rails apps I want to run in their own little jail cells, with Apache as the illiterate prison guard who can't understand complex instructions I want to give it.

Dammit, what's a person to do?

I wonder how software engineers set their priorities, sometimes. Are the Rails folks even thinking about the multiple-app installation problem?

And look at MovableType, which still evidently doesn't have threaded comments, but they support group blogs. But what good is a group blog? They're just weird. I haven't seen one that works well. Maybe there are a few out there. But what percentage of all blogs do they really constitute? It doesn't seem like it was the most important feature to add.

I'm just speculating, though, since I haven't looked at MT in a long time. I'm not a big fan of Perl anymore, and I'd rather throw my weight behind software written in languages I at least like a little bit.

And yeah, I'm sure you'll all want to tell me to please try your blogging software cuz its k00l. But I can't frigging install them all, only to find they suck as bad as Typo did. I've installed MovableType before, and it took at least half a day of tedious labor. I'm sure YOURS will also be a day of work. Forget it; there's no point in wasting precious days evaluating your probably-broken software and digging around your probably-missing documentation. There are plenty of hosts out there, and after looking at Typo, I decided to use one.

Supporting languages I like is partly why I picked Blogger; it's got a mix of Java and Python folks working on it, or so I've heard, and I think Java and Python are a darn sight better than Perl. Python, especially. It's a good language. Also picked Blogger because I have some friends that work there. And I hear it's going to get better at some point in the future. And "Blogger" is a cool name. So I'm willing to live with its inadequacies for now.

So I'm going to start blogging. It was blog, or get off the pot. Posting static HTML pages was ridiculous. (But hey, I honestly didn't think anyone was going to read them. It didn't really become a problem worth thinking about until late December.)

What will I blog about?

I have a lot of stuff to say. For one thing, I've realized that I have more ideas than time. Way more. It's not even close. I've been hoarding ideas for several years now, hoping I'd find time to get around to implementing (or documenting, or publishing, or whatevering) them someday. But it's not going to happen. Software and documentation both take time to write. Lots of time. If I come up with ten ideas that take 5 years each, I'm pretty screwed.

I did a 5-year idea once. I had an idea for a game in 1993. I knew exactly what game I wanted to play. I could envision it and even describe it in great detail. It was a fusion of all my favorite games to date: Nethack, Darker Realms LP Mud, Ultima IV, and Crossfire. A multi-player, tile-based, graphical MUD. I knew it would be a wonderful game, and yet nobody was writing it, or even discussing it. After a year or so, I realized nobody was thinking about it like I was. But I wanted to play the thing, so I decided to buckle down and write it myself. I figured it'd take me about 18 months. I mean, it's a tiles game; how hard could it really be? So I started working on a design in 1996, and started coding in 1997.

18 months, what a joke of an estimate. The game was pretty much "done" the way I'd originally envisioned it after about 5 years. I spent another 2 years pushing it to new levels; after actually having it online, new ideas started really coming in, both from me and from the players. After a total of 7 years, I threw in the towel, because my productivity had slowed to a crawl, all because of Java. God, I wish I'd picked Python. Or Common Lisp, or something that supports dynamic development. But in 1996, Java had great marketing, so that's what I picked. Thanks, Sun! You screwed me!

Half the game's written in Jython now; it simply isn't possible to do an extensible game like that in Java. You either write your own minilanguage or you use a JVM language; those are your only choices. People are adding code to the game right now, as we speak, and they're changing it on the fly. They can't go rebooting the f***ing server with a hundred players online every time they want to make a change. Java just doesn't work for that kind of application. Nor does C++. So all online games use dynamic extension languages. No exceptions.

Five years, that's how long that little idea took. Plus 2 years of adding bells and whistles. And that, folks, is why I'm giving in and using a blog host. Because I have lots of ideas, lots of things to say and to explore, but only a few dozen years left at best. I can't afford to go chasing every 5-year idea that springs to mind. After due consideration of the crap web frameworks and crap blog packages out there, I decided that I don't want to spend my next 5 years implementing my vision for a decent self-publishing system. Someone else can do it.

But those static HTML pages had to go. I had to blog or cut bait.

So here we go. I'm blogging now. I hope you'll like it!