Object oriented programming is good. Heck, it's great. The code reuse that inheritance brings is a phenomenal paradigm shift for someone coming from strict procedural languages. After you realize just how inane the "Cats are mammals, Jane is a mammal, Jane and Cat both share the same fur attribute" introductions are, you've finally got object oriented programming and you're ready to make some real code that's set out "right".

Problem is this fancy new programming style, even once you "get" it, leads one into a brand new land of infinitely more architectural choices. Depending on the application's size, there really can be a need for a straight-out code architect who may never actually type a line of code him or herself. I mean, this stuff gets messy, even with an awfully small application.

The problem comes at you in two directions. You could, of course, cheese out and write what's essentially a procedural app using oo code. You can very easily pull out repeated code into objects that amount to no more than modules. And you can use globals and high-level objects to keep making the same sort of top-down apps you'd always made. You can even stub out interfaces to decrease how people roll their own objects to accomplish simple tasks. But when you do this, you're probably also missing out on some of what makes oo-style architecture great.

The problem seems to be that most programmers don't use oo quite right even when they try, myself included. Do centralized objects put out listeners in every object (particularly GUIs) that might operate on them? Do GUIs have internal listeners that have hooks back into those same centralized objects that extend on some interface prescribed by the GUI object's constructors? Do you run a simple middleman object to pass parameters back and forth, at risk of that middle-object code looking like the worst assembly language spaghetti you've ever run across? Or do you split up your code into tons of manageable chunks, but risk having good interobject communication?

Trying to code oo right is truly a mess. All I do know is that I'm finding it much tougher to write a good, "normalized" program in Java than it ever was in Visual Basic. It's ultimately just as easy to write something that works in either if you ignore IDE dis/advantages, but much much more difficult to defend why you put what object in which package and why the return values are primitive types instead of specialized objects or vice versa. It looks like I'm going to need to research conventional patterns (like those seen in this book) before I'm going to be able to make awfully good defenses of my application's architecture.

At the same time it's a much more enjoyable challenge, even if my head hurts more often than it did before. OO done right is a thing of beauty indeed. But never before have I more starkly seen the lesson from Mythical Man Month - "Write one to throw away." I like to think of it as, "Write one that works now that you'll heavily refactor tomorrow," but the pain of having something less than perfect in OO-land is much greater than the same type of pain that's less frequent and less severe in VB-land.