Monthly Archives: December 2008

Season's mournful greetings

Happy holidays, peace and prosperity to all but most of all health for all the children of this world !

May Alex’s soul rest in peace and may his murder be a beacon to all of us for the liberty we have lost.

Advertisements

You can't bribe an algorithm

Recently I read an interesting article on ./ called programmers are expensive. The author does have a valid point especially where he comments about the “silly optimization syndrome”. This syndrome is a programmer affliction whereas a subject drills deeply down to the core of certain parts of the code while missing the trees, the avalanche about to engulf him and his idyllic brook and the mountain falling on his head just behind the afforementioned avalanche.

Still there is absolutely no way one can beat an algorithm or as us old timers call them “clever hacks”. Let us have a thiniking experiment. Suppose you right code that uses SQL dbs for storage, and you use join statements often. Also suppose that the application suddenly needs near real time performance. How do you optimize your queries if one of the tables is small enough to fit in memory ? If we follow the “Throw money at the problem approach” we will definitely have some performance gain. Better disks, better CPUs, more ram. With each and every upgrade we will get a performance boost up to the abilities of hardware but no more. We still want better performance. Time to hack!

antecedent 1) Each join statement is an N*M cartesian product performed on disks.
antecedent 2) Get rid of the join and perform two sql queries instead , store the first in a large enough hash table in ram so that a search in it will be log(M), with some reservations about worst best case performance.
antecedent 3) For each row in the first query lookup the relevant data in the hash table.

Result ) The problem now is N*log(M) where log(M) is performed in ram! That is linear performance, and can be tuned further more. Congratulations you have decreased the problem by a few orders of magnitude, increased your hackness quotient and saved enough money in buying faster disk arrays that you can afford a vacation in Mykonos for next summer !

Moral of the story, money spent on an Algorithm’s text book gives much more value to a programmer than the lates spiffy glitzy silicon/metal fusion.

Controlled Imbalance as method for Innovation Assimilation

It is extremely difficult to communicate innovative approaches , services, methods from the think tanks to lower echelons. The ability to assimilate change and functionality is not the same at all levels of an enterprise.There is enough inertia in things getting done a certain – even dysfunctional – way, that innovation has to fight. Only _AFTER_ one proves that the new approach is better, faster _AND_ easier, then people start to apply themselves to it. Still the assimilation inertia can be a good thing, for example one does not generally want a staunch dependable and predictable employee to start goofing off with let’s say the accounting books, or the delivery routes.

Controlled imbalance seems to be the key factor in innovation management. Put people a bit out of balance in their daily work, and they will eagerly assimilate a new better way of doing things. For example, Turn off that damned mail server that works half of the time and they will find the time to be trained in the new server’s settings. I could go on forever, but it is really easy if you think about it. Stability is what everyone craves for even its false sense. Innovation, especially the disruptive kind, affects operational stability. So if one removes that illusion one can direct people towards a new paradigm.