Recently I read an interesting article on ./ called programmers are expensive. The author does have a valid point especially where he comments about the “silly optimization syndrome”. This syndrome is a programmer affliction whereas a subject drills deeply down to the core of certain parts of the code while missing the trees, the avalanche about to engulf him and his idyllic brook and the mountain falling on his head just behind the afforementioned avalanche.
Still there is absolutely no way one can beat an algorithm or as us old timers call them “clever hacks”. Let us have a thiniking experiment. Suppose you right code that uses SQL dbs for storage, and you use join statements often. Also suppose that the application suddenly needs near real time performance. How do you optimize your queries if one of the tables is small enough to fit in memory ? If we follow the “Throw money at the problem approach” we will definitely have some performance gain. Better disks, better CPUs, more ram. With each and every upgrade we will get a performance boost up to the abilities of hardware but no more. We still want better performance. Time to hack!
antecedent 1) Each join statement is an N*M cartesian product performed on disks.
antecedent 2) Get rid of the join and perform two sql queries instead , store the first in a large enough hash table in ram so that a search in it will be log(M), with some reservations about worst best case performance.
antecedent 3) For each row in the first query lookup the relevant data in the hash table.
Result ) The problem now is N*log(M) where log(M) is performed in ram! That is linear performance, and can be tuned further more. Congratulations you have decreased the problem by a few orders of magnitude, increased your hackness quotient and saved enough money in buying faster disk arrays that you can afford a vacation in Mykonos for next summer !
Moral of the story, money spent on an Algorithm’s text book gives much more value to a programmer than the lates spiffy glitzy silicon/metal fusion.