From: Paul Koning 1695 To: djgpp mailing list Subject: RE: deadly optimization Date: Mon, 16 Jan 95 14:14:00 PST Encoding: 45 TEXT >With 2.6.x there is no visible difference in speed between gcc and Watcom, >which I think is the best optimizing compiler widely available for 386+. >With such a beast any code handcrafting is a waste of time. Note how much time >you spend writing inline code, and count how much time it can save during >execution... That's a generalization that may often be true but is not even close to true in some cases. High speed real time work may very well benefit substantially from hand coding. About a year ago I worked on a device that did packet switching in 68040 software; I got somewhere between 1.5x and 2x performance compared to C (from GCC) by careful hand-coding. Note "careful". If you don't understand EVERY detail of instruction timing and cache behavior of your particular platform, you don't know enough to do this. However, if you do, and you have skill, and you design the data structures right as well, then you stand to gain a lot. The other question is whether that work is worth it. Sometimes it is; often it will not be. But if you think it is, do the study, don't take the opinion of people who think that compilers are as good as the best programmers. They aren't, not by a long shot. (Note that a 68040 is a simpler architecture than an x86 by a large margin, but that doesn't really change the point. For that matter, it's valid for RISC machines too, popular "wisdom" notwithstanding.) >Then I also hope we will not be stuck with CrazyGlue to x86 CPUs for the rest >of our lives, so investing the precious time to study the architecture of the >ies going to its decline is not very useful. Amen. paul koning pkoning AT chipcom DOT com