Comparing Computer Languages
A cultural haze appears to be surround languages. The choice of a computer language for a software project seems to be in the first place a cultural issue.
Ajaxy Web 2.0 apps might be written in Ruby on Rails, Enterprise applications and SOA architectures in Java, Windows desktop apps in C# and Linux automation is in Bash. Other Unices use ksh and system Daemons are in C?
Is this decision made consciosly? It would appear to be irritating if it was not, given that computers, the thing being programmed are devices of mathematical precision.
Many programmers often seem to learn one language and to remain within their original culture. Same goes for project managers.
Which means that many projects get written in the only language the people at hand know or maybe the (only) language they know best.
Which also means that in case there are better or worse choices for implementation languages to choose from, then many projects will possibly not use the best tools.
I assert that being able to program well in some language also means knowing it well. This knowledge comes mostly from usage and a lot of usage will make you have a lot of experience with the language. Programming that takes a lot of time.
It is said that it takes 5 years of programming to learn C++ - suggesting that C++ is hard. However I think that 5 years is the lower bound for about any contemporary language.
It's not only knowing the basic syntax, it's also knowing how the language behaves once you have a lot of code, how fast the language is executing under different circumstances, what the culprits of the language are, what the traps and problem points are, what surrounding infrastructure there is for the language and all the knowhow surrounding those ...
I have been programming Bash in vim for some 15 years by now and am absolutely amazed about how the learning curve is not stopping and am feeling that I am learning more about theses two tools now than I ever was before.
So is the apprentissage time for a good programmer maybe 20 rather than five years?
If it takes very long to learn to use some tool then staying with it seems to be justyfied, since moving over to a different one means slowly loosing the fluency with it.
So is the above mentioned inertia to switch languages maybe even a fortunate happenstance?
Or does it mean that you are using the same ole hammer for any and every computer task at hand?
My experience with different languages is that some are much, much more effective at solving some type of problem. Possibly ten times less code to write, to maintain and to debug. Which as is being colported seems to mean ten times less effort too.
Does a massive increase in effectiveness beat the advantage of fluency and mastery of an established language?
How is it possible that apparently more effective languages are not replacing the more cumbersome ones in the competition of the marketplace? Programming is extremely expensive so how come it's not the best ones that propagate? Or are they?
My opinion on the issue is somewhat set: i think that there are better and worse languages. Some only for specific tasks, some in niche problem areas, but some generally.
The question is however how to find out, when to use which one. Also of interest is the question of which features of some specific language actually make it more powerful.
So I am setting to find out and to cover some of the ground.
As a first object for study serves the little awql tool which I originally wrote in bash. Awql lets you do simple SQL queries on structured text files.
I'm going to try to translate it to various languages and see what happens.
Tomáš Pospíšek, 2009-09-24