The blogs "Dynamic Language Weenies Victorious After All (http://diagrammes-modernes.blogspot.com/2007/07/dynamic-language-weenies-in-default.html)" and "Invasion of the dynamic language weenies? (http://blogs.adobe.com/shebanation/2007/03/invasion_of_the_dynamic_langua.html)" are based on “Invasion Of The Dynamic Language Weenies (http://www.hacknot.info/hacknot/action/showEntry?eid=93)”. Unfortunately http://www.hacknot.info/hacknot/action/showEntry?eid=93 is no longer easily accessible. I could manage to retrieve the article from Internet Archive (http://www.archive.org/).
Computer Science Is The Study Of Principals And How The...
Invasion of the dynamic language weenies
1. "Hacknot: Essays on Software Development" contains 46 essays published on this site between June 2003 and October 2006. Soft
copies of the book may be downloaded for free. Paper-back copies are now available.
Free Magazines:
Profit Magazine: Is distributed to more than 110,000 C-level executives* and provides... Subscribe Free!
SD Times: Is The Industry Newspaper for Software Development Managers.... Subscribe Free!
Infopackets Windows Email News Daily: Get MS Windows and Tech News: Daily... Subscribe Free!
PDFmyURL.com
2. ThinkGeek
Invasion Of The Dynamic Language Weenies
26 Mar 2007
Permalink Printable
Weenie:
Noun
1. a frankfurter or similar sausage
2. informal: one who displays a degree of enthusiasm for some subject or activity disproportionate to the import that most would afford it.
I thought I had become inured to it. And yet it still takes me by surprise whenever I come across further evidence of the near total absence of critical
thinking in the software development arena. The notion of "evidence" seems to be considered naive and outdated by most programmers. It seems
you can make just about any claim that you want and the programming public will eat it up and then later regurgitate it as if it were a self-evident
truth. There appears to be no difference between truth and marketing. Research, reason and rationality have given way to bluster, bravado and bull.
Such were my thoughts upon reading a piece in the February 2007 edition of IEEE Computer entitled "Programmers Shift to Dynamic
Languages" 1 by one Linda Dailey Paulson. The article's basic premise is that dynamic languages (DLs) are experiencing something of a
renaissance at the moment. Even though DLs such as Perl and PHP have been around for many years, with the recent advent of Ruby, DLs have
become the hot new thing. All the kids are using them. The article has a few select quotes from some of these kids, regarding the near supernatural
power of dynamic languages. There's a token skeptic, professor Les Hatton of Kingston University, who supplies what little reasoned thought this
puff piece can lay claim to, and then the brochure — I mean article — ends. This was also the point at which I reached for a bucket.
I've watched with increasing amusement and irritation as the Cult of Ruby has gathered adherents over the last year or so, smiling as the
predictably effervescent enthusiasms of the fan boys frothed and bubbled their way around the urinal of developer discourse. The behavior of the
Rubyists is more than a little reminiscent of what we've seen previously from the Extreme Programming crowd. In fact, a number of the most vocal
Ruby weenies are former XP weenies. But then, they do say that survivors of one cult are quite likely to later become involved with another. I
suppose XP isn't cool now that the buzz has subsided.
But for those missing that "born again" spirit that so pervaded the XP movement, despair not. For it is now available in plentiful supply from the
Ruby community. To illustrate their fervor, here are a few testimonials from members of the congregation, moved by the spirit of Ruby:
PDFmyURL.com
3. I can't put my finger on exactly what I love so much about Ruby, but it makes coding seem cool again.2
Ruby is a fully object-oriented language with a nice, clean syntax, making both development and maintenance faster and easier
than in other languages. Ruby brings back the fun into programming! 3
Vitamin R. Goes straight to the head. Ruby will teach you to express your ideas through a computer. You will be writing stories for
a machine. Creative skills, people. Deduction. Reason. Nodding intelligently. The language will become a tool for you to better
connect your mind to the world. I've noticed that many experienced users of Ruby seem to be clear thinkers and objective. (In
contrast to: heavily biased and coarse.)4
Ruby doesn't need to be more safe. It certainly doesn't need static type checking. What we need is to kick the idiots out, and
educate those with less understanding.5
Finally, languages like Ruby (with all its domain specific flavors, such as Rake and Ruby on Rails) are like red wine. Red wine is
that special gamut of products that demand incredibly high level of devotion and finesse, thus creating its own breed of
aficionados[sic]6
I never thought I'd be able to describe anything related to programming as elegant or beautiful, but sure enough, Ruby on Rails is
all of that and more.7
...the finest and most useful language available today8
If Ruby fever continues to grow unabated, the language will soon be touted as an antidote to poverty, disease and social injustice. Indeed, you can
already clothe the needy with Ruby t-shirts9 and lingerie10.
One expects such immature tripe amongst the Internet free-for-all. But I expected better, or at least hoped for it, from an official publication such as
IEEE Computer . I was therefore disappointed to see unjustified (and in some cases, not even potentially justifiable) claims made by DL
enthusiasts being repeated without challenge or comment.
What really piqued me was the following quote from Stephen Deibel, chairman of the Python Software Foundation11 and CEO of Wingware12, a
company whose principal offering is Wing IDE — an IDE for Python development. In support of the claim that DLs make for more compact code,
the author offers the following quote from Deibel:
Python would use 10 to 20 percent of the amount of code that Java or C++ would use to write the same application.
Upon reading this, I burst out laughing — in response to which my cat, the only other occupant of the room at the time, offered me a very strange
look indeed, as if to say "Did I miss something?" The quote is given without qualification which suggests that it is meant to apply in any
development context. On the face of it, Deibel seems to be claiming that for any given Java or C++ application, one could realize a functionally
PDFmyURL.com
4. equivalent version in Python in one fifth the volume of code or less. I couldn't believe anyone would have the audacity to make so bold a claim.
Actually, I had trouble imagining how it might be true even within limited domains. Though I've principally used Java for the last 10 years or so, and
C/C++ for five years preceding that, I have a basic familiarity with Python, having written a few utilities here and there with it. Reflecting on those
experiences I could see no basis for such a startling claim to code brevity.
But it's not only this claim made for Python that I find impossible to reconcile with my own experience and reason. I find many of the claims made for
other DLs to be equally dubious. I have a basic knowledge of DLs such as Perl and Ruby, and have worked with PHP quite a bit. Reflecting on my
experiences with these, I cannot find any support for the grandiose claims made by proponents of some of these languages either.
But before considering Python as an exemplar of a dynamic language, let's first consider what it means in general for a language to be dynamic.
What Is A Dynamic Language?
There is no canonical definition of a Dynamic Language, but some generalizations are possible. A dynamic language generally possesses one or
more of the following features:
1. Dynamic typing - Variables need not be declared before use. Their type is not determined until runtime.
2. Runtime code modification - Class hierarchies and other logical aspects of the code's structure can be modified on the fly e.g. methods may
be added to classes / objects.
3. Interpretation - Source code is read at runtime, translated into machine code as it is read, then executed.
Dynamic languages include JavaScript, PHP, Tcl, Ruby and Python. They are often referred to as "scripting" languages.
By contrast, static languages have the following general features:
1. Static typing - Variable and parameter types must be explicitly assigned so that they are known at compile time.
2. Fixed code structure - Classes and other code structures are immutable at runtime.
3. Compilation - Code is compiled into some intermediate form, often machine code, which is later executed.
Static languages include C, C++ and Java.
Python exhibits all three of the DL features listed above, but only the first two exercise any influence over code volume (NB: To be precise, Python
interprets compiled byte-code). So if Deibel's claim was generally true, it must be a result of dynamic typing, runtime code modification or some as
yet unidentified Python feature. Which feature, or combination of features, might yield these terrific code savings?
Can dynamic typing result in a reduction in code volume of 80 to 90 percent? It seems highly improbable to me. Suppose I had a piece of Java
code and went through it removing all the type qualifications. Would that reduce the code volume by 80 to 90 percent? No, I don't think so. Maybe
there are other omissions that dynamic typing would make possible, such as obviating adapter-style classes and some interfaces. But even
PDFmyURL.com
5. including such omissions, claiming an 80 to 90 percent code saving seems a bit rich.
As an aside, let me point out that the removal of explicit typing is not without its disadvantages, even though it may reduce code volume to some
extent. Consider that when you remove type qualification from the source code, you're also removing some documentation of your intent. Given the
paucity of documentation in many code bases, it seems unwise to start removing type-related documentation as well. Consider also — if it's good
form to give your identifiers meaningful names to aid in comprehension, isn't it equally good form to explicitly assign them types, for exactly the
same reason?
What about Python's other dynamic feature, runtime code modification. Can it result in a reduction in code volume of 80 to 90 percent? I find this
impossible to assess, as I have insufficient familiarity with self-modifying code. But I might mention that this lack of experience is quite intentional.
Even when I find myself using a language that permits runtime code modification, it is a feature I am loathe to use for it seems to me to hold great
potential for producing a code base that is hard to understand, and therefore difficult to debug and maintain. Additionally, there is significant
cognitive complexity that goes along with dynamically modifying code, not unlike the intrinsic difficulty that accompanies multi-threaded code. In my
view, this level of complexity is to be avoided if possible.
On the whole, no matter how hard I tried, I simply could not convince myself that dynamic typing and runtime code modification, either individually or
in concert, could produce a code base that was "10 to 20 percent" the size of its static equivalent. Either the quote was inaccurate, had been taken
out of context, or Deibel was engaging in some fairly outrageous marketing hyperbolae.
I figured the only way to find out which was the case was to go directly to the source, so I wrote to Deibel enquiring after the authenticity of the quote
and whether it had been taken out of context. Had he really meant to make such a broad claim? If so, what evidence — anecdotal or (preferably)
empirical — did he have to back up his claim? After all, if this claim of an across-the-board reduction in code volume were true, we Java
developers had some serious thinking to do! Not only that, but Fred Brooks' providence might be in doubt, for he once wrote:
Not only are there no silver bullets now in view, the very nature of software makes it unlikely that there will be any - no inventions that will
do for software productivity, reliability, and simplicity what electronics, transistors, and large-scale integration did for computer
hardware. We cannot expect ever to see twofold gains every two years.13
Deibel responded in short order, confirming that indeed the quote had been taken out of context and wasn't intended to "apply to everything, such
as implementing the same sorting or image processing algorithm in Python, C++, and Java." 14
Whew - I knew it must've been a mistake.
"However" , he continued, "it is true for many kinds of development when you consider the application as a whole (rather than a particular
algorithm)" .
Oh dear — so the scope of the claim wasn't all kinds of development, it was just "many kinds of development" . That's not quite the amount of
reservation I had hoped for.
PDFmyURL.com
6. Deibel offered two primary reasons for the code savings he claims exist:
1. The large number of third-party libraries and modules that are available for Python. He claims that these "make many operations that require
a lot of boiler plate code in C++ or Java into a 1-2 line task" .
2. The absence of "static type declaration and syntactic sugar" in Python. "For example," he wrote, " 'duck typing' is often used with no
declared interfaces" .
Let's consider these supporting claims further.
The present IEEE Computer article was discussing the relative merits of static and dynamic languages and Deibel's "10 to 20 percent" quote was
offered in support of the ostensible benefits of DLs. But any saving in code that occurs from the use of third-party libraries has nothing to do with
the dynamic nature of Python. In fact, a static language benefits from such libraries just as much as a dynamic language would. So we can't claim
any such savings as an inherent benefit of Python in particular, or DLs in general.
That leaves us with the absence of "static type declaration and syntactic sugar" as the only pillar supporting the "10 to 20 percent" claim.
Before considering that further, let me point out that Deibel is not the only one enamored of dynamic typing. If the assertions of other DL advocates
are to be believed, the savings it offers are quite staggering. Amongst such advocates is David Hansson, creator of Ruby on Rails, who is quoted
in the IEEE article as saying:
... the time it takes for static typing to enforce all rules throughout the process can make it unreasonably slow to get projects to fruition.
This may be an even bolder claim than Deibel's! While Deibel was content to claim that dynamic typing leads to a reduction in code volume, albeit
a large one, Hansson is claiming that dynamic typing makes a difference at the project level, between a reasonable and "unreasonabl[e]"
completion time! To point to typing as an issue dominating project scheduling is a pretty magnificent claim. Once again, I was incredulous and felt
compelled to question Hansson directly as to the basis for his statements. As Carl Sagan once said "Extraordinary claims require extraordinary
evidence" , and Hansson was certainly making an extraordinary claim.
Like Deibel, Hansson responded promptly, but in less detail. With regard to empirically derived data, he indicated that he "[was] not aware of any
such hard evidence" , but that his statement was based on "personal experience and the experience of others retold" 15. But he added that these
conclusions were "very rarely [obtained] through direct comparison" , but came from "the feeling of 'I get more done faster on this project using
X than I did on the other project using Y." So we're not even talking about a comparison of static vs. dynamic languages here, but a comparison
of entire projects where use of a DL was one of the differentiators. Who knows how many other confounding factors might influence one's
perception that work on one project was more efficient than on another? And who knows what factors might've influenced this "feeling" of faster
progress? We're not even taking measurements or performing a retrospective analyses of code bases ... we're just talking about a "feeling".
Oh dear, I thought. I'm getting a bit of a feeling myself — which is that Hansson's feeling that dynamic typing makes a difference at the project level
are no more significant than my feeling that they don't! In other words, neither of us really knows. Somehow I'd imagined that such categorical
statements as Hansonn's would not be made without some very solid evidence supporting them. I wonder how many other readers of the IEEE
PDFmyURL.com
7. article made that same inference?
The need for some corroborative evidence must have occurred to the article's author, I thought. Surely no one could document such bold (and in the
case of Deibel, precise) claims and not wonder "How do you know that?" So while I was on a roll, I shot off an e-mail to Linda Paulson to see if
there hadn't been some editing of her article that had resulted in the omission of some qualifications and questions regarding these quotes. I got
no response. Hmmm.
But at least Deibel was able to offer some sort of reply to my request for empirical evidence. He pointed me to a paper by Lutz Prechelt entitled
"Are Scripting Languages Any Good?" 16 This paper documents an experiment in which 80 implementations of the same set of requirements
were compared with respect to such qualities as run-time performance, memory consumption, code volume, correctness and development speed.
The results provide a basis for comparison between individual languages and groups of languages. Unfortunately, like much "research" that goes
on in our field, although well intended it is methodologically flawed to such an extent that its conclusions are meaningless. If you read the paper
carefully, you will discover such methodological errors as:
A self-selecting participant group, which introduces bias to an unknown extent.
Data set trimming i.e. exclusion of some participant's entries because "they did not work properly."
Participants self-reporting their work times, often by making retrospective guesses of time spent amidst other activities such as watching TV
and doing housework.
Some participants chose to read the requirements days prior to beginning implementation, while others only read them immediately
beforehand.
Different instructions given to script and non-script programmers. Script programmers were told the broader criteria that their entries might
be judged against, but non-script programmers were told to focus on correctness and not to over-optimize their solutions, but to deliver the
first one that had 'reasonable' performance. Highly inefficient non-script entries were sent back to the author for optimization.
Script and non-script programmers who misinterpreted the requirements and delivered solutions which failed automated acceptance tests
were treated differently. Script programmers were given hints and additional instructions, but non-script programmers were just pointed at a
particular sentence in the requirements and told to read it extremely carefully.
Script programmers were given the test input and correct output files which would ultimately be used for acceptance testing. Non-script
programmers were only given a few such samples, and had acceptance tests run against randomly generated test input files.
Although the author tries to excuse these errors in various ways, he does so only by making unwarranted assumptions about the extent of their
influence over the end result.
Now let's examine some of the specific claims made by dynamic language enthusiasts such as Deibel and Hansson ...
Claim: Dynamic Typing Increases Development Speed
You'll recall that most DLs are dynamically typed. According to DL proponents, this de-emphasis upon typing has some pretty amazing benefits.
You'll recall from the above that both Deibel's claims for DLs rest upon it. But what evidence do we have to support that claim and others such as
the following, again from the IEEE article:
PDFmyURL.com
8. Dynamic languages let developers get results more quickly, said Blackfriars' Howe. Their code is more compact because, for example,
there is no type-declaration overhead, Deibel explained
I don't think we can say that it's intuitively obvious that the omission of "type-declaration overhead" results in a reduction in code volume that is
significant. As mentioned previously, it certainly isn't obvious to me. I accept that it may well lead to some reduction in code volume, but it doesn't
follow that reduced code volume leads to increased development speed (see below, Claim: Reduced Code Volume Increases Development
Speed). There are many interrelated factors influencing overall development speed. It's not valid to consider one factor in isolation.
To reiterate, the abbreviation that brings ostensible benefit to the original author of the code may be an impediment to other's comprehension of
the code. Further, how do we know that the presence of compile-time type checking is not a nasty inconvenience, but an efficient eliminator of even
nastier runtime errors caused by unanticipated type coercions? Without a detailed assessment of the negative effects of dynamic typing, how can
we possibly determine its net effect on development speed? It seems to me that we can't. But it also seems that many DL enthusiasts are willing to
ignore such inconvenient issues, even when directly confronted by them.
For instance, I notice that in "Programming Ruby" , at the top of Dave Thomas' list of techniques for debugging Ruby problems 17 is "Run your
scripts with warnings enabled" — in other words, examine the sort of information that a compiler would already have given you, were you using a
static language! This tendency to exaggerate the advantages of some language feature while deliberately ignoring the disadvantages of that same
feature is characteristic of much DL advocacy.
For my part, I've long recognized that the weakest link in the development chain is not the tools I'm using, but myself. Compared to the cost of my
own mistakes, the cost of the tools I use must be trivial, given that I spend most of my average working day catching problems of my own creation.
How did those bugs get there? Sadly, I put them there. I'd be surprised if the experience is much different for other developers, if they're honest with
themselves. This knowledge guides my choice of tools. I want tools that are going to be most effective at finding the mistakes I make. Therefore I
prefer a statically typed language, because I've found that compiler warnings about type mismatches are generally a sign that I've made some
programmatic or conceptual error. I haven't written what I intended to.
Indeed, the gap between what I wrote and what I intended is where so many bugs lie, that I have come to appreciate the value of being as precise
as possible in specifying my intent. Therefore I use Design By Contract, because each contract is one more opportunity for me to spell out to
myself, the compiler, and future maintainers, exactly what I intend and what assumptions I'm making. I like to use static analysis tools like PMD 18,
that scour my code looking for likely points of failure — a task that is computationally more difficult to perform on a dynamic language. I prefer to
use a language with automatic memory reclamation, because I know from my C/C++ days that I tend to make mistakes with memory allocation and
deallocation, and the resulting bugs can be incredibly difficult to find.
An awareness of my own limitations is the most significant factor in determining my approach to software development as a whole. That's why the
dynamic language features that a weenie would call "flexible" I would call "an opportunity to err".
Claim: Interpretation Increases Development Speed
PDFmyURL.com
9. Most DLs are interpreted rather than compiled. This means that your test/debug cycle no longer includes an explicit "compile" or "link" step which,
DL weenies claim, makes for a "development cycle [that] is dramatically shorter than that of traditional tools." 19 Maybe so, but how significant
an impact does that have on overall development speed? You don't have to wait for the compile/link to occur, but you still have to wait for the thing to
execute so you can test it, and interpreted languages are often slower to execute than compiled ones. So how much do you really gain overall? And
there are other confounding factors. Perhaps the compile-time type checking of a static language serves to catch bugs that a DL won't pick up until
runtime, in which case some of your test/debug cycles for a DL will be spent catching type-related problems that a static language would've
detected before execution began. I say "perhaps" because I just don't know — I'm only speculating. I don't have the data and so I can only guess as
to what the net effect is on development speed. The problem is, the dynamic language weenies don't appear to have the data either; but somehow
they've attained surety in its absence. Again we see the advantages of a feature (interpretation) being extolled but its disadvantages being ignored.
There also seems to be a recurring straw man used regarding the slowness of the compile/link for static languages. The DL weenies like to tell
horror stories of how expensive this operation is so as to make interpretation look even better by comparison. For example:
But I've worked in C++ environments where programmers joked about having to "go to lunch" whenever they recompiled their systems.
Except they weren't really joking.20
I recall doing maintenance and extension work on a system written in C containing approximately one million lines of code. To build that entire
system from scratch took several hours. But that didn't mean that every test/debug cycle we performed was punctuated by that same delay. There
were large parts of the code base that we were not involved in changing, and they were built into libraries once at the beginning of the project, and
thereafter we just linked against them. The net result was that in each test/debug cycle we were compiling only a small fraction of the code base,
then linking against the pre-built libraries. This took seconds, not hours. It's part of basic build management when working with a static language
that you only recompile what you have to. The DL weenies would like you to believe that a burdensome compile/link time is an unavoidable
downside of using static languages when in fact, it is not.
Claim: Reduced Code Volume Increases Development Speed
Many language comparisons performed by DL enthusiasts focus on concision21 as if it were the sole measure of programmatic worth. However
there are many aspects of a code base that contribute to an overall assessment of its condition, volume being only one of them. Structure,
commenting, consistency, naming, performance and maintainability all influence the efficiency with which programmers can work on a body of
code. I would say the primary influencers are the qualities of the programmers themselves.
One of the least influential factors influencing development speed is the amount of typing you have to do. We have IDEs with numerous keystroke-
saving mechanisms built in such as smart editors, keyword expansion, templates, macros and so on. If code volume were significant as an
influencer of overall productivity, then we would expect touch typists to be at a significant advantage to the hunt-and-peck typists which are so
common in our occupation. (I can't imagine "the next big thing" being touch-typing, resulting in IEEE articles like "Developers move to touch
typing" , peppered with quotes from Mavis Beacon). So from the perspective of one writing code, code volume is not as big an issue as you might
think.
PDFmyURL.com
10. But for one reading the code, particularly if they are not the original author, the amount to be read and understood is probably more important — but
to what extent we can only guess. Further confusing the issue is that there are numerous factors effecting total code volume other than the
terseness of the language's syntax. So how can we justify the claim that a particular syntax will necessarily have a particular effect on code volume,
or that code volume in turn will have a particular effect on development speed? We can't.
Yet there is no shortage of syntax-obsessed language comparisons22 of the form:
Here's a toy task implemented in SSL (some static language). Observe that it takes 20 lines of code to implement. Oh, the humanity!
Now here's the same task implemented in SDL (some dynamic language). It only takes 10 lines. Therefore you're twice as productive in
SDL as in SSL.
Even worse, the examples chosen in such comparisons are often selected so as to deliberately highlight the syntactic strengths of the DL in
question. Language design, as with much of design in general, is a tradeoff between opposing design forces. Any particular compromise of these
forces results in a language with syntactic strengths in one area, but corresponding weakness in those areas requiring opposing characteristics.
So for any DL/SL comparison, you can easily arrive at the apparent "result" that your pre-existing bias dictates by carefully choosing the examples
upon which your comparison is based.
Claim: Support From Major Companies Legitimizes DLs
Some advocates, in an effort to bolster the perceived credibility of their favorite language, look for any sign of support that it may have received
from a major corporation as proof that it is not just a passing fad, but is here to stay. Otherwise, they reason, why would this company be investing
in it?
Let's not be so naive. Corporations are out to make money. They are not interested in advancing the software development field, championing an
ideology or improving your productivity, except in so far as those activities will yield some financial return. If a company appears to be hopping upon
your favorite DL's bandwagon, they are simply acknowledging "There is money to be made here." Commercial exploitation is not a hallmark of
engineering worth. It will be in many corporations best interests to actively fuel the hype surrounding a DL in order to generate sales of their own
related products and services. Hence the advent of the execrable job title "Technology Evangelist" .
When the dynamic language hysteria dries up — and it surely will — these same companies will eagerly pursue the very next trend to surface, just
as soon as they can find something to sell to the niche market it has created. We've seen this same pattern so many times throughout the history of
this industry, its reoccurrence is utterly predictable. Recall that when OO first came to the fore, companies everywhere were suddenly remarketing
their existing fare as being object oriented. All you had to do was change the terminology employed in your products (and conveniently, just about
anything can be labeled an "object") and suddenly your marketing strategy had a new impetus. When XP and Agile Methods came along, it was
fashionable for a company to promote itself as "agile"; it made them appear contemporary, as if they were intelligently monitoring advances in the
state of the art and applying them quickly to great advantage; although in reality many were just running mindlessly after the latest buzzwords. And
now that there's some DL brouhaha, it's inevitable that companies are going to be clambering to get on board and portray themselves as
"dynamic".
PDFmyURL.com
11. Claim: As the Problems Change, People Use New Languages
The present IEEE article concluded with a statement from Carl Howe, principal analyst for Blackfriars Communications, a market research firm. He
is also apparently a "longtime Unix coder". He claims:
Languages are tools for programmers. As the problems change, people use new languages.
I suggest that of all the factors driving the current interest in DLs, a changing problem set is about the least important. Consider the newest arrival in
the DL family, Ruby. What sorts of initial efforts are we seeing from that camp, but the reimplementation of solutions to problems already well
solved by other languages - some of them dynamic! In particular, there is a rush to build libraries so that Ruby can claim comparable library support
to Python, and a scramble to provide execution environments so that Ruby might also claim comparable performance to Python23. Is all this frenetic
development activity being driven by domain considerations? Hardly. These developers are playing "catch up" with other languages they perceive
as threatening to their beloved.
Even in the wider development community, the need to solve new problems is rarely to be found as a force driving adoption. Howe seems to be
suggesting that adoption of a new language occurs when a developer finds themselves facing a problem that is not addressed well by existing
languages, forcing them to search further a field for something more suitable. I suggest that the exact opposite is generally true. Enthusiasts start
with the desired solution in mind — the new toy they want to play with — and go looking for a problem to apply it to.
I saw this very effect in action on a recent gig. A particular component of a Java based system was needed to perform some basic data extraction
and translation tasks, shuffling data between one set of tables and another in a relational database. These transformations were necessary to
satisfy some input constraints of a downstream tool. The implementation vehicle of choice? Ruby. Now why on Earth would you choose an
interpreted language, both generally and specifically notorious for poor performance, to perform a processing intensive task such as this? And why
introduce a new language into a project that future maintainers will have to know?
In this instance, there was no great secret as to the real reason Ruby was selected. The weenie in charge of such things had read one too many
blog entries about it and decided he had to "get some of that". Although it introduced a maintenance burden for the company, adding another
language into the mix also guaranteed some measure of job security for him — given that said weenie would be the only one on the premises that
knew Ruby. Fascination with new toys and a quest for job security - these are the forces I see driving the current wave of dynamic language
adoption. The problems being faced have nothing to do with it.
Claim: You Can Assess Productivity By Feel
One of the features distinguishing fact from delusion is the presence of measurable, reproducible evidence that permits independent and skeptical
scrutiny. That's how it is in most domains. But in software land, you don't need data and you don't need measurement. All you need is emotion.
Martin Fowler's description of how he assessed Ruby's effectiveness is as good an indictment of our field's maturity as you will ever find. He asked
some people nearby how they felt, and decided that his subjective assessment of their subjective assessments constituted a business case:
PDFmyURL.com
12. When I ask the question "Do you think you're significantly more productive in Ruby rather than Java/C#", each time I've got a strong
"yes". This is enough for me to start saying that for a suitable project, you should give Ruby a spin.24
Of course, as discussed many times previously on this site, there is no end to the ways that people can fool themselves into interpreting their
experiences in a manner that confirms their preconceptions. Trying to assess a complex factor such as productivity (however you might define it) on
the basis of gut instinct is so naive as to beggar belief. If these were the rules of evidence in court, we'd dispense with the formalities and simply
ask the jury "How do you feel about the defendant?" .
Claim: Syntax Can Be Natural
One of the more desperate claims weenies are prone to making is that their DL of choice has a syntax that is somehow more natural, expressive
or intuitive than those of other languages. Sometimes they'll go even further, claiming that the DL syntax is so intuitive that it enables some sort of
direct mind-CPU connection. For an example of such metaphysical drivel, consider the following passage from Why's (Poignant) Guide to Ruby:
But what do you call the language when your brain begins to think in that language? When you start to use the language's own words
and colloquialisms to express yourself. Say, the computer can't do that. How can it be the computer's language? It is ours, we speak it
natively! We can no longer truthfully call it a computer language. It is coderspeak. It is the language of our thoughts.
Lest you think only Ruby weenies are capable of such twaddle, here's something similar from the Python camp:
Power, elegance, simplicity, equality, liberty, fraternity: This is heady stuff, and it explains the evangelical tone of some Python
programmers. I promote Python because doing so is the right thing.25
When a syntax is described as "natural", what is generally implied is that the syntax is intuitive. If someone unfamiliar with the language were to
read a piece of code written in that language, they would probably be able to work out what the code did. Similarly, if they predicted what sort of
syntax would be used to achieve a particular function, they would probably predict correctly.
Let's put both these claims to the test with a few examples. If you're unfamiliar with Ruby, see how natural the following code seems to you:
[1,3,5,7].inject(0) { |x,y| x + y }
Can you tell what it does? It's an expression whose value is the sum of all the elements in the array [1,3,5,7], which is 16. Let's try this exercise the
other way around. If you had not seen the above, what syntax would you think Ruby uses to sum the elements of an array? Here's a guess:
for every x in [1,3,5,7] { total += x }
Why did I guess this particular construct and not any of the infinite number of alternatives? Because this is what might seem "natural" to someone
with some programming background.
PDFmyURL.com
13. I'm using these rather well chosen examples to illustrate what should be self-evident — all programming languages are arbitrary, artificial constructs
which must be learnt. They cannot be intuited. Further, what constitutes a "natural" syntax is infinite in scope and varies dramatically between
individuals depending on what languages they have been exposed to in the past, amongst other factors. The language designer chose a syntactic
option that was most natural to him at the time. Everyone else will simply have to learn by rote the difference between the designer's conception of
"natural" and their own. But if the syntax has to be learnt by rote, wherein lies the magic? Language syntax is no more natural than any other
abstract notation, such as mathematical formulae, circuit diagrams or UML. Any claim that a DL has a natural syntax is just an attempt to dignify
subjective preference. Any claim that a DL has a more natural syntax that another language is just an attempt to elevate one personal preference
over another.
Claim: A Strength Of My Language Is Its Community
I've never been particularly comfortable with the word "community" when applied in a technical context, even though I use it that way myself, for want
of a suitable synonym. Watching the behavior of a group of techies does not put me in mind of a "community" in the sense that most would use the
term.
I've heard both Python and Ruby weenies lauding the nature of the community surrounding their respective DLs, but can find little evidence to
support these claims. Watching the various online forums dedicated to each, I see only the same uninspiring character that I've come to expect from
such venues. A few people are helpful, a few people are rude, most are engaging in some tacit game of technical one-upmanship with everyone
else, and outbursts of condescension and name calling are frequent.
A particularly egregious example of the Ruby community (or some elements thereof) at work appeared recently in an online magazine called
" bitwise" 26. Bitwise had the audacity to publish an article that was somewhat critical of Ruby, written by one Matthew Huntbach, a lecturer in
Computer Science at the University of London. Huntbach's article, " What's Wrong With Ruby?" 27 was a well-written piece, giving an academic's
perspective on the current surge of interest in DLs. It raised some interesting issues, such as the discrepancy between pedagogic value and
popular appeal that computer science educators struggle with. Huntbach describes his impressions of Ruby as he learnt to use it. He found it to
have qualities very much at odds with the claims made by the weenies. The hype says that Ruby is elegant and intuitive, but Huntbach found it to be
ad hoc, with a syntax that was not particularly obvious at all. He also made mention of "Why's (Poignant) Guide to Ruby" (already mentioned
here), describing it as patronizing, condescending and "horrid". I thought his comments were quite reasonable and to-the-point.
The response of the Ruby community to this article, as evidenced in the comments that follow it, was both revolting and enlightening. I suggest you
read them all to see the weenie mentality in full bloom. The specific criticisms that Huntbach made go largely unaddressed. Instead, there is an
instant resort to personal insult and emotional invective. "It would have been nice if this article had some real content" , exclaims Jason Watkins.
Apparently literacy is not Watkin's forte, for the article had a good deal of content, all of which must surely qualify as "real" . "Dear ghod [sic], I pity
your students" opines the no doubt eminently qualified Piers Cawley. "you fascist! It's pricks like you who delegitimize the field of programming
and suck the joy out of everything!" says one anonymous respondent with a tendency towards over-generalization. Jake, demonstrating a sound
grasp of the self-evident, proclaims "You ask 'how could I teach this to a bunch of first year undergraduates?' The answer is that you explain
what is going on." Another anonymous dilettante replies with "This could almost be a poster child for the adage those that do, do and those that
can't teach" , apparently oblivious to his embarrassing misquote. "biz" , with a perspicacity that cuts right to the heart of the matter, poses the
PDFmyURL.com
14. psychological quandary "Never mind what's wrong with Ruby, what's wrong with you?" Finally Bob puts it all in perspective for us: "The author
seems amazingly ignorant of the fact programming languages have a culture around them. Ruby's culture encourages creativeness and
innovative test-driven techniques. To truly understand a language, you have to see beyond your initial impressions of its overall features." I
wonder if Bob read any of these messages of encouragement from within this culture?
There's more — but you get the drift. Some Ruby weenies would say that these reactions are not representative of the Ruby community as a whole.
I'd say the comment thread as a whole is pretty much a spot on snapshot of Ruby culture. The beautiful irony here is that the fan boys passionate
attempts to defend their object of adoration are actually doing it a huge disservice. I think most people recognize that an emotional response is
often a cover for a lack of substantive argument. Were Huntbach's criticisms unfounded, a dispassionate statement of the facts would be enough to
counter them. By flaming away at the author, the weenies are effectively declaring "We've got nothing!" This is just an example of one language
community in action, but the same observations hold true for any other language community you might name. The beneficent throng, much alluded
to in the language hype, is never to be found when you go looking for it.
Claim: No Harm, No Foul
Perhaps the most disingenuous justification offered for unsubstantiated language hype is the common retort "Where's the harm. At worst, [insert
favorite DL here] is just one more tool in your toolbox" , the idea being that any amount of hyperbolae, evangelical marketing and straight out
fabrication is excusable, as the worst that can happen is that developers will be aware of one more option when they next come to select an
implementation vehicle ... and more options have to be good, right?
This is the same milksop argument used by advocates of "Intelligent Design" (the latest marketing term for Creationism) when they try to justify the
teaching of ID beside evolution in school science curriculums. "Offer students both alternatives and let them decide for themselves on the weight
of the evidence" , they say. When such statements are met with predictable (and justifiable) outrage by the scientific community, they
mischaracterise this indignance as scientific controversy and propose that science teachers should "Teach the controversy" . But in truth, there is
no controversy.
Except for a few crackpots, the scientific community recognizes evolution as one of the most successful and widely supported theories in scientific
history. To put the substance-free ID on a par with evolution and the mass of evidence supporting it is a travesty. ID proponents are not harmless
bystanders with a minority opinion. They are generally religious zealots determined to use ID as a means of promoting their own religious agenda.
Bent on manipulating public opinion, they incur an enormous societal and financial cost by distracting science educators from their real business of
teaching science, forcing them to defend their curricula against church infiltration. Then, just as they are caught with their hands in the filing cabinet,
they feign indignation and try to portray themselves as weaklings who just want to be heard. I hope you can recognize the similarities between ID
advocates and the technical advocates that pollute our occupation. They too are out to push their own agendas (selling books, courses, services,
products or ideology) by exploiting the fascination with novelty that preoccupies so many. Sure ... push Ruby onto your next client, because it feels
like the right thing to do. Let them bear the consequences of your transient fascination. After all, it's just one more option, right? It's not like there are
any ramifications of choosing the wrong tool ... at least, not for you.
Conclusion: For Rationalists
PDFmyURL.com
15. Language advocacy seems to be mainly an exercise in selective observation. You focus only on the positive effects of the design choices behind
your favorite language's construction and deliberately ignore the inevitable negative effects of those same choices. It may sound silly and trite, but it
is also extremely common. DL advocacy is just the latest example of this ongoing phenomena. DL advocates exhort the rapid test/debug cycle that
comes from interpretation, but ignore the slower execution time which comes along with it. They praise dynamic typing for its convenience to the
original programmer, but ignore the loss of documentation for the maintainer. They belabor the benefits of a DL's brevity, and make claims for the
commensurate effect on project duration, ignoring the fact that these claims are unsubstantiated by anything but flimsy and unreliable anecdotal
evidence, influenced to an unknown extent by the personal biases of the observer.
Then along comes a journalist, looking for material for a new article, who adds on their own bias through out-of-context quotation and further
selective observation, and suddenly the breathless enthusiasm of fan boys has become a hot new trend in the pages of IEEE Computer . Sad, isn't
it? There appears to be almost no demarcation between opinion, marketing, reporting and fact. It all blends together into a meaningless melange
of group-think.
The most meaningful contribution of the article in question comes from Les Hatton, in relating the fashion-driven nature of software development:
They [dynamic languages] will appear, hang around for a while, and then disappear, he explained. "This is what happens when fashion
dictates progress rather than engineering concepts such as measurement, validation, root-cause analysis, and defect prevention.
Conclusion: For Weenies
At eight thousand words, this article requires far longer to read than the attention span of the average dynamic language weenie. I know most of
them will have skipped directly to the conclusion to save themselves valuable time; time that could be spent in devoted worship of their favorite DL.
In weenie land, knowledge of a criticism is not a prerequisite for self-righteous indignation. I also know that it doesn't matter how heavily I qualify my
statements and conclusions, weenies will exaggerate them all into straw man absolutes for the purposes of counterargument.
As a convenience to such folk, what follows is an expurgated version of the preceding piece, expressed in weenie-speak (which is characterized
by snide, sweeping conclusions and ferocious responses to slights manufactured in the mind of the reader):
Claim: Dynamic Typing, Interpretation and Reduced Code Volume Increase Development Speed
Reality: No they don't, either individually or together. Development speed is influenced by so many technical, psychological and organizational
factors that the effect of a set of programming language features on overall progress can be, at most, minimal. Further, your arguments in favor of
these language features deliberately ignore the manifest disadvantages of those same features. Software development is much more than just
programming, and programming is much more than coding. Get back in your high chair, from where you might get some perspective.
Claim: Support From Major Companies Legitimizes DLs
Reality: No it doesn't. Companies know that fan boys like you are easy marks - an enthusiastic and indiscriminate market segment ripe for
PDFmyURL.com
16. exploitation. They also know that you might spread your naive enthusiasms into your workplaces, opening up a corporate market for supporting
tools.
Claim: As the Problems Change, People Use New Languages
Reality: As languages change, people remain the same. Software development is now, and always has been, driven by an obsession with novelty,
and that is what drives language adoption. If there is a new problem to solve, that will simply make for a convenient excuse. Your misplaced
enthusiasm simply perpetuates a cycle of self-defeating behaviour that prevents software development maturing into a true profession.
Claim: You Can Assess Productivity By Feel
Reality: No you can't. You're just trying to justify personal preference by hiding it behind a legitimate but definitionally complex term. If you've never
taken measurements, you have approximately no idea what your productivity is like either with or without your favorite dynamic language. You
certainly can't assess the difference between the two.
Claim: Syntax Can Be Natural
Reality: All programming languages are arcane and cryptic, in different ways and to varying degrees. What is perceived as "natural" varies
tremendously between individuals, depending upon their experience and background. Your mischaracterisation of a syntax as "natural" is just an
attempt to retro-fit a philosophy to your personal preferences.
Claim: A Strength Of My Language Is Its Community
Reality: If it it, then you are in deep trouble, for your community appears to be dominated by juveniles who only take time out from self-gratification
long enough to wipe the byproducts off their keyboard, then mindlessly flame anyone who does not share their adolescent enthusiasms.
Claim: No Harm, No Foul
Reality: No Brain, No Pain.
References:
1. Programmers Shift to Dynamic Languages – Linda Dailey Paulson, IEEE Computer, Feb 2007
2. Like Rails, Love Ruby – http://www.markmcb.com/markmcb/blog/Like_Rails_Love_Ruby
3. Dynamic Ruby Gets Ready for the Enterprise - http://www.sdtimes.com/article/column-20060701-01.html
4. Why's (Poignant) Guide To Ruby – http://poignantguide.net/ruby/chapter-2.html
5. Java People Must Be Stupid – http://evang.eli.st/blog/2007/1/22/java-people-must-be-stupid
6. Ruby On Rails Is Like Red Wine – http://lesscode.org/2006/01/08/ruby-on-rails-is-like-red-wine/
7. Why I Love Ruby on Rails – http://v1.garrettdimon.com/archives/why-i-love-ruby-on-rails
PDFmyURL.com
17. 8. Programming Ruby – Dave Thomas et. al, Pragmatic Bookshelf, 2006
9. Stuff For Ruby Programmers – http://www.cafepress.com/mathomhouse/6101
10. "I Love Ruby" Classic Thong – http://www.cafepress.com/iloveruby.27352825
11. Python Software Foundation – http://www.python.org/psf/
12. Wingware - http://www.wingware.com
13. No Silver Bullet: Essence and Accidents of Software Engineering – Frederick P. Brooks, Jr.
14. E-mail to the author – 3 March 2007, sdeibel@wingware.com
15. E-mail to the author – 3 March 2007, david@loudthinking.com
16. Are Scripting Languages Any Good? – L. Prechelt, Universitat Karlsruhe, Germany, 18 Aug 2002
17. Programming Ruby – Dave Thomas et. al, Pragmatic Bookshelf, 2006, pg. 167
18. PMD – http://pmd.sourceforge.net/
19. Programming Python – Mark Lutz, O'Reilly & Associates, 1996 p. 702
20. Ibid – p. 699
21. Python & Java: a Side-by-Side Comparison – http://www.ferg.org/projects/python_java_side-by-side.html
22. A subjective analysis of two high-level, object-oriented languages – http://twistedmatrix.com/~glyph/rant/python-vs-java.html
23. The Impending Ruby Fracture – http://blog.lostlake.org/index.php?/archives/11-The-Impending-Ruby-Fracture.html
24. Evaluating Ruby – http://www.martinfowler.com/bliki/EvaluatingRuby.html
25. Why I Promote Python – http://www.prescod.net/python/why.html
26. Bitwise Magazine – http://www.bitwisemag.com/
27. What's Wrong With Ruby – Matthew Huntbach, 16 Mar 2007, http://www.bitwisemag.com/2/What-s-Wrong-With-Ruby
PDFmyURL.com