incite insight   iconoclastic synchronicity  

Monday, December 17, 2007

Programming: The anachronism that is APL

Originally posted to Slashdot as a response to a comment about the programming language APL as of possible interest for parallel programming on the emerging multicore processor machines.


The programming language APL was totally brilliant for its time. APL is probably the only serious pictographic programming language ever devised, a fascinating characteristic that unfortunately even Kenneth Iverson (the originator of APL) himself gave up on in resorting to ASCII digraphs with his later language J.   I often complain about the fact that programming on our current keyboard, originally adapted from existing non-computing devices as it was, relegated us to resort to the symbols * and / for multiply and divide because proper symbols were not available-- we're still using essentially that 50 or more year old keyboard standard completely for reasons of ancient practicality, not modern elegance.   The original APL included a valiant but failed attempt to change that.

One big problem with APL as a pictographic language is that extending it implies a need for new pictograms over time, and a typewriter keyboard, even redesigned for the language, is a poor candidate for such expansion. The earliest forms of APL kept that problem somewhat in check via the use of overstrikes, so that you didn't have to provide a separate key for each symbol but instead could learn a relatively manageable set of symbols that could be reused in combination to produce the entire symbol set.   However, the use of overstrikes seemed to be out of place on video terminals, and so the innovation of overstrikes gave way to video terminal keyboards with a vast array of stick-on symbols that one would have to learn in order to write programs.

APL is also about the most terse programming language ever devised, a crucially important characteristic at a time when dial-up baud rates were often 110-300, and memory systems were large in size but small in capacity.   When it takes that long for characters to transmit, one-liner programs of a hundred or so pictographic symbols work pretty darn good.

And despite APL's inherent array operations, automatically parallelizing APL is a relatively crude means of taking advantage of multi-core processor systems, as the setup and teardown overhead would require additional logic to determine if the size of the arrays and/or complexity of the operation would warrant it.   Logic that would have to be done even when determining that parallelism is not appropriate for the operation.   While programmer guided parallelism would be more sensible and certainly feasible, even that would be insufficient to resurrect the language from its current status as largely an anachronism of the past confused by some to be ahead of its time.   I'm afraid the peak of its "time" though, was the late 1960s and early 1970s, roughly coincident with the IBM System/360 and System/370 mainframes. Once terminal output went to glass, APL began its divisions and its decline.

APL's real failure has been in the immense difficulty of standardization-- no two people (let alone APL vendors) ever thought about it alike, the various extensions and workspace formats differ significantly, and implementing a full APL is a complex undertaking as language implementations go.   In fact, the APL community still seems to have a hard time deciding whether or not to maintain the pictographic character set or to resort to ASCII in the mistaken belief that it will make it more "acceptable."   In a world where "write once run anywhere" is an important goal, APL fares rather poorly in the transportability department.

While I'm sure there are some APL die-hards that are just waiting for the advent of many-core desktops as the chance for APL to "shine again," I on the other hand, fully expect it to disappear as anything more than a historical curiosity, and fairly soon now as the original die-hards have been entering retirement age already for some time now.   And many of those not quite ready to retire, like Iverson have chosen to move to the J language or some other ASCII based array language, giving up one of the most important things that I think made APL special, if not uniquely valuable-- its pictographic character.

For me, APL will always have a soft spot in my heart, being the first computer language I was ever exposed to and on which I learned to program (on an IBM 2741 selectric terminal).   Long ago I wrote my own APL interpreter and have recently given it somewhat of a facelift for Windows (mine still supports overstrikes, as I don't need the stickers-- I still remember where all the original symbols are).   I use it as a super-Calculator and little else (it remains really good for that).   While I sometimes get wistful about its now misplaced potential, and periodically try to think of ways it might possibly mutate into something important in the modern world, I harbor no real illusions of a resurgence...