The Hundred-Year Language 730
dtolton writes "Paul Graham has a new article called "The Hundred-Year Language" posted. The article is about the programming languages of the future and what form they may take. He makes some interesting predictions about the rate of change we might expect in programming languages over the next 100 years. He also makes some persuasive points about the possible design and construction of those languages. The article is definitely worth a read for those interested in programming languages."
AI (Score:2, Interesting)
Re:how long (Score:5, Interesting)
\ Word definitions : convicted-of 0 ; \ To convict someone : murder 25 + ; : arson 10 + ; : robbery 2 + ; : music-copying 40 + ; : sentenced-to . ." years of prison" ;
And to use it:
convicted-of music-copying robbery sentenced-to
Output: 42 years of prison This looks quite like english. Of course, you can do that in many languages, but it feels more natural in Forth I think.
Evolutionary dead-end? (Score:3, Interesting)
Er... I don't think that Cobol is an evolutionary dead-end; in the best world, it would be extinct, but it isn't. What makes a language widely used is something that we can't predict right now - we have to watch it evolve over time, and as it grows and matures look at different aspects.
Take architecture for example - new buildings are loved the first five years because of their freshly introduced ideas. After that, all the problems start to appear - mildew problems, asbestos in the walls, and so on. During the next ten years, the child diseases are fixed. It is only a HUNDRED YEARS after the new building (or in our case, the new programming language) can be properly evaluated. The language/building then has either been replaced, or it has survived.
So - the only proper way to measure the successfulness of a programming language is to measure its survivability. Sure, we can do guesstimates along the way:
During introduction: Does the language have a good development environment? Is the language backed/introduced by a market leader?
Somewhere during the "middle years" (after about ten years): Does the language have a large user base? Does the language have a large code base?
After twenty/thirty years: ask the programmers if it really is maintainable...
Well - you get the picture! Predicting the survability of something more than five years into the future is impossible, I'd say.
VB Problems (Score:2, Interesting)
The only way VB will retain any large number of its current userbase is by being completely committed to the .NET infrastructure.
Meanwhile, languages like Java, Python, Perl and PHP will continue to grow and gain more and more users amoung tech savvy individuals.History and Future (Score:5, Interesting)
Re:dead-end? (Score:1, Interesting)
Types (Score:2, Interesting)
This bold statement is not only wrong (cf. Peyton Jones' latest work on macros in Haskell), but also misleading. Let's start off with some opinion: In my opinion, no language without static typing is worth using. The reason is simple: Because I am human. I make mistakes. And I don't want to spend the rest of my life writing test suites to check for errors which even trivial type systems can detect.
I agree with one thing: Languages will become simpler on a mathematical level. Anyone who has used ML or Haskell will have noticed how much easier these are to understand in comparison to any imperative language out there (and, by the way, in Haskell, Strings are lists of characters). But, at the same time, I truly hope that mechanisms for proving properties about programs will become not only more powerful, but also more widespread. I would like to have static verifications of my pre- and postconditions. I would like to verify that the result of my 'sort' function returns a permutation of its input for which each element is less than or equal to its successor. These are the things I'm looking forward to seeing in the future.
What about ASM? (Score:3, Interesting)
My prediction. (Score:4, Interesting)
We're already to the point where it's absurd for a single person to understand the whole of a software project. Things are only going to get worse from here, and the only way out is to let the computers manage the complexity for us. As computers become faster, they'll be able to test out an ungodly number of permutations of a program to see which ones perform the fastest, or give the best results.
Just a speculation. I don't wholeheartedly believe what I just said, but I think it's a bit silly to simply assume that programming languages will be around forever.
Re:how long (Score:4, Interesting)
For unsupervised commands humans tend to create something not all that different from code. A fixed set of grammar and vocabulary come into play (i.e. little slang, and very normalized style). For example:
Employees will update thier status on the In/Out board in the lobby when they will be gone for more then 15 minutes.
which is roughly:
(if (> (expected-completiontime task) 15)
(update-status out))
So the need and utility isn't there.
Abandon complex structures? Never! (Score:3, Interesting)
Strings aren't lists, they're structures.
Most strings use in programs is a holdover from teletype-style programming, where all you could display is a short (ahem) string of characters. Today's string use is a label to a data item, a menu item on a menu, a data object in a domain.
XML -- as clunky as it can seem -- and XUL in particular, are ways of describing user interface to a system as a tree of objects.
So I don't want lists of characters, I want associative structures of objects which can be of many different types, used in the manner required by the program (it's a string, it's a number, it's a floor wax, it's a desert topping).
I'm trying really hard to avoid saying "object-oriented," but objects will become more complex and more abstract. Computers of the future may not have to worry about pixels in an image, but rather know the object itself, where a bitmap is just an attribute of the thing.
Perhaps driver- and compiler-writers will still need stripped-down languages for efficient access to hardware, but as an app programmer and end user, I want the computer to handle statements like,
BUY FLOWERS FOR ANNIVERSARY
Currently, that would be something like
event("Anniversary").celebrate.prepare.purc
That's not nearly abstract enough.
Methinks he was wrong on one point (Score:4, Interesting)
Ummm... how about lichen? our mitochondria? What about the parasitic relationships that become mutually beneficial, such as the numerous bacteria in our gut and on our skin, and then eventually become necessary for life?
Merging actually does happen -- it just doesn't happen in the way he was thinking, that DNA become identical and cross-species fertility occurs. Rather, the two organisms live closer and closer, until they merge.
Come to think of it, although it isn't on the species level, the concept of merging species isn't too different than sexual reproduction.
Gaping Hole - Design Languages (Score:3, Interesting)
Some of us working in the telecommunications industry are already familiar with SDL (Specification and Description Language) [sdl-forum.org] as a tool for designing and auto-coding software. Yes, auto-coding. The SDL design software lets us design a system graphically, breaking it up into sub-components and specifying message flows between those components, and defining state machiens for handling these messages.
Developing software in such manner usually requires a very little coding, as the design tool will turn the design into code. Coding may be required for interfacing with the OS or other entities, though that's improving also.
I'm starting to think as such tools mature, they're going to be the next step up, like the way programming languages were the step up from coding in assembly. They are less efficient, just as BASIC or C is less efficient than pure assembler, but they allow greater focus on a solid and robust design and less requirement to focus on repetitive details.
Imagine being able to take out the step of having to go from a design to code - focus on the design, and you're done.
Re:What about ASM? (Score:3, Interesting)
The few remaining areas for ASM programming - embedded, SSE-like optimisations - are being eroded gradually as processors and compilers get better.
Re:how long (Score:3, Interesting)
Structurally, spoken languages and computer languages are very similar:
Phonetics: sounds
Phonology: sounds in relation to one another
Morphology: words
Syntax: structure (words in relation to one another)
Semantics: meaning
Pragmatics: meaning in context.
Morphology, Syntax and Semantics are shared by human and computer languages. Arguments could be made about phonology, too, but not by me. Some computer langauges might even have pragmatics. (Example of pragmatics: when one says "it's hot in here" one means, 1) it's hot in here and 2) somebody get off their ass and open the damned window). I'm not familiar enough with computing languages to say if a command means one thing in one instance and means something else in another instance, or has two meanings simultaneously.
Human language is full of redundancies. Some computing languages have some redundancies. Perl springs to mind (no wonder... Larry Wall was a linguist) with its "there's more than one way to do it" creed.
I don't think computing languages will reach the full complexity and redundancy of human language. One main reason is because human language is an extension of the human though process. Now, if you want to read the previous posting about the Turing Test, please, feel free....
Re:Awareness... (Score:2, Interesting)
Can we not do this now? All planes have systems that direct them away from other planes when they get too close. If you mean that this will be done without a connection between said objects, then there must be some set of rules that are followed. Very similar to those that we "intelligent" beings use when we drive. For instance:
-I signal to change lanes
-Person in lane over decides whether to let me in
-If I see them back off a bit then I proceed, if the speed up then I get behind them
Everything is logic based. In addition, today we do deal with logic in meaningful ways. What's worth remembering is that all logic is based on AND OR NOT. You can derive any logical expression from these operators. For instance:
R - S = R AND (NOT S).
Essentially "Give email from mom a higher priority" would boil down to the same logic as "if subject ~=
Reductionism, you kidding? (Score:5, Interesting)
To anyone that has studied theoretical computer science and/or programming languages knows that such reductionism is a fallacy. "...the fewer, the better..."
It turns out that its better to strike a balance, where you make the formal mathematical system (what a programming language is after all) as simple as possible, until you get to the point where making it more simple makes it more complicated. Or in other words, making it more simple would cloud the mathematical structures that you are describing.
Here are some examples of reductionism gone too far: Sheffer stroke, X = \z.zKSK, one instruction assembler, etc...
The only logical connective you need is the sheffer stroke... but thats of no use to us as it is easier to more connectives such as conjunction, disjunction, implication, and negation.
The only combinator you need is X, and you can compute anything... but making use of other combinators... or better yet the lambda-calculus is more useful.
Point is that we need more powerful tools that we can actually use, and there is no simple description of what makes one tool better than another. Applying reductionism can result in nothing special.
The true places to look for what the future brings with regards to programming languages are the following:
1. Mobile-Calculi: pi-calculus, etc...
2. Substructural Logics: linear-logic, etc...
3. Category Theory: It is big on structure, which is useful to computer scientists.
strings (Score:3, Interesting)
i think strings mainly exist because of usability considerations - from the developers point of view. they provide a compact notation for "list of characters". furthermore, most languages come with string routines/classes/operators that are a lot more powerful and flexible than their list-equivalent.
efficiency definitely is a consideration, but not the main one.
LISP in 100 years (Score:4, Interesting)
No need to mention they will agree with operators: (defop + a b (+ a b))
That was a joke and you can do similar thing even today. Seriously, I very agree with these three quotes:
So, if there will be a commercial effort to push LISP again to the market as underlying metalanguage then, if not in 100 then in 2 or about years, we may see all programming languages being "LISP-derived". Add here that LISP syntax is semantically much better than XML, but still same parser-unified. The only problem with LISP today is that it's not so "distributed" like Erlang. Fix it and you'll get the language of the nearest future.
---
I don't know the future. I barely remember the past. I see the present very blur. Time doesn't exist. The reason is irrational. The space is irrelevant. There is no me.
Re:Awareness... (Score:5, Interesting)
Travis
Totally wrong (Score:3, Interesting)
If you ever listen to the types of commands they give to their computers in star trek, they are subjective and ambiguous. Any computer capable of understanding such commands would have no need for the crew (as it would quickly realize).
As an alternate prediction, assuming that AI does not compute, is that we will always need people who know how to use computers, and we will always need people who know how to think.
Future languages may free you from pecadillo's, give you greater code reuse and portability, improve the mapping down to machine language, reduce the amount of time/space it takes to express algorithms, and possibly allow a larger degree of algorithmic analysis. What they will not do is free you from the need for programmers.
I seriously doubt that idiots with powerful computers can accomplish anything.
Being able to use a computer is the equivalent of using a weapon such as a sword or spear. Its a weapon for your mind.
Weapons are called force multipliers by the military for good reason: a totally out of shape clumsy slob with a sword is less dangerous than a fit and well trained warrior with a dagger. The same goes for computers, they are force multipliers, but not forces themselves.
Eventually, all our warriors (thinkers) will also be programmers. Not all at the same level or using the same languages and tools, but some sort of programmers for sure.
Java bad? (Score:3, Interesting)
The points he makes about what the good languages are seem to show that Java is indeed a good language. Specifically it has an additional layer that allows for abstraction from the hardware/operating system for portability. It takes care of mundane details for the programmer (garbage collection, no need to worry about dealing with memory directly, etc).
Basically the article seemed to repeat itself a lot and show that Java does indeed have a lot of good qualities that he thinks will be in future languages. He also dismisses Object-Oriented programming as the cause for "spaggetti code" without giving any justification for that statement. Finally, he slips in a nice ad hominium attack there by saying any "reasonably competent" programmer knows that object-oriented code sucks.
I think the author's own biases hurt his argument greatly.
I've said it before, and I'll say it again... (Score:2, Interesting)
Lisp damages the brain.
Having spent a year in University temporarily attached to the cult of Lisp (I'm better now, thanks), I can now spot the tell-tale signs of Lisp-induced brain damage in this "article". Among its more tell-tale signs:
I'm thinking we need to establish a de-programming group for Lisp cult members. Maybe a de-bracketing instead of de-programming?
Re:English and Grammar... (Score:3, Interesting)
Latin:
The colloquial translation:
And, although Latin is inflected but English is only partially inflected, all of the words are identical!
So HAH.
This sort of poetry is common in obfuscated C contests, although the visual lack of distinction between 0 and O and I and 1 is also commonly needed.
-Billy
Java and the Future (Score:2, Interesting)
Java is essentially a re-implementation of C; it's a very C-like language in syntax and semantics. The goal was to do an OO language based on a familiar paradigm. In addition to eliminating some of C's less endearing traits, it brings two things to the table, both of which which will shape future languages:
1. Re-usability. Sun offers you a crapload of very usefull re-usable objects in their JDKs, and people like the Apache project offers you even more. The ability to do insanely complex prrojects with tiny amounts of effort is one reason why Java rules the corporate enterprise. Future languages will start looking more and more like a big box of legos: find the parts you need and plug them together.
2) A universal computing environment: you can't write to the metal in Java, but it's not as slow as interpreted languages like Python and TCL for gigantic computing tasks. Any project, no matter how monolithic and task-optimized, is as portable as the VM is. Anyone who's had to manage a platform migration for key buisiness applications, from VAX to Solaris, say, or worse, from S/360 to Windows, knows the pain of re-implementation. That pain is gone when you use a VM-based language like Java.
Projects like Squeak are looking more and more like Java these days, in terms of re-usability and VM-based platform independance. The only missing piece of the puzzle are popular VM environments not tied to any one vendor: Java needs to cut its dependancy on Sun JDKs, or it will be supplanted by another language that is independant and standards based.
This isn't to say that other languages aren't going to evolve, too, or are useless because they're not like Java. Ayone who programs in the new interpreted scripting languages: PHP, Python, Perl, Ruby, TCL, Scheme, etc, etc, can attest to the power of the that approach to modern computing.
On the other hand, I really don't see any new compiled-to-the-metal languages emerging. Fortran is used for high-performance computing, Forth is used for tiny computers, and C/C++ is used for system programming. It will very likely be the same way in another 20 years, or another 50. The difference is that applications will slowly drift to either VM languages or interpreted languages from binaries compiled from source.
Functional languages will be the survivers (Score:2, Interesting)
Monkees in 100 years (Score:3, Interesting)
You know what, on a second thought I realize that 100 years is not enough for the humankind to move away from being monkees. Thus, Java forever!
There will be THREE new languages (Score:3, Interesting)
1. the target machine architecture
2. the range of expression required by the programmer and/or workgroup
Java is "successful" but it really looks a lot like Algol and Pascal,
as does C++. The range of expression is greater in the newer languages
(object-orientation in Java and C++) but the forte is still that of
expressing algorithms in written form to be used on a stored-program
digital computer.
WILL WE STILL BE PROGRAMMING?
Take one example -- genetic programming. If you had a programming system
where the basic algorithm could learn, and all you had to do is set up
the learning environment, then you'd be teaching rather than programming.
In fact I believe THIS is what most "programmers" will be doing in 100 years. The challenge
will be defining the problem domain, the inputs, the desired outputs; the
algorithm and the architecture won't change, or won't change much, and the
vast majority of people won't fiddle with it.
But if HAL doesn't appear and we aren't all retrained as Dr. Chandra,
I believe we'll still be handling a lot of text on flat screens.
I don't think we'll be using sound, and I don't think we'll be using pictures.
(see below)
So predicting what languages will be like in 100 years is predicated
on knowing what computers and peripherals will be like. I think progress
will be slow, for the most part -- that is, I don't think it will be all
that much different from how it is now.
HOW WILL OUR RANGE OF EXPRESSION CHANGE?
If we relied primarily on voice input, languages would be a lot more
like spoken natural languages; there would be far less ambiguity than
most natural languages (so they'd be more like Russian than like English,
for example) but there wouldn't be nearly as much punctuation as there
is in Java and C++.
If we rely primarily on thought-transfer, they'll be something else
entirely. But I don't think this will come in 100 years.
How is a 24x80 IDE window different from punched cards and printers?
Much more efficient but remarkably similar, really. It would not surprise
me if we still use a lot of TEXT in the future. Speech is slow --
a program body stored as audio would be hard to scan through quickly.
Eyes are faster than ears so the program body will always be stored as
either text or pictures.
Pictures - well, pictorial languages assume too much of the work has
already been done underneath. "Programming isn't hard because of all
the typing; programming is hard because of all the thinking." (Who
wrote that in Byte a couple of decades ago?). I don't think we'll be
using pictures. When we get to the point that we can use hand-waving
to describe to the computer what we want it to do, again we'll be
teaching, not programming.
HOW WILL THE ARCHITECTURE CHANGE?
If the target architecture isn't Von Neumann, but something else,
then we may not be describing "algorithms" as we know them today.
Not being up to speed about quantum computing, I can speak to that
example...but there are lots of other variations. Analog computers?
Decimal instead of Binary digital machines? Hardware-implemented
neural networks? Again, I don't see much progress away from binary
digital stored-program machine in 40 years, and I think (barring
a magical breakthrough) this may continue to be the cheapest, most
available hardware for the next 50-100 years.
SO WHAT DO I THINK?
I think IDE's and runtime libraries will evolve tremendously, but
I don't think basic language design will change much. As long as
we continue to use physical devices at all, I think the low-level
programming languages will be very similar to present day ones:
Based on lines of text with regular grammars and punctuation,
describing algorithms. I predict COBOL will be gone, FORTRAN will
still be a dinosaur, and Java and C/C++ will also be dinosaurs.
But compilers for all 4 wi
Re:Quantum Language (Score:3, Interesting)
But it's what we've got. Human language is, alas, imprecise. But we have more than 50 years of experience with that and we know nothing better is on the horizon. I think you'll be lucky if between now and a hundred years from now, you can teach 10% of the world's population the meaning of the world algorithm, much less the use of an algorithm.
But take heart -- while the computer has been called a relentless judge of incompleteness, the fact is that some of that incompleteness is just due to their bad schooling. Lack of common sense. Lack of context. If we can add that stuff in, maybe the kinds of problems computers give us won't sound like the whinings of a small child, ill-informed about the things in the world that we collectively agree should be 'obvious'. That won't fix everything, but it will fix some things.
For example, most non-computer people are able to take showers in finite time even with "Lather, Rinse, Repeat" written on shampoo bottles. They don't loop infinitely. Maybe in a hundred years, computers won't either because someone will have filled them in on the joke.
And legalese is not inherently required to be expressed as badly as it commonly is, that's just a fashion. Like doctors having bad handwriting. Social pressure would fix that if people were willing to tell their lawyers to go back and rewrite a text in prettier form. (Some probably are too cheap to pay by the hour to have that happen. Then again, if they did, they could perhaps read the result. The lawyer is probably just as happy you can't, just like a high paid computer consultant is often just as happy his clients can't understand the script he's written them, so they'll have to call him for upgrades. Again, not a technical problem, but a social one.)
Article Interesting but not Insightful (Score:4, Interesting)
Languages are built on top of many changes in technology: connectivity, speed, machine type, concurrency, adoption.
Plus, a language is just one codification of a problem solution. The solution can pushed towards any one of several goals: secure, speed, size, reuse, readability, etc.
Different languages have sprung up for just these 2 statements above. What metric is this guy using to measure a language's popularity? LOC still runnning (COBOL?), steadfastness of code (C?), or CTO business choices (a zoo)...? There are so many ways to look at this, just picking a point of single view is misguided reductionism.
We will continue to have a multitude of tools available for getting work done. Cross-breeding of concepts for languages is great, and does happen, but unless you trace decisions of specific prime movers you really can't say where a language comes from.
Anyone can put together a new language, and even get it adopted from some audience. But what gets mindshare for usage are languages that satisfy goals that are popular for the moment. Exploring what those goals will be is impossible, in my mind. What will be popular?
Speech recognition? Image recognition? Concurrency? Interoptibility? Auto-programming? Compactness? Mindshare?
These questions are based on our human senses, our environment, etc. Any sci-fi reader would tell you of the concept of a "race based on musical communication", for example that would base progamming on completely other sets of goals. And so on.
mug
None at all is the logical choice (Score:3, Interesting)
Languages as you understand them will be as dead as the steam powered loom in 50 years. We will have non letter/symbol typed tools to do that in as much as the DVD has 'replaced' live theater as the only way to 'reproduce' entertainment.
Re:how long (Score:2, Interesting)
thing I've seen to that are the sigils in Perl (where %foo can
become $foo when you access an individual value and @foo when you
access a slice), and even that is going away in Perl6. Besides,
that's not really even morphology so much as inflection. Real
morphology would be if spellings of words mutated not based
on meaning but on the adjascent words, or if attaching an affix could
cause changes in the spelling of the rest of the word. This happens
quite a bit in natural language, but I don't know of a single case
of it in any computer language.
Actually, I'm now trying to imagine what that would be like...
and I think I'm getting the willies.
Re:On natural language... (Score:2, Interesting)
People know what they want out of their machines, for the most part.
Dang, what dream-world do you live in? I work with people all the time that have NO idea what their computer is capable of doing for them, and feel the need to conform to what they think it expects because they have no idea what *they* can expect. In the future this may change somewhat, but even besides that, most of them don't really know what solution they need because they really don't understand the problem to be solved (just that there is a problem to be solved) and must *discover* more information about the problem.
Let's explore the implications of this a bit. Suppose we have a scenario:
1. Human states problem: "I need an accounting system for new company XYZ that is starting up. It is a widget manufacturing business that should be able to grow to the size of, say-- IBM with sales, parts supply and support offices all over the world."
2. Computer says "OK, done."
3. Human either spends months discovering what the computer's assumptions are about how the company is to operate and then either adjusting the companies operations to conform (thus making it just like every other equivalent business in the world), OR correcting the programs misconceptions so that it will conform with the companies operation (thus preserving some value added that the companies founders have with regards to new ideas of how to run the business) OR some combination thereof.
Frankly, I don't see any version of step #3 to be particularly effective. Of course, how effective it is may be less of the point than how easy it is to use, but it doesn't strike me as being particulary easy to use either. And this is just a scenario including a relatively common and (presumably) understood problem. What happens when the nature of the problem is not such that there are preconcieved assumptions regarding it that can be leveraged as in this case? I can tell you what happens, the human spends all kinds of time discovering more about the nature of the problem and possible solutions and their implications and side-effects, pretty much like things are today. Sure, new tools will make the process easier and may include spoken input, but all this talk about "human language" interface is mostly utopian ignorance-- you'll have to fix the problems in human language first, in which case you're still "making" the human conform to the computer. Humans can't even talk to *each other* with much understanding, when the misunderstanding a computer is capable of is in the mix it is not going to be a very useful way to solve problems.
Some people may prefer an English interface, but those will be the novice users for the most part. I can type faster than I can speak, and my typing isn't affected by the level of background noise and interruptions I may receive. As much as some tried to invent alternatives for the keyboard (such as the mouse), virtually none have eliminated the need. I have 10 fingers, not just *one* and know how to use them, and alternative inputs must compete with that. The novice who can't type may prefer using a mouse or some other alternative to do everything. "Ease of use" is desirable for novices, and will always be there, but it's not equivalent to "most efficient interface" between a computer and a knowledgable and/or trained user. And the thing about novices is, they don't all stay novices forever-- though some apparently, may prefer to.
Re:Somebody mod this back up (Score:4, Interesting)