Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

The Hundred-Year Language 730

dtolton writes "Paul Graham has a new article called "The Hundred-Year Language" posted. The article is about the programming languages of the future and what form they may take. He makes some interesting predictions about the rate of change we might expect in programming languages over the next 100 years. He also makes some persuasive points about the possible design and construction of those languages. The article is definitely worth a read for those interested in programming languages."
This discussion has been archived. No new comments can be posted.

The Hundred-Year Language

Comments Filter:
  • how long (Score:3, Insightful)

    by xao gypsie ( 641755 ) on Friday April 11, 2003 @10:35AM (#5710336)
    .... until programming languages begin to resemble spoken languages very closely? well, at least those languages with power, not BASIC and its friends. or, is it even possible to concieve, at this point, that there will be languages with the power of C but the syntax of English, SPanish...etc....

    xao
  • Convergence (Score:2, Insightful)

    by archetypeone ( 599370 ) on Friday April 11, 2003 @10:39AM (#5710364) Homepage
    The evolution of languages differs from the evolution of species because branches can converge...

    For species branches can converge too - it's just kind of weird...
  • by chrisseaton ( 573490 ) on Friday April 11, 2003 @10:39AM (#5710365) Homepage
    "Will we even be writing programs in a hundred years? Won't we just tell computers what we want them to do?"

    What the fuck? That's what a programming language is, and that's exactly what we do with them TODAY.
  • dead-end? (Score:2, Insightful)

    by xv4n ( 639231 ) on Friday April 11, 2003 @10:40AM (#5710369)
    Java will turn out to be an evolutionary dead-end, like Cobol.

    dead-end? Java has already spawned javascript and C#.
  • by SeanTobin ( 138474 ) <byrdhuntr@hotmailRASP.com minus berry> on Friday April 11, 2003 @10:43AM (#5710394)
    Who will design the languages of the future? One of the most exciting trends in the last ten years has been the rise of open-source languages like Perl, Python, and Ruby. Language design is being taken over by hackers. The results so far are messy, but encouraging. There are some stunningly novel ideas in Perl, for example. Many are stunningly bad, but that's always true of ambitious efforts. At its current rate of mutation, God knows what Perl might evolve into in a hundred years.
  • Why Change? (Score:3, Insightful)

    by jetkust ( 596906 ) on Friday April 11, 2003 @10:43AM (#5710398)
    Languages will change when computers change. Languages are driven by machine instructions which are mathematical operations done in sequence. If this doesn't change in 100 years, why would we not use C in 100 years?
  • Awareness... (Score:5, Insightful)

    by dmorin ( 25609 ) <{moc.liamg} {ta} {niromd}> on Friday April 11, 2003 @10:46AM (#5710422) Homepage Journal
    I know that's a scary word because it sounds like "self-aware". But I expect that in 100 years one of the inherent aspects of any computer language will be in detecting and working with other devices in a robust manner. In other words, being aware of what is around the programmed device. Not requiring a mandatory connection of type X. Instead I'm thinking about a device that can run just fine by itself, and then if another device of the same sort happens to come within 10 feet, then maybe they automatically attempt some sort of handshake (with encryption up the wazoo, of course) and then have the option of communicating. This would be useful for automatic transmittal of business cards, appointment schedules, and so on. Or it could be more of a client/server thing, where devices that do not have the power to get a certain job done will just naturally plug into "the grid" and request more power. The device won't have to deal with where the computing power comes from or how it is distributed.

    Imagine cars that, before changing lanes, signal to the surrounding cars' navigation systems and they work out for themselves how to let the car into the lane. A computer can be told to slow down, rather than speed up, when someone wants to change lanes. Or detectors in the dotted yellow lines that sense when you changed lanes without signalling, and alert the traffic authority to bump your points (ala Fifth Element).

    I always liked the idea of my PDA phonebook being more of a recently-used cache of numbers instead of a local store. I just punch up a number. If it's one of my commonly used ones, it comes right up (and dials, of course). But if it's not, then my PDA connects to the phone company, gets the information (and probably pays the phone company a micropayment for the service) and now I have that number locally on my PDA until it gets scrolled off if it's not used much.

    Also I expect lots of pseudo-intelligent content filtering software. You'll get 1000 emails a day and your spam filter will not only remove 99% of them, but it will also identify and prioritize the remaining ones. In order for this to be useful there needs to be languages that deal with expression of rules and logic in a meaningful way (far more than just and or not). No one 100 years from now will say "if subject ~= /*mom*/" (or however the hell you say it), they will expect to say "Give email from mom a higher priority", or sometihng very close.

  • Waste of Time (Score:3, Insightful)

    by tundog ( 445786 ) on Friday April 11, 2003 @10:50AM (#5710449) Homepage
    The author starts be describing the effect of moore's law on computing power (i.e. computers will be wicked fast)and then starts ranting about how today's constructs are so inefficient, then admits that inefficiency won't really matter because computers will be wicked fast (And it takes him half the article to impart this wisdom).

    huh!?!?

    This is the kind of mental constipation that is better left for blog sites.

    Somewhere there is parallel between the logic in this article and the dot.bomb busniess model.

  • Lisp... (Score:1, Insightful)

    by Anonymous Coward on Friday April 11, 2003 @10:51AM (#5710453)
    ... has been around for about 50 years already,
    in one form or another ...

    Something to think about. What is it about
    first order functional languages based on
    a clean predicate calculus?
  • by NetSettler ( 460623 ) <kent-slashdot@nhplace.com> on Friday April 11, 2003 @10:51AM (#5710454) Homepage Journal
    I def. think that a new languange based on quantum computing will be at the forefront.

    If after generations and generations of computers, we are still teaching people to talk in computer terms and not yet teaching computers how to talk in people terms, we'll have gone the wrong direction.

    It doesn't matter if quantum technology is used or not, for the same reason as it doesn't matter whether a brain is a parallel or single threaded machine, whether it's made of carbon-based or silicon-based technology, etc. What matters is that it can talk to you, can understand you, and can improve life.

    If you want to know what computer languages should and hopefully will look like in the future, you have only to watch Star Trek. I'm not kidding. The desire to pack computer use into a short TV program has led the authors of that show and shows like it to pare out all but the absolute essentials of describing what you want the computer to do. That is what computer programming should be like, since that's what people programming is like. People don't put up with excess verbiage, and neither should computers.
  • I think that it would be better to call this article "Where Programming is headed" rather than "The Hundred-Year Language". He tries to justify how he can predict the language 100 years into the future...

    It may seem presumptuous to think anyone can predict what any technology will look like in a hundred years...Looking forward a hundred years is a graspable idea when we consider how slowly languages have evolved in the past fifty.

    Hmm...funny, fifty years ago, if I remember my history (since I wasn't alive back then), those relay computers needed rolls and rolls of ticker-taped punch holes to compute math. The language was so-low-level...even x86 Assembly would have been a godsend to them. And he considers something like Object-Oriented Programming a slow evolution?

    All he's doing in the article is predicting what languages will be dead in the future, and which languages won't be. For example, he says Java will be dead...

    Cobol, for all its sometime popularity, does not seem to have any intellectual descendants. It is an evolutionary dead-end-- a Neanderthal language...I predict a similar fate for Java.

    I'll not go there, because predicting the demise of Java is opening another can of worms. But let's just say that he really doesn't support his argument with anything other than anecdotal opinion.

    I say read his article in jest, but don't look too deep into it.
  • Re:Aliens and XML (Score:2, Insightful)

    by damas ( 469487 ) on Friday April 11, 2003 @11:00AM (#5710524)
    well, I believe XML was invented by aliens... and not very sane ones

    what's wrong with

    alien.language = "ptuh"
    ptuh.language.family = {"ptuh", "XML"}
  • by dmorin ( 25609 ) <{moc.liamg} {ta} {niromd}> on Friday April 11, 2003 @11:00AM (#5710530) Homepage Journal
    I think that the question of whether natural language is the "way to go" misses out an important distinction. There will always be users of technology, and creators of new technology, and they must speak different languages. I do not need the same skills to drive a car as I do to build an engine. Being able to type does not make me a novelist. There are two different cultures at work.

    Having said that, I expect that the user language should certainly be natural language -- the "computers should understand people talk, not the other way around" argument. People know what they want out of their machines, for the most part. Whether it is "change my background to blue and put up a new picture of the baby" or "Find me a combination of variables that will result in the company not failing with a probability of greater than 90%", people want to do lots of things. They just need a way to say it. Pretty much every Star Trek reference you'll ever see that involves somebody talking to the computer is an input/output problem, NOT the creation of a new technology.

    It's when you build something entirely new that you need a new, efficient way to say it. Anybody remember APL? Fascinating language, particularly in that it used symbols rather than words to get its ideas across (those ideas primarily being focused on matrix manipulation, if I recall). Very hard for people to communicate about APL because you can't speak it. But the fact is that for what it did, it was a very good language. And I think that will always hold true. In order to make a computer work at its best, speak to it in a language it understands. When you are building a new device, very frequently you should go ahead and create a new language.

  • Re:Why Change? (Score:2, Insightful)

    by GnuVince ( 623231 ) on Friday April 11, 2003 @11:02AM (#5710541)
    The world is getting faster and faster and faster (and so are computers) and people want things faster. So the speed of C will not be as important as the speed of development of a software:

    Boss: I want this software written in 2 hours!
    C programmer: Hum... 2 hours on Pluto?
    Blub programmer: It'll be done in one hour!

    Also, we will want safer languages, because more and more things will rely on software and we don't want crackers to mess things up, do we?

  • Re:Lisp... (Score:4, Insightful)

    by NetSettler ( 460623 ) <kent-slashdot@nhplace.com> on Friday April 11, 2003 @11:03AM (#5710557) Homepage Journal
    I don't think it's the first order functional nature of Lisp that has allowed it to survive, but rather the "late binding" nature of it.

    Static, strongly-typed languages, make the assumption that everything that needs to be known about the world is knowable at compile time. Such programs need to be recompiled (at least) and rewritten (often) because the world changes and either the source program itself or its compiled form needs to accomodate that change.

    Lisp, because it delays many decisions until runtime, and because its runtime tagging accomodates datatypes that are not among the set that was declared at compile time, naturally accomodates changes in the environment around it, and naturally survives well during transitions between old and new ways to do things.

    Static languages often breed static ways of thinking, and often need new static specifications at regular intervals to accomodate the mismatch with how the world really is. Dynamic languages breed dynamic thinking, which (I claim) is more robust over time.
  • OOP (Score:3, Insightful)

    by samael ( 12612 ) <Andrew@Ducker.org.uk> on Friday April 11, 2003 @11:05AM (#5710567) Homepage
    I don't predict the demise of object-oriented programming, by the way. Though I don't think it has much to offer good programmers, except in certain specialized domains, it is irresistible to large organizations.


    Where OOP comes into it's own, in my experience, is with GUIs. The ability to say:

    If ThisScreen.Checkbox.IsTicked
    ThisScreen.OkButton.Disabled = True
    Endif

    is immensely useful. Similarly, the ability to change the definition of your master screen template and have all of the other screens take on it's new properties is something that OOP is designed to allow you to do.

    Similarly, anything where you tend to access things that act like objects in the first place suit it. Being able to say

    CurrentDocument.Paragraph(1).Bold= True

    or

    Errval=MyDatabase.SQL("Select * from mytable where name='Andrew'")
    Print MyDatabase.RecordCount

    has made my life easier on numerous occasions. There are certainly non OO methods of doing the same thing, but I've never found them as flexible.

    People who insist on making _everything_ an object, on the other hand, are idealists and should be carefully weeded from production environments and palced somewhere they'll be happier, like research.
  • Notation (Score:4, Insightful)

    by hey! ( 33014 ) on Friday April 11, 2003 @11:06AM (#5710577) Homepage Journal
    Lisp was a very early, successful language, because it was close to a mathematical notation and easy to implement on primitive computers. I think the uathor expects Lisp to remain a vital evolutionary branch because of its mathemtical roots.

    I'm not too sure though.

    A programming language is a notation in which we express our ideas through a user interface to a computer, which then interprets it/transforms it according to certain rules. I expect that a lot will depend upon the nature of the interfaces we use to communicate to a computer.

    For example, so far as I know people never programmed in lisp on punch cards; it doesn't fit that interface well. It was used on printing terminals (for you young'uns, these were essentially printers with terminals). Lisp fit this interface well; Fortan could be programmed either way.

    If you look at languages development as an evolutionary tree, Python's use of whitespace is an important innovation. However it presupposes havign sophisticated syntax aware editors on glass terminals. It would not have been convenient on printing terminals. Perhaps in 2103 we will have "digital paper" interfaces, that understand a combination of symbols and gestures. In that case white space sensitivity would be a great liability.

    In my mind the biggest question for the future of languages is not how powerful computers will be in one hundred years, but what will be the mechanics of our interaction with them? Most of our langages presume entry through a keyboard, but what if this is not true?

  • Re:Not long... (Score:4, Insightful)

    by bmj ( 230572 ) on Friday April 11, 2003 @11:16AM (#5710666) Homepage

    In fact never. Because while its okay human languages have a few problems

    Well, yeah, but doesn't a computer language suffer from the same pitfalls? If that isn't the case, why do languages tend to "evolve" over time? Why are new languages that borrow elements from other languages so prevelant?

    1) Redundancy, far to many ways to say or do one thing

    Isn't one of the driving principles of Perl "There's more than one way to do it"? Some say this is one of Perl's best feature, other's say it sucks.

    I won't argue with the point of ambiguity. You can remove ambiguity from a "spoken" language by applying rules to it. I do think we're quite far away from being to "speak" a program, but that's because we as a culture have moved away from a _grammar_ of English. Check the courses in a university and see what first year English and Linguistic students are taking. It's not Grammar, it's Grammars. Standard written English is a thing of the past. So we won't base a language on how we actually use our language, but we could base a language on certain grammars of the language. And, isn't that something else that languages like Perl and Python try to do? They try to create more "readable" programs?

  • by Anonymous Coward on Friday April 11, 2003 @11:22AM (#5710726)
    Lightweight and networked. With so many consumer items becoming net-aware now, who knows how many will be so in 100 years? I'd wager a lot of ordinary people will end up doing simple scripting for readily-automated tasks, and they won't want to use C or Perl. Somebody will come out with very natural language-like programming languages to meet this need.
  • LISP (Score:2, Insightful)

    by sohp ( 22984 ) <snewtonNO@SPAMio.com> on Friday April 11, 2003 @11:22AM (#5710731) Homepage
    The whole paper can be essentially summed up so: "The language of the future will be pure LISP".
  • Re:dead-end? (Score:5, Insightful)

    by shemnon ( 77367 ) on Friday April 11, 2003 @11:26AM (#5710753) Journal
    Sorry, Wrong and Wrong.

    Comparing JavaScript and Java is like comparing a Shark to a Dolphin, quite different actually even though both animals live in the sea, and both languages use the letters J A and V. Both have cariovascular systems and both use variables and control structures. But that is basically where the similarities end.

    JavaScript actually started life inside of Netscape as LiveScript, and durring the Netscape 2.0 time frame was re-named to JavaScript to ride the Java bandwagon, but thre is no realtionship at all beyond that. Compile-time type saftey? Java yes JavaScript no. Prototypes? JavaScript yes Java no. eval() of new programming code? One but not the other. Interface inheritance? Again. First Class Methods? yep, not both. Bones? Sharks no Dolphins yes (teeth don't count).

    Now C# and Java, they are at best siblings but java did not beget C#. The namespace structure is straight from Ansi C++, and the primative types include Cisims like signed and unsigned varieties. You don't shed a tail and then grow it back further down the trail. The comparison here is alligators and crocidiles. Very similar but one did not beget the other, it was a closer common parent than the sharks and dolphins.
  • by cthlptlk ( 210435 ) on Friday April 11, 2003 @11:27AM (#5710762)
    [OP]When I say Java won't turn out to be a successful language, I mean something more specific: that Java will turn out to be an evolutionary dead-end, like Cobol.

    Er... I don't think that Cobol is an evolutionary dead-end; in the best world, it would be extinct, but it isn't. What makes a language widely used is something that we can't predict right now - we have to watch it evolve over time, and as it grows and matures look at different aspects.

    "Extinct" & "evolutionary dead-end" aren't quite the same; the dinosaurs are extinct, but there are species today evolved from dinosaurs, like birds.

    Maybe nobody uses algol now, but algol has living descendants. Cobol doesn't, and I think it's pretty safe to say that there won't come a time when someone says "hey, let's use the cobol paradigm!"
  • by Chinthe ( 665303 ) on Friday April 11, 2003 @11:28AM (#5710772)
    No!

    FORTRAN programmers can ONLY write FORTRAN in any language.

    Real programmer write real programs in languages that fit the task.

    You know it's the truth!

  • by Anonymous Coward on Friday April 11, 2003 @11:51AM (#5710956)
    no, you don't remeber your hist very well at all.

    using round numbers, he is talking about the fifties, although really he probably wants to include the sixties.

    So what did we have? Among others: Fortran, which is still around and has influenced many designs. Algol, which begat c, java, c++, c#. Lisp, which introduced FP and most (certainly not all) of the interesting ideas that somewhat mainstream languages like python, ruby, perl are starting to pick up on 30+ years later.

    I know you weren't paying attention, but OO came in the 60's, and was developed *far* beyond anything seen today in mainstream production languages by the early 80's. (smalltalk, New Flavours, CLOS)

    Most of what mainstream programmers think of as the history of language ideas is complete drek, because they make the mistake of thinking that the first time they see a company hyping an idea has any relationship to when the idea was arrived at.

    If you had actually read the quoted sentenc fo comprehension, you would understand that he didn't say that Java would be dead, he said that it was an evolutionary dead end.

    Not the same thing. Java is a fairly direct evolutionary descendant of Algol. Cobol, a contemporary of Algol, has no evolutionary descendants.

    What he said is that the languages of 100 years from now will not *descend* from Java, any more than the languages of today descend from Cobol. I wouldn't be surprised if there were Java programs around in 100 years, but that is the nature of legacy systems, not an interesting insight.
  • by billtom ( 126004 ) on Friday April 11, 2003 @11:56AM (#5710998)
    Hey, I actually read the article and there's a key point that Graham makes that I don't agree with.

    He makes the point to separate the details of a language into "fundamental operators" and "all the rest" then goes on to say that languages which last and have influence on future languages are the ones that minimize the number of fundamental operators. And then gives examples of things that are fundamental operators in many languages that he feels we don't need (e.g. strings, arrays, maybe numbers).

    He doesn't have much to say about "all the rest". Presumabily he would move strings into "all the rest" since we would still want our languages to have functions to manipulate strings (if you think that I'm ever going to write a string tokenizer function again, you've got another thing coming).

    But, I think that the basic concept of splitting up a language into these two parts is fundamentally flawed. The line between the core of the language and all the accompanying libraries of code has broken down completely. It was already falling apart in C (does anyone program C without assuming that the standard I/O library is available?). But with Java and C# the distinction is almost completely gone. Programming languages have become complete environments were you can assume that tons of libraries are naturally going to be available. And separating out a language's "fundamental operators" and it's "all the rest" is an artificial division that doesn't really work.
  • by cosmosis ( 221542 ) on Friday April 11, 2003 @11:58AM (#5711011) Homepage
    Well, nothing like what we have now. Assuming we survive the coming nanotech era, by 2100 computers and human brains will have totally merged. Thought itself will be the computer language of the future. Of course these 'thoughts' will be as far beyond both our current consciousness and computer languages, as we are beyond an insects.

    Planet P Blog [planetp.cc]
  • by Anonymous Coward on Friday April 11, 2003 @11:59AM (#5711020)
    Strings aren't lists, they're structures.

    That can be represented as ordered lists of characters.
  • Re:how long (Score:5, Insightful)

    by grumpygrodyguy ( 603716 ) on Friday April 11, 2003 @12:07PM (#5711082)
    Spoken language is far too full of grammatical bodges and fixes to become a structure logical enough for a programming language

    This is a false and limited conception of the original poster's intent. Imagine having an A.I. on a PDA-type device that you carry with you from the age of 4. The PDA has a 100Terabye HD, and records/monitors your spoken words, actions, etc. After 20 or 30 years of this, your PDA probably knows you better than anyone. So if you tell your PDA "make a cool program that looks like this, and does this" there's a very good chance it understands what you mean.

    Think about police sketch artists. They take vague, half remembered information...and turn it into a very accurate rendering of the original image. You have a vague idea in your mind of what you are describing, and you can't see what he/she is drawing. So you describe the person...and 5 minutes later the artist shows you a rather remarkable portrait of what you described. Which in many cases later turns out to very closely resemble the suspect. The missing link here is context. The context of shared culture and language.

    If you can sit at a table and describe the basic functionality of a program, and describe its interface using words. Then your magic PDA will do the rest. It will even give you demos and visual feedback on the fly as you describe the program. It would serve as a layer between the absolutely massive context of your personal history, and the "structured" programming language required to build said program.

    Please don't limit the future, it's bigger than you are.
  • by Anonymous Coward on Friday April 11, 2003 @12:14PM (#5711139)
    Repesenting arrays as lists while claiming that such a thing would be natural. Gimme a break!

    A possible natural candidate would be (mathematical) functions: an array is really just a (partial) mapping between a range of integers and something. But lists? Good grief!

    Functions happen to work for hash tables too; those are really just partial mappings from strings to something.

    Existing languages like Standard ML and, for that matter, Perl can do that today. (But Perl's closure implementation would make it unusable since it limits the call depth.)

  • by Anonymous Coward on Friday April 11, 2003 @12:28PM (#5711241)
    OO is a good way of forcing bad programmers to organize their code a little better, and having someone else dictate enough of the design (interface) that they can't screw that up too badly.

    His point was, I believe, that OO is better suited for development by large groups where individual members cannot be trusted to organize their code properly, and suitable for the process that results in spaghetti code - avoiding rewrites and allowing software to evolve for a long time. This can happen to OO projects, too, but they remain more manageable despite the spaghetti-infusions.

    But his observation, which I agree with, is that OO doesn't make programs written by good programmers any better, except for certain domains. As an example (my example, not his), OO is good for implementing UI elements, but not for implementing language compilers.

    As a side-note, I've also seen lots of OO programs that have been unextendable because things were implemented in many parallel classes that weren't sufficiently generic and tied together in ways that threw away any of the potential usefulness of the abstractions...
  • by Anonymous Coward on Friday April 11, 2003 @12:31PM (#5711276)
    I think he may have put the cart before the horse here:
    "It's especially useful for language designers to think about where the evolution of programming languages is likely to lead, because they can steer accordingly. In that case, "stay on a main branch" becomes more than a way to choose a good language. It becomes a heuristic for making the right decisions about language design."
    One shouldn't make decisions to try to "stay on the main branch". Rather, language designers should make languages that are useful to programmers and which lead to better software for users. If they do that, then the main branch will choose them.
  • by dtabraha ( 557054 ) on Friday April 11, 2003 @12:34PM (#5711289) Homepage Journal
    People have been predicting the end of MS BASIC since the days of
    10 PRINT "HELLO"

    But you know what?
    While Pascal, C, C++, Perl, LISP, Java and all the other languages have been sipping tea in high OOP geek society chatting about their superiority over it, BASIC has undergone the most evolution of any language.
    Not evolution of IDEs and libraries, but actual evolution of the syntax, operation, compilation, OOP methodologies, interoperabilities, inheritance, polymorphism, threading, (insert your favorite programming buzzword here).

    BASIC gave way to QuickBasic, which gave way to VB1,2,3,4,5,6,7(.NET) with simple changes in some versions, and extreme changes in others.

    I've programmed in plenty of languages: Assembly (SPARC-RISC, INTEL-CISC), BASIC, C, C++, COLDFUSION (If you can call it programming), FORTRAN, J++, Java, JavaScript, Pascal, PHP, PLC, MSSQL (Stored Procs, etc), VB3, 4, 5, 6, 7.NET, VBScript (Yuck), not to mention far too many proprietary languages that thankfully died along the way.
    And I can say with confidence that the most improved, from inception to present is Visual Basic.
    Even if you were to start VB off at QuickBasic or VB3, they still have made the most improvements to the language itself.

    Now I'm not going to get up on any high horse and say that VB will be the language of the future that handles all of the flying cars or whatnot.
    But I will say that the precedent in the computing world is: Evolve or die.

    Texas Instruments once had a powerful computing platform (TI99/4A) and then chose not to continue developing any more personal computers. The DEC VAX was hailed as a wonderful OS. It's now been purchased by HPaq and discarded. Dead languages litter the floor of every university: FORTRAN, COBOL, Pascal, Java [google.com]^H^H^H^H

    VB.NET still has some problems, but every version of BASIC has fewer problems and more functionality than the last.
    Microsoft may win in the end, simply because they're not afraid to change the language they own, and they don't have to argue with anyone else whenever they want to change it.
  • by Sanity ( 1431 ) on Friday April 11, 2003 @12:39PM (#5711332) Homepage Journal
    His claim that Java will be an evolutionary dead-end is already wrong - since Microsoft's C# is clearly heavily inspired by Java.
  • by Anonymous Coward on Friday April 11, 2003 @12:40PM (#5711337)
    With more power and clear (presumably very high level) languages, won't everyone be able to do their own programming? It always seemed in Star Trek that anyone with access could just shout out demands to the computer and voila. Would "programmers" then mean those working exclusively at the compiler level in which case one would think that unless the hardware paradigm changes (e.g. quantum computers) then it might really not be much different from programming compilers now.
  • Re:My prediction. (Score:3, Insightful)

    by zCyl ( 14362 ) on Friday April 11, 2003 @12:42PM (#5711355)
    In 100 years, computer languages won't exist, or at least won't be used for anything but toy programs. Programs will be created, tested, and debugged through genetic algorithms.

    I think this is sort of like saying 50 years ago that programming won't exist in the year 2000 because almost no one will use machine code anymore.

    I fully expect that in 100 years, computers will be able to do much of what we currently consider programming faster than humans can. The act of programming, and thus programming languages in general, might evolve into a higher level way of describing the specifications and behaviors of a program, without directly specifying what are now considered implementation details. In this way, a 10 year old could write a bug-free web browser in an afternoon, simply by specifying what the features should be and what type of layout it should have.
  • by lpontiac ( 173839 ) on Friday April 11, 2003 @12:51PM (#5711414)
    (does anyone program C without assuming that the standard I/O library is available?)

    Off the top of my head: Palm applications, Win32 applications, operating system kernels, plugins for various programs, libraries with no business doing I/O..

  • by MrBandersnatch ( 544818 ) on Friday April 11, 2003 @12:55PM (#5711461)
    However its interesting waffle and does have some good points. I have to disagree that if we had the languages of 100 years hence that programmers would be able to use them. I still remember just how hard C++ was for those used to C and I am amazed at just how hard peeps find XSLT to be due to its lack of modifiable variables. Recursion just doesnt seem to come naturally to many.

    Basically the problem isnt going to be with the languages - the problem will be with the concepts that created those language features.

  • by sapped ( 208174 ) <mlangenhoven@ya h o o . c om> on Friday April 11, 2003 @01:00PM (#5711518)
    The point he was trying to make was that the building blocks used to build up the language must be as simple as possible.

    Thus, in the core of the language you don't need to build in the ability to do multiplication if you have built in the ability to do addition. Multiplication is just a special case.

    However, you then add another layer to this simple core. In that layer you provide functionality for multiplication, subtraction etc.

    The key here being that the layer will have been written in the language itself. There is no need to go "outside" the language syntax to some kind of machine language.

    If this was adhered to in a strict enough fashion then it would probably make porting of the compilers to different platforms a lot easier as we would only really need to port the core. The rest of the language simply consists of layers that will just need to be recompiled within the language again.
  • Re:Types (Score:3, Insightful)

    by William Tanksley ( 1752 ) on Friday April 11, 2003 @01:01PM (#5711527)
    I'm not sure whether I agree with you or not. On the one hand, I really like strong static typing; it's worked well for me. On the other hand, my experience and the experience of others I've seen is that dynamicly typed languages are not only more fun to program in, they produce good results faster.

    My theory is that this is just bad implementation on the part of the static languages (this is obvious with C++, less obvious with ML, subtle with Haskell); but I don't know this, it's only a suspicion.

    I don't want to spend the rest of my life writing test suites to check for errors which even trivial type systems can detect.

    Put that the other way around -- do you want to spend the rest of your life working against strong typing systems which only detect errors that even the simplest test suite would automatically find?

    You ARE writing those test suites anyways, right? Because there are errors that NO static type system can ever find but a simple test can.

    -Billy
  • Re:how long (Score:4, Insightful)

    by Simonetta ( 207550 ) on Friday April 11, 2003 @01:04PM (#5711554)
    I believe that programs should read like novels; there should be long paragraphs of text that describe what and how the code is working followed by short bursts of actual 'dialog' that is the actual source instruction to the computer.
    The actual source code (i.e. the instructions to the processor) should be surrounded by quote marks or other delimiters, and the comments (i.e. the extended code description and documention) should be the part of the source surrounded by white space characters (space, tab, cr/lf).
    I never cease to be amazed at how little programming has changed since the 1960's. It really seems that the only innovation in compilier user-interface design has been that (some) compiliers will actually allow you to put your keywords and comments in color! (duh!)
    If we are ever going to increase the productivity of programmers to even remotely match the vast increases in price/performance of the the hardware then we must be willing to spend large amounts of time energy and money to develop new and better approaches to writing software code.
    We must abandon our kilobyte mentality to gigabyte technology!
    As an example of a different approach, has anyone considered using Chinese characters arranged in a three-dimensional grid as a method of doing syncronous parallel programming? Have each character represent a complete function and have their placement in the 3-D grid space represent the point in the algorymthic process that the function should be complete. The compilier would either create the machine language or suggest other arrangements of the parallel process by rearranging the Chinese characters in the 3-D user interface.
    (The fact that it sounds weird is not important. What is important is that any new idea that can help improve the productivity of programmers should be considered, regardless of how strange it may sound at the present time)

    Thank you,
  • by WolfWithoutAClause ( 162946 ) on Friday April 11, 2003 @01:05PM (#5711561) Homepage
    If the computer generates the runtime code automatically (I don't necessarily agree with using GAs in fact, it seems there's lots of search algorithms out there that usually outperform GAs, but GAs do seem to work); then the question becomes one of testing it. Therefore, the programming becomes not, writing the code, but specifying what code has to do in some way, and then the computer writes the code to match.

    So say you want a chess program. You feed in the rules of the game in a special language, and it generates a program to play by searching for the program that successfully implements that.

    That sounds fantastic! You'll never have to write another line of code ever again!

    Hold on, don't get excited, it's not that simple.

    First, just because it plays a game of chess, doesn't mean it plays a good game of chess, so you might still have to tell it in quite high detail what 'good' means in terms of your program specification.

    Secondly, it's very easy to tell it the specification incorrectly. Specification errors are very common in computer software already, and having the highly precise specification language that you'll need doesn't make it easier, although the machine is less likely to screw up the implementation, so you're still better off than coding by hand.

    Still, in some cases, where the problem is easy to state, you should have a solution program in just a few minutes; whereas now it could take hours or days.

    Anyway that's where I see it go. There are some existing implementations of this kind of thing out there already, but they tend to be small. For example somebody wrote a program to find the shortest program to perform certain operations for gcc backends. You pretty much just point the compiler at a new processor and optimum code pops out, it's kind of neat. But currently this is limited to maybe 15 instructions long. I think that this will grow enormously, particularly if you throw away the constraint of having to be completely optimal, and allow 'only slightly suboptimal' programs; and faster and faster processors are coming down the pipe; so 'searching for code' is becoming more practical.

    Oh yeah, and the idea probably works for parallel and quantum computers too. Parallel searching for optimum parallel code, and so forth.

  • by OGmofo ( 189475 ) on Friday April 11, 2003 @01:07PM (#5711586)

    A friend once asked me, "When will computers be so easy to program that a six year old child can do it?"

    I replied, "When six year old children can specify a task with zero ambiguity."

    Its perfectly fine to have some english like programming, it simply must be as unambiguous as possible:

    Computer, load memory location 40343 into register 6.
    Computer, add register 6 to register 8.
    Computer, if overflow, jump to address 52895.

    I'll stick with C++.
  • Article Summary (Score:3, Insightful)

    by Mannerism ( 188292 ) <keith-slashdot@nOspAm.spotsoftware.com> on Friday April 11, 2003 @01:23PM (#5711725)
    Here's what I got out of it:

    Nobody really has a clue what programming languages will be like in a hundred years, but if all the Perl and Python weenies would learn LISP then maybe we could get somewhere within the next decade.
  • Re:Types (Score:2, Insightful)

    by jameson ( 54982 ) on Friday April 11, 2003 @01:34PM (#5711797) Homepage
    On the other hand, my experience and the experience of others I've seen is that dynamicly typed languages are not only more fun to program in, they produce good results faster.

    Hmm, my experience is slightly different. The 'fun' part is, of course, rather subjective-- I find dyanmically typed languages more frustrating, which is why I personally wouldn't agree with that, but, of course, YMMV-- but I definitely disagree about the part with "good results being produced faster".
    They do produce results faster, often even reasonable results. But good results (i.e. stable code) take much longer to achieve, in my experience.

    Put that the other way around -- do you want to spend the rest of your life working against strong typing systems which only detect errors that even the simplest test suite would automatically find?

    I've rarely had to "work against" strong type systems in the past. Sufficiently expressive type systems are important for this to make sense, I'll admit that; it requires a certain kind of thinking to be able to make use of strong typing effectively. This restriction on "free thought" may be what people find most offensive, but I'm not sure.

    Regarding test suites: It is very hard to construct (for a non-trivial program) a test suite that achieves complete code coverage (recall that you'll want to check all of your error and special case handlers). Right now, test suites are the most reasonable approach to handling properties beyond the capabilities of type systems in languages we can come up with ("Does this function terminate? Does the sequence of file system updates it performs correspond to what I want it to do?"), and they are certainly a good complement to static checkers. But they rarely test all of the relevant properties of a program; as such, the neccessity for implementing a test suite can be considered as a sign of weakness of static checking. (That doesn't mean they should be omitted if static checking fails-- that'd be dishonest.)

    There are errors which no static type system or static checker will be able to find (thanks to Goedel ;-) but that doesn't mean that we should give up on static checking-- simply because no test suite writer will ever be able to test for all the things that fall out of a static checker for free.

    In conclusion, I believe that we need static typechecking, static checking in general, and test suites for everything we cannot formalise (for whichever reason) if we want to ever resolve the "software crisis". Without any of these, we'll just keep inventing more crappy code.
  • by AxelTorvalds ( 544851 ) on Friday April 11, 2003 @01:38PM (#5711818)
    These "language discussions" are always flawed. There are zealots who like one hammer for every nail and then there are these other zealots that are so far from the actual problems that their ideas are kind of hoaky.

    Heirarchy will continue to exist. It's the only concept the human brain has to deal with complexity, call it what you will but you classify and associate things in to hierarchy whether you're aware of it or not. I see no reason to believe right now that processors will have more advanced instructions than they currently do now; they may be very different (like optmisitic registers that know values before they have been calculated or something) but they will be on the same order of complexity. The atomic operations will probably remain at the same order of complexity in biological processors, quantum, or SI/GAAS/whatever based transistor processors. I don't see how sort a list will be done without some sort of operations to look at elements in it, compare them, and then change their ordering. Even with quantum computers you have to set up those operations to happen and cause results. That being said there will always be an assembly language.

    On top of that there will always be a C like language, if it's not C, that will be a portable assembly language. Then there will be "application" languages built at a higher level still. That won't change, for good reasons, it's just too complex to push the protection and error checking and everything down a level. I'll give examples if you want them. The easiest one that comes to mind is something like Java garbage collection and how programmers assume that it has mystical powers and are shocked when they fire up a profiler and see leftovers sitting around, it's a very complex piece of software and you expect it to go down to a lower level? The lower levels have their own problems keeping up with Dr. Moore.

    I think the other biggest area is that reliability needs to go up by several orders. Linux, BSD, Win2000 and WinXP are pretty reliable but they aren't amazing. I've seen all of them crash at one point or another, I may have had hand in making it happen and so might have hardware; either way it did. To really start to solve the issues and problems of humanity better we need to have more trust for our computers, that requires more reliable computers and that require different methods of engineering. The biggest thing going on in programming languages now to deal with that is Functional Programming. In 50 years I could see some kind of concept like an algorithm broker that has the 1700+ "core algorithms" (Knuth suspects that there are about 1700 core algorithms in CS) implemented in an ML or Haskell like language, proven for correctness, in a proven runtime environment being the used in conjunction with some kind of easy to use scripting glue. And critical low level programming will be proven automatically by an interpreter at compile time, they are already making automatic provers for ML.

  • by matvei ( 568098 ) on Friday April 11, 2003 @01:40PM (#5711835)
    Learning the syntax of a language is trivial compared to problem solving itself. Joe User will make the same logical fallacies (= bugs) using English that he would if he was using Python/C++/COBOL.

    Besides, the syntax of English language is a lot more complex than that of most common programming languages. Because of that it would be easier for non-native English speakers to learn some simple scripting language than to learn English well enough to avoid syntax errors on line XX.

  • Re:Why Change? (Score:3, Insightful)

    by Arandir ( 19206 ) on Friday April 11, 2003 @01:58PM (#5711942) Homepage Journal
    If we look at the last 100^H^H50 years of programming, we can predict what the next 100 years will bring:

    Safer languages. Forget typesafe languages, we'll have typeless languages. And then the algorithms will be intensely abstracted as well. We'll have functional composition with a usable syntax. We'll create GUIs by overloading the + operator to handle components. And of course, automatic runtime code reuse will be an assumed feature of the language.

    And of course, the past fifty years teaches us that it won't be free. The personal computer of 2100 will have 100 Zigs of RAM, with 1024 parallel SMB processors running at 100.33 Zigahertz, and a display with 128Zps framerate. But the languages of the future will suck up those resources faster than you can upgrade your box. And .NET2100 applications will still run like a dog.

    And the college kids of 2100 will wonder how they ever managed to fit a Linux distro in a 5 DVD set, or how people ever managed to communicate over a 8Gbps connection.
  • by prestonmarkstone ( 265596 ) on Friday April 11, 2003 @02:02PM (#5711978) Homepage
    English actually doesn't really have a written Grammar BTW
    Um, not really. Read any first-year linguistics textbook - all languages are grammatical. Without grammar, you don't have a language.
    The idea of a prescriptive grammar was retro-fitted by the Victorians, yes, but English had grammatical rules of its own long before nineteenth-century grammarians attempted to enforce Latinate rules upon it. wrt the topic at hand, one way to think of human language is to call it ambiguous; another way to think of it is to call it flexible or even extensible. There's the denotative layer of a lexicon, in which words are more-or-less set in their definitions, but there's also a dynamic connotative layer in which words can shift meaning within a certain variable range (e.g., the word 'house' can take on connotative meanings that may or may not correspond with definitions of 'home'; a "house of ill repute," for example, is never referred to a "home of ill repute").
    The advantage of the flexibility of human language is pretty significant. We get to do things like pun, allude, infer and imply, and most importantly, we get to extend the language on several levels, by creating new words ("meme," "slashdotted") and by creating new definitions for a word ("mouse," "root").
    The downside of all this flexibility, of course, is the enormous and inherent possibility for ambiguity and poorly-formed statements. This is the case with any language, including classical languages in which grammar and syntax have been comprehensively analyzed and documented. But it's a trade-off; reduce a human language to zero ambiguity, and you compromise some of its core features.
  • Re:Notation (Score:4, Insightful)

    by rabidcow ( 209019 ) on Friday April 11, 2003 @02:30PM (#5712184) Homepage
    I think the uathor expects Lisp to remain a vital evolutionary branch because of its mathemtical roots.

    I think he expects it to remain a vital branch because recent languages have been more and more like lisp. If lisp doesn't directly beget new, highly popular languages, lisp's features will be (and have been) absorbed into whatever does become popular.
  • Re:Why Change? (Score:3, Insightful)

    by __past__ ( 542467 ) on Friday April 11, 2003 @03:20PM (#5712582)
    Still trying to get the same old program to run without coredumps, probably.
  • Re:Well said! (Score:3, Insightful)

    by pHDNgell ( 410691 ) on Friday April 11, 2003 @03:25PM (#5712613)
    I'm a professional java programmer. I also do objective C (before Cocoa), Smalltalk, python, C, scheme, Haskell and a few other languages reguarly. I've managed large perl based projects, written high-performance data processors in Ocaml, written a billing system in C++, a monitoring system mostly in tcl, games in assembler, data processing systems in eiffel, and hell, I even did a CueCat decoder in javascript a while back.

    While I've still got quite a bit to learn, I can say you're missing the point if you don't understand what he's saying about java being a dead end on the evolution of languages.

    What does java have to offer to evolution? I can't think of much that java as a language has brought us that's new.

    Being VM based? It's more acceptable now, but it's not new. OO? It barely does that well at all. Hmm... Libraries. JDBC is ODBC with a touch of sanity. The collections look very much like the collections of Smalltalk finally (which also appear in objective C).

    Overall, without doing something revolutionary, I don't see how it can contribute that much to evolution.

    Revolutionary languages are languages like lisp (which is a sad story in that it's still capable of doing everything any other language can do, often with less effort), smalltalk (actual OO is a lot more simple than partial OO), Haskell (smaller change, but you can be terse without being cryptic...almost a common language for design and specification), etc...

    Many languages have made some contributions to programming in general and will change the way people think about development. Python may not hold the influence of lisp or smalltalk, but it does do some things that future language designers will consider (for example, yield for generators vs. call/cc is an excellent improvement in helping expressing what you mean).
  • by MobyTurbo ( 537363 ) on Friday April 11, 2003 @03:40PM (#5712732)
    It may seem presumptuous to think anyone can predict what any technology will look like in a hundred years...Looking forward a hundred years is a graspable idea when we consider how slowly languages have evolved in the past fifty.
    Hmm...funny, fifty years ago, if I remember my history (since I wasn't alive back then), those relay computers
    Actually, relay computers were 1930s. They were using vaccum tubes in the late 40s, and were less than a decade away from transistors fifty years ago; not that they weren't just as primative.
    needed rolls and rolls of ticker-taped punch holes to compute math.
    Punch cards were a limit IBM placed on the technology because IBM thought compatability with their previous non-computer automated machines that used punch cards would be a big plus in selling them to existing clients. Actually IBM's competition, using magtape, had a better form of input/output; IBM set back the computer industry years in doing this.
    The language was so-low-level...even x86 Assembly
    One thing you've got to understand about Paul Graham is he is, for better or for worst, a big fan of LISP; a language that began in 1958 and is still used in Artificial Intellegence and other things (like Orbitz and Paul Graham's own Yahoo! Store) today. Since LISP has a lot of ability to use abstraction, and object oriented programming is a narrower level of abstraction, it does seem that OO isn't so revolutionary. (Even if you go by OO history alone specifically without recourse to comparing it with LISP, it is over 30 years old - Simula was written in the late 60s.)
  • my grade (Score:4, Insightful)

    by bob dobalina ( 40544 ) on Friday April 11, 2003 @03:45PM (#5712766)
    The signal to noise ratio in this piece is high. There's lots of metaphors and similes to explain his otherwise very facile points.

    He also seems to be contradicting himself. " Semantically, strings are more or less a subset of lists in which the elements are characters. So why do you need a separate data type? You don't, really. Strings only exist for efficiency. ", he says at one point, then a few paragraphs later says "What's gross is a language that makes programmers do needless work. Wasting programmer time is the true inefficiency, not wasting machine time.". The efficiency in implementing strings in programming languages is for the programmer, who doesn't have to use said "compiler advice" and carefully separate his strings from his other, non-string list instances and keep the two distinct in his programming model. Apparently it's "lame" to simplify text manipulation for programmers, but at the same time the efforts of programming language design should be towards making the programmer's life easier. Which is it? I know strings and string libraries have made my life a whole lot easier.

    Nevertheless, I'm willing to accept the notion that eliminating strings and other complex, native datatypes and structures serves to make a programmer's use of time more efficient. But how does it do it? Graham doesn't say, he just waxes nostalgic about lisp and simpler times and languages.

    I don't think the slashdot crowd needs it explained why data manipulation by computer needn't be simplified; it already is, as machine code is binary in the common paradigm. What ought to be simplified is data manipulation by humans, and on this point Graham nominally agrees (I think). This has been the thrust of the evolution of programming from machine code to assembler to high level language. Simplifying high level languages into more and more basic, statements -- getting closer to the "axioms" that Graham calls tokens and grammars -- simply reverses that evolution. It makes it easier and more elegant to compile programs, but it does absolutely zero to make the programmer's life more efficient, or easy. The whole reason high level languages were developed was precisely to get away from this enormously simple, yet completely tedious way of programming.

    The overarching fallacy in this article is Graham's reliance on what is known about computation theory now to determine what programming languages would (and should) look like then. And while it's interesting to prognosticate on what the future would be like 100 years from now based on what we have today, it's not a reliable guide. Like Metropolis, A Trip to the Moon, and other sci-fi stories from the distant past, they're entertaining and no doubt prescient to the people of the time, but when we reach the date in question, the predictions are largely off the mark. It's somewhat laughable to think that despite our flying cars and soaring skyscrapers, we use steam engines to power our cities and make robots with eyes and mouths. Likewise, I don't think an honest, intelligent prediction or forecast of (high level) programming languages 100 years hence can occur without a firm basis, or even idea, of what assembly code would look like then. This, in turn, relies on a firm idea of what computer architecture will look like. Who knows if five (or fifty) years from now a coprocessor is designed that makes string functionality as easy to implement as arithmetic. Such an advance would completely invalidate Graham's point about strings and advanced datatypes, and in fact possibly stand modern lexical analysis on its head. Or if an entirely new model of computation comes to the fore. Even Graham himself admits that foresight is foreshortened: " Languages today assume infrastructure that didn't exist in 1960.", but he doesn't let that stop him from making pronouncements on the future of computing.

    Graham seems to be spending too much time optimizing his lisp code and not enough on his writing. This piece of code could have been optimized had he used a simile-reductor and strict idea explanations. But it's definitely a thesis worth considering, if for no other reason than mild entertainment. C-
  • by s88 ( 255181 ) on Friday April 11, 2003 @04:30PM (#5713061) Homepage
    The objective is to create a syntax which is the simplest way to convey the semantics of the program.

    Do you see mathematicians "evolving" into using natural language?
    "Four plus four times two divided by three"

    I highly doubt it. Natural language is not the best way to describe math, nor is it the best way to describe computer programs.

    While I am positive we will have excellent natural language user interfaces for programs in the future, this is only because the end user doesn't want to learn a new, more expressive language, he wants to use what he knows (English).

    This is the very nature of abstraction. Abstraction inherently causes loss of expressiveness.

    I don't want to "speak" my program to my computer, I want to describe the algorithm in the most expressive and elgant way.

    Scott
  • by Raffaello ( 230287 ) on Friday April 11, 2003 @06:56PM (#5713938)
    "Projects like Squeak are looking more and more like Java these days, in terms of re-usability and VM-based platform independence."

    Wow, do you know anything about the history of programming languages?

    Java's OO is based on Smalltalk, quite intentionally, not the other way around.

    Smalltalk ran on cross platform virtual machines long before Sun decided to foist it's failure of a set-top box language (i.e., Java) on the internet in the mid '90s.

    Squeak is a deliberate attempt to recreate the original Smalltalk from ca. 1970.

    So, yes, Java is an evolutionary dead end because it badly implements only some of the features that Smalltalk had 30 years ago, and contributes nothing new, unless, of course, you consider the exalted levels of marketing hype.
  • by pHDNgell ( 410691 ) on Saturday April 12, 2003 @12:25AM (#5714953)
    The popularity of a language has little to do with its contributions to the evolution of future languages. Sure, popularity means that more people get to see it and what-not, but many people who write languages tend to know more than the one they just wrote.

    To call languages like lisp, smalltalk, objc, etc... ``barely usable'' is to make it clear you've barely used them.

    While you may not believe that C is much different from fortran or pascal, it's not derived from either. C's father is a language called B. Algol 60 and 68 also influenced the development of C. I can't speak too much on the history of C++, but it's misleading at best to say that C inspired Objective C. Smalltalk inspired Objective C. C was the default system programming language for quite a while (i.e. people were having to use it anyway), and Brad Cox wanted to add the features of smalltalk to it to make application development easier. Not to say that C didn't play a major part in this development, that'd be stupid, C is clearly the father. Objective C, however, is the mother who stayed at home and took care of it while the father was out working on bigger and more bloated things.

    Javascript is in *no way* related to java. Javascript existed before java and shares only a few items of syntax.

    So, you never said what Java brought other than popularity. If you design a language that has the collections API of java, you've just modeled a language after smalltalk. If it's got RMI, then I hope you at least get it close to as good as objective C did (which was doing it before java). If you plan on using java as a reference for a reflection API, then please, use it as a bad one, because it was done much better in...I don't know, just to pick something, let's say python. Cross platform UI? I was writing cross-platform perl apps using Tk before I did any UI work in Java (and it was easier, though I find it still yet easier in tcl, which is the only language in which I've actually deployed cross-platform UI)...etc...
  • 25-year language? (Score:3, Insightful)

    by karlm ( 158591 ) on Saturday April 12, 2003 @04:06AM (#5715423) Homepage
    We really have no idea what the computing environments will be like in 100 years, nor what the core uses of computers will be. I think hoping to use languges "along that evolutionary path" is putting the cart before the horse. I think 25-years is about as far ahead as you can reasonably hope to predict. It's like a 1-week forecast. Nobody asks the weather channel for a 1-month forecast, as the margin of error becomes astronomical.

    One poster claimed quantum computing will make current languaes useless. This is false. Any reasonably flexible language has space for new data types and operators. You would have to be careful not to prematurely branch based on a quantum value, as this may force you to opserve its value, destroying the superposition. However, I don't doubt Fortran will be one of the first languages used in quantum computing once they get past the assebly language stage.

    What will languages be like in 25 years (and maybe 50 years)?

    Well, there will be a Fortran and a Lisp and a C (maybe a C++). Lisp has always had automatic garbage collection. The Fortran and the C will have optional garbage colletors. Fortran, Lisp, and C are all decent attempts at languages that are pretty easy to grasp and have huge legacy backings.

    Hopefully all of the main languages will be less machine depenent. Fixnums, ints, longs, floats, and doubles will be the same size across platforms, wasting a few cycles if this doesn't fit the underlying hardware.

    In terms of new novel languages, I see languages simultaneously going three ways. I forsee languages that resemble formal proofs and/or formal specifications for use where reliability is critical. I forsee languages specialized for Scientific/Engineering disciplines. (Maybe Fortran, Excell, and Matlab cover all of the bases well enough, but I hope there is enough room left for improvement to drive innovation and adoption. Having a CS background, I didn't appreciate LabView's "G" language until I had an opportunity to see the ugly ugly code scientists and engineers tend to write in Fortran.) (I can also imagine efforts to use sytaxts that better express parallelism and other features for optimimizers/compilers so we finally have a widely used scientific/engineering language that is faster than Fortran 77. I can also see more languages like the Bell Labs Aleph, designed for parallel/cluster environments.) The third direction I see languages going is scripting/prototyping-like languages that will look more like natural language. (It's too bad there isn't a cross-platform open-soource AppleScript-like language yet.)

    What do I think languages should have? Languages should have garbage collection and bounds checking enabled by default (of course, optionally turned off if you really must have the performance).

    Langages should have very clean and consitent APIs. Having few orthagonal types helps make a language clean Languages should merge character arrays and strings (arguably the Algol languages have had this for a while). If a language wants to be able to have immutable strings, it should provide a way to declair an immutable variant of each fundamental type. (This is actually very useful in writing less buggy code.) Languages should strictly define the size of fundametal numeric types. (I really like Python, but it seems a huge mistake that an integer is "at least 32-bits". Allowig variations in size of fundametal numeric types adds cross-platform bugs. If I wrote a language, the types would look like "int32" "int64" "float32", "float64", "complex32" and "complex64". We got rid of 9- and 12-bit bytes. We should further get rid of these headaches.) Having worked with lots of engineers and scientists, I would love to see complex numbers as basic numeric types that all of the normal operators work on. Wrapping two doubles in an object adds a function call overhead for each numeric operation. Performance with complex numers (and numbers in general) is one big reason a lot of the code I see is written in F

  • by NetSettler ( 460623 ) <kent-slashdot@nhplace.com> on Sunday April 13, 2003 @11:45AM (#5721442) Homepage Journal
    I hold the best computer languages for actually programming the computer will still be analytic, logical languages.

    I guess this is the fundamental point on which we just have to agree to disagree. I think that analysis and logic are critical operations, but I hope to find that the computer languages of the future will cease to be pedantic about the specific mode of expression, perhaps building in a sense of redundancy of expression so that no matter what language you express the idea in, it ends up with effectively the same internal representation.

    One of the biggest differences right now about how computers do things and how people do things is that computers do not "degrade gracefully" when you go outside the ordinary way they expect to receive things. They tend to "fail catastrophically" on the least little deviation from the expected. In a hundred years, I hope they learn to be more laid back about what doesn't really matter (the manner of expression) and to focus more on what really does matter (the goal of the expression).

    It might be that they will fail at the goal for the first few revs. But if we don't at least deploy them with the intent of trying, we won't get there.

    Sometimes the path toward the future is a crooked one. For a further illustration of this phenomenon, search for (and read) 'A Personal Footnote' at the end of my 2001 paper on error handling in Lisp [nhplace.com].

Machines have less problems. I'd like to be a machine. -- Andy Warhol

Working...