Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

The Hundred-Year Language 730

dtolton writes "Paul Graham has a new article called "The Hundred-Year Language" posted. The article is about the programming languages of the future and what form they may take. He makes some interesting predictions about the rate of change we might expect in programming languages over the next 100 years. He also makes some persuasive points about the possible design and construction of those languages. The article is definitely worth a read for those interested in programming languages."
This discussion has been archived. No new comments can be posted.

The Hundred-Year Language

Comments Filter:
  • AI (Score:2, Interesting)

    by Anonymous Coward on Friday April 11, 2003 @10:42AM (#5710390)
    In 100 years, I would expect computers to be writing it's own code. And rewriting it agian to evolve.
  • Re:how long (Score:5, Interesting)

    by GnuVince ( 623231 ) on Friday April 11, 2003 @10:43AM (#5710396)
    Forth can be used a little bit like that (example taken from "Starting Forth", by Leo Brodie):

    \ Word definitions : convicted-of 0 ; \ To convict someone : murder 25 + ; : arson 10 + ; : robbery 2 + ; : music-copying 40 + ; : sentenced-to . ." years of prison" ;

    And to use it:

    convicted-of music-copying robbery sentenced-to

    Output: 42 years of prison This looks quite like english. Of course, you can do that in many languages, but it feels more natural in Forth I think.

  • by Mxyzptlk ( 138505 ) on Friday April 11, 2003 @10:48AM (#5710430) Homepage
    When I say Java won't turn out to be a successful language, I mean something more specific: that Java will turn out to be an evolutionary dead-end, like Cobol.

    Er... I don't think that Cobol is an evolutionary dead-end; in the best world, it would be extinct, but it isn't. What makes a language widely used is something that we can't predict right now - we have to watch it evolve over time, and as it grows and matures look at different aspects.

    Take architecture for example - new buildings are loved the first five years because of their freshly introduced ideas. After that, all the problems start to appear - mildew problems, asbestos in the walls, and so on. During the next ten years, the child diseases are fixed. It is only a HUNDRED YEARS after the new building (or in our case, the new programming language) can be properly evaluated. The language/building then has either been replaced, or it has survived.

    So - the only proper way to measure the successfulness of a programming language is to measure its survivability. Sure, we can do guesstimates along the way:

    During introduction: Does the language have a good development environment? Is the language backed/introduced by a market leader?

    Somewhere during the "middle years" (after about ten years): Does the language have a large user base? Does the language have a large code base?

    After twenty/thirty years: ask the programmers if it really is maintainable...

    Well - you get the picture! Predicting the survability of something more than five years into the future is impossible, I'd say.
  • VB Problems (Score:2, Interesting)

    by nigel.selke ( 665251 ) on Friday April 11, 2003 @10:50AM (#5710445) Homepage
    Smarter compilers and more powerful hardware will definitely negate the need for strongly typed and down-to-the metal languages that we've seen in the past to some extent, but VB has several limitations that will prevent it from taking over other languages:
    Lack of portability This will become increasingly important as companies and inviduals move away from Microsoft as Microsoft pushes its luck further and further by strangling the market.
    Basic sytax, hacked OO The use of Basic syntax can cripple larger projects, add to this the lack of proper OO in VB, and you have a problem.
    Too many power-user addons VB has become a language for people who just want to buy third party addons and plug them in. While this is fine theoretically, it makes the program segment's modules difficult to integrate with the rest of the project, as well as encouraging lazy practices or even lack of knowledge in the programmer.

    The only way VB will retain any large number of its current userbase is by being completely committed to the .NET infrastructure.

    Meanwhile, languages like Java, Python, Perl and PHP will continue to grow and gain more and more users amoung tech savvy individuals.
  • History and Future (Score:5, Interesting)

    by AbdullahHaydar ( 147260 ) on Friday April 11, 2003 @10:51AM (#5710458) Homepage
    This is a really interesting paper [unc.edu] on the history and future of programming languages. (Check out the history chart in the middle....)
  • Re:dead-end? (Score:1, Interesting)

    by Anonymous Coward on Friday April 11, 2003 @11:01AM (#5710536)
    Not really. C# and Java are sister languages. Both were "spawned" by C++ and Smalltalk.
  • Types (Score:2, Interesting)

    by jameson ( 54982 ) on Friday April 11, 2003 @11:03AM (#5710551) Homepage
    "For example, types seem to be an inexhaustible source of research papers, despite the fact that static typing seems to preclude true macros-- without which, in my opinion, no language is worth using."

    This bold statement is not only wrong (cf. Peyton Jones' latest work on macros in Haskell), but also misleading. Let's start off with some opinion: In my opinion, no language without static typing is worth using. The reason is simple: Because I am human. I make mistakes. And I don't want to spend the rest of my life writing test suites to check for errors which even trivial type systems can detect.

    I agree with one thing: Languages will become simpler on a mathematical level. Anyone who has used ML or Haskell will have noticed how much easier these are to understand in comparison to any imperative language out there (and, by the way, in Haskell, Strings are lists of characters). But, at the same time, I truly hope that mechanisms for proving properties about programs will become not only more powerful, but also more widespread. I would like to have static verifications of my pre- and postconditions. I would like to verify that the result of my 'sort' function returns a permutation of its input for which each element is less than or equal to its successor. These are the things I'm looking forward to seeing in the future.
  • What about ASM? (Score:3, Interesting)

    by siliconwafer ( 446697 ) on Friday April 11, 2003 @11:08AM (#5710596)
    Upper level languages will change. But what about Assembly? What about programming for embedded systems?
  • My prediction. (Score:4, Interesting)

    by An Onerous Coward ( 222037 ) on Friday April 11, 2003 @11:10AM (#5710619) Homepage
    Try this on for size: In 100 years, computer languages won't exist, or at least won't be used for anything but toy programs. Programs will be created, tested, and debugged through genetic algorithms. Nobody programmed them, nobody is exactly sure how they do what they do, and it works so well that nobody really cares to find out.

    We're already to the point where it's absurd for a single person to understand the whole of a software project. Things are only going to get worse from here, and the only way out is to let the computers manage the complexity for us. As computers become faster, they'll be able to test out an ungodly number of permutations of a program to see which ones perform the fastest, or give the best results.

    Just a speculation. I don't wholeheartedly believe what I just said, but I think it's a bit silly to simply assume that programming languages will be around forever.
  • Re:how long (Score:4, Interesting)

    by yasth ( 203461 ) on Friday April 11, 2003 @11:10AM (#5710620) Homepage Journal
    As any one that has worked on Natural Language Processing can tell you, natural language is a bugger. It is very context driven, and too top it all off has a good deal of redundant syntax (a, the, sv agreement, etc.) Human language is a very nice protocol for transfering ideas (It is in many ways a system designed to transmit through noisy environments by many users all of whom differ in thier individual implementation of the standard). Natural spoken form language is less good at commands, and is particularly bad for unsupervised commands.

    For unsupervised commands humans tend to create something not all that different from code. A fixed set of grammar and vocabulary come into play (i.e. little slang, and very normalized style). For example:

    Employees will update thier status on the In/Out board in the lobby when they will be gone for more then 15 minutes.

    which is roughly:
    (if (> (expected-completiontime task) 15)
    (update-status out))

    So the need and utility isn't there.
  • by unfortunateson ( 527551 ) on Friday April 11, 2003 @11:14AM (#5710644) Journal
    The article seems a bit naive about data structures and their evolution into objects.

    Strings aren't lists, they're structures.

    Most strings use in programs is a holdover from teletype-style programming, where all you could display is a short (ahem) string of characters. Today's string use is a label to a data item, a menu item on a menu, a data object in a domain.

    XML -- as clunky as it can seem -- and XUL in particular, are ways of describing user interface to a system as a tree of objects.

    So I don't want lists of characters, I want associative structures of objects which can be of many different types, used in the manner required by the program (it's a string, it's a number, it's a floor wax, it's a desert topping).

    I'm trying really hard to avoid saying "object-oriented," but objects will become more complex and more abstract. Computers of the future may not have to worry about pixels in an image, but rather know the object itself, where a bitmap is just an attribute of the thing.

    Perhaps driver- and compiler-writers will still need stripped-down languages for efficient access to hardware, but as an app programmer and end user, I want the computer to handle statements like,

    BUY FLOWERS FOR ANNIVERSARY

    Currently, that would be something like
    event("Anniversary").celebrate.prepare.purch ase($f lowers)

    That's not nearly abstract enough.

  • by MickLinux ( 579158 ) on Friday April 11, 2003 @11:19AM (#5710696) Journal
    The evolution of languages differs from the evolution of species because branches can converge. The Fortran branch, for example, seems to be merging with the descendants of Algol. In theory this is possible for species too, but it's so unlikely that it has probably never happened.

    Ummm... how about lichen? our mitochondria? What about the parasitic relationships that become mutually beneficial, such as the numerous bacteria in our gut and on our skin, and then eventually become necessary for life?

    Merging actually does happen -- it just doesn't happen in the way he was thinking, that DNA become identical and cross-species fertility occurs. Rather, the two organisms live closer and closer, until they merge.

    Come to think of it, although it isn't on the species level, the concept of merging species isn't too different than sexual reproduction.

  • by Saige ( 53303 ) <evil.angelaNO@SPAMgmail.com> on Friday April 11, 2003 @11:22AM (#5710730) Journal
    Interesting article, but I think it had a serious flaw - by assuming that programming languages in the future are just going to extend the current model even further.

    Some of us working in the telecommunications industry are already familiar with SDL (Specification and Description Language) [sdl-forum.org] as a tool for designing and auto-coding software. Yes, auto-coding. The SDL design software lets us design a system graphically, breaking it up into sub-components and specifying message flows between those components, and defining state machiens for handling these messages.

    Developing software in such manner usually requires a very little coding, as the design tool will turn the design into code. Coding may be required for interfacing with the OS or other entities, though that's improving also.

    I'm starting to think as such tools mature, they're going to be the next step up, like the way programming languages were the step up from coding in assembly. They are less efficient, just as BASIC or C is less efficient than pure assembler, but they allow greater focus on a solid and robust design and less requirement to focus on repetitive details.

    Imagine being able to take out the step of having to go from a design to code - focus on the design, and you're done.
  • Re:What about ASM? (Score:3, Interesting)

    by BenjyD ( 316700 ) on Friday April 11, 2003 @11:26AM (#5710754)
    In a hundred years time, the cost of processing power will be likely be *much* lower. The price difference between putting a 1 MIPS or a 10 GigaMIPS processor in a toaster will be in the order of a fractions of a pence.

    The few remaining areas for ASM programming - embedded, SSE-like optimisations - are being eroded gradually as processors and compilers get better.
  • Re:how long (Score:3, Interesting)

    by Anonymous Coward on Friday April 11, 2003 @11:32AM (#5710797)
    I thought Chomsky had a lot to say about this.

    Structurally, spoken languages and computer languages are very similar:

    Phonetics: sounds
    Phonology: sounds in relation to one another
    Morphology: words
    Syntax: structure (words in relation to one another)
    Semantics: meaning
    Pragmatics: meaning in context.

    Morphology, Syntax and Semantics are shared by human and computer languages. Arguments could be made about phonology, too, but not by me. Some computer langauges might even have pragmatics. (Example of pragmatics: when one says "it's hot in here" one means, 1) it's hot in here and 2) somebody get off their ass and open the damned window). I'm not familiar enough with computing languages to say if a command means one thing in one instance and means something else in another instance, or has two meanings simultaneously.

    Human language is full of redundancies. Some computing languages have some redundancies. Perl springs to mind (no wonder... Larry Wall was a linguist) with its "there's more than one way to do it" creed.

    I don't think computing languages will reach the full complexity and redundancy of human language. One main reason is because human language is an extension of the human though process. Now, if you want to read the previous posting about the Turing Test, please, feel free....
  • Re:Awareness... (Score:2, Interesting)

    by ekephart ( 256467 ) on Friday April 11, 2003 @11:35AM (#5710833) Homepage
    Imagine cars that, before changing lanes, signal to the surrounding cars' navigation systems and they work out for themselves how to let the car into the lane. A computer can be told to slow down, rather than speed up, when someone wants to change lanes.

    Can we not do this now? All planes have systems that direct them away from other planes when they get too close. If you mean that this will be done without a connection between said objects, then there must be some set of rules that are followed. Very similar to those that we "intelligent" beings use when we drive. For instance:
    -I signal to change lanes
    -Person in lane over decides whether to let me in
    -If I see them back off a bit then I proceed, if the speed up then I get behind them

    Everything is logic based. In addition, today we do deal with logic in meaningful ways. What's worth remembering is that all logic is based on AND OR NOT. You can derive any logical expression from these operators. For instance:

    R - S = R AND (NOT S).

    Essentially "Give email from mom a higher priority" would boil down to the same logic as "if subject ~= /*mom*/" in the same way that printf("Hello World\n"); is just a bunch of instructions and addresses is just a bunch of 0s and 1s.
  • by Jagasian ( 129329 ) on Friday April 11, 2003 @11:39AM (#5710865)
    I think it's important not just that the axioms be well chosen, but that there be few of them. Mathematicians have always felt this way about axioms-- the fewer, the better-- and I think they're onto something.


    To anyone that has studied theoretical computer science and/or programming languages knows that such reductionism is a fallacy. "...the fewer, the better..."

    It turns out that its better to strike a balance, where you make the formal mathematical system (what a programming language is after all) as simple as possible, until you get to the point where making it more simple makes it more complicated. Or in other words, making it more simple would cloud the mathematical structures that you are describing.

    Here are some examples of reductionism gone too far: Sheffer stroke, X = \z.zKSK, one instruction assembler, etc...

    The only logical connective you need is the sheffer stroke... but thats of no use to us as it is easier to more connectives such as conjunction, disjunction, implication, and negation.

    The only combinator you need is X, and you can compute anything... but making use of other combinators... or better yet the lambda-calculus is more useful.

    Point is that we need more powerful tools that we can actually use, and there is no simple description of what makes one tool better than another. Applying reductionism can result in nothing special.

    The true places to look for what the future brings with regards to programming languages are the following:

    1. Mobile-Calculi: pi-calculus, etc...
    2. Substructural Logics: linear-logic, etc...
    3. Category Theory: It is big on structure, which is useful to computer scientists.

  • strings (Score:3, Interesting)

    by roskakori ( 447739 ) on Friday April 11, 2003 @11:44AM (#5710911)
    from the article:
    Semantically, strings are more or less a subset of lists in which the elements are characters. So why do you need a separate data type? You don't, really. Strings only exist for efficiency.

    i think strings mainly exist because of usability considerations - from the developers point of view. they provide a compact notation for "list of characters". furthermore, most languages come with string routines/classes/operators that are a lot more powerful and flexible than their list-equivalent.

    efficiency definitely is a consideration, but not the main one.

  • LISP in 100 years (Score:4, Interesting)

    by axxackall ( 579006 ) on Friday April 11, 2003 @11:55AM (#5710991) Homepage Journal
    in 100 years, LISPers will be finally agree with different shapes of brackets. In fact they will accept that something like (defbracket {} metafunctor ...) will let possible something like this (abc {x y z})

    No need to mention they will agree with operators: (defop + a b (+ a b))

    That was a joke and you can do similar thing even today. Seriously, I very agree with these three quotes:

    • "Lisp is a programmable programming language." - John Foderaro, CACM, September 1991;
    • "Lisp isn't a language, it's a building material." - Alan Kay pad;
    • "Greenspun's Tenth Rule of Programming: any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp." - Phil Greenspun;
    Thus, I think that if underlying language for the most of OS components would be something like LISP than the whole concept of programming would be different. It could not happen before being limited by available hardware performance and quality of LISP implementations. But samething was about Java.

    So, if there will be a commercial effort to push LISP again to the market as underlying metalanguage then, if not in 100 then in 2 or about years, we may see all programming languages being "LISP-derived". Add here that LISP syntax is semantically much better than XML, but still same parser-unified. The only problem with LISP today is that it's not so "distributed" like Erlang. Fix it and you'll get the language of the nearest future.

    ---

    I don't know the future. I barely remember the past. I see the present very blur. Time doesn't exist. The reason is irrational. The space is irrelevant. There is no me.

  • Re:Awareness... (Score:5, Interesting)

    by Kallahar ( 227430 ) <kallahar@quickwired.com> on Friday April 11, 2003 @12:03PM (#5711053) Homepage
    If you're going to have the cars sort themselves out, why bother with signals at all? If everything is guided by GPS, why have headlights? If it's not the human doing the driving, why have traffic laws that are there to punish human errors?

    Travis
  • Totally wrong (Score:3, Interesting)

    by Srin Tuar ( 147269 ) <zeroday26@yahoo.com> on Friday April 11, 2003 @12:23PM (#5711209)

    If you ever listen to the types of commands they give to their computers in star trek, they are subjective and ambiguous. Any computer capable of understanding such commands would have no need for the crew (as it would quickly realize).

    As an alternate prediction, assuming that AI does not compute, is that we will always need people who know how to use computers, and we will always need people who know how to think.

    Future languages may free you from pecadillo's, give you greater code reuse and portability, improve the mapping down to machine language, reduce the amount of time/space it takes to express algorithms, and possibly allow a larger degree of algorithmic analysis. What they will not do is free you from the need for programmers.

    I seriously doubt that idiots with powerful computers can accomplish anything.

    Being able to use a computer is the equivalent of using a weapon such as a sword or spear. Its a weapon for your mind.

    Weapons are called force multipliers by the military for good reason: a totally out of shape clumsy slob with a sword is less dangerous than a fit and well trained warrior with a dagger. The same goes for computers, they are force multipliers, but not forces themselves.

    Eventually, all our warriors (thinkers) will also be programmers. Not all at the same level or using the same languages and tools, but some sort of programmers for sure.

  • Java bad? (Score:3, Interesting)

    by MrBlue VT ( 245806 ) on Friday April 11, 2003 @12:38PM (#5711319) Homepage
    I found it interesting that right at the outset he dismissed Java as an "evolutionary dead-end" with no explanation of that comment in the whole article.

    The points he makes about what the good languages are seem to show that Java is indeed a good language. Specifically it has an additional layer that allows for abstraction from the hardware/operating system for portability. It takes care of mundane details for the programmer (garbage collection, no need to worry about dealing with memory directly, etc).

    Basically the article seemed to repeat itself a lot and show that Java does indeed have a lot of good qualities that he thinks will be in future languages. He also dismisses Object-Oriented programming as the cause for "spaggetti code" without giving any justification for that statement. Finally, he slips in a nice ad hominium attack there by saying any "reasonably competent" programmer knows that object-oriented code sucks.

    I think the author's own biases hurt his argument greatly.

  • by Anonymous Coward on Friday April 11, 2003 @12:38PM (#5711323)

    Lisp damages the brain.

    Having spent a year in University temporarily attached to the cult of Lisp (I'm better now, thanks), I can now spot the tell-tale signs of Lisp-induced brain damage in this "article". Among its more tell-tale signs:

    • Obsession with mathematical axioms
    • Dissing object-oriented programming as having "no future" or "usefulness". This is only true if you program in Lisp. Otherwise, we've all, in one way or another, been using an OO approach the programming even when it was just Abstract Data Types (ADT) with a library.
    • A pathological, unhealthy obsession with language orthogonality. Having a couple odd-ball cases in your language is OK as long as it improves expressability in the language. A couple, not the hundreds in Perl. :-)
    • "There is no logical need for a programming language to use numbers or strings!" This is something that drives Lispers nuts that Lisp still has special number and string behavoir. Their like an artist who is upset that there's a fleck of black on their "Polar Bears Wrestling in a Snow Storm" magnus opus.
    • Computing Power makes programming language ineffeciency irrelevant. That's how we got the Java Runtime!

    I'm thinking we need to establish a de-programming group for Lisp cult members. Maybe a de-bracketing instead of de-programming?

  • by William Tanksley ( 1752 ) on Friday April 11, 2003 @12:52PM (#5711423)
    What other languages can use one word to make an entire sentence ?

    Latin:

    Malo
    malo
    malo
    malo


    The colloquial translation:

    "Oh I would rather be

    In an apple tree
    Than a naughty boy
    In adversity."


    And, although Latin is inflected but English is only partially inflected, all of the words are identical!

    So HAH. ;-)

    This sort of poetry is common in obfuscated C contests, although the visual lack of distinction between 0 and O and I and 1 is also commonly needed.

    -Billy
  • Java and the Future (Score:2, Interesting)

    by SoupIsGood Food ( 1179 ) on Friday April 11, 2003 @01:08PM (#5711590)
    The author's claim that Java is a dead-end language like COBOL is patently ludicrous, and by dismissing Java out of hand, he undermines the rest of his article.

    Java is essentially a re-implementation of C; it's a very C-like language in syntax and semantics. The goal was to do an OO language based on a familiar paradigm. In addition to eliminating some of C's less endearing traits, it brings two things to the table, both of which which will shape future languages:

    1. Re-usability. Sun offers you a crapload of very usefull re-usable objects in their JDKs, and people like the Apache project offers you even more. The ability to do insanely complex prrojects with tiny amounts of effort is one reason why Java rules the corporate enterprise. Future languages will start looking more and more like a big box of legos: find the parts you need and plug them together.

    2) A universal computing environment: you can't write to the metal in Java, but it's not as slow as interpreted languages like Python and TCL for gigantic computing tasks. Any project, no matter how monolithic and task-optimized, is as portable as the VM is. Anyone who's had to manage a platform migration for key buisiness applications, from VAX to Solaris, say, or worse, from S/360 to Windows, knows the pain of re-implementation. That pain is gone when you use a VM-based language like Java.

    Projects like Squeak are looking more and more like Java these days, in terms of re-usability and VM-based platform independance. The only missing piece of the puzzle are popular VM environments not tied to any one vendor: Java needs to cut its dependancy on Sun JDKs, or it will be supplanted by another language that is independant and standards based.

    This isn't to say that other languages aren't going to evolve, too, or are useless because they're not like Java. Ayone who programs in the new interpreted scripting languages: PHP, Python, Perl, Ruby, TCL, Scheme, etc, etc, can attest to the power of the that approach to modern computing.

    On the other hand, I really don't see any new compiled-to-the-metal languages emerging. Fortran is used for high-performance computing, Forth is used for tiny computers, and C/C++ is used for system programming. It will very likely be the same way in another 20 years, or another 50. The difference is that applications will slowly drift to either VM languages or interpreted languages from binaries compiled from source.
  • by Anonymous Coward on Friday April 11, 2003 @01:12PM (#5711636)
    because they can take advantage of parallelism by virtue of the fact that each function call does not produce any side effects. So a grossly inneficient Lisp program on a uniprocessor can be made to be blindingly fast on a million CPU machine. Erlang would also benefit since it uses this same model. Non-functional languages like C and Java don't have a snowball's chance in hell of scaling to a million processors.
  • Monkees in 100 years (Score:3, Interesting)

    by axxackall ( 579006 ) on Friday April 11, 2003 @01:42PM (#5711838) Homepage Journal
    That's b/c most of managers are "smart" enough to give the job to many monkees thinking that the quantity is more important than quality. Such management style is the main consumer for procedural (imperative) languages. Non procedural languages are good to describe the problem, but who need them if no one know the problem. Just try the procedure and see if it works.

    You know what, on a second thought I realize that 100 years is not enough for the humankind to move away from being monkees. Thus, Java forever!

  • by Clod9 ( 665325 ) on Friday April 11, 2003 @02:28PM (#5712165) Journal
    Language structure is determined by two things:
    1. the target machine architecture
    2. the range of expression required by the programmer and/or workgroup

    Java is "successful" but it really looks a lot like Algol and Pascal,
    as does C++. The range of expression is greater in the newer languages
    (object-orientation in Java and C++) but the forte is still that of
    expressing algorithms in written form to be used on a stored-program
    digital computer.

    WILL WE STILL BE PROGRAMMING?
    Take one example -- genetic programming. If you had a programming system
    where the basic algorithm could learn, and all you had to do is set up
    the learning environment, then you'd be teaching rather than programming.
    In fact I believe THIS is what most "programmers" will be doing in 100 years. The challenge
    will be defining the problem domain, the inputs, the desired outputs; the
    algorithm and the architecture won't change, or won't change much, and the
    vast majority of people won't fiddle with it.
    But if HAL doesn't appear and we aren't all retrained as Dr. Chandra,
    I believe we'll still be handling a lot of text on flat screens.
    I don't think we'll be using sound, and I don't think we'll be using pictures.
    (see below)

    So predicting what languages will be like in 100 years is predicated
    on knowing what computers and peripherals will be like. I think progress
    will be slow, for the most part -- that is, I don't think it will be all
    that much different from how it is now.

    HOW WILL OUR RANGE OF EXPRESSION CHANGE?
    If we relied primarily on voice input, languages would be a lot more
    like spoken natural languages; there would be far less ambiguity than
    most natural languages (so they'd be more like Russian than like English,
    for example) but there wouldn't be nearly as much punctuation as there
    is in Java and C++.

    If we rely primarily on thought-transfer, they'll be something else
    entirely. But I don't think this will come in 100 years.

    How is a 24x80 IDE window different from punched cards and printers?
    Much more efficient but remarkably similar, really. It would not surprise
    me if we still use a lot of TEXT in the future. Speech is slow --
    a program body stored as audio would be hard to scan through quickly.
    Eyes are faster than ears so the program body will always be stored as
    either text or pictures.

    Pictures - well, pictorial languages assume too much of the work has
    already been done underneath. "Programming isn't hard because of all
    the typing; programming is hard because of all the thinking." (Who
    wrote that in Byte a couple of decades ago?). I don't think we'll be
    using pictures. When we get to the point that we can use hand-waving
    to describe to the computer what we want it to do, again we'll be
    teaching, not programming.

    HOW WILL THE ARCHITECTURE CHANGE?
    If the target architecture isn't Von Neumann, but something else,
    then we may not be describing "algorithms" as we know them today.
    Not being up to speed about quantum computing, I can speak to that
    example...but there are lots of other variations. Analog computers?
    Decimal instead of Binary digital machines? Hardware-implemented
    neural networks? Again, I don't see much progress away from binary
    digital stored-program machine in 40 years, and I think (barring
    a magical breakthrough) this may continue to be the cheapest, most
    available hardware for the next 50-100 years.

    SO WHAT DO I THINK?
    I think IDE's and runtime libraries will evolve tremendously, but
    I don't think basic language design will change much. As long as
    we continue to use physical devices at all, I think the low-level
    programming languages will be very similar to present day ones:
    Based on lines of text with regular grammars and punctuation,
    describing algorithms. I predict COBOL will be gone, FORTRAN will
    still be a dinosaur, and Java and C/C++ will also be dinosaurs.
    But compilers for all 4 wi
  • Re:Quantum Language (Score:3, Interesting)

    by NetSettler ( 460623 ) <kent-slashdot@nhplace.com> on Friday April 11, 2003 @03:30PM (#5712651) Homepage Journal
    The problem is that English is *not* very precise... When lawyers attempt to make English be more precise, look at the messes they make ... I don't know about you, but I don't want that.

    But it's what we've got. Human language is, alas, imprecise. But we have more than 50 years of experience with that and we know nothing better is on the horizon. I think you'll be lucky if between now and a hundred years from now, you can teach 10% of the world's population the meaning of the world algorithm, much less the use of an algorithm.

    But take heart -- while the computer has been called a relentless judge of incompleteness, the fact is that some of that incompleteness is just due to their bad schooling. Lack of common sense. Lack of context. If we can add that stuff in, maybe the kinds of problems computers give us won't sound like the whinings of a small child, ill-informed about the things in the world that we collectively agree should be 'obvious'. That won't fix everything, but it will fix some things.

    For example, most non-computer people are able to take showers in finite time even with "Lather, Rinse, Repeat" written on shampoo bottles. They don't loop infinitely. Maybe in a hundred years, computers won't either because someone will have filled them in on the joke.

    And legalese is not inherently required to be expressed as badly as it commonly is, that's just a fashion. Like doctors having bad handwriting. Social pressure would fix that if people were willing to tell their lawyers to go back and rewrite a text in prettier form. (Some probably are too cheap to pay by the hour to have that happen. Then again, if they did, they could perhaps read the result. The lawyer is probably just as happy you can't, just like a high paid computer consultant is often just as happy his clients can't understand the script he's written them, so they'll have to call him for upgrades. Again, not a technical problem, but a social one.)
  • by mugnyte ( 203225 ) on Friday April 11, 2003 @03:44PM (#5712763) Journal
    The tongue in cheek references to XML, Aliens, and other topics are mere amusing, but no content there. The metaphor to evolution is lost.

    Languages are built on top of many changes in technology: connectivity, speed, machine type, concurrency, adoption.

    Plus, a language is just one codification of a problem solution. The solution can pushed towards any one of several goals: secure, speed, size, reuse, readability, etc.

    Different languages have sprung up for just these 2 statements above. What metric is this guy using to measure a language's popularity? LOC still runnning (COBOL?), steadfastness of code (C?), or CTO business choices (a zoo)...? There are so many ways to look at this, just picking a point of single view is misguided reductionism.

    We will continue to have a multitude of tools available for getting work done. Cross-breeding of concepts for languages is great, and does happen, but unless you trace decisions of specific prime movers you really can't say where a language comes from.

    Anyone can put together a new language, and even get it adopted from some audience. But what gets mindshare for usage are languages that satisfy goals that are popular for the moment. Exploring what those goals will be is impossible, in my mind. What will be popular?

    Speech recognition? Image recognition? Concurrency? Interoptibility? Auto-programming? Compactness? Mindshare?

    These questions are based on our human senses, our environment, etc. Any sci-fi reader would tell you of the concept of a "race based on musical communication", for example that would base progamming on completely other sets of goals. And so on.

    mug
  • by gelfling ( 6534 ) on Friday April 11, 2003 @04:01PM (#5712868) Homepage Journal
    Languages are tools that are used to organize sytactic rules that are then converted into machine usable representations of the same general purpose.

    Languages as you understand them will be as dead as the steam powered loom in 50 years. We will have non letter/symbol typed tools to do that in as much as the DVD has 'replaced' live theater as the only way to 'reproduce' entertainment.
  • Re:how long (Score:2, Interesting)

    by jonadab ( 583620 ) on Friday April 11, 2003 @04:19PM (#5712968) Homepage Journal
    Morphology? In computer languages? Name an instance. The closest
    thing I've seen to that are the sigils in Perl (where %foo can
    become $foo when you access an individual value and @foo when you
    access a slice), and even that is going away in Perl6. Besides,
    that's not really even morphology so much as inflection. Real
    morphology would be if spellings of words mutated not based
    on meaning but on the adjascent words, or if attaching an affix could
    cause changes in the spelling of the rest of the word. This happens
    quite a bit in natural language, but I don't know of a single case
    of it in any computer language.

    Actually, I'm now trying to imagine what that would be like...
    and I think I'm getting the willies.
  • by Kazoo the Clown ( 644526 ) on Friday April 11, 2003 @05:40PM (#5713521)

    People know what they want out of their machines, for the most part.

    Dang, what dream-world do you live in? I work with people all the time that have NO idea what their computer is capable of doing for them, and feel the need to conform to what they think it expects because they have no idea what *they* can expect. In the future this may change somewhat, but even besides that, most of them don't really know what solution they need because they really don't understand the problem to be solved (just that there is a problem to be solved) and must *discover* more information about the problem.

    Let's explore the implications of this a bit. Suppose we have a scenario:

    1. Human states problem: "I need an accounting system for new company XYZ that is starting up. It is a widget manufacturing business that should be able to grow to the size of, say-- IBM with sales, parts supply and support offices all over the world."

    2. Computer says "OK, done."

    3. Human either spends months discovering what the computer's assumptions are about how the company is to operate and then either adjusting the companies operations to conform (thus making it just like every other equivalent business in the world), OR correcting the programs misconceptions so that it will conform with the companies operation (thus preserving some value added that the companies founders have with regards to new ideas of how to run the business) OR some combination thereof.

    Frankly, I don't see any version of step #3 to be particularly effective. Of course, how effective it is may be less of the point than how easy it is to use, but it doesn't strike me as being particulary easy to use either. And this is just a scenario including a relatively common and (presumably) understood problem. What happens when the nature of the problem is not such that there are preconcieved assumptions regarding it that can be leveraged as in this case? I can tell you what happens, the human spends all kinds of time discovering more about the nature of the problem and possible solutions and their implications and side-effects, pretty much like things are today. Sure, new tools will make the process easier and may include spoken input, but all this talk about "human language" interface is mostly utopian ignorance-- you'll have to fix the problems in human language first, in which case you're still "making" the human conform to the computer. Humans can't even talk to *each other* with much understanding, when the misunderstanding a computer is capable of is in the mix it is not going to be a very useful way to solve problems.

    Some people may prefer an English interface, but those will be the novice users for the most part. I can type faster than I can speak, and my typing isn't affected by the level of background noise and interruptions I may receive. As much as some tried to invent alternatives for the keyboard (such as the mouse), virtually none have eliminated the need. I have 10 fingers, not just *one* and know how to use them, and alternative inputs must compete with that. The novice who can't type may prefer using a mouse or some other alternative to do everything. "Ease of use" is desirable for novices, and will always be there, but it's not equivalent to "most efficient interface" between a computer and a knowledgable and/or trained user. And the thing about novices is, they don't all stay novices forever-- though some apparently, may prefer to.

  • by voodoo1man ( 594237 ) on Monday April 14, 2003 @02:31AM (#5726023)
    If you want to build complex systems fast, nobody is going to turn to LISP for a solution. There isn't one. LISP is a beautiful language which I think any programmer would benefit to learn, but its not a language to get things done with.
    Yeah [lava.net], nobody [naughtydog.com] writes [dp.com] large [izware.com] systems [earthlink.net] software [portusgroup.com] in [nasa.gov] Lisp [itasoftware.com].
    LISP is a powerful and interesting language and as a language has its merits. I don't mean to pick on LISP.
    Stop contradicting yourself. Also, nowadays the preferred spelling is "Lisp."
    What irks me is not that Paul Graham is saying this, but that he might get listened to
    So because Graham is promoting Lisp, it's not ok for him to spout off BS? Gosling says some pretty dumb things when evangelizing Java, but I don't see anyone complain (and a lot of people sure seem to listen to him). At least Graham has the decency to admit it's BS.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...