Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming

Stephen Wolfram Developing New Programming Language 168

Nerval's Lobster writes "Stephen Wolfram, the chief designer of the Mathematica software platform and the Wolfram Alpha 'computation knowledge engine,' has another massive project in the works—although he's remaining somewhat vague about details for the time being. In simplest terms, the project is a new programming language—which he's dubbing the 'Wolfram Language'—which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. 'The Wolfram Language does things automatically whenever you want it to,' he wrote in a recent blog posting. 'Whether it's selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you've given in natural language.' In other words, he's proposing a general-purpose programming language with a mind-boggling amount of functions built right in. At this year's SXSW, Wolfram alluded to his decades of work coming together in 'a very nice way,' and this is clearly what he meant. And while it's tempting to dismiss anyone who makes sweeping statements about radically changing the existing paradigm, he does have a record of launching very big projects (Wolfram Alpha contains more than 10 trillion pieces of data cultivated from primary sources, along with tens of thousands of algorithms and equations) that function reliably. At many points over the past few years, he's also expressed a belief that simple equations and programming can converge to create and support enormously complicated systems. Combine all those factors together, and it's clear that Wolfram's pronouncements—no matter how grandiose—can't simply be dismissed. But it remains to be seen how much of an impact he actually has on programming as an art and science."
This discussion has been archived. No new comments can be posted.

Stephen Wolfram Developing New Programming Language

Comments Filter:
  • Well... (Score:5, Interesting)

    by Adam Colley ( 3026155 ) <mog@ k u p o .be> on Friday November 15, 2013 @08:06AM (#45431867)

    Hrm, another programming language...

    Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)

    Still, look forward to seeing it, perhaps I'll be pleasantly surprised.

    • Re:Well... (Score:5, Informative)

      by Nerdfest ( 867930 ) on Friday November 15, 2013 @08:10AM (#45431887)

      Perhaps, but I can't help thinking that making assumptions will lead to unpredictable and inconsistent behaviour. Convention over configuration and type inference is one thing, but assumptions are completely another. It's like the dangers in lower level languages where a programmer assumes memory will be zeroed ... and _usually_ it is. It leads to obscure errors. There's a lot to be said for beiong explicit where possible.

      • Re:Well... (Score:5, Insightful)

        by plover ( 150551 ) on Friday November 15, 2013 @09:25AM (#45432443) Homepage Journal

        People seem to think that the problems with programming come from the languages. They're too weakly-typed, too strongly-typed, they use funny symbols, they don't have enough parenthesis, they use significant white space.

        The biggest problems aren't coming from the languages. The problems come from managing the dependencies.

        Everything needs to change state to do useful work. But each state has all these dependencies on prior states, and is itself often setting up to perform yet another task. Non-programmers even have a cute phrase for it: "getting your ducks in a row" is an expression meaning that if you get everything taken care of in advance, your task will be successful.

        Ever notice that on a poorly done task that it's so much easier to throw away the prior work and start over? That's because you've solved the hard part: you learned through experience what things need to be placed in which order, which was the root of the hard problem in the first place. When you redo it, you naturally organize the dependencies in their proper order, and the task becomes easy.

        What a good language has to do is encapsulate and manage these relationships between dependencies. It might be something like a cross between a PERT chart, a sequence diagram, a state chart, and a timeline. Better, the environment should understand the dependencies of every component to the maximum degree possible, and prevent you from assembling them in an unsuccessful order.

        Get the language to that level, and we won't even need the awkward syntax of "computer, tea, Earl Grey, hot."

        • Re:Well... (Score:5, Insightful)

          by Greyfox ( 87712 ) on Friday November 15, 2013 @09:55AM (#45432761) Homepage Journal
          It's very rare that I see a dev team throw something away, unless it's an entire project. Once something's written, people seem to treat it as carved in stone, and they never look at it again. I was looking at some code a while back that output a file full of numbers in a particular format. Over the years the format had changed a few times. Their solution for addressing that was to write a new piece of code that took the original output file, and reformatted it to the new output version. The next time a format change came along, they did the same thing again using the file their reformatter had output! There was a lot of other nonsense in there that was apparently written so that they'd never have to go back and change anything that had already been written. And that kind of mentality seems to be pervasive in the industry (though usually not to THAT extreme.)

          So people bitch about that or business process and I tell them "Well if it's not working for you, FIX it! It doesn't HAVE to be this way, we could just do things differently!" And they look at me as if I'd just suggested the Earth is flat.

          • Re:Well... (Score:4, Interesting)

            by LongearedBat ( 1665481 ) on Friday November 15, 2013 @10:43AM (#45433315)
            I do, frequently. And my code is better 'cos of it. In my experience, when people are too afraid to start a module afresh, it's because they're afraid that they don't/can't/won't understand the problem well enough to a) write a solution that works b) understand the insufficiencies/faults of the existing code to do a better job next time around.
            • Re: (Score:2, Insightful)

              by Anonymous Coward

              Or they don't rewrite a module from scratch because it is too intermingled with a hugely complicated system and you cannot guarantee you will not miss something. On my own personal Android projects I have no problem rewriting parts of it no matter how involved or integrated within the whole program it is because I understand how all the parts work together. At work the project is too big to understand how everything works together. There are too many other parts that are assuming that things work the way th

              • I would say this is often true in the real world, but it shouldn't be true if things are really written using best practices. A true, well-written object-oriented design comprised of small, isolated pieces of encapsulated logic, ideally paired with comprehensive unit tests, should prevent the kind of subtle problems you describe. The unfortunate reality, however, is that in many professional settings such careful practices are often ignored or only partially followed, undermining the benefits they are suppo
          • I'm not a coder, but I've seen similar effects in system configurations like firewall policies. In those cases it was due to change control, it is easier to get approved and to roll back something that is added onto the existing structure without changing it, than it is to rework the whole thing. I wondered if that was a factor in your experiences in the developer world.
      • Re:Well... (Score:5, Insightful)

        by fuzzyfuzzyfungus ( 1223518 ) on Friday November 15, 2013 @10:03AM (#45432875) Journal

        Perhaps, but I can't help thinking that making assumptions will lead to unpredictable and inconsistent behaviour. Convention over configuration and type inference is one thing, but assumptions are completely another. It's like the dangers in lower level languages where a programmer assumes memory will be zeroed ... and _usually_ it is. It leads to obscure errors. There's a lot to be said for beiong explicit where possible.

        This is Stephen "A New Kind of Science" Wolfram. The guy who cataloged some cellular autonoma (and had his uncredited research peons catalog a bunch more) and then decided that he'd answered all the profound questions of metaphysics. I'm not sure that banal matters of 'software engineering' are his problem anymore.

        very sharp guy. However, like many sharp guys, he seems to have entered his obsessive pseudoscience and grandiosity phase. Same basic trajectory as Kurzweil, whose achievements are not to be underestimated; but who now basically evangelizes for nerd cyber-jesus full time.)

        • like many sharp guys, he seems to have entered his obsessive pseudoscience and grandiosity phase

          Which is exactly why this may be a fascinating language. Even if it's completely absurd and impractical, whatever ideas he's cooking up may at least be entertaining and/or thought-provoking.

          • Which is exactly why this may be a fascinating language. Even if it's completely absurd and impractical, whatever ideas he's cooking up may at least be entertaining and/or thought-provoking.

            So a bit like Lisp?

      • Re:Well... (Score:5, Interesting)

        by physicsphairy ( 720718 ) on Friday November 15, 2013 @10:05AM (#45432895)

        Being explicit is precisely what makes programming laborious and tedious. It is entirely true that without such tediousness, you do not enjoy a full range of functionality. But the vast majority of the time you do not need a full range of functionality.

        Speaking as someone in a scientific major, Wolfram|Alpha has shortly become the go to resource for everyone looking to do quick, more-than-arithmetical calculation. It does a fantastic job of anticipating what you need and providing the appropriate options. If I need a differential equation integrated or the root to some expression I *can* ask for it explicitly, but usually I just type in the expression and everything I need will be generated by Wolfram automatically. For involved projects I do setup my problems in Python, but 99% of the time Wolfram|Alpha does just what I need for a hundredth of the effort. The fact my peers are using it the same way is notable because, while before Wolfram I might use Python or Maple or Mathematica, most everyone else would do these things by hand -- learning to use the available tools was something they considered too intimidating or not worth the effort.

        If Stephen Wolfram can do something vaguely like Wolfram|Alpha with more ability to customize and automate what is happening, it's going to transform academics, maybe even down to the high school level. Imagine being able to easily develop a science fair project which requires solving some complicated ODEs, without having to take 3 years of college math first.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Lisp worked well. So much so, most of the languages since C basically go "Here's our idea, we're going to be like C in base thinking but extra. What extra? Well, we're going to add lisp-like features, usually in a half-baked manner." The only really major variation is languages targetting multicore operation but they tend to be functional type like lisp.

      Problem with C is that it's a high level assembly. Great for computers as they were in the 1970s and 1980s.

      Problem back then was lisp was too heavy. P

      • "Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp."

        -Greenspun's Tenth Rule.
      • ... which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. 'The Wolfram Language does things automatically whenever you want it to,' he wrote in a recent blog posting. 'Whether it's selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you've given in natural language.' In other words, he's proposing a general-purpose programming language with a mind-boggling amount of functions built right in.

        Well, that's pretty much a description of Common Lisp in the hands of a capable lisper. ;-)

      • by qbzzt ( 11136 )

        C also has a couple of features that appear to have come from Lisp:

        1. Pointers to functions, which allow functions to be passed, etc.
        2. Macros expansion phase prior to compile time.

    • Re:Well... (Score:4, Insightful)

      by rudy_wayne ( 414635 ) on Friday November 15, 2013 @08:26AM (#45431953)

      Even really smart people come up with stupid ideas.

      Anything that is capable of doing complex things is complex itself. It's unavoidable. Even if every function by itself is extremely simple -- just press the green button -- what happens when there are a thousand buttons. And any one of them can interact with any other button.

    • Re:Well... (Score:5, Funny)

      by rudy_wayne ( 414635 ) on Friday November 15, 2013 @08:33AM (#45431995)

      Hrm, another programming language...

      Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)

      Too many people think that programing is "just a lot of typing". Which leads people to believe that they should create a "new programming language" where you can just type "Make a new better version of Facebook" and be done with it.

      Which leads to a lot of crap with "Visual" in its name. Hey look, you don't have to type. Just drag this widget from here to here. And we've seen how sell that turned out.

      • by jythie ( 914043 )
        I think part of the problem is making new languages is fun and sexy, so people keep doing it rather then building frameworks, libraries, or editors on top of existing ones. So we end up with dozens of half baked languages that do not work together and are missing a great deal of functionality.. with more on the way to try to fix the problem with the same solution that got us into the mess in the first place.
      • by HiThere ( 15173 )

        You don't understand the real problem with visual programming systems. They can be extremely powerful, but they tend to use so much screen space to do anything that you can't see very much of what's going on. So you need to hold too much in memory.

        I remember Prograf on the Mac (before it died trying to transition to MSWind95). It was an excellent dataflow language, and if it were alive now, it would be a natural for multi-CPU platforms. But it was too visual, which made it nearly impossible to have a pr

    • by skids ( 119237 )

      Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)

      The places where it does work, you don't notice. Compilers/optimizers/JIT engines are automated programming. You tell the system what you want to do and behind the scenes it figures out all the stuff you did not tell it. Like not to actually check if X again if you have checked that earlier and X could not have changed, even if you told it to check X again because it was easier for you to write it that way.

      That said, we have words for this in Perl5/Perl6, DWIM (Do What I Mean) and WAT (acronym open to co

    • Attempts have been made in the past to automate programming, it's never worked very well

      On the contrary. The first attempt to automate programming produced assembly language, which automated the assignment of addresses to variables and instructions. The second one produced FORTRAN, which automated the "programming" of formulae into sequences of individual operations. Every time we successfully automate some programming activity, the nature of programming changes.

      Mike O'Donnell

    • Attempts have been made in the past to automate programming, it's never worked very well

      On the contrary, automated programming has worked repeatedly, each time redefining "programming":

      1. Machine language automated the programming of patch boards.
      2. Assembly language automated programming in machine language, particularly the assignment of addresses.
      3. FORTRAN automated assembly language programming, particularly the programming of formulae into sequences of operations.

      Each time someone automated "programming," the word stopped referring to the automated part, and referred to the remaining part of alg

  • I'll stick with C
    • Anything with this guy's name on it makes me want to distance myself from it. Alpha was a tracking disaster and I still receive junk mail from this clown.
    • Say brother, can you spare a pointer?
      • I have one going cheap here. It's just a copy of a pointer to a char which I am using globally in a multithreaded program with no semaphores or mutexs. It will probably work as long as you use it quick, and only read it's contents.

      • 0x3a28213a
        0x6339392c
        0x7363682e

      • by Valdrax ( 32670 )

        Sure, buddy! (I know I dropped one somewhere around here...)

      • by mwehle ( 2491950 )

        Say brother, can you spare a pointer?

        Should have a heap of them around here somewhere. Let me peek in the register. Ah yes, here's a stack!

    • by jythie ( 914043 )
      I think that around the time C matured we had as many actual languages as we needed. But for some reason people keep coming up with new ones with syntax that is different enough to be incompatible with each other but similar enough one wonders why they created a whole new language rather then a library or framework to link an existing language to a new domain.
      • The problem is not what languages we need. We don't actually need a high-level language other than C. The problem is finding useful languages for various purposes.

        There are a lot of programs in assorted languages that would be very difficult to write in C, and are much easier to write in other languages. Many things are much easier to write in C++, Lisp, or Prolog than in C, to name three languages. The different syntax is necessary to get the ease of writing. You can write object-oriented C program

  • by jopet ( 538074 ) on Friday November 15, 2013 @08:09AM (#45431883) Journal

    that you will have to pay a lot of money to use it?

    • that you will have to pay a lot of money to use it?

      If the work that needs to be done could be done quicker or simpler (i.e. cheaper) by paying a $1000 license rather than having a $300,000-per-year researcher to go learn Python or R, then it is worth it to pay, no? The current options aren't going away.

    • If the programming language relies on remote servers (basically Wolfram Alpha) in order to function it would make sense that it would cost money. It costs money to hire people to make and improve a system like Wolfram Alpha.

      If people got over the idea of having everything on their computers for free the world would have a lot less corporate snooping and a lot less ad spamming. That would be nice.

      • If people got over the idea of having everything on their computers for free the world would have a lot less corporate snooping and a lot less ad spamming. That would be nice.

        And several of the current tech giants would shrivel up and die, and that would be even nicer. :-)

      • How cutely naive! If a programming language costs money and relies on remote servers you expect corporate snooping to decrease? I think hell would freeze over first.
        • How cutely naive! If a programming language costs money and relies on remote servers you expect corporate snooping to decrease? I think hell would freeze over first.

          I did not say that.

          If you have two programming languages that depend on remote servers, one that's free is in gratis and one that has fees I would expect the one that has fees to value and respect your privacy more than the one that is free.

      • "Add a google Ad box to the upper left corner below the logo"
        "Make it fit under the logo nicely."
        "Make it blink."

        ...

        Noooooooooo!

  • by Phrogman ( 80473 ) on Friday November 15, 2013 @08:13AM (#45431897)

    that way if we make a programming error we can just comment "Bad Wolf" (too much exposure to Dr Who recently) :P

  • by paiute ( 550198 ) on Friday November 15, 2013 @08:20AM (#45431925)
    Wolfram announced his latest idea - that there needed to be some kind of pliable material available next to toilets with which to clean one's bum. This material, he said, is going to be really soft, probably a couple of layers thick, and needed to be on some kind of continuous dispenser mechanism which he is developing.
    • Wolfram announced his latest idea - that there needed to be some kind of pliable material available next to toilets with which to clean one's bum. This material, he said, is going to be really soft, probably a couple of layers thick, and needed to be on some kind of continuous dispenser mechanism which he is developing.

      And naturally, he'll call it Wolfram paper. :-)

  • Oh boy. (Score:5, Funny)

    by korbulon ( 2792438 ) on Friday November 15, 2013 @08:29AM (#45431961)

    First a new kind of SCIENCE, now a new kind of PROGRAMMING.

    Can't wait for a new kind of LOVE.

    • First a new kind of SCIENCE, now a new kind of PROGRAMMING.

      Can't wait for a new kind of LOVE.

      Given the challenges many face with the old kind I doubt we are ready to face a new kind...

    • You beat me to this one. I actually read that whole damn book, thinking it would be worth my time - what a laugh.
  • by korbulon ( 2792438 ) on Friday November 15, 2013 @08:35AM (#45432001)
    Stephen Wolfram is the George Lucas of scientific computing.
  • He won't publish it under a free software license...

  • So you can do anything you want with Wolfram language? The only limit is your imagination?

    Will the first project be the long-awaited 1.0 version of Zombo.com [zombo.com]?

  • by Celarent Darii ( 1561999 ) on Friday November 15, 2013 @08:42AM (#45432055)

    Well, either he's created the mother of all LISP macros, or it's simply vaporware. Love to see it when they publish it. Code or it didn't happen.

    Here is the obligatory xkcd [xkcd.com], panel two.

  • by umafuckit ( 2980809 ) on Friday November 15, 2013 @08:54AM (#45432131)
    This isn't even a story. The linked-to blog post is marketing fluff, full of big hazy promises and substance. I read some of it and sounds like some sort of data-centric OO language (it makes me think of the R plotting system ggplot: http://ggplot2.org/ [ggplot2.org]), beyond that, however, who knows what the hell this is?
  • by smpoole7 ( 1467717 ) on Friday November 15, 2013 @09:09AM (#45432265) Homepage

    I don't program for a living anymore, and I've always been more of a system-level, hardware driver kind of guy, so C/C++ work fine for me.

    But especially coming from that background, my need isn't for another programming language, it's for better documentation of available libraries. For any common task that I want to do, somebody has probably written a great library that I can just strap in and use.

    The problem is when I start trying to use it. The documentation has blank "TBD" pages, or really helpful comments like, "init_lib() -- initializes the library. You can specify the # of flickers per bleem ..."

    Or ... and this is my 2nd favorite ... the documentation is out of date. "Hey, I tried to do this the way your tutorial said and it didn't work?" "Oh, yeah, that tutorial is out of date; we've changed some stuff ..."

    My very most #1 favorite is automatically generated documentation that looks at (for example) a C++ class and then creates an HTML page. I might as well just look at the source code ... hoping, of course, that the people who wrote that source actually inserted more than a few, "does what it says" comments. Or that I don't have to play the Spaghetti Trace(tm) game, bouncing from one .c file to another .h file and back to a third .c (and this is after repeated greps in the "src" directory) to try to figure out what's happening to my poor variable while it's inside a function.

    Not criticizing FOSS, per se; I understand that it's written by volunteers (for whom I'm very grateful). But this, rather than needing a new way to "PRINT" or "SORT" in a programming language, is the far bigger problem, in my book.

    • I fully agree with this.

      Just finding libraries, configuring them, and learning to use them is pretty hard some times. .NET/Java makes this a bit easier, just import the jar/.dll and away you go.

      Some PERL distributions make this easier with a package manager.

      I have no idea what Wolfram has, but it would be pretty cool if it managed to do a lot of this. Centralized package management. Maybe it scans your code, sees what you're trying to do and then chooses an optimal function in some library (hopefully offers

    • by bertok ( 226922 )

      Interestingly, you claim your choice of programming language suits your requirements, but then you state a bunch of issues endemic to it, but mitigated or absent in other languages.

      For example, the need to sometimes, but not always, initialize objects, libraries, or whatever is typical of C/C++ code, but rare in Java or C#, where constructors or factory methods usually do that automatically for you on demand. The worst I've seen is some Microsoft C++ code where every object had both a C++ constructor and an

  • parable (Score:4, Insightful)

    by Megane ( 129182 ) on Friday November 15, 2013 @09:24AM (#45432427)

    One day the student came to the master and said "There are too many programming languages! I am tired of having to learn thirty programming languages! I shall write a new programming language to replace them all!"

    The master smacked the student upside the head. "Idiot! Then we would all have to learn thirty-one programming languages!" The student was enlightened.

    Unfortunately, it was only the one student who was enlightened. Now we all have to learn fifty programming languages.

  • Seriously, C is that awesome.

    If C doesn't work, import python.

  • by DdJ ( 10790 ) on Friday November 15, 2013 @10:05AM (#45432897) Homepage Journal

    This fellow needs to work on his self-esteem.

    • by Prune ( 557140 )
      He's been putting his name in everything he produced after Mathematica. Actually, it looks like now even that's become "Wolfram Mathematica", according to the website.
  • Both did work wioth cellular autmata 50 years apart
  • I'm not saying Wolfram can't pull this off, but I've been programming for a long, long time and mastering the pesky "nitpicking complexity" is one thing good programmers do very well.

    I wish him well, but I remain skeptical. I hope the result doesn't devolve into "click here and here and here."
  • ... which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware ...

    Isn't that what fortran does?

  • by T.E.D. ( 34228 ) on Friday November 15, 2013 @11:23AM (#45433833)

    The most concrete detail I could find anywhere on his web about it was his repeated characterization of the language as "knowledge-based".

    Now, unless he has some whole new meaning in mind, that isn't a totally new concept in languages. We generally call such languages "AI languages" (or more technically, Inference Engines [wikipedia.org] or Reasoning Engines [wikipedia.org] or whatever.

    The general idea is that the programmer's job is to write rules. Then you feed the engine your rules and a set of facts (or an operating environment it can go get "facts" from), and it will follow what rules it needs to. The language/system of this kind that programmers here will probably be most familiar with is make [gnu.org]

    It sounds cool, but I think a lot of folks here might find the concept of something like make being the answer to all their programming difficulties a case of the cure being worse than the disease.

    • I took a class in rule-based systems once. I started thinking of how I'd verify correctness, and didn't really think of much. The advantage of conventional programming languages is that, within very broad limits, you can tell what the computer is going to do with the program. I think that is largely lost with complex rule-based systems.

      Languages like make and yacc can get away with that because of their limited scope. If I screw up with an LALR(1) grammar, yacc will find reduce-reduce errors and tell

  • by cyberchondriac ( 456626 ) on Friday November 15, 2013 @01:11PM (#45435291) Journal
    Personally, just for the coolness factor, I think he should name it, "Wolf", for maybe one of the following:

    * World Object Language Framework
    or
    * Wolfram Object Language Framework

    Im just barking at the moon.. I really have no idea what I'm talking about.
  • Stephen Wolfram is a brilliant businessman who has made a fortune charging what the market will bear for Mathematica and Alpha. Will that model break-down with the Wolfram programming language? I think it will. PARCplace tried to sell Smalltalk for awhile and the language stagnated until Alan Kay was able to get Squeak going. I can't imagine anything becoming as popular as Python or C++ if it costs thousands of dollars to get into the game.

    Perhaps Wolfram will patent some of his ideas and then they will

Ocean: A body of water occupying about two-thirds of a world made for man -- who has no gills. -- Ambrose Bierce

Working...