Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming

Stephen Wolfram Developing New Programming Language 168

Nerval's Lobster writes "Stephen Wolfram, the chief designer of the Mathematica software platform and the Wolfram Alpha 'computation knowledge engine,' has another massive project in the works—although he's remaining somewhat vague about details for the time being. In simplest terms, the project is a new programming language—which he's dubbing the 'Wolfram Language'—which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. 'The Wolfram Language does things automatically whenever you want it to,' he wrote in a recent blog posting. 'Whether it's selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you've given in natural language.' In other words, he's proposing a general-purpose programming language with a mind-boggling amount of functions built right in. At this year's SXSW, Wolfram alluded to his decades of work coming together in 'a very nice way,' and this is clearly what he meant. And while it's tempting to dismiss anyone who makes sweeping statements about radically changing the existing paradigm, he does have a record of launching very big projects (Wolfram Alpha contains more than 10 trillion pieces of data cultivated from primary sources, along with tens of thousands of algorithms and equations) that function reliably. At many points over the past few years, he's also expressed a belief that simple equations and programming can converge to create and support enormously complicated systems. Combine all those factors together, and it's clear that Wolfram's pronouncements—no matter how grandiose—can't simply be dismissed. But it remains to be seen how much of an impact he actually has on programming as an art and science."
This discussion has been archived. No new comments can be posted.

Stephen Wolfram Developing New Programming Language

Comments Filter:
  • Re:Well... (Score:4, Insightful)

    by rudy_wayne ( 414635 ) on Friday November 15, 2013 @09:26AM (#45431953)

    Even really smart people come up with stupid ideas.

    Anything that is capable of doing complex things is complex itself. It's unavoidable. Even if every function by itself is extremely simple -- just press the green button -- what happens when there are a thousand buttons. And any one of them can interact with any other button.

  • by korbulon ( 2792438 ) on Friday November 15, 2013 @09:35AM (#45432001)
    Stephen Wolfram is the George Lucas of scientific computing.
  • by smpoole7 ( 1467717 ) on Friday November 15, 2013 @10:09AM (#45432265) Homepage

    I don't program for a living anymore, and I've always been more of a system-level, hardware driver kind of guy, so C/C++ work fine for me.

    But especially coming from that background, my need isn't for another programming language, it's for better documentation of available libraries. For any common task that I want to do, somebody has probably written a great library that I can just strap in and use.

    The problem is when I start trying to use it. The documentation has blank "TBD" pages, or really helpful comments like, "init_lib() -- initializes the library. You can specify the # of flickers per bleem ..."

    Or ... and this is my 2nd favorite ... the documentation is out of date. "Hey, I tried to do this the way your tutorial said and it didn't work?" "Oh, yeah, that tutorial is out of date; we've changed some stuff ..."

    My very most #1 favorite is automatically generated documentation that looks at (for example) a C++ class and then creates an HTML page. I might as well just look at the source code ... hoping, of course, that the people who wrote that source actually inserted more than a few, "does what it says" comments. Or that I don't have to play the Spaghetti Trace(tm) game, bouncing from one .c file to another .h file and back to a third .c (and this is after repeated greps in the "src" directory) to try to figure out what's happening to my poor variable while it's inside a function.

    Not criticizing FOSS, per se; I understand that it's written by volunteers (for whom I'm very grateful). But this, rather than needing a new way to "PRINT" or "SORT" in a programming language, is the far bigger problem, in my book.

  • by alispguru ( 72689 ) <bob,bane&me,com> on Friday November 15, 2013 @10:18AM (#45432365) Journal

    1. Wolfram is a notorious Lisp disser [ymeme.com], and Mathematica is arguably a shining example of Greenspun's tenth rule [wikipedia.org].

    2. Lisp has a long history of trying to help programmers, with mixed results. The term DWIM [wikipedia.org] was coined by Warren Teitelman in 1966 as part of a project based on BBN Lisp, the main predecessor of Interlisp; this project of his sounds like DWIM writ large.

  • parable (Score:4, Insightful)

    by Megane ( 129182 ) on Friday November 15, 2013 @10:24AM (#45432427)

    One day the student came to the master and said "There are too many programming languages! I am tired of having to learn thirty programming languages! I shall write a new programming language to replace them all!"

    The master smacked the student upside the head. "Idiot! Then we would all have to learn thirty-one programming languages!" The student was enlightened.

    Unfortunately, it was only the one student who was enlightened. Now we all have to learn fifty programming languages.

  • Re:Well... (Score:5, Insightful)

    by plover ( 150551 ) on Friday November 15, 2013 @10:25AM (#45432443) Homepage Journal

    People seem to think that the problems with programming come from the languages. They're too weakly-typed, too strongly-typed, they use funny symbols, they don't have enough parenthesis, they use significant white space.

    The biggest problems aren't coming from the languages. The problems come from managing the dependencies.

    Everything needs to change state to do useful work. But each state has all these dependencies on prior states, and is itself often setting up to perform yet another task. Non-programmers even have a cute phrase for it: "getting your ducks in a row" is an expression meaning that if you get everything taken care of in advance, your task will be successful.

    Ever notice that on a poorly done task that it's so much easier to throw away the prior work and start over? That's because you've solved the hard part: you learned through experience what things need to be placed in which order, which was the root of the hard problem in the first place. When you redo it, you naturally organize the dependencies in their proper order, and the task becomes easy.

    What a good language has to do is encapsulate and manage these relationships between dependencies. It might be something like a cross between a PERT chart, a sequence diagram, a state chart, and a timeline. Better, the environment should understand the dependencies of every component to the maximum degree possible, and prevent you from assembling them in an unsuccessful order.

    Get the language to that level, and we won't even need the awkward syntax of "computer, tea, Earl Grey, hot."

  • Re:Well... (Score:5, Insightful)

    by Greyfox ( 87712 ) on Friday November 15, 2013 @10:55AM (#45432761) Homepage Journal
    It's very rare that I see a dev team throw something away, unless it's an entire project. Once something's written, people seem to treat it as carved in stone, and they never look at it again. I was looking at some code a while back that output a file full of numbers in a particular format. Over the years the format had changed a few times. Their solution for addressing that was to write a new piece of code that took the original output file, and reformatted it to the new output version. The next time a format change came along, they did the same thing again using the file their reformatter had output! There was a lot of other nonsense in there that was apparently written so that they'd never have to go back and change anything that had already been written. And that kind of mentality seems to be pervasive in the industry (though usually not to THAT extreme.)

    So people bitch about that or business process and I tell them "Well if it's not working for you, FIX it! It doesn't HAVE to be this way, we could just do things differently!" And they look at me as if I'd just suggested the Earth is flat.

  • Re:Well... (Score:5, Insightful)

    by fuzzyfuzzyfungus ( 1223518 ) on Friday November 15, 2013 @11:03AM (#45432875) Journal

    Perhaps, but I can't help thinking that making assumptions will lead to unpredictable and inconsistent behaviour. Convention over configuration and type inference is one thing, but assumptions are completely another. It's like the dangers in lower level languages where a programmer assumes memory will be zeroed ... and _usually_ it is. It leads to obscure errors. There's a lot to be said for beiong explicit where possible.

    This is Stephen "A New Kind of Science" Wolfram. The guy who cataloged some cellular autonoma (and had his uncredited research peons catalog a bunch more) and then decided that he'd answered all the profound questions of metaphysics. I'm not sure that banal matters of 'software engineering' are his problem anymore.

    very sharp guy. However, like many sharp guys, he seems to have entered his obsessive pseudoscience and grandiosity phase. Same basic trajectory as Kurzweil, whose achievements are not to be underestimated; but who now basically evangelizes for nerd cyber-jesus full time.)

  • by wickerprints ( 1094741 ) on Friday November 15, 2013 @11:04AM (#45432881)

    Being primarily a mathematician and not a computer scientist or engineer, I have used Maple, Mathematica, and R. At one point I knew Pascal and C. I've dabbled in Python.

    Of all these programming languages, Mathematica was BY FAR the easiest language for me to learn to use. The way it does certain things makes so much more sense to me than the others--for example, how it handles functions and lists. Unlike C, it's a high-level language if you want it to be, although you aren't forced to use it in that way. Pattern matching is extremely powerful. And the syntax is totally unambiguous; brackets define functions, braces define lists, and parentheses are used only for algebraic grouping of terms.

    The major criticism I have of Mathematica is that it is comparatively slow, mainly because of its lack of assumptions regarding the nature of the inputs. Internally, it tries to preserve numerical precision, it works with arbitrary precision arithmetic, and it doesn't assume values are machine precision. All this comes at a cost. Also, reading other people's code can be remarkably difficult, even if it's commented. The tendency is to write functions that do a lot of complicated things in one command, so code can be remarkably dense.

    Most recently, I have had to learn how to use R, due to its abundance of statistical algorithms, many of which have not been implemented in Mathematica. There was a simple example where I tried to calculate a Bayes factor, and the expression was something like (1 - x)/(1 - y), where x and y were very small positive numbers, somewhere around the order of 10^-15. This calculation totally failed in R--the answer given was 1. Mathematica correctly calculated the ratio. Maybe I don't know enough about R to know how to preserve the necessary numerical precision, but it sort of shows that in Mathematica, such issues are handled automatically; moreover, if there is a potential problem, Mathematica warns you.

    Anyway, this is all just personal opinion, really. The takeaway for me is that I see a lot of evidence that Stephen Wolfram is pretty good at designing computer languages for specific purposes. Yes, he's totally egocentric, but there's no denying that he is brilliant. When Wolfram | Alpha debuted, I remember thinking how totally stupid it was. And now...every single high school and college math student knows about it. It is one of the most ingenious marketing ploys I have ever seen. And the scary thing is, it keeps improving. It's almost Star Trek-like in its ability to parse natural language input. And I think that's the eventual direction that computer programming will evolve towards. Programs will not be written in code, but instead, as broad sentences, parsed by an AI which automatically performs the high-level task.

  • Re:Well... (Score:2, Insightful)

    by Anonymous Coward on Friday November 15, 2013 @01:10PM (#45434477)

    Or they don't rewrite a module from scratch because it is too intermingled with a hugely complicated system and you cannot guarantee you will not miss something. On my own personal Android projects I have no problem rewriting parts of it no matter how involved or integrated within the whole program it is because I understand how all the parts work together. At work the project is too big to understand how everything works together. There are too many other parts that are assuming that things work the way the have been and if you change something you very well could miss testing some obscure section that seems totally unrelated but still is touched by those changes. It's much easier to make the changes you need to make in as small a way as possible to avoid breaking things. The file conversion seems pretty ridiculous as it would make sense to do the new conversion from the original and not do a conversion of a conversion. Or even just do a conversion once and now all the files are kept as the new version and no more converting needs to be done. But if there are parts of the project that still need to use those intermediate parts of the conversions then that complicates things. You need to change those parts and maybe that breaks other parts etc.The risk can be too high for the little reward it some cases.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...