Stephen Wolfram Developing New Programming Language 168
Nerval's Lobster writes "Stephen Wolfram, the chief designer of the Mathematica software platform and the Wolfram Alpha 'computation knowledge engine,' has another massive project in the works—although he's remaining somewhat vague about details for the time being. In simplest terms, the project is a new programming language—which he's dubbing the 'Wolfram Language'—which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. 'The Wolfram Language does things automatically whenever you want it to,' he wrote in a recent blog posting. 'Whether it's selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you've given in natural language.' In other words, he's proposing a general-purpose programming language with a mind-boggling amount of functions built right in. At this year's SXSW, Wolfram alluded to his decades of work coming together in 'a very nice way,' and this is clearly what he meant. And while it's tempting to dismiss anyone who makes sweeping statements about radically changing the existing paradigm, he does have a record of launching very big projects (Wolfram Alpha contains more than 10 trillion pieces of data cultivated from primary sources, along with tens of thousands of algorithms and equations) that function reliably. At many points over the past few years, he's also expressed a belief that simple equations and programming can converge to create and support enormously complicated systems. Combine all those factors together, and it's clear that Wolfram's pronouncements—no matter how grandiose—can't simply be dismissed. But it remains to be seen how much of an impact he actually has on programming as an art and science."
Well... (Score:5, Interesting)
Hrm, another programming language...
Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)
Still, look forward to seeing it, perhaps I'll be pleasantly surprised.
Re:Well... (Score:5, Informative)
Perhaps, but I can't help thinking that making assumptions will lead to unpredictable and inconsistent behaviour. Convention over configuration and type inference is one thing, but assumptions are completely another. It's like the dangers in lower level languages where a programmer assumes memory will be zeroed ... and _usually_ it is. It leads to obscure errors. There's a lot to be said for beiong explicit where possible.
Re:Well... (Score:5, Insightful)
People seem to think that the problems with programming come from the languages. They're too weakly-typed, too strongly-typed, they use funny symbols, they don't have enough parenthesis, they use significant white space.
The biggest problems aren't coming from the languages. The problems come from managing the dependencies.
Everything needs to change state to do useful work. But each state has all these dependencies on prior states, and is itself often setting up to perform yet another task. Non-programmers even have a cute phrase for it: "getting your ducks in a row" is an expression meaning that if you get everything taken care of in advance, your task will be successful.
Ever notice that on a poorly done task that it's so much easier to throw away the prior work and start over? That's because you've solved the hard part: you learned through experience what things need to be placed in which order, which was the root of the hard problem in the first place. When you redo it, you naturally organize the dependencies in their proper order, and the task becomes easy.
What a good language has to do is encapsulate and manage these relationships between dependencies. It might be something like a cross between a PERT chart, a sequence diagram, a state chart, and a timeline. Better, the environment should understand the dependencies of every component to the maximum degree possible, and prevent you from assembling them in an unsuccessful order.
Get the language to that level, and we won't even need the awkward syntax of "computer, tea, Earl Grey, hot."
Re:Well... (Score:5, Insightful)
So people bitch about that or business process and I tell them "Well if it's not working for you, FIX it! It doesn't HAVE to be this way, we could just do things differently!" And they look at me as if I'd just suggested the Earth is flat.
Re:Well... (Score:4, Interesting)
Re: (Score:2, Insightful)
Or they don't rewrite a module from scratch because it is too intermingled with a hugely complicated system and you cannot guarantee you will not miss something. On my own personal Android projects I have no problem rewriting parts of it no matter how involved or integrated within the whole program it is because I understand how all the parts work together. At work the project is too big to understand how everything works together. There are too many other parts that are assuming that things work the way th
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
In theory if a node-based graphical programming editor is created right, it should be able to output code in a syntax for any programming language you choose. Diagram what you want, and click the button for C++/Javascript/Python/Etc., and then you get the desired source saved out for use in your compiler or interpreter of choice. So it really wouldn't have to be a language, but more like a meta-language.
Some people seem to be confused by the idea that whatever meta-language a program is written becomes the primary source code. If you write a UML diagram and generate code from it, the UML diagram is the actual source code, not the .c++ and .h files it spits out. It also makes developers uncomfortable when there are blocks of C++ code they must never touch. (Microsoft understands this, and so they carefully try to make their visual editors work both ways, where they parse the C# code to alter the values o
Re:Well... (Score:5, Insightful)
Perhaps, but I can't help thinking that making assumptions will lead to unpredictable and inconsistent behaviour. Convention over configuration and type inference is one thing, but assumptions are completely another. It's like the dangers in lower level languages where a programmer assumes memory will be zeroed ... and _usually_ it is. It leads to obscure errors. There's a lot to be said for beiong explicit where possible.
This is Stephen "A New Kind of Science" Wolfram. The guy who cataloged some cellular autonoma (and had his uncredited research peons catalog a bunch more) and then decided that he'd answered all the profound questions of metaphysics. I'm not sure that banal matters of 'software engineering' are his problem anymore.
very sharp guy. However, like many sharp guys, he seems to have entered his obsessive pseudoscience and grandiosity phase. Same basic trajectory as Kurzweil, whose achievements are not to be underestimated; but who now basically evangelizes for nerd cyber-jesus full time.)
Re: (Score:3)
Which is exactly why this may be a fascinating language. Even if it's completely absurd and impractical, whatever ideas he's cooking up may at least be entertaining and/or thought-provoking.
Re: (Score:2)
Which is exactly why this may be a fascinating language. Even if it's completely absurd and impractical, whatever ideas he's cooking up may at least be entertaining and/or thought-provoking.
So a bit like Lisp?
Re:Well... (Score:5, Interesting)
Being explicit is precisely what makes programming laborious and tedious. It is entirely true that without such tediousness, you do not enjoy a full range of functionality. But the vast majority of the time you do not need a full range of functionality.
Speaking as someone in a scientific major, Wolfram|Alpha has shortly become the go to resource for everyone looking to do quick, more-than-arithmetical calculation. It does a fantastic job of anticipating what you need and providing the appropriate options. If I need a differential equation integrated or the root to some expression I *can* ask for it explicitly, but usually I just type in the expression and everything I need will be generated by Wolfram automatically. For involved projects I do setup my problems in Python, but 99% of the time Wolfram|Alpha does just what I need for a hundredth of the effort. The fact my peers are using it the same way is notable because, while before Wolfram I might use Python or Maple or Mathematica, most everyone else would do these things by hand -- learning to use the available tools was something they considered too intimidating or not worth the effort.
If Stephen Wolfram can do something vaguely like Wolfram|Alpha with more ability to customize and automate what is happening, it's going to transform academics, maybe even down to the high school level. Imagine being able to easily develop a science fair project which requires solving some complicated ODEs, without having to take 3 years of college math first.
Re: (Score:2)
You will still need the 3 years of college math just to understand what you did.
Sometimes yes, and sometimes no.
For a simple example:
You need integral calculus to find the area underneath a curve, and a good understanding of limits and other pre-calculus to understand how the integration itself works.
Wolfram et al doesn't change that.
However, you don't need integral calculus to ask: "what's the area under this curve" and understand the answer.
Similarly carbon dating uses ODEs, but even a child can understa
Re: (Score:2, Interesting)
Lisp worked well. So much so, most of the languages since C basically go "Here's our idea, we're going to be like C in base thinking but extra. What extra? Well, we're going to add lisp-like features, usually in a half-baked manner." The only really major variation is languages targetting multicore operation but they tend to be functional type like lisp.
Problem with C is that it's a high level assembly. Great for computers as they were in the 1970s and 1980s.
Problem back then was lisp was too heavy. P
Re: (Score:3)
-Greenspun's Tenth Rule.
Re: (Score:2)
... which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware from PCs and smartphones all the way up to datacenters and embedded systems. The Language will leverage automation to cut out much of the nitpicking complexity that dominates current programming. 'The Wolfram Language does things automatically whenever you want it to,' he wrote in a recent blog posting. 'Whether it's selecting an optimal algorithm for something. Or picking the most aesthetic layout. Or parallelizing a computation efficiently. Or figuring out the semantic meaning of a piece of data. Or, for that matter, predicting what you might want to do next. Or understanding input you've given in natural language.' In other words, he's proposing a general-purpose programming language with a mind-boggling amount of functions built right in.
Well, that's pretty much a description of Common Lisp in the hands of a capable lisper. ;-)
Re: (Score:2)
C also has a couple of features that appear to have come from Lisp:
1. Pointers to functions, which allow functions to be passed, etc.
2. Macros expansion phase prior to compile time.
Re:Well... (Score:4, Insightful)
Even really smart people come up with stupid ideas.
Anything that is capable of doing complex things is complex itself. It's unavoidable. Even if every function by itself is extremely simple -- just press the green button -- what happens when there are a thousand buttons. And any one of them can interact with any other button.
Re:Well... (Score:5, Funny)
Hrm, another programming language...
Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)
Too many people think that programing is "just a lot of typing". Which leads people to believe that they should create a "new programming language" where you can just type "Make a new better version of Facebook" and be done with it.
Which leads to a lot of crap with "Visual" in its name. Hey look, you don't have to type. Just drag this widget from here to here. And we've seen how sell that turned out.
Re: (Score:2)
Re: (Score:2)
You don't understand the real problem with visual programming systems. They can be extremely powerful, but they tend to use so much screen space to do anything that you can't see very much of what's going on. So you need to hold too much in memory.
I remember Prograf on the Mac (before it died trying to transition to MSWind95). It was an excellent dataflow language, and if it were alive now, it would be a natural for multi-CPU platforms. But it was too visual, which made it nearly impossible to have a pr
Re: (Score:2)
Attempts have been made in the past to automate programming, it's never worked very well (or at all in some cases)
The places where it does work, you don't notice. Compilers/optimizers/JIT engines are automated programming. You tell the system what you want to do and behind the scenes it figures out all the stuff you did not tell it. Like not to actually check if X again if you have checked that earlier and X could not have changed, even if you told it to check X again because it was easier for you to write it that way.
That said, we have words for this in Perl5/Perl6, DWIM (Do What I Mean) and WAT (acronym open to co
Automatic programming works, and redefines itself (Score:3)
Attempts have been made in the past to automate programming, it's never worked very well
On the contrary. The first attempt to automate programming produced assembly language, which automated the assignment of addresses to variables and instructions. The second one produced FORTRAN, which automated the "programming" of formulae into sequences of individual operations. Every time we successfully automate some programming activity, the nature of programming changes.
Mike O'Donnell
Automated programming succeeds, redefines itself (Score:3)
Attempts have been made in the past to automate programming, it's never worked very well
On the contrary, automated programming has worked repeatedly, each time redefining "programming":
Each time someone automated "programming," the word stopped referring to the automated part, and referred to the remaining part of alg
Meh... (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
I have one going cheap here. It's just a copy of a pointer to a char which I am using globally in a multithreaded program with no semaphores or mutexs. It will probably work as long as you use it quick, and only read it's contents.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
0x3a28213a
0x6339392c
0x7363682e
Re: (Score:2)
Sure, buddy! (I know I dropped one somewhere around here...)
Re: (Score:2)
Say brother, can you spare a pointer?
Should have a heap of them around here somewhere. Let me peek in the register. Ah yes, here's a stack!
Re: (Score:2)
Re: (Score:2)
The problem is not what languages we need. We don't actually need a high-level language other than C. The problem is finding useful languages for various purposes.
There are a lot of programs in assorted languages that would be very difficult to write in C, and are much easier to write in other languages. Many things are much easier to write in C++, Lisp, or Prolog than in C, to name three languages. The different syntax is necessary to get the ease of writing. You can write object-oriented C program
The main innovation of course being ... (Score:5, Interesting)
that you will have to pay a lot of money to use it?
Re: (Score:3)
that you will have to pay a lot of money to use it?
If the work that needs to be done could be done quicker or simpler (i.e. cheaper) by paying a $1000 license rather than having a $300,000-per-year researcher to go learn Python or R, then it is worth it to pay, no? The current options aren't going away.
Re: (Score:2)
Thanks for mentioning that. And thanks for fleshing out the bugs in Rekonq!
Re: (Score:3)
If the programming language relies on remote servers (basically Wolfram Alpha) in order to function it would make sense that it would cost money. It costs money to hire people to make and improve a system like Wolfram Alpha.
If people got over the idea of having everything on their computers for free the world would have a lot less corporate snooping and a lot less ad spamming. That would be nice.
Re: (Score:2)
And several of the current tech giants would shrivel up and die, and that would be even nicer. :-)
Re: (Score:2)
Re: (Score:2)
How cutely naive! If a programming language costs money and relies on remote servers you expect corporate snooping to decrease? I think hell would freeze over first.
I did not say that.
If you have two programming languages that depend on remote servers, one that's free is in gratis and one that has fees I would expect the one that has fees to value and respect your privacy more than the one that is free.
Re: (Score:2)
"Add a google Ad box to the upper left corner below the logo"
"Make it fit under the logo nicely."
"Make it blink."
...
Noooooooooo!
Just Call It "Wolf" (Score:4, Funny)
that way if we make a programming error we can just comment "Bad Wolf" (too much exposure to Dr Who recently) :P
His next project is interesting (Score:5, Funny)
Re: (Score:3)
And naturally, he'll call it Wolfram paper. :-)
Re:His next project is interesting (Score:4, Funny)
I know you're trying to be funny by implying he's reinventing the wheel, but ironically, there's more than one way to clean your ass. In some countries, they use water streams rather than TP. There's not just one unique solution to each problem.
You just made my point. There are already multiple and satisfactory ways to clean one's ass.
Re: (Score:2)
I know you're trying to be funny by implying he's reinventing the wheel, but ironically, there's more than one way to clean your ass. In some countries, they use water streams rather than TP. There's not just one unique solution to each problem.
You just made my point. There are already multiple and satisfactory ways to clean one's ass.
So maybe he's developing the ultrasonic ass-cleaning device? Guaranteed to leave your ass sparkly clean in 1/10th the time it takes for 'traditional' methods, and no chance the dog will drag it all over the house while you're at work. Also takes care of unsightly butt-hair...
Re: (Score:2)
Unless you miss one drunken evening and find you are now a eunuch.
Oh boy. (Score:5, Funny)
First a new kind of SCIENCE, now a new kind of PROGRAMMING.
Can't wait for a new kind of LOVE.
Re: (Score:2)
First a new kind of SCIENCE, now a new kind of PROGRAMMING.
Can't wait for a new kind of LOVE.
Given the challenges many face with the old kind I doubt we are ready to face a new kind...
Re: (Score:2)
Phantom Minus Minus (Score:5, Insightful)
Let me guess: (Score:2)
He won't publish it under a free software license...
Automatic everything? (Score:2)
So you can do anything you want with Wolfram language? The only limit is your imagination?
Will the first project be the long-awaited 1.0 version of Zombo.com [zombo.com]?
One hell of a language (Score:5, Informative)
Well, either he's created the mother of all LISP macros, or it's simply vaporware. Love to see it when they publish it. Code or it didn't happen.
Here is the obligatory xkcd [xkcd.com], panel two.
That's funny twice, considering... (Score:4, Insightful)
1. Wolfram is a notorious Lisp disser [ymeme.com], and Mathematica is arguably a shining example of Greenspun's tenth rule [wikipedia.org].
2. Lisp has a long history of trying to help programmers, with mixed results. The term DWIM [wikipedia.org] was coined by Warren Teitelman in 1966 as part of a project based on BBN Lisp, the main predecessor of Interlisp; this project of his sounds like DWIM writ large.
Re: (Score:2)
Please get back to us when Lisp M-expressions are finished.
Here you go... (Score:2)
If Dylan [wikipedia.org] isn't good enough for you, your problem isn't non-S-expression syntax.
Not a story (Score:3)
Libraries And Documentation (Score:5, Insightful)
I don't program for a living anymore, and I've always been more of a system-level, hardware driver kind of guy, so C/C++ work fine for me.
But especially coming from that background, my need isn't for another programming language, it's for better documentation of available libraries. For any common task that I want to do, somebody has probably written a great library that I can just strap in and use.
The problem is when I start trying to use it. The documentation has blank "TBD" pages, or really helpful comments like, "init_lib() -- initializes the library. You can specify the # of flickers per bleem ..."
Or ... and this is my 2nd favorite ... the documentation is out of date. "Hey, I tried to do this the way your tutorial said and it didn't work?" "Oh, yeah, that tutorial is out of date; we've changed some stuff ..."
My very most #1 favorite is automatically generated documentation that looks at (for example) a C++ class and then creates an HTML page. I might as well just look at the source code ... hoping, of course, that the people who wrote that source actually inserted more than a few, "does what it says" comments. Or that I don't have to play the Spaghetti Trace(tm) game, bouncing from one .c file to another .h file and back to a third .c (and this is after repeated greps in the "src" directory) to try to figure out what's happening to my poor variable while it's inside a function.
Not criticizing FOSS, per se; I understand that it's written by volunteers (for whom I'm very grateful). But this, rather than needing a new way to "PRINT" or "SORT" in a programming language, is the far bigger problem, in my book.
Re: (Score:2)
I fully agree with this.
Just finding libraries, configuring them, and learning to use them is pretty hard some times. .NET/Java makes this a bit easier, just import the jar/.dll and away you go.
Some PERL distributions make this easier with a package manager.
I have no idea what Wolfram has, but it would be pretty cool if it managed to do a lot of this. Centralized package management. Maybe it scans your code, sees what you're trying to do and then chooses an optimal function in some library (hopefully offers
Re: (Score:2)
Interestingly, you claim your choice of programming language suits your requirements, but then you state a bunch of issues endemic to it, but mitigated or absent in other languages.
For example, the need to sometimes, but not always, initialize objects, libraries, or whatever is typical of C/C++ code, but rare in Java or C#, where constructors or factory methods usually do that automatically for you on demand. The worst I've seen is some Microsoft C++ code where every object had both a C++ constructor and an
parable (Score:4, Insightful)
One day the student came to the master and said "There are too many programming languages! I am tired of having to learn thirty programming languages! I shall write a new programming language to replace them all!"
The master smacked the student upside the head. "Idiot! Then we would all have to learn thirty-one programming languages!" The student was enlightened.
Unfortunately, it was only the one student who was enlightened. Now we all have to learn fifty programming languages.
It's called C. (Score:2)
Seriously, C is that awesome.
If C doesn't work, import python.
"Wolfram Language"? (Score:5, Funny)
This fellow needs to work on his self-esteem.
Re: (Score:2)
like he [re]invented Von Neuman's 1948 physics? (Score:2)
"cut out much of the nitpicking complexity" (Score:2)
I wish him well, but I remain skeptical. I hope the result doesn't devolve into "click here and here and here."
Sounds like something I have heard of before (Score:2)
... which will allow developers and software engineers to program a wide variety of complex functions in a streamlined fashion, for pretty much every single type of hardware ...
Isn't that what fortran does?
Knowledge-based programming (Score:3)
The most concrete detail I could find anywhere on his web about it was his repeated characterization of the language as "knowledge-based".
Now, unless he has some whole new meaning in mind, that isn't a totally new concept in languages. We generally call such languages "AI languages" (or more technically, Inference Engines [wikipedia.org] or Reasoning Engines [wikipedia.org] or whatever.
The general idea is that the programmer's job is to write rules. Then you feed the engine your rules and a set of facts (or an operating environment it can go get "facts" from), and it will follow what rules it needs to. The language/system of this kind that programmers here will probably be most familiar with is make [gnu.org]
It sounds cool, but I think a lot of folks here might find the concept of something like make being the answer to all their programming difficulties a case of the cure being worse than the disease.
Re: (Score:2)
I took a class in rule-based systems once. I started thinking of how I'd verify correctness, and didn't really think of much. The advantage of conventional programming languages is that, within very broad limits, you can tell what the computer is going to do with the program. I think that is largely lost with complex rule-based systems.
Languages like make and yacc can get away with that because of their limited scope. If I screw up with an LALR(1) grammar, yacc will find reduce-reduce errors and tell
All in a name (Score:3)
* World Object Language Framework
or
* Wolfram Object Language Framework
Im just barking at the moon.. I really have no idea what I'm talking about.
Is Wolfram willing to make it free? (Score:2)
Stephen Wolfram is a brilliant businessman who has made a fortune charging what the market will bear for Mathematica and Alpha. Will that model break-down with the Wolfram programming language? I think it will. PARCplace tried to sell Smalltalk for awhile and the language stagnated until Alan Kay was able to get Squeak going. I can't imagine anything becoming as popular as Python or C++ if it costs thousands of dollars to get into the game.
Perhaps Wolfram will patent some of his ideas and then they will
Re:yet another programming language (Score:4, Informative)
But this one is ostensibly designed by Stephen Wolfram, who knows what scientists and physicists need from a programing language.
Python, C, Java, et al were all designed by computer programmers for computer programmers. R and Mathlab were designed by computer programmers for mathematicians, thus works a lot better for expressing certain mathematical concepts and working with them (transformations, statistics). But there is much room for improvement, especially when looking at the problem from the scientist's point of view, not from the programmer's point of view.
Re:yet another programming language (Score:5, Interesting)
Consider that the answer may be completely the opposite than what you assume. Perhaps we just teach kids math with programming. Then, just like long division or integration, etc. they won't have a problem explaining their desires to computers.
Hell, I have a _BEST_SORT() macro which, together with my collections library's _GEN_PROFILED_H directive will select the actual best sort method on next compile after profiling to PROVE which sort is best for the scale of the problem space, instead of guessing. Predicting what I want to do next? Yep, my brain even does that automatically me too. All I have to do is explain to the computer what I want to have happen, and it happens. IMO, the problem is the way mathematics is taught. A sigma is a for loop. The latter is more verbose, but if they'd have been taught for loop instead of sigma they'd be programmers; It's sort of ridiculous when you think about teaching kids the old way: "I'll never use this in real life", meanwhile they can utilize programming in say, javascript, to take better control of every damn device they own right now... Teachers just failed to tell them how.
Seriousy, I've taught pre-teens how to code as a remedy for flunking out of mathematics; Instantly they're able to see the utility of the tool. Humans are tool using creatures, no wonder they have a hard time learning how to use tools that aren't immediately useful to them. The flunkers are actually being smarter than the teachers.
Re: (Score:2)
That is a rather creative idea. I would love to see more practical examples of what you do with it, such as the Sigma example.
Re: (Score:2)
Part of the problem is that much of math education is based on rote memorization rather than problem solving, and writing a program to do your homework for you is frowned upon and often considered cheating. If writing programs to solve your math homework was generally accepted as legitimate, we would have both fewer kids flunk out of math and more kids going into programming, as those kids would be actually learning how both math and programming are used in the real world.
Re: (Score:2)
I think you really nailed it. Would upvote you if had modpoints today.
Re:yet another programming language (Score:5, Insightful)
Being primarily a mathematician and not a computer scientist or engineer, I have used Maple, Mathematica, and R. At one point I knew Pascal and C. I've dabbled in Python.
Of all these programming languages, Mathematica was BY FAR the easiest language for me to learn to use. The way it does certain things makes so much more sense to me than the others--for example, how it handles functions and lists. Unlike C, it's a high-level language if you want it to be, although you aren't forced to use it in that way. Pattern matching is extremely powerful. And the syntax is totally unambiguous; brackets define functions, braces define lists, and parentheses are used only for algebraic grouping of terms.
The major criticism I have of Mathematica is that it is comparatively slow, mainly because of its lack of assumptions regarding the nature of the inputs. Internally, it tries to preserve numerical precision, it works with arbitrary precision arithmetic, and it doesn't assume values are machine precision. All this comes at a cost. Also, reading other people's code can be remarkably difficult, even if it's commented. The tendency is to write functions that do a lot of complicated things in one command, so code can be remarkably dense.
Most recently, I have had to learn how to use R, due to its abundance of statistical algorithms, many of which have not been implemented in Mathematica. There was a simple example where I tried to calculate a Bayes factor, and the expression was something like (1 - x)/(1 - y), where x and y were very small positive numbers, somewhere around the order of 10^-15. This calculation totally failed in R--the answer given was 1. Mathematica correctly calculated the ratio. Maybe I don't know enough about R to know how to preserve the necessary numerical precision, but it sort of shows that in Mathematica, such issues are handled automatically; moreover, if there is a potential problem, Mathematica warns you.
Anyway, this is all just personal opinion, really. The takeaway for me is that I see a lot of evidence that Stephen Wolfram is pretty good at designing computer languages for specific purposes. Yes, he's totally egocentric, but there's no denying that he is brilliant. When Wolfram | Alpha debuted, I remember thinking how totally stupid it was. And now...every single high school and college math student knows about it. It is one of the most ingenious marketing ploys I have ever seen. And the scary thing is, it keeps improving. It's almost Star Trek-like in its ability to parse natural language input. And I think that's the eventual direction that computer programming will evolve towards. Programs will not be written in code, but instead, as broad sentences, parsed by an AI which automatically performs the high-level task.
Re: (Score:2)
It's almost Star Trek-like in its ability to parse natural language input. And I think that's the eventual direction that computer programming will evolve towards. Programs will not be written in code, but instead, as broad sentences, parsed by an AI which automatically performs the high-level task.
That is kinda how I would think of it.. You make a request. The computer AI does its best to pick a starting point given what you described and starts running it. Then you explain to the computer what the AI is doing wrong in comparison to the running program. It tries again. Rinse and repeat until it has something that does everything you want it to.
From a programming point of view it is like starting with a similar project and using natural language to modify the existing program little by little.
This
Re: (Score:2)
Then once there is a prototype they could hand it to a software engineer for the parts that need optimization for final tweaking.
To use an analogy, would you cut your custom suit from an old plastic tarp using rusty garden shears and then hand it to your tailor for "final tweaking"? No? Then you'd best leave the programming to the experts who frankly don't need your "help".
Re: (Score:2)
Being primarily a mathematician and not a computer scientist or engineer I've used Maple, Mathematica, Matlab, Magma and R. I've also programmed in Python, Perl, C, and Java and dabbled in things like Lisp and Haskell.
All the "math" programs on that list are terrible programming languages; they work great as interactive environments for doing (potentially symbolic) computation, but writing code in them? Ugh. If I actually have to write scientific computing code it's going to be in Python using numpy and sym
Re: (Score:2)
| There was a simple example where I tried to calculate a Bayes factor, and the expression was something like (1 - x)/(1 - y), where x and y were very small positive numbers, somewhere around the order of 10^-15. This calculation totally failed in R--the answer given was 1.
Is this a failure, or that 1 is the best approximation. Did you want (1-x)/(1-y) - 1 instead ? Then
Mathematica is not good for numerical comp
Re: (Score:2)
As I implied in my previous response, the answer was not supposed to be 1. I also probably didn't remember the example correctly, but my point is that I could not get the correct value in R, but Mathematica did get it.
For me, having confidence that numerical results shown are correct to within the precision displayed is more important than speed of calculation. I totally get that such things have a speed penalty. Python might be able to do it better and faster, but as I noted, my experience with that pro
Re: (Score:2)
Well, to be truthful, even from a programmer's point of view they are all lacking.
I want a language with a built-in B+Tree (stored to a file), where structs are directly accessed, where classes are handles to the heap, where there is built-in support for sorted arrays AND for hash tables, where it's serialization of stucts that don't contain pointers is trivial (i.e., I want such things to be able to be the data items of the B+Tree without additional work), etc.
And I want object, struct, and array persisten
Re: (Score:2)
Re: (Score:2)
Then why does it use the engineer's 'j' for complex numbers instead of maths' and physics' 'i'?
because i is for index I imagine.
Re: (Score:2)
Guido is also an extremely competent C programmer (see recent Slashdot article) and he did not design Python for scientists, but rather for programmers.
Re: (Score:3)
Re: (Score:2)
Python is actually a good example of why adding new languages is not the answer. One of the big reasons that python has been so embraced in scientific computing are the libraries that were built on top of it that are well suited to those types of tasks.
That is very true, however they still require one to express his problem in terms of lists, sets, dicts, strings, ints, floats, and complex numbers. Not all scientific concepts can be massaged into one of those datatypes.
The python community did a reasonably good job of grafting domain specific functionality in via libraries that were fairly accessible to people who are not primarily programmers while still having the general purpose language behind it for people who are, allowing programmers and non-programmers to collaborate easily. Which is why I tend to get annoyed with the whole 'lets build a new language for this domain!' thing since all it really does is increase the barrier between fields and produces yet another custom language that needs to be learned and maintained.
The counter argument is that each individual domain needs its own programming language in the same sense that each individual domain needs its own jargon. Each domain has its own unique intricacies, problems, methods, and context. The tools used should reflect that.
Re: (Score:2)
Re: (Score:2)
Libraries only provide new functions and types. Go look at mathlab or (shudder) labview for some examples of domain-specific datatypes (not simply classes built on the common primitives) and paradigms.
Surely you are not suggesting that the field of particle physics should be using the same tools as the field of psychiatry? That materials engineers should be using the same tools as palaeontologists?
Re: (Score:2)
Or, for that matter, on search.
Good at PR, though.
Re: (Score:2)
I'm sure I misread that ;-)
Re: (Score:2)
imo, problem definition comes before problem solution.
Re: (Score:2)
I think the point here is that, at some point, somebody's got to translate these human-level concepts into some formal system. I think that sort of translation is AI-complete (meaning that we can't automate it without strong AI). I don't think Wolfram's going to do it.
Re: (Score:2)
"social-science abomination like R with a screwball syntax."
R is a clone of the S+ language, which was invented at Bell Labs. SPSS is the social-science 'abomination'.