Python Gets a Big Data Boost From DARPA 180
itwbennett writes "DARPA (the U.S. Defense Advanced Research Projects Agency) has awarded $3 million to software provider Continuum Analytics to help fund the development of Python's data processing and visualization capabilities for big data jobs. The money will go toward developing new techniques for data analysis and for visually portraying large, multi-dimensional data sets. The work aims to extend beyond the capabilities offered by the NumPy and SciPy Python libraries, which are widely used by programmers for mathematical and scientific calculations, respectively. The work is part of DARPA's XData research program, a four-year, $100 million effort to give the Defense Department and other U.S. government agencies tools to work with large amounts of sensor data and other forms of big data."
Great. Just Great (Score:1, Insightful)
The work is part of DARPA's XData research program, a four-year, $100 million effort to give the Defense Department and other U.S. government agencies tools to work with large amounts of sensor data and other forms of big data.
Yeah the govt needs better systems to manage the huge databases and dossiers they are building on everybody with their warrentless wiretaps and reading everybody's emails. Anybody who helps with this project is pretty damn naive if they don't think it will also be used for this.
For that matter anybody who trusts the govt and thinks the govt is your friend is pretty damn naive. Yeah I would like to believe that too. No I won't ignore the mountains of evidence to the contrary. I won't treat all the cou
Re:Great. Just Great (Score:5, Insightful)
Yeah the govt needs better systems to manage the huge databases and dossiers they are building on everybody with their warrentless wiretaps and reading everybody's emails. Anybody who helps with this project is pretty damn naive if they don't think it will also be used for this.
For that matter anybody who trusts the govt and thinks the govt is your friend is pretty damn naive. Yeah I would like to believe that too. No I won't ignore the mountains of evidence to the contrary. I won't treat all the counterexamples as isolated cases. I see them for what they are: an amazingly consistent pattern. The rule, not the exception. Govt positions are really attractive to sociopath types who just love power and control and a feeling that they are important and they get that feeling by imposing their will on us.
So what you are saying is that DARPA funds will be used in a way to further the goals of DARPA/The government? Shocking. I haven't read anything that says which agencies will/won't have access to these tools - so I'd hazard a guess that any department that wants it can have it (including the famous three letter agencies).
FYI, Continuum Analytics is a company that is based on providing high-performance python-based computing to clients. Any packages they might release will either be open source (and can be checked), or closed source (in which case you don't have to use it). They aren't hijacking the Numpy/Scipy libraries. They are developing libraries/tools for a client (who happens to be DARPA). (Frankly, I'd hope that Continuum Analytics open sources their development because it might be useful to the larger community). You do know that DARPA funds also go to improve robotics, they supported ARPANET, and a lot of their space programs later got transferred to NASA?
Basically, I have no idea what you are ranting about. One government organization funded a project - it happens all the time. Do you rant about NSF/NIH/NASA money as well? If so, you'd better live in a cave - a lot of government sponsored research has gone into almost every modern convenience that we take for granted.
Re:Great. Just Great (Score:5, Funny)
Re: (Score:2)
Poe's Law. [wikipedia.org]
In /., you never know.
Re:Great. Just Great (Score:5, Informative)
Frankly, I'd hope that Continuum Analytics open sources their development because it might be useful to the larger community
Open sourcing is a requirement of the XDATA program.
Re: (Score:2)
You have no idea what he's talking about? It was pretty clear: factions within the US government wants these tools to datamine all the ISP data they have been snarfing up so they can spy on everyone in the world. Saying that you believe otherwise is a pretty extreme view
He has no idea why there is ranting about open source code that everyone in the world can use for any purposes. Did you rant about git being open source? I'm betting the gov't can use that to manage code related to data mining. Do you rant about postgres or any of the databases used by the US gov't? Would postgres suddenly become evil because the gov't threw some money their way?
Re: (Score:2)
Yeah the govt needs better systems to manage the huge databases and dossiers they are building on everybody with their warrentless wiretaps and reading everybody's emails. Anybody who helps with this project is pretty damn naive if they don't think it will also be used for this.
Isn't this true of all useful open source projects?
I get the impression that (Score:5, Interesting)
Re: (Score:2)
I think you're right.
I love Ruby, it's a very fun and effective language, I could write it in my sleep but there are so many cool projects that are written in Python.
Those languages are *very* similar, and it's a shame that so much effort is being divided between communities.
I might get to learn Python one day but I'm afraid I'd become a so-so programmer in both languages.
Re:I get the impression that (Score:5, Interesting)
> I might get to learn Python one day but I'm afraid I'd become a so-so programmer in both languages.
I empathize since I conversely only barely use Ruby. Once someone learns one of these languages, there is not that much that the other offers. But happily, one need not learn advanced Python to benefit from these projects.
> it's a shame that so much effort is being divided between communities
AFAIK, all scientific funding from US and Europe is/was always directed to Python, not Ruby. So Python is firmly established as a research language and there is not much effort being divided with Ruby (which seems to have a much more spotted and amateur movement in this direction), at least as far as scientific stuff is concerned (Ruby is more popular on web app side). For me the tension for scientific use is not between Python and Ruby, but between Python and R. Python community is replicating a lot of R functionality these days but R still has a much better lead in science libraries. Happily, it is quite easy to call R from Python.
Re: (Score:2)
I think I disagree. I think that its great that both communities exist and each can develop languages in ways unconstrained by the particular historical choices that shaped the other languages (and that, in both cases, each has subcommunities around part
Re:I get the impression that (Score:4, Interesting)
Re: (Score:3)
Re: (Score:2)
GIL is a non-issue. (Score:2)
I think you're right. I love Ruby, it's a very fun and effective language, I could write it in my sleep but there are so many cool projects that are written in Python. Those languages are *very* similar, and it's a shame that so much effort is being divided between communities. I might get to learn Python one day but I'm afraid I'd become a so-so programmer in both languages.
Both languages suffer from the global interpreter lock defect and will require a rewrite in the next 5-10 years if the languages have any chance of surviving in the servers.
Gee, because there are no distributed enterprise solutions written on Python or Ruby <rolls eyes/>
It will take some very serious, dedicated, low level work and I just don't see it happening.
It already has happened. The solutions aren't just in the mainstream versions, though. Take Jython. On a typical JVM, it is the fastest Python in-the-trenches implementation available. Throw that over specialized Java-focused hardware (like the Azul Vega 3), and you are on fire.
Furthermore, a solution to the GIL problem is not necessary in the general case. In any modern system, the cost of communicatin
Re: (Score:2)
Java/JEE never shines. It is total crap.
That's an invective, not an argument. Now go back and finish your programming homework.
Re: (Score:2)
It would be easier to get some of that Darpa money sent over to Pynie [bitbucket.org] and it will all run on Parrot (multithreaded stable as of last month apparently). Then you will be able to call Perl6 and Befunge when you get tired of indenting all the time (ducks)
Re: (Score:2)
Both languages suffer from the global interpreter lock defect and will require a rewrite in the next 5-10 years if the languages have any chance of surviving in the servers.
You don't really understand big data, if you think it needs to run on ONE computer.
This is only a problem if you think threading is the solution to scaling CPU computations across hundreds of computers. If you generalized your code to run on hundreds of computers, there is no reason you can't run a process per core for your multicore machines.
Re: (Score:2)
I used to do "big data" and "cloud" computing when it was called clusters.
Did you run one process with multiple threads across all of those machines, or was threading less of an issue once you started thinking about distributed computing?
I can say this with a certainty: Anything other than a compiled language with low level facilities is a pure waste of time and money.
Isn't that what Numba does? Compiling Python code using LLVM and being able to understand numpy data structures? I'm still not sure I understand what threading has to do with this. The OP said threading was an issue, but threading doesn't
While with Java you at least get some safety for big projects
Safety? Job security?
Neither LANGUAGE has a GIL (Score:2)
No, they don't. The CPython and MRI/YARV implementations of Python and Ruby, respectively, have global interpreter locks, but those are implementation quirks not language features. On the Python side, IronPython and Jython don't have a GIL, on the Ruby side neither JRuby, MacRuby, IronRuby nor Rubinius (the latter being particularly impo
Re:I get the impression that (Score:5, Informative)
Re: (Score:3)
Why would Fortran be any faster than any other compiled language?
Re:I get the impression that (Score:5, Informative)
Re: (Score:2)
Because the language is simpler, so the compiler can make assumptions and generate better automatic optimizations. C/C++ are much harder to optimize (=generate optimal assembly instructions).
Re: (Score:3)
FORTRAN does arrays in a way thats slightly easier for the compiler to optimise. But some modern techniques and data structures are much harder to do in FORTRAN compared to c++. It is also quite easy to call C, C++ or FORTRAN functions from python.
Writing a loop in python is slow. You express that loop as a numpy array operation you get a substantial way towards c speed. if you use numexpr you will get something faster than a simple C version.
Processing big data is as much about moving the data around, and
Re: (Score:2)
If by expressing things efficiently you mean easy for the programmer to write, then you're wrong. What matters (doubly so for big data) is full control over the machine's resources, ie how data is laid out in memory, good control over i/o etc. While this has always been the key to fast performan
Re:I get the impression that (Score:5, Interesting)
Compared to plain old Python, yes. But Cython offers a lot of capabilities that improve speed dramatically - just using a type for your data in Cython gives programs a wonderful boost in speed.
As someone who uses Matlab for most of my programming, I have come to detest languages that do not force specifying a variable type and/or declaring variables. Matlab offers neither, but it is a standard in some circles.
Re:I get the impression that (Score:5, Insightful)
You're probably right, but you're also missing the point. Most scientists are not programmers who specialise in numerical methods and software optimisation. Just getting something that does what they want is hard enough for them, which is why they use high-level languages like Matlab and R. If things are too slow, they learn to rewrite their computations in matrix form, so that they get deferred to the built-in linear algebra function libraries (which are written in C or Fortran), which usually gets them to within an order of magnitude of these low-level languages.
If that still isn't good enough, they can either 1) choose a smaller data set and limit the scope of their investigations until things fit, 2) buy or rent a (virtual) machine with more CPU and more memory, or 3) hire a programmer to re-implement everything in a low-level language and so that it can run in parallel on a cluster. The third option is rarely chosen, because it's expensive, good programmers are difficult to find, and in the course of research the software will have to be updated often as the research question and hypotheses evolve (scientific programming is like rapid prototyping, not like software engineering), which makes option 3) even more expensive and time-consuming.
So yes, operational weather forecasts and big well-funded projects that can afford to use it will continue to use Fortran and benefit from faster software. But for run-of-the-mill science, in which the data sets are currently growing rapidly, having a freely available "proper" programming language that is capable of relatively efficiently processing gigabytes of data while being easy enough to learn for an ordinary computer user is a godsend. R and Matlab and clones aren't it, but Python is pretty close, and this new library would be a welcome addition for many people.
Re:I get the impression that (Score:5, Insightful)
Which is exactly why FORTRAN is an excellent choice for them instead of something else fast (close to assembler) like C/C++, and why so many of the top fluid dynamics models continue to use it. It is simple (perhaps a function of its age) and because of that it is simple to do things like break up the calculation for MPI or tell the compiler to "vectorize this" or "automatically make it multi-threaded" in a way which is still a long from maturity for other languages.
Can you guess which language MATLAB was originally written in? You know that funny row,column order on indexes? Any ideas on the history of that?
R is great an all, and is brilliant in its niche, but how's that RAM limitation thing going? It's not a solution for everything.
MATLAB is pretty good too, as is Octave and SciLab, and it has gotten a whole lot faster recently, but ever try much disk I/O or array resizing for something which couldn't be vectorized? Becomes slow as molasses.
heh. I don't think you know these people.
Many problems are I/O limited and require real machines with high speed low latency network traffic. VMs just don't cut it for many parallelized tasks which need to pass messages quickly.
Forgive me if I'm wrong, but your post sounds a bit like you think you're pretty good on the old computers, but don't know the first thing about FORTRAN and are feeling a bit defensive about that, and attacking something out of ignorance.
Re: (Score:2)
You're not picking on me [slashdot.org], you're arguing your point. That's what this thing here is for, so no hard feelings at all.
I'll readily admit to not knowing Fortran (or much Python! ;-)); I'm a C++ guy myself, having got there through GW-Basic, Turbo Pascal and C. I now teach an introductory programming course using Matlab (and know of its history as an easy-to-use Fortran-alike), and I use R because it's what's commonly used in my field of computational ecology. I greatly dislike R, and I'm not too hot on Matlab
Re: (Score:2)
buying vm's on the same farm is just another way of getting access to real machines on the cheap for limited time. it's just another way of saying of buying time on a supercomputer now.
Re: (Score:2)
That funny row/column order in matrix indices (aka column major order) is because it's the correct mathematical order.
Consider that you can only multiply two matrices if matrix A is of size [i,j], and matrix B is of size [j,k], i.e. the number of rows in A must be equal to the number of columns in B. The product C=AB is then of size [i,k]. This works for any number of matrices, so, [i,j]*[j,k]*[k,l]*[l,m] is valid, and gives [i,m].
This naturally leads to the indexing you see in Fortran and Matlab, because i
Re: (Score:2)
I don't mean for this to be pick on LourensV day, but I have another small nit to pick. You're presuming operational weather forecasting is well funded? I don't think funding has anything to do with it. Often it's what the original author knew which chose the language.
And have you seen what's been done to NOAA's budget over they last decade?? Well funded.
Re: (Score:2)
Perhaps he means it's well funded in the sense that they have dedicated programmers at all. "Run of the mill" science is done by investigating scientists or their jack-of-all-trades research assistants, collaborators or grads/post-docs, etc. most of which are unlikely to have substanital software engineering experience or training in their background.
Nonetheless, they write code - very useful, productive code - but it's in whatever tool or high-level language popular among their peers/discipline (matlab, R,
Re: (Score:2)
You lost me at ``Most scientists are not programmers...'' schtick. Whether it was my Mechanical Engineering professors fluent in ADA, C, Fortran, C++ or Pascal or my EE professors in the same, to my Mathematics Professors all in the same, not a single CS Professor could hold a candle to them, unless we started dicking around with LISP, SmallTalk or VisualBasic for shits and giggles. In fact, they became proficient in these languages because they had to write custom software to model nonlinear-dynamic system
Re: (Score:2)
Re: (Score:2)
I resorted to using Fortran and MATLAB for visualisation. So I managed to learn basic Fortran over the weekend and then use it to write a working program for a cluster, all within 1 week time. I just don't think I could have done that with Python.
Python with SciPy is a lot like MATLAB. Python, the language, is far superior to MATLAB's language; I hate 1-based array indexing, for example. MATLAB's language does have a few special features for matrices that Python lacks, but that is just syntactic sugar (th
Re: (Score:2)
Sadly, with the exception of a few times where I get to sum an array, pretty much my whole model needs to be run in a fast language like C or Fortran (I use C, my supervisor uses F77). It's the kind of model (a spatial stochastic disease simulation) that doesn't really lend itself to coding up in Python. No matrices, just lots of little bits of data interrogation, calculating one event at a time, and so many loops (unavoidable) that it would just crawl in Python. If you try to start in Python and replace th
Re: (Score:2)
Thanks for the link. My problem is that there isn't any one bit you can point to and say "that's the slow bit" (unless it's telling the code which parameters to use, varying the parameters, and then graphing the results when done -- I'm currently doing those parts with bash and Octave, and to be fair I would probably be better off doing both of those in Python).
The main work is the simulation, and it's where I've got a trivially small amount of data (say a 20x20 lattice of sites containing the number of sus
Re: (Score:2)
Python with SciPy is a lot like MATLAB. Python, the language, is far superior to MATLAB's language; I hate 1-based array indexing, for example. MATLAB's language does have a few special features for matrices that Python lacks, but that is just syntactic sugar (there are functions to do everything in Python).
Even as a MATLAB user I agree, as long as we're strictly talking about the language. Many of GNU Octave's woes (though they're getting JIT now!) can be blamed on the poor language design.
But there are many things that SciPy doesn't have. Yes, MATLAB is an unnecessarily expensive choice for data analysis, but my employer only uses it for that (not "big data," mind you) because it's already our design tool, so it's an ideal rapid prototyping environment. That's where it really shines: Simulink and code gen
Re: (Score:2)
Re: (Score:2)
PyPy [pypy.org] might change that in the future, especially with the Transitional Memory [pypy.org] branch.
Re: (Score:2)
Re: (Score:2)
You're dead wrong, nothing quite beats Fortran in speed when it comes to number crunching. If you need to go through hundreds of gigabytes of data and performance is important there's only one realistic choice: Fortran. Python isn't fit to run on a large cluster to simulate things, too much overhead. And lets not forget what sort of efficiency you can get if you use a good compiler (Intel Composer). You won't find Fortran on the way out over here, it's here to stay!
Isn't that the point of DARPA funding this project - to make it so Python is fit to run on a large cluster to simulate things? I do agree, though, that Fortran is here to stay. However, it is so specialized in what it does and that often a solution then requires multiple languages to get the task accomplished.
Back in the day (1970s) I had a professor who would say that you can write anything in anything. For instance you could write a business app in Fortran and you can use COBOL for plotting trajectories
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
The core processing in SciPy/NumPy is done in compiled C or Fortran libraries (LAPACK is used extensively where available), not in Python.
I'm unaware of a widely-used interpreted version of Python. Whether Python is byte-compiled (CPython), JIT'd (psyco, pypy, IronPython, many Jython stacks), or compiled ahead of time to machine code (Jython+gcj, ShedSkin) depends on which Python implementation you're talking about.
Re: (Score:2)
Re: (Score:2)
Most of those are still interpreted. It's not because it's a bytecode that it's not interpreted. In fact even your CPU interprets complex instructions and executes them using a set of simple instructions in a lot of cases (yay for RISC/CISC hybrids).
Okay, then Fortran's an interpreted language too. What was the point of your original post, then?
Moving the goalposts like this in the middle of a conversation is pointless--sure, there's a semi-rational definition under which x86 assembler is an interpreted l
Re: (Score:2)
Re: (Score:2)
Yes, but Python is still an interpreted language and very slow compared to Fortran.
Nope, it's not. Never was.
Re: (Score:2)
Re: (Score:2)
Sorry, but you're wrong: it is. Or did you forget where the PYC files come from? You might want to read the official Python documentation on this one http://docs.python.org/3/glossary.html [python.org] [python.org] . Go to "interpreted" in case you're too lazy to find it yourself. And by the definition we use over at the electronics department Python is an interpreted language no matter what you wish to claim.
You're conflating implementations with languages.
Not every Python implementation even has .pyc files. When I co
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
good lord no. (Score:2)
Well.. there's C, of course...
I work with C and C++ on a daily basis, and I have to ask/answer: For parallelized scientific computation or data crunching? No thank you. You don't use a phillips screw driver to unscrew a hexagonal bolt, do you? Know your tools, their strenghts and limitations.
Re: (Score:2)
I disagree, Fortran and C is very good for parallel scientific computations. If you are doing computations you care about speed, and the closer you are to the iron (and the os) the faster it runs and more work you can do in a time unit. You have nice tools like OpenMP, UPC, Cilk and MPI, etc. Posix SHM is the best for local IPC/RPC.
Python may be a nice lang to work with, but it is a slow dog.
Re: (Score:2)
No one seems to be pining away for Fortran programmers. At least not much ayways. A quick 'n dirty search on Dice.com yields 46 results, (and no doubt a few are doubles).
Re: (Score:2, Informative)
I guess the problem is that people who speak about Fortran actually think about FORTRAN. The last FORTRAN standard was from 1977, and that shows. After that, there had been no new standard and little new development until the Fortran 90 standard (note the different capitalization). Fortran 90 got rid of the old punch card based restrictions by giving it completely new, much more reasonable code parsing rules (it still accepts old form code for backwards compatibility, but you cannot mix both forms in one fi
Re:I get the impression that (Score:4, Informative)
The entire point of Fortran is that it has difficult-to-deal-with aliasing rules that make the compiler more free to produce optimized code. That's why it is suitable for things that require every last bit of performance you can wring out of it. Today probably you can get the same thing with C or C++ provided you are prepared to use things like restrict, but it used to be you couldn't, so Fortran ruled certain topics.
Python is an easy-to-use system with abysmal performance - expect 10-100x slowdown for code that runs in pure Python over a similar C version. If you can get things set up so Python is only gluing other C components together and the data never has to touch native Python data structures or loops, then performance will be fine, but now you aren't really coding in Python any more.
The point is, the purpose of Fortran and the purpose of Python are entirely opposed. They are exactly the opposite of each other. So it boggles the mind how you can think that Python can be Fortran "done right". So much so that now I suspect I got trolled. Well done, sir.
Yes I understand, and many people made the same point. However Fortran was for a lot of scientists and engineers the hammer to crack any nut. It was used for simple "try outs" where performance wasn't needed, simply because it was the language that Engineers knew. I think the same thing is happening with Python now, it is the first and sometimes only language that many engineers know. Now for the performance issue, it will not give the best performance but packages like SciPy and NumPy do give very good performance (arguably by using these libraries you are just using python to string c functions together, but it is properly integrated). Tests show that you are getting about a third of the performance of Fortran [nasa.gov], (with the exception of the Fortran DGEMM marix multiply which greatly outperforms Python and other Fortran variants). The typical engineering reaction to performance needs is to throw hardware at the problem, then optimise your algorithm, and only change language if absolutely necessary!
Re: (Score:2)
I don't think that being easy is python's main advantage. Using a dynamic environment were you can
Matlab (Score:1)
Bye-bye Matlab. I liked your plotting capabilities, but that was about it.
Re: (Score:2)
matplotlib already does this in conjunction with Numpy and Scipy - its plotting quality and flexibility compares favourably to Matlab.
Its biggest drawback is that it is pretty glacial even by Matlab's standards when rendering large datasets (think millions of points). I'm not sure whether matplotlib or the interactive backend is at fault, but anything DARPA can do to improve the situation would be welcome.
Re: (Score:2)
Still nothing for Simulink.
Re: (Score:2)
Sage doesn't do anything Simulink does.
Python 2 or 3? (Score:4, Interesting)
So is this going to focus on Python 2 or 3? Might be a reason to upgrade..
Re:Python 2 or 3? (Score:5, Informative)
Both. The prebuilt "Anaconda" distro defaults to Python 2.7, but it also works with 3.3 and 2.6.
Wrong language (Score:5, Funny)
The put the money in the wrong place. They should have put it in to R which very popularly interfaces with Python.
Re:Wrong language (Score:4, Informative)
DARPA runs a lot of these research seed programs, putting a couple of million dollars into a bunch of different but related research projects. In this case the program budget is $100 million in total, and Continuum got $3 million for their Python work (Numba, Blaze, etc). Some of the program money may have gone to R as well; there's a couple of dozen research groups, but I don't have a full list.
Re: (Score:2)
Wow, I hope not. As much as I am actually a Ruby fan at heart; and as much as I appreciate the R community and everything R has done, it always seems much easier to write slow and/or memory-intensive R code than in Python. Perhaps I never quite spent enough time with it but there are many corners to the language which seem unnecessarily tedious. And no references - variables are all copied around the place, which is expensive. I know, I know... worrying about pass-by-value and efficiency of assignment state
Re: (Score:3)
R is a statistical programming language. It has lots of neat methods and functions implemented, and is rules the world of statistical analysis.. which is kinda cool, since it's also open source.
It sits pretty much halfway between Matlab and Python.. It's pretty usuable and convenient because of the huge library, but as a programming language it just, well, sucks ball. Building up the objects some of the methods there need, if you get data from an un
Re: (Score:2)
Others have complained about limitations of R in this very thread, so it doesn't seem as cut-and-dried as you make it out to be. Python is the popular language of this particular fifteen-minute period, so it's the logical choice to put the effort into. Scientists would like to benefit from language popularity too.
Re: (Score:2)
Perhaps. After all, it is in the nature of companies to ask as much money as possible for as little work as possible.
Good news for the Python community (Score:3, Funny)
As a full time Python developer for going on 6 years this is good to hear! Now if we can get a Python-lite to replace Javascript in the browser.
Re: (Score:2)
Yeah, the issue is that Python is pretty hard to sandbox, being the hugely dynamic language it is. I imagine it would take a lot to get the browsers to stop working on their JavaScript implementations that they have sunk insane amounts of time and effort into, and start something brand new.
Trust me, I'd love to see it happen, but I don't think it will.
depends (Score:2)
Yeah, the issue is that Python is pretty hard to sandbox, being the hugely dynamic language it is.
Forgive me but JavaScript is also hugely dynamic. How does this prevent effective sand boxing in the general sense?
I imagine it would take a lot to get the browsers to stop working on their JavaScript implementations that they have sunk insane amounts of time and effort into, and start something brand new.
Another solution is to program in a subset of Python that gets verified at compile time with additional restrictions, and then compiled into JavaScript (the way CoffeeScript does.) That way we re-capture the investment already made in browser-side JavaScript technology.
Trust me, I'd love to see it happen, but I don't think it will.
That sounds more like a solution looking for a problem. No need to reinvent the browser vm wheel. Reuse what's there to greate
Re: (Score:2)
So... (Score:3)
So, they're porting R and Perl PDL to Python, then?
There is also Pandas (Score:2)
Pandas http://pandas.pydata.org/ [pydata.org] is another great tool for data analysis. It use numpy and is highly optimized with critical code paths which is written in C.
There's more to XDATA (Score:2)
Python? (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Insightful)
Okay, look. I used Octave for a long time on Linux and on Windows. On Linux (Ubuntu) it generally worked rather well and I used it for classwork where possible. On Windows, it works well as long as you don't need to plot anything. I can't tell you the number of times I installed/uninstalled various versions of Octave on Windows to find out that the plotting was broken in some way. MATLAB is great until you run in to licensing issues.
Then I found out about the combination of IPython/Numpy/Scipy/Matplotl
Re: (Score:2)
110 reasons to pick Python over Matlab (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Have they heard of Matlab?
Have you heard of SciPy?
I predict that a tipping point is coming, and after we reach that tipping point, Matlab will become a legacy language and all the new projects will be SciPy.
Right now Matlab is benefiting from network effect: everyone uses Matlab because everyone uses Matlab. It's the standard, you expect to see everyone using it in certain industries. But it's a proprietary product controlled by a single company that is doing its best to extract maximum revenue from it.
Me
Re: (Score:2)
Re: (Score:2)
grab one of the other million and 1/2 great open source mathematical packages
Okay, why?
The scientific community is already coalescing around SciPy. You are arguing that DARPA should send money to anything but SciPy but you didn't give a reason.
Re: (Score:2)
The scientific community is already coalescing around SciPy.
Maybe some it is but I'm going to bet the vaste majority aren't even touching python. I would never use python for scientific computing, it's not designed for it, simply put. Sure you can do light scrientific computing in SciPy maybe even some more advanced functions but if Python has to go balls to the wall, it simply wont measure up!
So if your looking for a system that can handle all your big data and your large storage why not look towards a system which can handle most of it out of the box, that's
YAY (Score:3)
Now China can win!
Big Data != Analytics (Score:3)
The summary and article seem to conflate Big Data with Analytics. These days the two often go together, but it's quite possible to have either one without the other. Big Data is "more data than can fit on one machine", and analytics means "applying statistics to data". E.g. many Big Data projects start out as "capture now, analyze a year or two from now," and maybe just do simple counts in the interim, which is not "analytics". And of course, many useful analytics take place in the sub-terabyte range.
The irony with this story is that Python is useful for in-memory processing, and not "Big Data" per se. To process "Big Data" typically requires (today, based on available tools, not inherent language advantages) JVM-based tools, namely Hadoop or GridGain, and distributed data processing tasks on those platforms require Java or Scala. Both of those platforms leverage the uniformity of the JVM to launch distributed processes across a heterogeneous set of computers.
The real use case here is one first reduces Big Data using the JVM platform, and only then once it can fit into the RAM of a single workstation, use Python, R, etc. to analyze the reduced data. So typically, yes, these Python libraries will be used in Big Data scenarios, but pedantically, analytics doesn't require Big Data and Python isn't even capable (generally, based on today's tools) of processing raw Big Data.
Imagine the research if we took all lobbying (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)