Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Python Government United States

Python Gets a Big Data Boost From DARPA 180

itwbennett writes "DARPA (the U.S. Defense Advanced Research Projects Agency) has awarded $3 million to software provider Continuum Analytics to help fund the development of Python's data processing and visualization capabilities for big data jobs. The money will go toward developing new techniques for data analysis and for visually portraying large, multi-dimensional data sets. The work aims to extend beyond the capabilities offered by the NumPy and SciPy Python libraries, which are widely used by programmers for mathematical and scientific calculations, respectively. The work is part of DARPA's XData research program, a four-year, $100 million effort to give the Defense Department and other U.S. government agencies tools to work with large amounts of sensor data and other forms of big data."
This discussion has been archived. No new comments can be posted.

Python Gets a Big Data Boost From DARPA

Comments Filter:
  • Great. Just Great (Score:1, Insightful)

    by Anonymous Coward

    The work is part of DARPA's XData research program, a four-year, $100 million effort to give the Defense Department and other U.S. government agencies tools to work with large amounts of sensor data and other forms of big data.

    Yeah the govt needs better systems to manage the huge databases and dossiers they are building on everybody with their warrentless wiretaps and reading everybody's emails. Anybody who helps with this project is pretty damn naive if they don't think it will also be used for this.

    For that matter anybody who trusts the govt and thinks the govt is your friend is pretty damn naive. Yeah I would like to believe that too. No I won't ignore the mountains of evidence to the contrary. I won't treat all the cou

    • by Kwyj1b0 ( 2757125 ) on Wednesday February 06, 2013 @04:07AM (#42806151)

      Yeah the govt needs better systems to manage the huge databases and dossiers they are building on everybody with their warrentless wiretaps and reading everybody's emails. Anybody who helps with this project is pretty damn naive if they don't think it will also be used for this.

      For that matter anybody who trusts the govt and thinks the govt is your friend is pretty damn naive. Yeah I would like to believe that too. No I won't ignore the mountains of evidence to the contrary. I won't treat all the counterexamples as isolated cases. I see them for what they are: an amazingly consistent pattern. The rule, not the exception. Govt positions are really attractive to sociopath types who just love power and control and a feeling that they are important and they get that feeling by imposing their will on us.

      So what you are saying is that DARPA funds will be used in a way to further the goals of DARPA/The government? Shocking. I haven't read anything that says which agencies will/won't have access to these tools - so I'd hazard a guess that any department that wants it can have it (including the famous three letter agencies).

      FYI, Continuum Analytics is a company that is based on providing high-performance python-based computing to clients. Any packages they might release will either be open source (and can be checked), or closed source (in which case you don't have to use it). They aren't hijacking the Numpy/Scipy libraries. They are developing libraries/tools for a client (who happens to be DARPA). (Frankly, I'd hope that Continuum Analytics open sources their development because it might be useful to the larger community). You do know that DARPA funds also go to improve robotics, they supported ARPANET, and a lot of their space programs later got transferred to NASA?

      Basically, I have no idea what you are ranting about. One government organization funded a project - it happens all the time. Do you rant about NSF/NIH/NASA money as well? If so, you'd better live in a cave - a lot of government sponsored research has gone into almost every modern convenience that we take for granted.

      • by Anonymous Coward on Wednesday February 06, 2013 @06:27AM (#42806621)
        What is this APRANET thing? It sounds like some useless crap loaded acronym to me.
      • Re:Great. Just Great (Score:5, Informative)

        by sdaug ( 681230 ) on Wednesday February 06, 2013 @08:54AM (#42807227)

        Frankly, I'd hope that Continuum Analytics open sources their development because it might be useful to the larger community

        Open sourcing is a requirement of the XDATA program.

    • Yeah the govt needs better systems to manage the huge databases and dossiers they are building on everybody with their warrentless wiretaps and reading everybody's emails. Anybody who helps with this project is pretty damn naive if they don't think it will also be used for this.

      Isn't this true of all useful open source projects?

  • by Chrisq ( 894406 ) on Wednesday February 06, 2013 @03:35AM (#42806023)
    I get the impression that in the Engineering and Scientific community Python is the new Fortran. I hope so, because it would be "Fortran done right".
    • I think you're right.
      I love Ruby, it's a very fun and effective language, I could write it in my sleep but there are so many cool projects that are written in Python.
      Those languages are *very* similar, and it's a shame that so much effort is being divided between communities.
      I might get to learn Python one day but I'm afraid I'd become a so-so programmer in both languages.

      • by jma05 ( 897351 ) on Wednesday February 06, 2013 @04:16AM (#42806185)

        > I might get to learn Python one day but I'm afraid I'd become a so-so programmer in both languages.

        I empathize since I conversely only barely use Ruby. Once someone learns one of these languages, there is not that much that the other offers. But happily, one need not learn advanced Python to benefit from these projects.

        > it's a shame that so much effort is being divided between communities

        AFAIK, all scientific funding from US and Europe is/was always directed to Python, not Ruby. So Python is firmly established as a research language and there is not much effort being divided with Ruby (which seems to have a much more spotted and amateur movement in this direction), at least as far as scientific stuff is concerned (Ruby is more popular on web app side). For me the tension for scientific use is not between Python and Ruby, but between Python and R. Python community is replicating a lot of R functionality these days but R still has a much better lead in science libraries. Happily, it is quite easy to call R from Python.

      • I love Ruby, it's a very fun and effective language, I could write it in my sleep but there are so many cool projects that are written in Python. Those languages are *very* similar, and it's a shame that so much effort is being divided between communities.

        I think I disagree. I think that its great that both communities exist and each can develop languages in ways unconstrained by the particular historical choices that shaped the other languages (and that, in both cases, each has subcommunities around part

    • by solidraven ( 1633185 ) on Wednesday February 06, 2013 @04:18AM (#42806195)
      You're dead wrong, nothing quite beats Fortran in speed when it comes to number crunching. If you need to go through hundreds of gigabytes of data and performance is important there's only one realistic choice: Fortran. Python isn't fit to run on a large cluster to simulate things, too much overhead. And lets not forget what sort of efficiency you can get if you use a good compiler (Intel Composer). You won't find Fortran on the way out over here, it's here to stay!
      • by ctid ( 449118 )

        Why would Fortran be any faster than any other compiled language?

        • by Anonymous Coward on Wednesday February 06, 2013 @04:46AM (#42806261)
          Short answer, Fortran has stricter aliasing rules so the compiler has more optimization opportunities. Long answer, see Stack Overflow [stackoverflow.com].
        • Why would Fortran be any faster than any other compiled language?

          Because the language is simpler, so the compiler can make assumptions and generate better automatic optimizations. C/C++ are much harder to optimize (=generate optimal assembly instructions).

      • by ssam ( 2723487 )

        FORTRAN does arrays in a way thats slightly easier for the compiler to optimise. But some modern techniques and data structures are much harder to do in FORTRAN compared to c++. It is also quite easy to call C, C++ or FORTRAN functions from python.

        Writing a loop in python is slow. You express that loop as a numpy array operation you get a substantial way towards c speed. if you use numexpr you will get something faster than a simple C version.

        Processing big data is as much about moving the data around, and

        • Processing big data is as much about moving the data around, and minimising latency in this movement as the raw processing speed. so a language that lets you express things efficiently will win in the end.

          If by expressing things efficiently you mean easy for the programmer to write, then you're wrong. What matters (doubly so for big data) is full control over the machine's resources, ie how data is laid out in memory, good control over i/o etc. While this has always been the key to fast performan

      • by Kwyj1b0 ( 2757125 ) on Wednesday February 06, 2013 @05:14AM (#42806355)

        Compared to plain old Python, yes. But Cython offers a lot of capabilities that improve speed dramatically - just using a type for your data in Cython gives programs a wonderful boost in speed.

        As someone who uses Matlab for most of my programming, I have come to detest languages that do not force specifying a variable type and/or declaring variables. Matlab offers neither, but it is a standard in some circles.

      • by LourensV ( 856614 ) on Wednesday February 06, 2013 @06:06AM (#42806555)

        You're probably right, but you're also missing the point. Most scientists are not programmers who specialise in numerical methods and software optimisation. Just getting something that does what they want is hard enough for them, which is why they use high-level languages like Matlab and R. If things are too slow, they learn to rewrite their computations in matrix form, so that they get deferred to the built-in linear algebra function libraries (which are written in C or Fortran), which usually gets them to within an order of magnitude of these low-level languages.

        If that still isn't good enough, they can either 1) choose a smaller data set and limit the scope of their investigations until things fit, 2) buy or rent a (virtual) machine with more CPU and more memory, or 3) hire a programmer to re-implement everything in a low-level language and so that it can run in parallel on a cluster. The third option is rarely chosen, because it's expensive, good programmers are difficult to find, and in the course of research the software will have to be updated often as the research question and hypotheses evolve (scientific programming is like rapid prototyping, not like software engineering), which makes option 3) even more expensive and time-consuming.

        So yes, operational weather forecasts and big well-funded projects that can afford to use it will continue to use Fortran and benefit from faster software. But for run-of-the-mill science, in which the data sets are currently growing rapidly, having a freely available "proper" programming language that is capable of relatively efficiently processing gigabytes of data while being easy enough to learn for an ordinary computer user is a godsend. R and Matlab and clones aren't it, but Python is pretty close, and this new library would be a welcome addition for many people.

        • by nadaou ( 535365 ) on Wednesday February 06, 2013 @07:41AM (#42806929) Homepage

          You're probably right, but you're also missing the point. Most scientists are not programmers who specialise in numerical methods and software optimisation.

          Which is exactly why FORTRAN is an excellent choice for them instead of something else fast (close to assembler) like C/C++, and why so many of the top fluid dynamics models continue to use it. It is simple (perhaps a function of its age) and because of that it is simple to do things like break up the calculation for MPI or tell the compiler to "vectorize this" or "automatically make it multi-threaded" in a way which is still a long from maturity for other languages.

          Can you guess which language MATLAB was originally written in? You know that funny row,column order on indexes? Any ideas on the history of that?

          R is great an all, and is brilliant in its niche, but how's that RAM limitation thing going? It's not a solution for everything.

          MATLAB is pretty good too, as is Octave and SciLab, and it has gotten a whole lot faster recently, but ever try much disk I/O or array resizing for something which couldn't be vectorized? Becomes slow as molasses.

          If that still isn't good enough, they can either 1) choose a smaller data set and limit the scope of their investigations until things fit,

          heh. I don't think you know these people.

          2) buy or rent a (virtual) machine with more CPU and more memory,

          Many problems are I/O limited and require real machines with high speed low latency network traffic. VMs just don't cut it for many parallelized tasks which need to pass messages quickly.

          Forgive me if I'm wrong, but your post sounds a bit like you think you're pretty good on the old computers, but don't know the first thing about FORTRAN and are feeling a bit defensive about that, and attacking something out of ignorance.

          • You're not picking on me [slashdot.org], you're arguing your point. That's what this thing here is for, so no hard feelings at all.

            I'll readily admit to not knowing Fortran (or much Python! ;-)); I'm a C++ guy myself, having got there through GW-Basic, Turbo Pascal and C. I now teach an introductory programming course using Matlab (and know of its history as an easy-to-use Fortran-alike), and I use R because it's what's commonly used in my field of computational ecology. I greatly dislike R, and I'm not too hot on Matlab

          • by gl4ss ( 559668 )

            buying vm's on the same farm is just another way of getting access to real machines on the cheap for limited time. it's just another way of saying of buying time on a supercomputer now.

          • That funny row/column order in matrix indices (aka column major order) is because it's the correct mathematical order.

            Consider that you can only multiply two matrices if matrix A is of size [i,j], and matrix B is of size [j,k], i.e. the number of rows in A must be equal to the number of columns in B. The product C=AB is then of size [i,k]. This works for any number of matrices, so, [i,j]*[j,k]*[k,l]*[l,m] is valid, and gives [i,m].

            This naturally leads to the indexing you see in Fortran and Matlab, because i

        • by nadaou ( 535365 )

          So yes, operational weather forecasts and big well-funded projects that can afford to use it will continue to use Fortran and benefit from faster software.

          I don't mean for this to be pick on LourensV day, but I have another small nit to pick. You're presuming operational weather forecasting is well funded? I don't think funding has anything to do with it. Often it's what the original author knew which chose the language.
          And have you seen what's been done to NOAA's budget over they last decade?? Well funded.

          • by csirac ( 574795 )

            Perhaps he means it's well funded in the sense that they have dedicated programmers at all. "Run of the mill" science is done by investigating scientists or their jack-of-all-trades research assistants, collaborators or grads/post-docs, etc. most of which are unlikely to have substanital software engineering experience or training in their background.

            Nonetheless, they write code - very useful, productive code - but it's in whatever tool or high-level language popular among their peers/discipline (matlab, R,

        • by tyrione ( 134248 )

          You lost me at ``Most scientists are not programmers...'' schtick. Whether it was my Mechanical Engineering professors fluent in ADA, C, Fortran, C++ or Pascal or my EE professors in the same, to my Mathematics Professors all in the same, not a single CS Professor could hold a candle to them, unless we started dicking around with LISP, SmallTalk or VisualBasic for shits and giggles. In fact, they became proficient in these languages because they had to write custom software to model nonlinear-dynamic system

        • I disagree partially with what you said based on personal experience. As an EE student I had to learn to use Fortran for my thesis. I needed to run a large EM simulation and not a single affordable commercial program was able to run on a small cluster of computers that was available. So I resorted to using Fortran and MATLAB for visualisation. So I managed to learn basic Fortran over the weekend and then use it to write a working program for a cluster, all within 1 week time. I just don't think I could have
          • by steveha ( 103154 )

            I resorted to using Fortran and MATLAB for visualisation. So I managed to learn basic Fortran over the weekend and then use it to write a working program for a cluster, all within 1 week time. I just don't think I could have done that with Python.

            Python with SciPy is a lot like MATLAB. Python, the language, is far superior to MATLAB's language; I hate 1-based array indexing, for example. MATLAB's language does have a few special features for matrices that Python lacks, but that is just syntactic sugar (th

            • Sadly, with the exception of a few times where I get to sum an array, pretty much my whole model needs to be run in a fast language like C or Fortran (I use C, my supervisor uses F77). It's the kind of model (a spatial stochastic disease simulation) that doesn't really lend itself to coding up in Python. No matrices, just lots of little bits of data interrogation, calculating one event at a time, and so many loops (unavoidable) that it would just crawl in Python. If you try to start in Python and replace th

            • Python with SciPy is a lot like MATLAB. Python, the language, is far superior to MATLAB's language; I hate 1-based array indexing, for example. MATLAB's language does have a few special features for matrices that Python lacks, but that is just syntactic sugar (there are functions to do everything in Python).

              Even as a MATLAB user I agree, as long as we're strictly talking about the language. Many of GNU Octave's woes (though they're getting JIT now!) can be blamed on the poor language design.

              But there are many things that SciPy doesn't have. Yes, MATLAB is an unnecessarily expensive choice for data analysis, but my employer only uses it for that (not "big data," mind you) because it's already our design tool, so it's an ideal rapid prototyping environment. That's where it really shines: Simulink and code gen

            • I doubt SciPy would have been as easily to expand for running on a cluster. These sort of things come of as natural to Fortran. Additionally if I write my code in Fortran the compiler can optimize it a lot further than Python will allow me to. Hence the speed advantage is still in Fortran's hands which is important if you don't have access to the latest hardware and time on large clusters.
      • PyPy [pypy.org] might change that in the future, especially with the Transitional Memory [pypy.org] branch.

      • You're dead wrong, nothing quite beats Fortran in speed when it comes to number crunching. If you need to go through hundreds of gigabytes of data and performance is important there's only one realistic choice: Fortran. Python isn't fit to run on a large cluster to simulate things, too much overhead. And lets not forget what sort of efficiency you can get if you use a good compiler (Intel Composer). You won't find Fortran on the way out over here, it's here to stay!

        Isn't that the point of DARPA funding this project - to make it so Python is fit to run on a large cluster to simulate things? I do agree, though, that Fortran is here to stay. However, it is so specialized in what it does and that often a solution then requires multiple languages to get the task accomplished.

        Back in the day (1970s) I had a professor who would say that you can write anything in anything. For instance you could write a business app in Fortran and you can use COBOL for plotting trajectories

        • Sure you can, any language that has a full feature set can do any task that the system is capable off. But efficiency is also important, and Fortran simply has so much advantages over Python. Complex data structures aren't needed for most simulations while they make optimisation so much harder. Additionally interpretation is a serious bottleneck.
    • by SpzToid ( 869795 )

      No one seems to be pining away for Fortran programmers. At least not much ayways. A quick 'n dirty search on Dice.com yields 46 results, (and no doubt a few are doubles).

  • by Anonymous Coward

    Bye-bye Matlab. I liked your plotting capabilities, but that was about it.

    • matplotlib already does this in conjunction with Numpy and Scipy - its plotting quality and flexibility compares favourably to Matlab.

      Its biggest drawback is that it is pretty glacial even by Matlab's standards when rendering large datasets (think millions of points). I'm not sure whether matplotlib or the interactive backend is at fault, but anything DARPA can do to improve the situation would be welcome.

    • Still nothing for Simulink.

  • Python 2 or 3? (Score:4, Interesting)

    by toQDuj ( 806112 ) on Wednesday February 06, 2013 @03:51AM (#42806095) Homepage Journal

    So is this going to focus on Python 2 or 3? Might be a reason to upgrade..

  • by Dishwasha ( 125561 ) on Wednesday February 06, 2013 @04:03AM (#42806135)

    The put the money in the wrong place. They should have put it in to R which very popularly interfaces with Python.

    • Re:Wrong language (Score:4, Informative)

      by SQL Error ( 16383 ) on Wednesday February 06, 2013 @05:09AM (#42806337)

      DARPA runs a lot of these research seed programs, putting a couple of million dollars into a bunch of different but related research projects. In this case the program budget is $100 million in total, and Continuum got $3 million for their Python work (Numba, Blaze, etc). Some of the program money may have gone to R as well; there's a couple of dozen research groups, but I don't have a full list.

    • by csirac ( 574795 )

      Wow, I hope not. As much as I am actually a Ruby fan at heart; and as much as I appreciate the R community and everything R has done, it always seems much easier to write slow and/or memory-intensive R code than in Python. Perhaps I never quite spent enough time with it but there are many corners to the language which seem unnecessarily tedious. And no references - variables are all copied around the place, which is expensive. I know, I know... worrying about pass-by-value and efficiency of assignment state

      • by hyfe ( 641811 )
        http://en.wikipedia.org/wiki/R_(programming_language) [wikipedia.org]

        R is a statistical programming language. It has lots of neat methods and functions implemented, and is rules the world of statistical analysis.. which is kinda cool, since it's also open source.

        It sits pretty much halfway between Matlab and Python.. It's pretty usuable and convenient because of the huge library, but as a programming language it just, well, sucks ball. Building up the objects some of the methods there need, if you get data from an un

    • Others have complained about limitations of R in this very thread, so it doesn't seem as cut-and-dried as you make it out to be. Python is the popular language of this particular fifteen-minute period, so it's the logical choice to put the effort into. Scientists would like to benefit from language popularity too.

  • by kauaidiver ( 779239 ) on Wednesday February 06, 2013 @04:04AM (#42806145)

    As a full time Python developer for going on 6 years this is good to hear! Now if we can get a Python-lite to replace Javascript in the browser.

    • Yeah, the issue is that Python is pretty hard to sandbox, being the hugely dynamic language it is. I imagine it would take a lot to get the browsers to stop working on their JavaScript implementations that they have sunk insane amounts of time and effort into, and start something brand new.

      Trust me, I'd love to see it happen, but I don't think it will.

      • Yeah, the issue is that Python is pretty hard to sandbox, being the hugely dynamic language it is.

        Forgive me but JavaScript is also hugely dynamic. How does this prevent effective sand boxing in the general sense?

        I imagine it would take a lot to get the browsers to stop working on their JavaScript implementations that they have sunk insane amounts of time and effort into, and start something brand new.

        Another solution is to program in a subset of Python that gets verified at compile time with additional restrictions, and then compiled into JavaScript (the way CoffeeScript does.) That way we re-capture the investment already made in browser-side JavaScript technology.

        Trust me, I'd love to see it happen, but I don't think it will.

        That sounds more like a solution looking for a problem. No need to reinvent the browser vm wheel. Reuse what's there to greate

      • It could easily be done but there are too many people who are heavily invested in JS being broken.
  • by CAIMLAS ( 41445 ) on Wednesday February 06, 2013 @04:37AM (#42806237)

    So, they're porting R and Perl PDL to Python, then?

  • Pandas http://pandas.pydata.org/ [pydata.org] is another great tool for data analysis. It use numpy and is highly optimized with critical code paths which is written in C.

  • It's strange that this article focused on Python and Continuum when there is a much bigger story to be had. The XDATA program is being run in a very open source manner, and there will be a multitude of open source tools created and delivered by the end of the contract. The program is focusing on two major tasks: the analytics/algorithmic tools to process big data; and the visualization/interaction tools that go along with them.
  • by Murdoch5 ( 1563847 ) on Wednesday February 06, 2013 @08:42AM (#42807179) Homepage
    Have they heard of Matlab?
    • Have you heard of Open Source?
      • Fine then use Octave or one of the other mathematical open source packages. The issue is that they want to adapt a system instead of using an existing one.
        • Re: (Score:2, Insightful)

          by Anonymous Coward

          Okay, look. I used Octave for a long time on Linux and on Windows. On Linux (Ubuntu) it generally worked rather well and I used it for classwork where possible. On Windows, it works well as long as you don't need to plot anything. I can't tell you the number of times I installed/uninstalled various versions of Octave on Windows to find out that the plotting was broken in some way. MATLAB is great until you run in to licensing issues.

          Then I found out about the combination of IPython/Numpy/Scipy/Matplotl

          • by naroom ( 1560139 )
            Thanks! As a scientist looking to switch away from Matlab, this was really informative! Somebody get this guy some mod points :)
      • Well it's group voted on so it's not like I can argue the list. How ever that being said, Matlab or any mathematical computing language is still better suited for big data, the lack of skill of a programmer should never be blamed on the language, it's an easy way out.
        • by naroom ( 1560139 )
          You may not be familiar with SciPy / NumPy. They are the scientific computing side of Python. They support matrix operations and linear algebra at least as well as Matlab does. Underneath, both NumPy and Matlab are just LAPACK anyway. Here, have a relevant wiki article [wikipedia.org].
    • Proprietary languages, lol.
    • by steveha ( 103154 )

      Have they heard of Matlab?

      Have you heard of SciPy?

      I predict that a tipping point is coming, and after we reach that tipping point, Matlab will become a legacy language and all the new projects will be SciPy.

      Right now Matlab is benefiting from network effect: everyone uses Matlab because everyone uses Matlab. It's the standard, you expect to see everyone using it in certain industries. But it's a proprietary product controlled by a single company that is doing its best to extract maximum revenue from it.

      Me

      • Fine, then grab one of the other million and 1/2 great open source mathematical packages and run with it. Basically a ton of money is being sunk into something that can be solved by moving platforms. Open or not, there is software which fulfills the need and for $100 million you can get a lot of anything.
        • by steveha ( 103154 )

          grab one of the other million and 1/2 great open source mathematical packages

          Okay, why?

          The scientific community is already coalescing around SciPy. You are arguing that DARPA should send money to anything but SciPy but you didn't give a reason.

          • The scientific community is already coalescing around SciPy.

            Maybe some it is but I'm going to bet the vaste majority aren't even touching python. I would never use python for scientific computing, it's not designed for it, simply put. Sure you can do light scrientific computing in SciPy maybe even some more advanced functions but if Python has to go balls to the wall, it simply wont measure up!

            So if your looking for a system that can handle all your big data and your large storage why not look towards a system which can handle most of it out of the box, that's

  • by sproketboy ( 608031 ) on Wednesday February 06, 2013 @09:05AM (#42807293)

    Now China can win!

  • by michaelmalak ( 91262 ) <michael@michaelmalak.com> on Wednesday February 06, 2013 @11:15AM (#42808527) Homepage

    The summary and article seem to conflate Big Data with Analytics. These days the two often go together, but it's quite possible to have either one without the other. Big Data is "more data than can fit on one machine", and analytics means "applying statistics to data". E.g. many Big Data projects start out as "capture now, analyze a year or two from now," and maybe just do simple counts in the interim, which is not "analytics". And of course, many useful analytics take place in the sub-terabyte range.

    The irony with this story is that Python is useful for in-memory processing, and not "Big Data" per se. To process "Big Data" typically requires (today, based on available tools, not inherent language advantages) JVM-based tools, namely Hadoop or GridGain, and distributed data processing tasks on those platforms require Java or Scala. Both of those platforms leverage the uniformity of the JVM to launch distributed processes across a heterogeneous set of computers.

    The real use case here is one first reduces Big Data using the JVM platform, and only then once it can fit into the RAM of a single workstation, use Python, R, etc. to analyze the reduced data. So typically, yes, these Python libraries will be used in Big Data scenarios, but pedantically, analytics doesn't require Big Data and Python isn't even capable (generally, based on today's tools) of processing raw Big Data.

  • cash and put it to advancing applied sciences to better the nation. We piss billions down the drain marketing to morons and yet whine about spending billions on DARPA, DoE and whatnot. This county is truly too stupid for its own well-being.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...