Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

The Power of the R Programming Language 382

BartlebyScrivener writes "The New York Times has an article on the R programming language. The Times describes it as: 'a popular programming language used by a growing number of data analysts inside corporations and academia. It is becoming their lingua franca partly because data mining has entered a golden age, whether being used to set ad prices, find new drugs more quickly or fine-tune financial models. Companies as diverse as Google, Pfizer, Merck, Bank of America, the InterContinental Hotels Group and Shell use it.'"
This discussion has been archived. No new comments can be posted.

The Power of the R Programming Language

Comments Filter:
  • by Anonymous Coward

    ... most others keep thinking that M$ Excel is the silver bullet.

    Sad, but f****** true.

    • by Samschnooks ( 1415697 ) on Wednesday January 07, 2009 @07:44PM (#26366339)

      ... most others keep thinking that M$ Excel is the silver bullet.

      The folks I know who use Excel for analysis use it because it's the package that everyone gets in their organization, there's a shit load of material on the web that uses excel, there's plenty of add-ons for it (no need to reinvent the wheel), and when sharing data and analysis, everyone is familiar with it. An engineer I know who uses excel chose it because it was the fastest way to connect to his testing equipment. R is relatively new and as more folks come into the workforce who know it, we'll see it replace Excel for functions that it is better suited for.

      • by Anonymous Coward on Wednesday January 07, 2009 @07:49PM (#26366395)

        I guess I was thinking of analysts using Excel to develop "complicated" statistical analyses. Sure, Excel is unbeatable at handling small, tabular datasets and doing basic or even considerable arithmetic with them.

        When it comes to do more elaborate analysis, using Excel IS reinventing the wheel. Plus, it is IMPOSSIBLE to understand later.

        • by jcupitt65 ( 68879 ) on Thursday January 08, 2009 @08:43AM (#26370893)
          I've come across a couple of examples of inappropriate use of Excel
          • A friend worked at the UK Treasury as a statistician. One of his jobs was testing and improving the Treasury's model of the UK enconomy. I was impressed and asked what tools they used for this. Erm, none, it's just a huge Excel spreadsheet.
          • One of my jobs is modelling for a project using FDG-PET to investiagate COPD and asthma. I was horrified to discover that these large 4D images were being analysed in ... Excel.
          • Re: (Score:3, Interesting)

            I made a living once reaching into excel spreadsheets from python to do math because excel was bone slow. A process that took *three* hours in excel took python about 45 seconds. Maybe my loops were smarter (hint) but it was still a very very dramatic experience. just sayin'
      • by jaxtherat ( 1165473 ) on Wednesday January 07, 2009 @08:14PM (#26366655) Homepage

        Sorry, but R is not relatively new, it's been around for at least 10 years, I was taught how to use R at University back in 2001, and S and later S+ (which R is a FOSS version of) has been around for even longer, since the mid 70's.

      • by colinrichardday ( 768814 ) <colin.day.6@hotmail.com> on Wednesday January 07, 2009 @08:32PM (#26366845)

        Has Microsoft corrected its percentile function? Or does it still put the largest datum in the 100th percentile, as well as assign fractional percentiles?

      • by zippthorne ( 748122 ) on Wednesday January 07, 2009 @09:39PM (#26367379) Journal

        Pfft. Matlab is the fastest way to connect to his testing equipment.

        Well.. Labview, actually, but no one in their right mind would want to actually use it. Anyway, simulink gets you a lot of the graphical programming features if you need that.

        • Re: (Score:3, Insightful)

          by PeterBrett ( 780946 )

          Pfft. Matlab is the fastest way to connect to his testing equipment.

          One of MATLAB's few redeeming features is the Instrument Control Toolbox, especially since it works well with most of the top-end Agilent/Tektronix kit. It's nice to be able to automate acquisition and analysis of instrument data from a single environment.

      • Re: (Score:3, Informative)

        by dave1791 ( 315728 )

        >The folks I know who use Excel for analysis use it because it's the package that everyone gets in their organization, there's a shit load of material on the web that uses excel, there's plenty of add-ons for it (no need to reinvent the wheel), and when sharing data and analysis, everyone is familiar with it

        Back when I was in grad school, ten years ago, Excel was the preferred data analysis tool for most physical and biological scientists that I knew; even when they had high end analysis tools installed

      • Re: (Score:3, Funny)

        by blueskies ( 525815 )

        Why don't they just use Word if they need a database??

        http://www.neopoleon.com/home/blogs/neo/archive/2003/09/29/5458.aspx [neopoleon.com]

    • Re: (Score:3, Insightful)

      by Hatta ( 162192 )

      Do analysts who use R get better returns than those who use Excel?

  • by Anonymous Coward on Wednesday January 07, 2009 @07:39PM (#26366279)

    R!

  • popular? no (Score:5, Insightful)

    by geekoid ( 135745 ) <{moc.oohay} {ta} {dnaltropnidad}> on Wednesday January 07, 2009 @07:39PM (#26366283) Homepage Journal

    Growing in use? sure.

  • There appear to be duplicate links in the summary :)
  • by bogaboga ( 793279 ) on Wednesday January 07, 2009 @07:47PM (#26366369)

    My request is to those that are in the know to show me some example code, that does something useful. Then later, compare that code to code from other languages to accomplish the same task.

    Include reasons to support the notion that the R language is [necessarily] better at what it does.

    • by transonic_shock ( 1024205 ) on Wednesday January 07, 2009 @07:58PM (#26366501) Homepage

      FTA
      "I think it addresses a niche market for high-end data analysts that want free, readily available code," said Anne H. Milley, director of technology product marketing at SAS. She adds, "We have customers who build engines for aircraft. I am happy they are not using freeware when I get on a jet.""

      Seriously, does this person know what she is talking about?

      1. Yes, CFD and Structural Analysis software is increasingly written using open source tools and run on open source OS (Linux running on clusters)

      2. SAS is not used to design any part of the aircraft.

      I have noticed SAS uses the same kind of FUD to counter R as M$ uses to counter Linux.

      • by visible.frylock ( 965768 ) on Wednesday January 07, 2009 @08:04PM (#26366555) Homepage Journal

        Seriously, does this person know what she is talking about?

        Let's see, Director of technology product marketing. I'm gonna go with a big NO.

      • by Peaquod ( 1200623 ) on Wednesday January 07, 2009 @10:49PM (#26367921)
        Yeah that's a poorly informed comment. C is a freeware language. And it is used in virtually every embedded system on earth... like the control system for the laser that cuts your cornea at the neighborhood lasik shop. No doubt R is staggeringly less mature than C, but the fact that it is free has no bearing on its quality.
    • I remember once years ago freaking my colleagues out with a largish app written in R... with nary a loop anywhere.

      Actually that wasn't why I used R, just a fun addendum. The reason to use R is the huge body of statistics, data mining and graphics facilities. Superb.

      Of course, the problem with any statistical library is you have to turn your brain on first. Nothing produces "Garbage in Garbage out" quite like statistical analysis.

      With R you tend to need to spend far more time thinking about why you are doing something, and what the answer means than in say vanilla C/Ruby programming.

      Which is actually not a Bad Thing at all.

      The worse thing about R programming is its name. Googling for "R" turns up way to much noise and way too little signal.

      • by Anonymous Coward on Wednesday January 07, 2009 @08:29PM (#26366815)

        "The worse thing about R programming is its name. Googling for "R" turns up way to much noise and way too little signal"

        Try searching from http://rseek.org/ [rseek.org] instead of directly from Google.

      • by fm6 ( 162816 ) on Wednesday January 07, 2009 @08:47PM (#26366977) Homepage Journal

        I remember once years ago freaking my colleagues out with a largish app written in R... with nary a loop anywhere.

        That's a feature of functional languages, a class that also includes Scheme and XSLT. The basic idea is that programs should not have state, because state makes them harder to debug. A for or while loop, by definition, has state, so you have to do your iteration some other way, namely Tail Recursion [wikipedia.org].

        I suppose that makes sense, but I've never been able to teach myself to think that way. It's the main reason I never managed to get through The Wizard Book [mit.edu].

      • The worse thing about R programming is its name. Googling for "R" turns up way to much noise and way too little signal.

        Yep. There are a couple of dedicated R search engines that can help with that: http://www.dangoldstein.com/search_r.html [dangoldstein.com] and http://www.rseek.org/ [rseek.org]. It may also sometimes be useful to Google on "Splus (whatever)" since most R and S+ code is pretty much interchangeable.

    • r-project.org (Score:4, Informative)

      by Kludge ( 13653 ) on Wednesday January 07, 2009 @08:02PM (#26366541)

      The language is very well documented online and the mailing lists contain thousands of examples. It is primarily for statistical analysis, and the libraries available for doing such analysis are unparalleled.

    • by Anonymous Coward on Wednesday January 07, 2009 @08:05PM (#26366579)

      It may not be "better" in the sense of "calculating stuff with higher efficiency" (i reckon you can do the same stuff in C, given the right libraries :P), but for statistical and data mining/visualization purposes it is a quite simple object-oriented functional language with many useful built-in procedures and lots of freely available packages/libraries that is simple enough for "non-programmers" and, so far, it does what i want it to do fast enough and.. it's free.

      So.. probably not the best all-purpose programming language, but fits nicely in the "statistical software environment/language" niche and, unlike SPSS et al., it's free (as in "libre", as in "everyone can independently verify your results without having to shell out cash", which is useful in academia).

      Example code:

      results <- prcomp(datamatrix)

      This does a PCA (Principal Component Analysis [wikipedia.org]) on the data contained in "datamatrix" and dumps the results into the "results" variable.

      I have no idea how i would start to code that in C, python, etc. in a way that's remotely efficient ;)

      • by Jurily ( 900488 )

        I have no idea how i would start to code that in C, python, etc. in a way that's remotely efficient ;)

        I'd go with

        #include "prcomp.h"

        Once someone did the algorithm for you, any programming language is easy. I think the point of the language would be, if said algorithm was orders of magnitude easier to code, represent, argue about, etc. in R, than it would be in "C, Python, etc."

        • R has vector operations. Every operator works componentwise on a vector, and does the right thing with scalars. This makes vector heavy code easier and clearer to write.
          • Re: (Score:3, Insightful)

            by zippthorne ( 748122 )

            But we already have a language that does vectors correctly. It's called Matlab and it's based on Fortran, which I guess technically also does vectors correctly, if you want to bother to learn it.

      • by stephentyrone ( 664894 ) on Wednesday January 07, 2009 @09:16PM (#26367173)

        I have no idea how i would start to code that in C, python, etc. in a way that's remotely efficient ;)

        How about:

        #include <clapack.h>
        dgesdd( argument list );

        This sort of thing is a feature of libraries, not an inherent advantage of one language.

    • by Keyper7 ( 1160079 ) on Wednesday January 07, 2009 @08:12PM (#26366637)

      It's been a while since I worked with it and I don't have code examples with me at the moment, but think of it as the Matlab/Octave of statistics, including the preference for "function over each row/column" instead of loops.

      Compared to other languages, R makes it easy to do statistical analysis tasks like Matlab/Octave makes it easy to do linear algebra tasks.

      Plus, as other posts stated above, there's excellent documentation and tons of useful libraries (take a peek at the libraries available at the Debian repositories), Bioconductor being the finest example.

      Oh, and nice emacs integration. :)

    • by lt. slock ( 1123781 ) on Wednesday January 07, 2009 @08:19PM (#26366715)

      I use R a great deal. Think of it as an alternative to MATLAB, or Excel, rather than C or perl or lisp or whatever you like to use as a general purpose language. So, compared to MATLAB, functions are first class objects (rather like lisp), so, you can write functions that take functions as arguments, and return them as well, just as though
      they were simple variables. It handles
      vectors rather easily, and has decent plotting tools.

      #quick example

      # function, which, given numerical arguments a and b, and a function g, returns a function of x
      f - function(a,b, g){
          function(x){ a * x + g(b * x)}
      }

      f1 - f(1,2.5,sin)
      x - seq(-pi,pi,l=100)
      plot(x,f1(x),type='l')

      • Re: (Score:2, Informative)

        by lt. slock ( 1123781 )
        note that the minus (as in f - function...) signs should be (left angle bracket minus sign), that is, the R assigmnent operator, I guess this is the lameness filter
    • by Anonymous Coward on Wednesday January 07, 2009 @08:19PM (#26366717)

      i'm a PhD student in biostatistics at a fairly prestigious american university. we use R almost exclusively, because it is better than other statistical software options. reasons for it's superiority are i) it's free ii) it's open source and iii) its considerably more powerful than STATA, SPSS, SAS, etc.

      it is true that other languages can be quicker for many tasks. proficiency in C is desirable, but C is not geared toward statistics, where many built-in libraries and user-contributed packages for R implement complex methodologies.

      i'm not as versed in C as i am in R, so i can't provide a direct comparison of the languages, but i have included a sample below. it's a function that fits a simple linear model, taking the outcome data and input data (as a matrix) and a couple of other parameters as inputs. it returns a variety of values, including the model coefficients and fitted values. there is an R function that does this exact thing, but we have to do something for homework.

      lm=function(y,x,returnHat=FALSE,addInt=FALSE){
              if(addInt){
                      x=cbind(matrix(1,nrow(x),1),x)
                      }
              #use range around 0, for roundoff error
              if(-1e-5=det(t(x)%*%x) & det(t(x)%*%x)=1e-5){stop("x'x not invertible",call.=F)}

              beta=solve(t(x) %*% x) %*% t(x) %*% y
              sigma = as.numeric(sqrt(var(y-(x%*%beta))))
              varbeta=sigma * (solve(t(x)%*% x))
              fitted=x %*% beta
              residuals=y-fitted

              if(!returnHat){
              output=list(beta,sigma,varbeta,fitted,residuals)
              names(output)=c("beta","sigma","varbeta","fitted","residuals")
                                  }

              if(returnHat){
              hat=x%*% solve(t(x) %*% x) %*% t(x)
              output=list(beta,sigma,varbeta,fitted,residuals,hat)
              names(output)=c("beta", "sigma", "varbeta", "fitted", "residuals", "hat matrix")
                      }

              output

      }

      i'd also say that i'm glad to see some press for R. it's popular in some circles, but not as accepted by companies and some academics because it is open source. the idea is that software you have to pay a licensing fee for must be more reliable because, well, you paid for it (thinking i'm sure you're familiar with).
             

    • Re: (Score:3, Interesting)

      by garcia ( 6573 )

      My request is to those that are in the know to show me some example code, that does something useful. Then later, compare that code to code from other languages to accomplish the same task.

      Would you ask someone who utilizes SAS or SPSS to do the same thing? Because that's more or less what R is -- a free version of SAS or SPSS. I work in SAS all day long and I have been planning on using R to automate some of my personal website statistics/graphing that I run regularly because I don't really like doing the

    • Re: (Score:2, Informative)

      by dookiesan ( 600840 )
      In R you can easily extract elements of an array :

      x = 1:10 #integers from 1 to 10

      #set all even elts of x that are less than 7

      x[(x < 7)&(x %% 2 == 0)] = -1

      #y is some big array with several dimensions

      #I and J are vectors of integers

      z = y[I,,J,,, drop = F]

      #'z' is now a sub array

      z = y[I,2,J,1,]

      #now z is a subarray with fewer dimensions

    • Re: (Score:3, Insightful)

      by gringer ( 252588 )

      Okay, I'll take you up on that... here's some code that takes in a vector of genotypes (as a factor with levels AA,AC,CC,XX), and a matrix of columns to be used for different bootstraps, and spits out a list of genotype counts for those bootstraps:


      ## matmap -- maps a vector onto a matrix of indexes to the vector
      ## (a hack to get round something that R doesn't seem to do by default)
      matmap <- function(vector.in, matrix.indices){
      res <- vector.in[matrix.indices];
      if(is.null(di

  • Oh god... cue pirate jokes.

  • Free as in beer (Score:3, Insightful)

    by visible.frylock ( 965768 ) on Wednesday January 07, 2009 @08:00PM (#26366521) Homepage Journal

    "R is a real demonstration of the power of collaboration, and I don't think you could construct something like this any other way," Mr. Ihaka said. "We could have chosen to be commercial, and we would have sold five copies of the software."

    Very true. This is what I try to explain to people when they can't understand why some software is given away gratis. Because if they charged for it, given the current attitudes of the market, they wouldn't stand a chance and wouldn't ever get any market share to begin with.

  • by enilnomi ( 797821 ) on Wednesday January 07, 2009 @08:00PM (#26366525)
    FTFA:

    She [Anne H. Milley, director of technology product marketing at SAS] adds, "We have customers who build engines for aircraft. I am happy they are not using freeware when I get on a jet."

    Good thing Boeing's not using fere software for aircraft simulation tools [sourceforge.net], space station labs [nasa.gov], sub hunters [com.com], or moon rockets [popularmechanics.com] ;-)

  • FUD from SAS (Score:4, Insightful)

    by idiot900 ( 166952 ) * on Wednesday January 07, 2009 @08:13PM (#26366647)

    "I think it addresses a niche market for high-end data analysts that want free, readily available code," said Anne H. Milley, director of technology product marketing at SAS. She adds, "We have customers who build engines for aircraft. I am happy they are not using freeware when I get on a jet."

    Wow...talk about FUD. Does SAS imdemnify against plane crashes?

    • Re: (Score:3, Interesting)

      by belmolis ( 702863 )

      I guess nobody told her about how proprietary Excel is inferior to libre Gnumeric in having quite a few errors in its statistical functions and how when apprised of errors the Gnumeric folks fixed them quickly while the Excel folks either never fixed them, did it slowly, or introduced new errors? See the report [csdassn.org] by Drexel University statistician B. D. McCullough.

  • by idiot900 ( 166952 ) * on Wednesday January 07, 2009 @08:15PM (#26366665)

    Actually it may not suck. But having used it on and off over the past few years while not being a statistics pro, I find the R language bletcherous and annoying. - as an assignment operator?

    • Well, crap, hit Submit instead of Preview. Meant to say, <- as an assignment operator (I know = works now, but still...)? Bizarre data frame and object semantics? R is quite useful but I really dislike writing anything nontrivial in it.

    • Re: (Score:3, Informative)

      by tmoertel ( 38456 )

      The R language is optimized for writing statistical code. It's going to seem a little weird, especially if you have a traditional programming background. Once you spend some serious time writing R code, however, you will probably begin to appreciate many of the things that initially seemed odd.

      For example, consider the way R handles function calls [moertel.com]:

      • It allows you to pass function arguments by name and abbreviate the names, which is handy during live sessions when you want to call statistical routines that
  • I have high regards for Ashlee Vance and miss his Podcasts he did while he was at The Register. It puzzles me he included this old FUD chestnut. Seems like a throw back from the 90's.

    Anne H. Milley, director of technology product marketing at SAS ... adds, "We have customers who build engines for aircraft. I am happy they are not using freeware when I get on a jet."

  • From TFA:

    It is becoming their lingua franca partly because data mining has entered a golden age, whether being used to set ad prices, find new drugs more quickly or fine-tune financial models.

    The "smart set" needs a such a high level lingua franca to express infinite precision financial models of no accuracy whatsoever!

  • Or at least in the context it's made out to be in this article. Isn't it a language suited mostly to statistics? For that use, I hear that it's one of the best.
  • by golodh ( 893453 ) on Wednesday January 07, 2009 @09:26PM (#26367257)
    I'll pitch in because R deserves better than the usual Slashdot cocktail of random ignorance and immature jokes.

    The R language (yes, it's a language; an interpreted languages is a language too) has developed as the language of choice by statisticians (both academics and sundry statistical researchers) around the world as their main computer language. It is used in those cases where researchers feel the need for customized computations rather than the use of a package like SAS or SPSS.

    The reason that R has become popular is due to a snowball effect and history. It started as a FOSS re-implementation-from-scratch of the "S" language designed for statistical work at Bell labs (see http://en.wikipedia.org/wiki/S_(programming_language) [wikipedia.org]. Some academics and researchers of repute used it (the S language) because at that time (1975) it was very innovative and far better than most alternatives, and others followed. The S language gained a measure of acceptance among statisticians. Then when R became available the cycle intensified because of the much improved availability of the interpretor and its libraries. This cycle continued to the point that by now probably most professional statisticians use it.

    As far as I can see, the R language isn't especially sophisticated or elegant, and may strike people used to more modern languages as a bit repugnant. It does however excel in three respects:

    (a) it allows for easy access of Fortran and C library routines

    (b) it allows you to pass large blobs of data by name

    (c) it makes it easy to pass data to and from your own compiled C and Fortran routines

    The first reason is particularly important because it allows one to use e.g. pre-compiled linear algebra package like LAPACK, or Fourier Transforms, or special function evaluations and thereby gain execution speeds comparable to C despite being an interpreted language (just like Matlab, Octave, Scilab, Gauss, Ox and suchlike): the hard work is carried out by a compiled library routine which is made easily accessible through the interpreted language. Any algorithm needed in statistics that's available as C or Fortran code can be linked in and called without too much effort.

    The second reason is important because it slows down execution much less than any pass-by-value interpreted language would, and it allows you to change data that is passed into a function.

    The third reason is particularly important because it helps researchers be more productive. Reading in your data, examining it, graphing it, tracing outliers and cleaning them up is best done in an interactive environment in an interpreted language. Coding such things in C or Fortran is an awful waste of time, and besides, researchers aren't code-monkeys and don't enjoy coding inane for-loops to read, clean, and display data. Vector and matrix primitives are far more powerful, and usually preferable unless they are so inefficient that you have to wait for the result. However, there are times when you just need to carry out standard algorithms (linear algebra, calculation of mathematical or statistical functions) or simply time-consuming repetitive algorithms that run so much faster in a genuine compiled language. You could start out by coding the algorithm in an interpreted language to check if it's working, and then isolate the computationally expensive part and code it up in C or Fortran. Using R (or Matlab or Scilab) you can *call* the compiled subroutine, pass it your (cleaned) data, and get the result back in an environment where you can easily analyze it.

    That's why languages like R, Matlab, Scilab, Octave, Gauss, and Ox are so productive: you get the best of both worlds. Both the convenience, interactiveness, and terseness of a high-level interpreted language and the speed of compiled languages.

    So why R, and why not Gauss or Matlab or whatever?

    Well, part of that is cultural. If you're an econometrician you'll have been weane

    • by radtea ( 464814 ) on Wednesday January 07, 2009 @10:32PM (#26367769)

      (a) it allows for easy access of Fortran and C library routines

      (b) it allows you to pass large blobs of data by name

      (c) it makes it easy to pass data to and from your own compiled C and Fortran routines

      So, it's exactly like Python, except with an outdated 1970's syntax that was frankly pretty weird to start with :-)

      I've used R, and found it useful for some of its relatively esoteric capabilities, but currently use it almost exclusively via rpy now, the Python binding to R.

      Furthermore, I've been using it less in recent years as the native statistical capability of Python has continued to improve. I can appreciate that people who work strictly in data analysis could find R an appropriate tool, but as someone whose work spans multiple areas, from analysis to application design and development, R is too limiting a tool, and using it always feels a little alien and weird.

      • by verySmartApe ( 1053716 ) on Wednesday January 07, 2009 @11:12PM (#26368087)

        I second that. R is terribly useful for the wide variety of libraries available and esoteric statistical procedures. But you would *never* want to write a long/complex program in R.

        As you say, it's most convenient to work in some other language that's actually designed to be scaleable, object-oriented, and easy to debug. It's usually straightforward to call R libraries when you need them. I find that python+scipy+rpy is an almost ideal environment for day to day scientific programming.

        • Well yes and no.

          I agree with the first part of your post: to me R is something to code in when you have to, and to keep the resulting code as short and simple as possible. If I ever had to code a real application with a GUI that needed the statistical strengths of R, I would almost certainly not use R.

          On the other hand I'd probably use Java and link to R as a server (see my other post about R and Java) instead of using Python.

  • I think we all know how well that's turned out, eh? So it that the fault of the language or programmer error?

  • by thethibs ( 882667 ) on Wednesday January 07, 2009 @11:49PM (#26368323) Homepage

    You have to play with it. As with APL you'll either love it or hate it.

    If you like the idea of a language that includes relational tables as a primitive data type, that extends most operators to do the right thing when you feed them vectors and matrices, that has linear regression and equation solving built-in, you'll probably like R.

Do you suffer painful hallucination? -- Don Juan, cited by Carlos Casteneda

Working...