May 4, 2010

Revolution Analytics seems very confused

Revolution Analytics is a relaunch of a company previously known as REvolution Computing, built around the open source R language. Last week they sent around email claiming they were a new company (false), and asking for briefings in connection with an embargo this morning. I talked to Revolution Analytics yesterday, and they told me the embargo had been moved to Thursday.* However, Revolution apparently neglected to tell the press the same thing, and there’s an article out today — quoting me, because I’d given quotes in line with the original embargo, before I’d had the briefing myself. And what’s all this botched timing about? Mainly, it seems to be for a “statement of direction” about software Revolution Analytics hasn’t actually developed yet.

*More precisely, they spoke as if the embargo had been Thursday all along.

Anyhow, so far as I could tell, the Revolution Analytics story is something like:

Comments

13 Responses to “Revolution Analytics seems very confused”

  1. Ken Williams on May 4th, 2010 2:22 pm

    I don’t know very much about Revolution, but I do use R a lot. One of your points above isn’t true, namely it’s quite possible to use R on data sets larger than your RAM. You can address data that’s in a backing database (e.g. sqlite or a traditional DBMS), or use R to build mappers & reducers for Hadoop, and so on. Revolution themselves have done some good work in this area, some of which is available free.

  2. CommeRcial R- Integration in software « DECISION STATS on May 4th, 2010 10:46 pm

    […] They probably need to hire more people now – Curt Monash, noted all-things-data software guru has the inside dope here […]

  3. Ross Ihaka on May 6th, 2010 3:00 am

    I’m one of the guys who created R. These guys are distributing GPLed software and not feeding anything back to the rights holders. Please don’t support them. Get the free version from the R-project instead and support the R Foundation.

  4. FlashInThePan on May 6th, 2010 8:51 am

    Another flash in the pan, non-production stable, premature-to-market, money-hungry, get-rich-quick off a noble open source effort company with less thatn 30 people to suppor the globe. Sure.

  5. Ken Williams on May 6th, 2010 2:38 pm

    Ross: can you be a bit more specific? There’s nothing wrong with “distributing GPLed software” so I’m guessing there’s something more that you’re not saying.

  6. Turadg Aleahmad on May 6th, 2010 4:20 pm

    I can confirm it’s not Eclipse-based. I had thought it was but just downloaded the academic version and it is built on Visual Studio. From the about box:

    Revolution R Enterprise powered by Visual Studio Shell.
    Copyright © 2007 Microsoft Corporation.

  7. Curt Monash on May 7th, 2010 1:18 am

    Giving a free single-server version of all their closed-source code to academics != Revolution Analytics giving NOTHING back.

    But it does give me pause when an open source commercialization company doesn’t have any core contributors as employees.

  8. Curt Monash on May 7th, 2010 1:19 am

    @Turadq,

    Good find. Thanks!

    Perhaps it’s not coincidental that Revolution development is in Seattle?

  9. Bob Muenchen on May 7th, 2010 7:59 pm

    From the Revolution Analytics web site (http://revolutionanalytics.com/aboutus/leadership.php) it lists as a member of their board of directors:

    “Robert Gentleman has been an innovator in the statistical computing world for more than 20 years. In 1996, he and fellow University of Aukland (NZ) professor Ross Ihaka released the “R” statistical computing language as a free software package after five years of development.”

    It appears the two creators of R don’t agree on this topic.

  10. Curt Monash on May 8th, 2010 9:04 pm

    @Ken,

    But see Michael Wexler’s comment on http://www.dbms2.com/2010/05/07/in-database-sas-teradata-netezza-aster/

    To what do you attribute this difference in perception?

    Thanks,

    CAM

  11. Ken Williams on May 10th, 2010 10:58 am

    @Curt,

    Well, the way people *usually* use R is to import their data sets into in-memory data structures. They start loading bigger & bigger files this way, then hit memory limits, and say “R can’t handle big files”.

    To handle big data requires different methods – see for example the ‘ff’, ‘biglm’, ‘bigglm’, ‘sqldf’, and in general anything on http://cran.r-project.org/web/views/HighPerformanceComputing.html .

  12. Curt Monash on May 11th, 2010 6:13 am

    @Ken,

    Is there any serious downside to that stuff?

  13. pit online 2013 on September 3rd, 2013 5:02 am

    Hi there, just became aware of your blog through Google, and
    found that it’s truly informative. I’m going to watch out for brussels.
    I’ll appreciate if you continue this in future. Lots of people will be benefited from your writing.
    Cheers!|

    my web site: pit online 2013

Leave a Reply




Feed: DBMS (database management system), DW (data warehousing), BI (business intelligence), and analytics technology Subscribe to the Monash Research feed via RSS or email:

Login

Search our blogs and white papers

Monash Research blogs

User consulting

Building a short list? Refining your strategic plan? We can help.

Vendor advisory

We tell vendors what's happening -- and, more important, what they should do about it.

Monash Research highlights

Learn about white papers, webcasts, and blog highlights, by RSS or email.