Analytic technologies
Discussion of technologies related to information query and analysis. Related subjects include:
- Business intelligence
- Data warehousing
- (in Text Technologies) Text mining
- (in The Monash Report) Data mining
- (in The Monash Report) General issues in analytic technology
Use cases for low-latency analytics
At various times I’ve noted the varying latency requirements of different analytic use cases, which can be as different as the speed of a turtle is from the speed of light. In particular, back when I wrote more about CEP (Complex Event Processing), I listed some applications for super-low-latency and not-so-low-latency CEP alike. Even better were some longish lists of “active data warehousing” use cases I got from Teradata in August, 2009, generally focused on interactive customer response (e.g. personalization, churn prevention, upsell, antifraud) or in some cases logistics.
In the slide deck for the Teradata 6680/solid-state drive announcement, however, Teradata went in a slightly different direction. In its list of “hot data use case examples”, Teradata suggested: Read more
| Categories: Data warehousing, Teradata | 2 Comments |
Revolution Analytics update
I wasn’t too impressed when I spoke with Revolution Analytics at the time of its relaunch last year. But a conversation Thursday evening was much clearer. And I even learned some cool stuff about general predictive modeling trends (see the bottom of this post).
Revolution Analytics business and business model highlights include:
- Revolution Analytics is an open-core vendor built around the R language. That is, Revolution Analytics offers proprietary code and support, with subscription pricing, that help in the use of open source software.
- Unlike most open-core vendors I can think of, Revolution Analytics takes little responsibility for the actual open source part. Some “grants” for developing certain open source R pieces seem to be the main exception. While this has caused some hard feelings, I don’t have an accurate sense for their scope or severity.
- Revolution Analytics also sells a single-user/workstation version of its product, freely admitting that this is mainly a lead generation strategy or, in my lingo, a “break-even leader.”
- Revolution Analytics boasts around 100 customers, split about 70-30 between the workstation seeding stuff and the real server product.
- Revolution Analytics has “about” 37 employees. Headquarters are at 101 University Avenue (do I have to say in what city? 🙂 ). There are also a development office in Seattle and a sales office in New York.
- Revolution Analytics’ pricing is by size of server. “Small” servers — i.e. up to 12 cores — start at $25K/year.
- Unsurprisingly, adoption is more alongside SAS et al. than rip-and-replace.
| Categories: Health care, Investment research and trading, Open source, Parallelization, Predictive modeling and advanced analytics, Pricing, Revolution Analytics, SAS Institute | 2 Comments |
So can logistic regression be parallelized or not?
A core point in SAS’ pitch for its new MPI (Message-Passing Interface) in-memory technology seems to be logistic regression is really important, and shared-nothing MPP doesn’t let you parallelize it. The Mahout/Hadoop folks also seem to despair of parallelizing logistic regression.
On the other hand, Aster Data said it had parallelized logistic regression a year ago. (Slides 6-7 from a mid-2010 Aster deck may be clearer.) I’m guessing Fuzzy Logix might make a similar claim, although I’m not really sure.
What gives?
| Categories: Aster Data, Hadoop, Parallelization, Predictive modeling and advanced analytics, SAS Institute | 45 Comments |
Comments on EMC Greenplum
I am annoyed with my former friends at Greenplum, who took umbrage at a brief sentence I wrote in October, namely “eBay has thrown out Greenplum“. Their reaction included:
- EMC Greenplum no longer uses my services.
- EMC Greenplum no longer briefs me.
- EMC Greenplum reneged on a commitment to fund an effort in the area of privacy.
The last one really hurt, because in trusting them, I put in quite a bit of effort, and discussed their promise with quite a few other people.
Short-request and analytic processing
A few years ago, I suggested that database workloads could be divided into two kinds — transactional and analytic. The advent of non-transactional NoSQL has suggested that we need a replacement term for “transactional” or “OLTP”, but finding one has been a bit difficult. Numerous tries, including high-volume simple processing, online request processing, internet request processing, network request processing, short request processing, and rapid request processing have turned out to be imperfect, as per discussion at each of those links. But then, no category name is ever perfect anyway. I’ve finally settled on short request processing, largely because I think it does a good job of preserving the analytic-vs-bang-bang-not-analytic workload distinction.
The easy part of the distinction goes roughly like this:
- Anything transactional or “OLTP” is short-request.
- Anything “OLAP” is analytic.
- Updates of small amounts of data are probably short-request, be they transactional or not.
- Retrievals of one or a few records in the ordinary course of update-intensive processing are probably short-request.
- Queries that return or aggregate large amounts of data — even in intermediate result sets — are probably analytic.
- Queries that would take a long time to run on badly-chosen or -configured DBMS are probably analytic (even if they run nice and fast on your actual system).
- Analytic processes that go beyond querying or simple arithmetic are — you guessed it! — analytic.
- Anything expressed in MDX is probably analytic.
- Driving a dashboard is usually analytic.
Where the terminology gets more difficult is in a few areas of what one might call real-time or near-real-time analytics. My first takes are: Read more
| Categories: Analytic technologies, Data warehousing, MySQL, NoSQL, OLTP | 34 Comments |
Analytic performance — the persistent need for speed
Analytic DBMS and other analytic platform technologies are much faster than they used to be, both in absolute and price/performance terms. So the question naturally arises, “When is the performance enough?” My answer, to a first approximation, is “Never.” Obviously, your budget limits what you can spend on analytics, and anyhow the benefit of incremental expenditure at some point can grow quite small. But if analytic processing capabilities were infinite and free, we’d do a lot more with analytics than anybody would consider today.
I have two lines of argument supporting this view. One is application-oriented. Machine-generated data will keep growing rapidly. So using that data requires ever more processing resources as well. Analytic growth, rah-rah-rah; company valuation, sis-boom-bah. Application areas include but are not at all limited to marketing, law enforcement, investing, logistics, resource extraction, health care, and science.
The other approach is to point out some computational areas where vastly more analytic processing resources could be used than are available today. Consider, if you will, statistical modeling, graph analytics, optimization, and stochastic planning. Read more
| Categories: Analytic technologies, RDF and graphs | Leave a Comment |
Hadapt (commercialized HadoopDB)
The HadoopDB company Hadapt is finally launching, based on the HadoopDB project, albeit with code rewritten from scratch. As you may recall, the core idea of HadoopDB is to put a DBMS on every node, and use MapReduce to talk to the whole database. The idea is to get the same SQL/MapReduce integration as you get if you use Hive, but with much better performance* and perhaps somewhat better SQL functionality.** Advantages vs. a DBMS-based analytic platform that includes MapReduce — e.g. Aster Data — are less clear. Read more
MySQL soundbites
Oracle announced MySQL enhancements, plus intentions to use MySQL to compete against Microsoft SQL Server. My thoughts, lightly edited from an instant message Q&A, include:
- Given how hard Oracle fought the antitrust authorities to keep MySQL around the time of the acquisition, we always knew they were serious about the business.
- We’ll know they’re even more serious if they buy MySQL enhancements such as Infobright, dbShards, or Schooner MySQL.
- Oracle-quality MySQL’s most obvious target is SQL Server.
- But if you’ve bought into the Windows stack, why not stay bought-in?
- MySQL vs. SQL Server competition is mainly about new applications; few users will actually switch.
- A lot of SaaS vendors use Oracle Standard Edition, and have some MySQL somewhere as well. They don’t want to pay up for Oracle Enterprise Edition or Exadata. Good MySQL could suit them.
- Mainly, I see the Short Request Processing market as being a battle between MySQL versions and NoSQL systems. (I’m a VoltDB pessimist.)
The last question was “Is there an easy shorthand to describe how Oracle DB is superior to MySQL even with these improvements?” My responses, again lightly edited, were: Read more
| Categories: Analytic technologies, Exadata, MySQL, NoSQL, Oracle, Software as a Service (SaaS) | 2 Comments |
So how many columns can a single table have anyway?
I have a client who is hitting a 1000 column-per-table limit in Oracle Standard Edition. As you might imagine, I’m encouraging them to consider columnar alternatives. Be that as it may, just what ARE the table width limits in various analytic or general-purpose DBMS products?
By the way — the answer SHOULD be “effectively unlimited.” Like it or not,* there are a bunch of multi-thousand-column marketing-prospect-data tables out there.
*Relational purists may dislike the idea for one reason, privacy-concerned folks for quite another.
| Categories: Data warehousing, Surveillance and privacy | 37 Comments |
Notes for my March 10 Investigative Analytics webinar
It turns out that the slide deck I posted a couple of days ago underwent more changes than I expected. Here’s a more current version. A number of the changes arose when I thought more about how to categorize analytic business benefits; hence that blog post a few minutes ago with more detail on the same subject.
Unchanged, however, is the more technical list of six things you can do with analytic technology, taken from a blog post late last year. Also unaltered are my definitions of investigative analytics and machine-generated data.
I write extensively on privacy. This technological overview of privacy threats doubles as a survey of advanced investigative analytics techniques now coming into practical use.
And finally, on a happier note — if you enjoyed the xkcd cartoon, here are two links to that one and a few more.
| Categories: Analytic technologies, Presentations | Leave a Comment |
