Analytic technologies

Discussion of technologies related to information query and analysis. Related subjects include:

April 10, 2011

Use cases for low-latency analytics

At various times I’ve noted the varying latency requirements of different analytic use cases, which can be as different as the speed of a turtle is from the speed of light. In particular, back when I wrote more about CEP (Complex Event Processing), I listed some applications for super-low-latency and not-so-low-latency CEP alike. Even better were some longish lists of “active data warehousing” use cases I got from Teradata in August, 2009, generally focused on interactive customer response (e.g. personalization, churn prevention, upsell, antifraud) or in some cases logistics.

In the slide deck for the Teradata 6680/solid-state drive announcement, however, Teradata went in a slightly different direction. In its list of “hot data use case examples”, Teradata suggested:  Read more

April 8, 2011

Revolution Analytics update

I wasn’t too impressed when I spoke with Revolution Analytics at the time of its relaunch last year. But a conversation Thursday evening was much clearer. And I even learned some cool stuff about general predictive modeling trends (see the bottom of this post).

Revolution Analytics business and business model highlights include:

Read more

April 6, 2011

So can logistic regression be parallelized or not?

A core point in SAS’ pitch for its new MPI (Message-Passing Interface) in-memory technology seems to be logistic regression is really important, and shared-nothing MPP doesn’t let you parallelize it. The Mahout/Hadoop folks also seem to despair of parallelizing logistic regression.

On the other hand, Aster Data said it had parallelized logistic regression a year ago. (Slides 6-7 from a mid-2010 Aster deck may be clearer.) I’m guessing Fuzzy Logix might make a similar claim, although I’m not really sure.

What gives?

April 5, 2011

Comments on EMC Greenplum

I am annoyed with my former friends at Greenplum, who took umbrage at a brief sentence I wrote in October, namely “eBay has thrown out Greenplum“.  Their reaction included:

The last one really hurt, because in trusting them, I put in quite a bit of effort, and discussed their promise with quite a few other people.

Read more

March 30, 2011

Short-request and analytic processing

A few years ago, I suggested that database workloads could be divided into two kinds — transactional and analytic. The advent of non-transactional NoSQL has suggested that we need a replacement term for “transactional” or “OLTP”, but finding one has been a bit difficult. Numerous tries, including high-volume simple processing, online request processing, internet request processing, network request processing, short request processing, and rapid request processing have turned out to be imperfect, as per discussion at each of those links. But then, no category name is ever perfect anyway. I’ve finally settled on short request processing, largely because I think it does a good job of preserving the analytic-vs-bang-bang-not-analytic workload distinction.

The easy part of the distinction goes roughly like this:

Where the terminology gets more difficult is in a few areas of what one might call real-time or near-real-time analytics. My first takes are:  Read more

March 24, 2011

Analytic performance — the persistent need for speed

Analytic DBMS and other analytic platform technologies are much faster than they used to be, both in absolute and price/performance terms. So the question naturally arises, “When is the performance enough?” My answer, to a first approximation, is “Never.” Obviously, your budget limits what you can spend on analytics, and anyhow the benefit of incremental expenditure at some point can grow quite small. But if analytic processing capabilities were infinite and free, we’d do a lot more with analytics than anybody would consider today.

I have two lines of argument supporting this view. One is application-oriented. Machine-generated data will keep growing rapidly. So using that data requires ever more processing resources as well. Analytic growth, rah-rah-rah; company valuation, sis-boom-bah. Application areas include but are not at all limited to marketing, law enforcement, investing, logistics, resource extraction, health care, and science.

The other approach is to point out some computational areas where vastly more analytic processing resources could be used than are available today. Consider, if you will, statistical modeling, graph analytics, optimization, and stochastic planning.  Read more

March 23, 2011

Hadapt (commercialized HadoopDB)

The HadoopDB company Hadapt is finally launching, based on the HadoopDB project, albeit with code rewritten from scratch. As you may recall, the core idea of HadoopDB is to put a DBMS on every node, and use MapReduce to talk to the whole database. The idea is to get the same SQL/MapReduce integration as you get if you use Hive, but with much better performance* and perhaps somewhat better SQL functionality.** Advantages vs. a DBMS-based analytic platform that includes MapReduce — e.g. Aster Data — are less clear.  Read more

March 15, 2011

MySQL soundbites

Oracle announced MySQL enhancements, plus intentions to use MySQL to compete against Microsoft SQL Server. My thoughts, lightly edited from an instant message Q&A, include:

The last question was “Is there an easy shorthand to describe how Oracle DB is superior to MySQL even with these improvements?” My responses, again lightly edited, were:  Read more

March 13, 2011

So how many columns can a single table have anyway?

I have a client who is hitting a 1000 column-per-table limit in Oracle Standard Edition. As you might imagine, I’m encouraging them to consider columnar alternatives. Be that as it may, just what ARE the table width limits in various analytic or general-purpose DBMS products?

By the way — the answer SHOULD be “effectively unlimited.” Like it or not,* there are a bunch of multi-thousand-column marketing-prospect-data tables out there.

*Relational purists may dislike the idea for one reason, privacy-concerned folks for quite another.

March 10, 2011

Notes for my March 10 Investigative Analytics webinar

It turns out that the slide deck I posted a couple of days ago underwent more changes than I expected. Here’s a more current version. A number of the changes arose when I thought more about how to categorize analytic business benefits; hence that blog post a few minutes ago with more detail on the same subject.

Unchanged, however, is the more technical list of six things you can do with analytic technology, taken from a blog post late last year. Also unaltered are my definitions of investigative analytics and machine-generated data.

I write extensively on privacy. This technological overview of privacy threats doubles as a survey of advanced investigative analytics techniques now coming into practical use.

And finally, on a happier note — if you enjoyed the xkcd cartoon, here are two links to that one and a few more.

← Previous PageNext Page →

Feed: DBMS (database management system), DW (data warehousing), BI (business intelligence), and analytics technology Subscribe to the Monash Research feed via RSS or email:

Login

Search our blogs and white papers

Monash Research blogs

User consulting

Building a short list? Refining your strategic plan? We can help.

Vendor advisory

We tell vendors what's happening -- and, more important, what they should do about it.

Monash Research highlights

Learn about white papers, webcasts, and blog highlights, by RSS or email.