Oracle announced its in-memory columnar option Sunday. As usual, I wasn’t briefed; still, I have some observations. For starters:
- Oracle, IBM (Edit: See the rebuttal comment below), and Microsoft are all doing something similar …
- … because it makes sense.
- The basic idea is to take the technology that manages indexes — which are basically columns+pointers — and massage it into an actual column store. However …
- … the devil is in the details. See, for example, my May post on IBM’s version, called BLU, outlining all the engineering IBM did around that feature.
- Notwithstanding certain merits of this approach, I don’t believe in complete alternatives to analytic RDBMS. The rise of analytic DBMS oriented toward multi-structured data just strengthens that point.
I’d also add that Larry Ellison’s pitch “build columns to avoid all that index messiness” sounds like 80% bunk. The physical overhead should be at least as bad, and the main saving in administrative overhead should be that, in effect, you’re indexing ALL columns rather than picking and choosing.
Anyhow, this technology should be viewed as applying to traditional business transaction data, much more than to — for example — web interaction logs, or other machine-generated data. My thoughts around that distinction start:
- I argued back in 2011 that traditional databases will wind up in RAM, basically because …
- … Moore’s Law will make it ever cheaper to store them there.
- Still, cheaper != cheap, so this is a technology only to use with you most valuable data — i.e., that transactional stuff.
- These are very tabular technologies, without much in the way of multi-structured data support.
But in a bit of evidence that disconfirms my case, one of the first SAP applications to require HANA was something called “Smart Meter Analytics”.
To see more specifically where this technology could be useful, let’s map it against my 2011 analytic database taxonomy.
- If you’re managing a partial EDW (Enterprise Data Warehouse) on the same technology as your OLTP (OnLine Transaction Processing) databases, but are running out of steam, in-memory columnar could provide some acceleration.
- Traditional data marts are somewhat obsolete, and establishing a new one would be mainly a cost play. So the fit is questionable.
- Investigative data marts could be a good fit, but only if you’re fairly unimaginative as to the kinds of data you want to include.
- Several other categories are no fit at all.
- There’s a good fit for certain kinds of operational analytics.
I’ll finish by expanding on that last point.
Operational applications have always had analytics blended in. If nothing else, there were a lot of straight reports; sometimes there’s a bit of optimization as well. Workday, for example, has BI and search as two of its core OLTP UI metaphors, and has a lot of other BI snippets called worklets as well. (And by the way, a lot of Workday’s database is in-memory.) I’ve thought for years that operational/analytic blending would be a major area of competition between Oracle and SAP; hence — I believe — SAP’s acquisitions of Business Objects and KXEN. Columnar in-memory Oracle features, and similarly SAP HANA, seem well-suited to support such application elements.