April 10, 2011

Use cases for low-latency analytics

At various times I’ve noted the varying latency requirements of different analytic use cases, which can be as different as the speed of a turtle is from the speed of light. In particular, back when I wrote more about CEP (Complex Event Processing), I listed some applications for super-low-latency and not-so-low-latency CEP alike. Even better were some longish lists of “active data warehousing” use cases I got from Teradata in August, 2009, generally focused on interactive customer response (e.g. personalization, churn prevention, upsell, antifraud) or in some cases logistics.

In the slide deck for the Teradata 6680/solid-state drive announcement, however, Teradata went in a slightly different direction. In its list of “hot data use case examples”, Teradata suggested: 

To me, four things stand out about that list. The first two are:

Since a lot of customer response applications use tiny result sets, I imagine those two features are not coincidental. Even so, some customer-response applications can benefit from serious real-time analysis, such as graph-analytic techniques, which can be applied to antifraud and influencer-identifying anti-churn alike.

And thus I’ve shown that a list of bullet points, sized to fit on a single marketing slide, is imperfect. Oh, the humanity!

The two other notable points are:

Frankly, I think low-latency monitoring is going to be one of the hot areas over the next few years. “Real-time” is cool, and big monitors with constantly changing graphics are cooler yet.

Comments

2 Responses to “Use cases for low-latency analytics”

  1. J. Andrew Rogers on April 11th, 2011 12:49 am

    A frustrating limitation of low-latency or “real-time” analytics is the absence of visualization systems designed to be driven that way for many application domains.

    If you have a logically contiguous model of reality that supports a very high update rate — some geo-sensing databases are like this — then it becomes a problem of detecting intersections with a visualization viewport on the backend and pushing that event to the frontend. For these applications it is not efficient to poll the database nor to let the visualization client decide what to render by drinking from the firehose. The backend needs to decide if an update needs to be rendered in a current visualization context and push it there.

    While it is possible to build backends like this it turns out that it is quite difficult to find existing visualization frontends that are designed to support this “real-time” use case. Some of the closest software designs architecturally are server-rendered game engines. Real-time is cool but there seems to be little support for it on the frontend except for the trivial cases that fit inside the old models, which is a limiting factor.

  2. Which analytic technology problems are important to solve for whom? | DBMS 2 : DataBase Management System Services on April 9th, 2015 7:53 am

    […] Use cases for low-latency analytics (April, 2011) […]

Leave a Reply




Feed: DBMS (database management system), DW (data warehousing), BI (business intelligence), and analytics technology Subscribe to the Monash Research feed via RSS or email:

Login

Search our blogs and white papers

Monash Research blogs

User consulting

Building a short list? Refining your strategic plan? We can help.

Vendor advisory

We tell vendors what's happening -- and, more important, what they should do about it.

Monash Research highlights

Learn about white papers, webcasts, and blog highlights, by RSS or email.