The most straightforward approach to the applications business is:
- Take general-purpose technology and think through how to apply it to a specific application domain.
- Produce packaged application software accordingly.
However, this strategy is not as successful in analytics as in the transactional world, for two main reasons:
- Analytic applications of that kind are rarely complete.
- Incomplete applications rarely sell well.
I first realized all this about a decade ago, after Henry Morris coined the term analytic applications and business intelligence companies thought it was their future. In particular, when Dave Kellogg ran marketing for Business Objects, he rattled off an argument to the effect that Business Objects had generated more analytic app revenue over the lifetime of the company than Cognos had. I retorted, with only mild hyperbole, that the lifetime numbers he was citing amounted to “a bad week for SAP”. Somewhat hoist by his own petard, Dave quickly conceded that he agreed with my skepticism, and we changed the subject accordingly.
Reasons that analytic applications are commonly less complete than the transactional kind include:
- Transactional apps often serve to automate rigid business processes. Analytic technology use is inherently more flexible and varied.
- Transactional apps are often used by cheaper/lower-status people. Analytic technology may be used by managers who treasure the right of individualized decision making.
There are indeed scenarios in which incomplete analytic applications can be useful. For example:
- If a user has sufficiently simple needs, cookie-cutter analytic apps — perhaps offered on a SaaS (Software as a Service) — basis might suffice.
- Small teams of technical workers can kick-start their analytic efforts with pre-built booster kits. Two examples come to mind:
- SAS Institute has done quite well with statistical “applications” that really are just accelerators for custom statistical work of the usual kind.
- Starter-kit data models for data warehousing have some value as well.
But otherwise, I think the best opportunities for application-specific analytic technology aren’t really classical “analytic apps”. Rather, they arise in three sometimes-overlapping areas, adjacent to the analytic application core:
- Operational applications enhanced with some analytics so as to improve routine business processes.
- Information services enhanced with some analytic technology that retrieves (and perhaps also helps analyze) the information.
- Analytic-application-specific “platform” technology.
Operational applications have been enhanced with analytics for as long as we have had reports. Indeed, meeting that reporting need was the core business for Crystal Reports, the only business intelligence company ever to build a large OEM/VAR business (it was eventually merged into Business Objects). Analytic enhancement is also a major direction for application behemoths Oracle and SAP, but I won’t address that aspect in this post.
If you offer a service whose essence is tabular-structured information — e.g. a third-party data source or some stakeholder-facing analytics — then you also need to provide business intelligence capability to the information’s consumers. Too often, however, those BI capabilities are unimpressive, and there’s an “easy” improvement in upgrading them that should happen before more serious analytic-app capabilities are addressed.
What I’m most excited about right now is analytic-application-specific “platform” technology, an area in which I’ve sensed a groundswell of interest over the past 6-12 months. It’s at the heart of a significant fraction of the new startup ideas I’m hearing, and rightly so; on the other hand, it’s also been going on for decades. Here is a grab-bag of examples.
- Simulation and optimization have been around since the 1970s, if not before. One cool effort was by River Logic, which developed a visual programming language especially geared to profitability/logistics kinds of simulations in the 1990s. (While still around, the company unfortunately doesn’t seem to have done much for the past 1 – 1 1/2 decades.)
- Much more established is SAP’s APO (Advanced Planner and Optimizer), dating back to at least the 1990s. Given the magnitude of the mixed-integer programming problems it tackles, I would conjecture it includes some built-in domain-specific heuristics you might not find in a generic set of mathematical packages.
- The financial services industry has long featured domain-specific technology. From the 1800s through the 1970s, this was focused on communications, from stock tickers (one of Thomas Edison’s first important inventions) to networks of stock quote machines. In the 1980s, that expanded to include what we’d recognize even today as real-time business intelligence tools, and then also to complex security-valuation analytics.
- What’s more, the whole area of CEP/streaming has traditionally been focused on financial trading, for reasons including low latency, time series orientation, and the opportunity to parameterize queries across a broad set of ticker symbols.
- Despite a lot of application potential, general-purpose text analytics technology has floundered. But when text analytics technology is specifically extended for marketing applications, it does better. Indeed, marketing applications don’t use general-purpose text mining technology to its fullest power. But they do add the relatively new analytic techniques of sentiment analysis. They further add capabilities to analyze short, ungrammatical “verbatims”, such as text messages.
- My clients at Metamarkets — Mike Driscoll et al. — have built a pretty cool technology stack focused on real-time/in-memory BI, well-suited for digital advertising and similar markets. I question whether it has much applicability outside of that space, however, because every industry that I can think of that needs real-time BI needs something rather different.
- WibiData is focused in a similar area, but on actually personalizing things rather than on monitoring personalization’s effects. WibiData believes this requires aggressive use of derived data and the associated schema evolution.
- Log analyzer Sumo Logic probably doesn’t rely on an off-the-shelf machine learning engine.
- Other apparent examples showed up in the comment thread to my November, 2011 post on agile predictive analytics.
It will be fascinating to see how this all plays out.