January 12, 2009

Database SaaS gains a little visibility

Way back in the 1970s, a huge fraction of analytic database management was done via timesharing, specifically in connection with the RAMIS and FOCUS business-intelligence-precursor fourth-generation languages.  (Both were written by Gerry Cohen, who built his company Information Builders around the latter one.)  The market for remoting-computing business intelligence has never wholly gone away since. Indeed, it’s being revived now, via everything from the analytics part of Salesforce.com to the service category I call data mart outsourcing.

Less successful to date are efforts in the area of pure database software-as-a-service.  It seems that if somebody is going for SaaS anyway, they usually want a more complete, integrated offering. The most noteworthy exceptions I can think of to this general rule are Kognitio and Vertica, and they only have a handful of database SaaS customers each. To wit: Read more

January 12, 2009

Gartner’s 2008 data warehouse database management system Magic Quadrant is out

February, 2011 edit: I’ve now commented on Gartner’s 2010 Data Warehouse Database Management System Magic Quadrant as well.

Gartner’s annual Magic Quadrant for data warehouse DBMS is out.  Thankfully, vendors don’t seem to be taking it as seriously as usual, so I didn’t immediately hear about it.  (I finally noticed it in a Greenplum pay-per-click ad.)  Links to Gartner MQs tend to come and go, but as of now here are two working links to the 2008 Gartner Data Warehouse Database Management System MQ.  My posts on the 2007 and 2006 MQs have also been updated with working links. Read more

January 10, 2009

Some reasons business intelligence is in a funk

I wrote recently that BI is in a “funk”.  Let me now offer a few ideas as to why that is so. Read more

January 8, 2009

The business intelligence funk

Gartner analyst Andreas Bitterer’s rarely-updated blog has gotten some recent attention because of his kerfuffle with Yves de Montcheuil of Talend.   Reading same, I went on to notice another post by Andreas that captured my own feelings, to wit:

Sure, the BI market has enjoyed consistent growth, has seen a lot of action on the M&A front, technology has advanced significantly (I remember when gigabytes were considered wild), and yet we are still discussing same old business intelligence. I keep hearing vendors announce that the next version of their tool will be able to address that untapped market within their customer base, growing penetration beyond those 10-15% that are using BI today. I heard this 5 years ago already, but what has changed since then? Not much.

Of course, that post was probably met with considerable PR/AR outreach to the effect “I’m so glad you said that. OUR firm really IS different, and we’d love to tell you about how.”

January 7, 2009

Pervasive DataRush

I’ve made a few references to Pervasive DataRush in the past — like this one — but I’ve never gotten around to seriously writing it up.   I’ll now try to make partial amends.  The key points about Pervasive Datarush are:

More details may be found at the rather rich Pervasive DataRush website, or in the following excerpt from an email by Pervasive’s Steve Hochschild: Read more

January 4, 2009

Expressor pre-announces a data loading benchmark leapfrog

Expressor Software plans to blow the Vertica/Syncsort “benchmark” out of the water, to wit

What I know already is that our numbers will between 7 and 8 min to load one TB of data and will set another world record for the tpc-h benchmark.

The whole blog post has a delightful air of skepticism, e.g.:

Sometimes the mention of a join and lookup are documented but why? If the files are load ready what is there to join or lookup?

… If the files are load ready and the bulk load interface is used, what exactly is done with the DI product?

My guess… nothing.

…  But what I can’t figure out is what is so complex about this test in the first place?

January 3, 2009

More from Vertica on data warehouse load speeds

Last month, when Vertica releases its “benchmark” of data warehouse load speeds, I didn’t realize it had previously released some actual customer-experience load rates as well.  In a July, 2008 white paper that seems thankfully free of any registration requirements, Vertica cited four examples:

Read more

January 3, 2009

ParAccel’s market momentum

After my recent blog post, ParAccel is once again angry that I haven’t given it proper credit for it accomplishments. So let me try to redress the failing.

Uh, that’s about all I can think of. What else am I forgetting? Surely that can’t be ParAccel’s entire litany of market success!

December 29, 2008

ParAccel actually uses relatively little PostgreSQL code

I often find it hard to write about ParAccel’s technology, for a variety of reasons:

ParAccel is quick, however, to send email if I post anything about them they think is incorrect.

All that said, I did get careless when I neglected to doublecheck something I already knew. Read more

December 29, 2008

Ordinary OLTP DBMS vs. memory-centric processing

A correspondent from China wrote in to ask about products that matched the following application scenario: Read more

← Previous PageNext Page →

Feed: DBMS (database management system), DW (data warehousing), BI (business intelligence), and analytics technology Subscribe to the Monash Research feed via RSS or email:

Login

Search our blogs and white papers

Monash Research blogs

User consulting

Building a short list? Refining your strategic plan? We can help.

Vendor advisory

We tell vendors what's happening -- and, more important, what they should do about it.

Monash Research highlights

Learn about white papers, webcasts, and blog highlights, by RSS or email.