I was tired and cranky when I talked with my former clients at Rainstor (formerly Clearpace) yesterday, so our call was shorter than it otherwise might have been. Anyhow, there’s a new version called Rainstor 4, the two main themes of which are:
- Compliance-specific features.
- Bottleneck Whack-A-Mole.
The point is that Rainstor is focusing its efforts on enterprises that:
- Have a compliance mandate to keep detailed information, either now or coming down the pike.
- Would like to query the information, either as part of the compliance mandate or for the usual business reasons one does analysis (or for that matter pinpoint lookup of historical information).
- Might want to delete the information as soon as the compliance mandate runs out. (That’s a new feature. Frankly, I think the clients demanding it are being foolish. Information is valuable and should never be thrown away if one can afford to keep it.)
- Might want to annotate the information, even though it is being preserved immutably. (Also a new feature. I think that one is smart.)
“Application retirement” was mentioned only in the context of Rainstor’s flagship Informatica partnership, and even then mainly for clients who had a compliance reason to keep old application data around. “Cloud” and “private cloud” get mentioned, but they don’t seem to be as central as Rainstor was previously hoping they would be. (This is one area we could and probably should have touched on more had I been more awake.)
One thing that hasn’t changed: “Information preservation,” which I coined for Rainstor at our first meeting, is still the company catchphrase.
So far as I could tell, the big point on Rainstor 4 Bottleneck Whack-A-Mole is this: When you load data into Rainstor (bulk or otherwise), it likes to do some metadata analysis first. (I imagine this is related to the sophisticated Rainstor compression scheme.) Well, that isn’t much of a performance hit for schemas with small numbers of tables, but is a bigger deal for more complex schemas. The Rainstor 4 fix is to remember/persist some of that analysis from one time the database is updated until the next time. Sounds obvious, but so do a lot of bottleneck fixes once they are made.