The Only Constant Is Change

By Perry Rotella July 25, 2013

Perry RotellaYear after year, we're astounded by the implications of Moore's Law. That’s the observation that the number of transistors on an integrated circuit has doubled every year since the microchip was invented and data density has doubled approximately every 18 months.

A new class of massively parallel processing (MPP) systems has emerged — processing large amounts of data faster and at less cost than ever before. Users can develop and run predictive models on a single device that also hosts the data being queried, avoiding the slow process of moving data across networks. We call the new capability in-database analytics. The performance improvements offered by such analytics platforms allow for multiple iterations or tests conducted in a single day on very large data sets, enabling predictive model development on hundreds of millions of records.

Smartphones and tablets now let us generate and examine significantly more content anywhere and at any time. And that level of access helps drive an appetite for yet more data, still better analytics, and ever better ways to handle both. For example, an article in the Armed Forces Journal describes how “in Iraq, U.S. forces who recovered computers used by al-Qaida consistently found Google Maps information on them. Insurgents were using the same databases as U.S. forces to view streets, consider get-away routes and plan ambushes.”

So, better access to data became a concern. The article points out: “The discoveries showed how one kind of U.S. tactical advantage was eroding — and they underscore the vital need to improve the information flow to U.S. troops, all the way down to individual soldiers on patrol. We will derive our future advantage from the ability to store, access and analyze unique data, and deliver those resources through our networks to the point of need, better and faster than our enemies.”

With that in mind, business and technology leaders must assimilate the principle that the bigger the data, the better. And that goes for analytics, too. The models that only a short time ago took hours or days to run now can provide results in seconds. The future is heading toward even greater analytics sophistication, delivered in next to no time or in real time.

Still, there are challenges. The number of people doing work on iPads and other mobile devices is exploding, and that will necessitate more mobile-accessible analytics. That trend will accelerate and change how we interact with our computing resources in an enterprise. In insurance, one current example is products that give claims adjusters access to analytics in the field, so they can quickly and precisely run replacement cost estimates.

It stands to reason, then, that as devices evolve and change in form and functionality so will the requirements for data and analytics. And that’s what keeps things interesting.