Hazelcast CEO Kelly Herrell: Neoclassical economics – it’s time to augment Adam Smith’s ‘Wealth Of … [+] Nations’ with a new real-time chapter.
2020 IVAN APFEL
Business accelerates. Throughout every tier of the first, second, third (and especially within the AI-driven fourth) industrial revolution, business has been accelerating the pace at which it manufactures, creates, trades, operates and (in some cases, where innovations don’t work) fails.
This perennial reality is perhaps one of the reasons why the technology industry has always been obsessed with the move from data to real-time data.
The death of batch
Where data exists in environments and sits in applications, databases, web services and other entities that move back and forth, its non real-time status is typically denoted by the fact that it has to be compiled, parsed, managed, saved and so processed in batches at the end of the day, or some other defined period. This is why real-time advocates are fond of talking about the so-called ‘end, or death of batch’ era today.
When and where data moves in real-time streams, there is enough computing capacity and crucial enabling real-time software engine intelligence to make a human user perceive that a process has happened instantaneously. In reality, even real-time processing takes a number of milliseconds, so it’s not quite ‘real’ in the sense of our existence on the planet, but that’s a philosophical argument for another day.
A keen protagonist of real-time technologies is Hazelcast. The company is known for its open source in-memory data grid technology that shares information out across compute clusters to enable rapidly scaled data processing. But the company is realistic, it says that true real-time (being able to act on information in the moment) is hard to achieve, especially if we are talking about while, when, during and before an event, not after.
“Many providers still claim to deliver real-time, while waiting on databases…
..
[ad_2]