PREVIOUS ARTICLENEXT ARTICLE
THOUGHT LEADERSHIP
By 5 August 2019 | Categories: Thought Leadership

0

VIEWING PAGE 1 OF 1

By Gary Allemann, Managing Director of Master Data Management

The Internet of Things (IoT) and Machine to Machine (M2M) communications are by no means new concepts, having existed for many years already.

What is new and revolutionary is what we are now able to do with the data these technologies generate. From predictive maintenance to consumer behaviour analytics, facial recognition to fraud detection and prevention, there is a wealth of value to be gained. However, these applications require a shift in mindset, from storing all data for retrospective analytics to analysing data on the fly and then dumping it. This new paradigm of streaming analytics is key to leveraging value from M2M and IoT data.

The genuinely revolutionary aspect of IoT and M2M is its ability to generate always-on data. Human-generated data is and always has been finite. While the volume has increased over the years, it is still subject to limits that sensors and machines simply do not have – these objects can generate thousands of data objects every minute, and they are inexhaustible sources. While this means that there are now limitless sources of data for analysis, it also means that unless something changes, we have to find a way of storing infinite data volumes. This is just not possible even if it were financially viable.

The constant and relentless nature of ‘always-on’ data generation also makes real time analytics even more important. Much of this data is only relevant now, in the moment, and once it is historical it no longer has worth.

It must be analysed immediately and then deleted, otherwise no value can be gained. For example, a machine sending a signal every 30 seconds communicating that its status is good, is important information. However, it does not need to be stored because it is worthless after the signal has been sent. It also only requires action if something changes, which can only be ascertained if this data is being analysed in real-time.

Without this instantaneous and continuous analysis, critical information might be missed. This is driving the emergence of the streaming analytics stack, enabling organisations to analyse data on the fly without storing it. However, this technology is still in its infancy, and while a number of open source technologies exist, such as Apache Kafka, they are by no means enterprise-ready. There are no audit trails or governance and no disaster or error recovery protocols.

In order to be of use in an enterprise setting, streaming analytics needs some sort of failover. This is critical so that it can still be available for analytics in the event of, for example, a network outage. Innovative commercial solutions are essential when it comes to solving problems that are only just beginning to emerge. Our partner, Syncsort, is leading this charge with enterprise-ready streaming built into broader data integration stacks, helping to pave the way for success in streaming data pipelines

While the need for such solutions has emerged from the growth of IoT and M2M data, the same real-time analytics platforms have far broader business application. They can be leveraged to great value in any analytics scenario for enhanced processing speed, flexibility, agility and more. Real-time decision-making ability is the future of analytics and we need to gear up for this or risk being left behind.

VIEWING PAGE 1 OF 1

USER COMMENTS

Read
Magazine Online
TechSmart.co.za is South Africa's leading magazine for tech product reviews, tech news, videos, tech specs and gadgets.
Start reading now >
Download latest issue

Have Your Say


What new tech or developments are you most anticipating this year?
New smartphone announcements (44 votes)
Technological breakthroughs (28 votes)
Launch of new consoles, or notebooks (14 votes)
Innovative Artificial Intelligence solutions (28 votes)
Biotechnology or medical advancements (21 votes)
Better business applications (132 votes)