By Morne Bekker, Country Manager at NetApp South Africa
Big Data has become part of every single conversation of business leaders, IT decision makers and public stakeholders. It is much more than a buzzword or the latest technology hype. It is now a reality. Industry leaders are doing their best to figure out how to harness, understand and most importantly to leverage it. Below is part 1 of a 3 article series in which we will look at how big data is transforming different industries.
The automotive sector is in the midst of a digital revolution. It is taking the lead in smart manufacturing and against the background of sales and profits moving towards the aftermarket and smart mobility services, OEMs and suppliers must make their IT architectures flexible and scalable and establish efficient data management. And with Forrester predicting that more than 50% of global enterprises will rely on at least one public cloud platform to drive digital transformation in 2018, the automotive sector has now been placed in a position where data management is no longer a luxury, but rather a necessity to suit tech-savvy customer needs.
Systematising the production line
Within the manufacturing phase, OEMs use productive solutions that drive smart manufacturing. Connected production lines send sensor data that can be analysed and used to optimise machinery and production processes.
Predictive maintenance can also be extended with analysis that helps to monitor product quality and intervene if required. However, the analysis of huge amounts of data has not been performed across the complete supply chain up to now. It is necessary to draw a differentiated picture for suppliers. Medium-sized and smaller suppliers have not reached this stage yet. These are the companies that require external IT specialists that support in interconnecting their production.
Altering the after-market
From an after-market perspective, sensor enabled vehicles constantly collect data sets such as tyre pressure, GPS position, and even motor temperature. The smart and reasonable analysis of this huge amount of gathered data can then provide new insights to the vehicle user.
Late last year NetApp had predicted that this very same data would become self-aware in 2018. In other words, the flow between data, applications and storage elements will be mapped in real time as the data delivers the exact information a user needs at the exact time they need it. This also introduces the ability for data to self-govern. The data itself will determine who has the right to access, share and use it, which could have wider implications for external data protection, privacy, governance and sovereignty. As data becomes self-aware and even more diverse than it is today, the metadata will make it possible for the data to proactively transport, categorize, analyse and protect itself.
For example, if you are in a car accident there may be a number of different groups that want or demand access to the data from your car. A judge or insurance company may need it to determine liability, while an auto manufacturer may want it to optimize the performance of the brakes or other mechanical systems. When data is self-aware, it can be tagged so it controls who sees what parts of it and when, without additional time consuming and potentially error prone human intervention to subdivide, approve and disseminate the valuable data.
The data path in 5 steps
When implementing a digital smart manufacturing project, automotive companies should keep in mind the 5 phase model developed by data management specialist NetApp. The path of the data through the factory to the open road is classified into the process stages: collect, transport, store, analyse and archive.
•First, the sensor data is recorded. The measured values must be converted to IP compatible information, which might require retrofitting for older systems. So-called edge gateways can translate data for older systems and then transfer it.
•Further data transport is secured by switching, routing, wireless and firewall technologies. Communication between machines can be implemented via protocols, like the popular communication protocol MQTT (Message Queue Telemetry Transport).
•In the third phase, the IT infrastructure has to store the compiled sensor data and make it available for analysis. Fast flash resources are best suited for stream analytics. In case of large amounts of data, users are advised to implement a data lake with object storage and integrated cloud storage because this type of storage system is very scalable.
•Hadoop and NoSQL (Not Only SQL) database solutions are ideally suited to analysing large data volumes in the fourth step.
•Finally, it is a requirement to implement a cost-effective long-term archive for sensor data. A rule-based, automated data classification enables the system to automatically delete data after a legal hold-over time or grant access to the so-called storage tiering, which amounts to data distribution according to access authorizations.
The technologies from NetApp support enterprise-wide data management and create a link between on premise systems and resources from the public cloud. As a result, the automotive industry can achieve high flexibility in the use of their IT resources and can move data and workloads across all resources.
This creates the basis for the efficient infrastructure that is necessary for big data projects and car manufacturers can utilise the cloud to future-proof their businesses. With its Data Fabric concept, NetApp provides a suitable solution for implementing a multi-cloud infrastructure for big data. Companies can use cloud resources from different vendors, by retaining full control over their data. The use of cloud resources will put companies in a position to integrate the most powerful data analysis engines without investing big money in a new on premise IT infrastructure.
Look out for the second article in this series, which will look closely at how big data is changing the financial services industry