05/09/2024 | News release | Distributed by Public on 05/09/2024 13:54
Registered users can unlock up to five pieces of premium content each month.
Data Management and Data Integration as Key Topics at This Year's Hannover Messe |
NEWS |
Earlier in April 2024, Hannover Messe once again set the stage for a remarkable display of innovative technological innovations, highlighting how rapid advancements are shaping the future of industry. As technologies like Artificial Intelligence (AI), which require copious amounts of data to unfold their true transformative potential for enterprise verticals, gain traction, manufacturers around the world show increased interest in aggregating and harmonizing the data they already gather to use to feed Large Language Models (LLMs) for AI training. At the same time, soaring energy costs put a price tag on transferring and processing data, pushing manufacturers to appreciate the cost of data collection and ensure these data will be used in the most optimal way. After all, depending on the exact industry, several Petabytes (PB) of data are gathered from the factory floor each year. Consequently, normalization, contextualization, and integration of enterprise Operational Technology (OT) data for further treatment were an important subject of discussion at Hannover Messe this year.
A range of exhibitors presented their approach to and products around this integration challenge. Microsoft presented its Data Fabric software, which relies heavily on contextualizing, harmonizing, and normalizing data in the cloud. AWS' Industrial Data Fabric (IDF) solution, on the other hand, relies on edge deployments on enterprise premises for data integration, while LLM training, for example, will be performed in the cloud. In a similar fashion, Hewlett Packard Enterprise (HPE) showcased its Ezmeral software offering for data integration at its booth in Hannover.
Different Approaches to Data Integration |
IMPACT |
Earlier in April 2024, Hannover Messe once again set the stage for a remarkable display of innovative technological innovations, highlighting how rapid advancements are shaping the future of industry. As technologies like Artificial Intelligence (AI), which require copious amounts of data to unfold their true transformative potential for enterprise verticals, gain traction, manufacturers around the world show increased interest in aggregating and harmonizing the data they already gather to use to feed Large Language Models (LLMs) for AI training. At the same time, soaring energy costs put a price tag on transferring and processing data, pushing manufacturers to appreciate the cost of data collection and ensure these data will be used in the most optimal way. After all, depending on the exact industry, several Petabytes (PB) of data are gathered from the factory floor each year. Consequently, normalization, contextualization, and integration of enterprise Operational Technology (OT) data for further treatment were an important subject of discussion at Hannover Messe this year.
A range of exhibitors presented their approach to and products around this integration challenge. Microsoft presented its Data Fabric software, which relies heavily on contextualizing, harmonizing, and normalizing data in the cloud. AWS' Industrial Data Fabric (IDF) solution, on the other hand, relies on edge deployments on enterprise premises for data integration, while LLM training, for example, will be performed in the cloud. In a similar fashion, Hewlett Packard Enterprise (HPE) showcased its Ezmeral software offering for data integration at its booth in Hannover.
Extensive Edge Presence and Partnership Networks Are Key to Success |
RECOMMENDATIONS |
In providing a data integration solution to industrial verticals, ABI Research is convinced that on-premises edge deployments will an important determinant for success for a number of reasons:
There are several ways in which software vendors, hyperscalers, and telco providers can strengthen their footprint at the on-premises edge for industrial environments: