
In today’s complex digital landscape, achieving true data interoperability between systems, devices, and AI agents is not just a nice-to-have – it’s a necessity. As someone who’s worked extensively in this field, I can attest that traditional data integration is a time-consuming and labor-intensive process that requires significant technical expertise. Despite efforts by 3rd party service providers and standards development organizations like GS1 to ease integration tasks, data integration remains a slow and error-prone process. The question is, can emerging technologies be the catalyst for change?
In this article, I’ll share with you the reasons behind the complexity, slowness, and drudgery of current data integration practices. More importantly, I’ll examine the potential of emerging technologies like AI/ML, knowledge graphs, and digital identity tech to truly revolutionize the way we do data integrations. Without a doubt, these innovative technologies can increase data interoperability through better security, clarity, and high-velocity information exchange. By leveraging these advancements, businesses can unlock new opportunities for growth, innovation, and success. Indeed, the future of data interoperability is about to get a lot more interesting.
- 1. Traditional Data Integration: Setups Takes Too Long, Much Expertise Needed, and Data Transferred Not Understandable.
- 2. Emerging Tech to Make Data Integrations Understandable, Secure, And High Velocity.
- a. AI / ML that Streamlines and Automates Data Integration.
- b. Machine Learning (ML) for Accelerating Data Standard Development.
- c. Knowledge Graph Interoperability Opportunities: Incorporate Data with Structured Relationships and Shared Meaning.
- d. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
1. Traditional Data Integration: Setups Takes Too Long, Much Expertise Needed, and Data Transferred Not Understandable.
Traditional data integration practices are a bottleneck to seamless interoperability. Primarily this is due to their time-consuming setups, significant technical expertise requirements, and complex systems with many competing data access methods. Furthermore, many data integration efforts fail because organizational leaders don’t clearly specify their objectives, nor define business definitions and glossaries. As a result, data transferred is not always understood. So, let’s look at these data integration challenges, and more importantly, what can businesses and new, emerging technology do to overcome these obstacles.
a. IT Data Integrators Require Detailed Interface Documentation.
First, IT data integrators require detailed interface documentation to implement new business-to-business data interfaces. These technical documents are either proprietary or Standards Development Organization (SDO) specifications. For example, proprietary documentation could include a “data dictionary” from a commercial SaaS platform. Ideally, this document will include descriptions of data elements and example use cases. Now with SDO documentation, an example would be a GS1 General Specifications Standard for barcodes. This document provides universally accepted data specifications, but may require payment to the SDO to access as well as supplemental documentation from the lead integrator. In all cases to include APIs and file transfers, tech documentation is required to implement.
b. Most System Integrations Lack Quality Data and Business Definitions.
One of the biggest stumbling blocks for organizations to exchange data is the data is of low quality. Besides organizations transferring data that is out-of-date and incomplete, the data transferred is ambiguous and not understandable by the receiving organization. Today, most data interoperability challenges are more of a business problem than just a tech problem. Here’s why:
- Source Systems Lack Quality Data. In many cases, the owners of the data, the business owners, are neglectful of their data. At best, they leave it up to their IT staff who are not business experts to define data elements and determine which data to keep up-to-date.
- Integrators Need Business Data Know-How. What’s more, data integration teams need both business data know-how and the technical skills to implement. Without this know-how, data may get transferred without any technical error, but it may not be understood by the receiving organization.
c. Resulting Systemic Integration Challenges and What Can We Do About It?
Today, business-to-business data integration is complicated! Worse, it takes too long and in many cases the data that is sent is not understood by the receiving organization Here’s why:
Systemic Integration Challenges
- Scarce Highly-Skilled IT Integrators. Indeed, all organizations involved in the data transfer require a competent IT integrator. This is costly and only the largest businesses can usually justify the expense. Smaller organizations will turn to 3rd party integrators. However, in many cases this is a gamble, resulting in either getting overcharged or the data integration project falling short.
- Data Interface Specs Docs Lacking or Too Complicated. Also, because there is a lack of skilled IT Integrators, documentation is usually lacking and not well maintained. On the other hand, an organization with highly skilled Integrators many times will provide overly complex documentation and implementation methodologies. As a result, spec docs are not understood, not followed, or results in meaningless data exchange.
- Lack of Common, Measurable Business Definitions and Glossaries. Surprisingly, many data integration fall short, not for technical reasons, but because both the businesses that are exchanging data do not have a common agreement on business terms. For example, a transportation carrier may transmit that a package is “shipped“, but in reality only the shipping label has been printed, and the package is not in the carrier’s possession.
Now, there are ways to apply fixes to these systemic issues. First and foremost, organizations need to have common business glossaries with agreed upon measurable, business definitions. Regardless of how good the technology or the technical implementation team is, they cannot make ambiguous, incomplete data better. For more insights on this subject, see my article, Poor Operational Definitions Impede Supply Chain Tech Adoption: Now Is the Time For A Big Change.
At the same time, there are emerging technologies that can help achieve better data interoperability that I will explain in the remainder of this article.
2. Emerging Tech to Make Data Integrations Understandable, Secure, And High Velocity.
Emerging technologies like artificial intelligence (AI), machine learning (ML), knowledge graphs, and digital identity tech are poised to revolutionize data interoperability. Indeed, these technologies can streamline integration setups, and accelerate data standards development through self-learning. What’s more, these new innovations can enhance shared understanding of business data via knowledge graphs, and increase trust between data-sharing systems and devices. By leveraging these innovations, businesses can achieve more efficient, secure, and high-velocity data integrations. Below, I’ll explain how these emerging technologies will make a difference.
a. AI / ML that Streamlines and Automates Data Integration.
First, AI and ML offers several opportunities in the area of traditional data integration tasks. For instance,
- Streamline Labor-Intensive Integration Tasks. Example use cases include data discovery, mapping, data quality improvements, data transformation, and metadata management to name a few.
- Use AI Agents to Autonomously Implement Data Integrations. For instance, AI agents could use data dictionaries for guidance, interact with their environment, collate data, and use the data to perform self-determined tasks to meet predetermined goals.
- Leverage Computer Vision AI to Reduce Formatted Data Requirements and Increase Understandability. For example, Machine Learning (ML) can easily “translate” an image-based shipment status that includes a picture of a box on a carrier’s truck with a local time and GPS location. In contrast, traditional, proprietary status updates have complex text formats riddled with ambiguity. Click here for more insights on using Computer Vision AI for image-based tracking.
For a more detailed discussion on AI data integration opportunities, see AIMultiple’s article, Machine Learning in Data Integration: 8 Challenges & Use Cases.
b. Machine Learning (ML) for Accelerating Data Standard Development.
ML presents an AI opportunity to advance data standard development to increase data interoperability across industries. Specifically, ML technology excels in classifying and predicting events. We could, therefore, apply ML to statistically analyze large datasets to identify new additions for data models and standards.
Essentially, this automated learning process would rapidly unearth new insights by examining vast amounts of data. This approach promises significant labor savings and a faster pace in maturing data standards and models. For a more detailed discussion on leveraging AI in advancing semantic interoperability, see IEC’s white paper, Semantic interoperability: challenges in the digital transformation age.
c. Knowledge Graph Interoperability Opportunities: Incorporate Data with Structured Relationships and Shared Meaning.
Knowledge graphs bring a transformative approach to data interoperability by incorporating data with meaning and defining relationships in a contextual framework. Indeed, these semantic networks enable a more nuanced and rich representation of data, transcending the limitations of traditional databases and specification documentation. By mapping out entities and the connections between them, knowledge graphs facilitate a more intuitive understanding of complex data models and standards. Knowledge Graph Tech brings the following capability to accelerate Data interoperability:
Knowledge Graph Tech Capabilities
- Fact Verification. Enables data that is transferred to be more trustworthy, providing the ability to verify the quality of data.
- Superior Contextual Understanding. Knowledge graphs can put data in context via linking and semantic metadata.
- Fact Ranking. Moreover, knowledge graphs can increase understanding of data by ranking varying and conflicting information.
- Linking Related Entities. Knowledge graphs provide more depth, helping to discover additional facts as well as provide better, explainable insights.
- Contextually Linking Data From Disparate Data Sources. Lastly, knowledge graphs help to better discern meaning and correlations when data comes from different functional domains and data types.
So, Knowledge Graph Tech offers data interoperability with many new capabilities. Moreover, this type of technology is very versatile where it can support anything from smart contracts, international ecommerce, and the Internet of Things (IoT). Also coupled with AI, knowledge graphs can greatly facilitate the exchange of meaningful data using autonomous AI agents. For more information, see my article, Knowledge Graph Tech: Enabling A More Discerning Perspective For AI.
d. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
An increasingly critical component of data interoperability is to have trust in who or what is the source of data. For that reason data owners are increasingly turning to digital identity tech. To detail, Digital Identity tech instills trusted interoperability by:
- Equips businesses with the capability of who and what systems can access their data.
- Enables compliance with regulations.
- Provides protection from bad actors such as hackers.
- Empowers businesses to securely leverage AI agents and IoT devices through digital credentials for data access.
For more Insights on deploying Digital Identity technology, especially for the supply chain industry, see my article, Digital Identity In Logistics And What To Know – The Best Security, Scary Risks.
Conclusion.
Without a doubt, emerging technologies such as AI / ML, knowledge graphs, and digital identity tech can go a long way for businesses to achieve data interoperability. Specifically, these new technologies hold the promise to make data exchange more secure, more understandable, and increase business data velocity. However, we need knowledgeable business leaders and professionals to assure this tech is implemented right to achieve their business objectives. For more insights on businesses achieving Logistics Interoperability, see my article, Achieving Logistics Interoperability: The Best Way to Breakthrough The Tangle Of Dumb Data Integrations.
Lastly, if you are in the supply chain industry and need help with moving forward in the area of data interoperability, please contact me to discuss next steps. I’m Randy McClure, a supply chain tech advisor. I have implemented 100s of tech pilot projects and innovative solutions across the supply chain as well as all transportation modes. I specialize in proof-of-concepts (POC) for emerging technologies and data-centric software development methods. To reach me, click here to access my contact form or you can find me on LinkedIn.
Also, for more from SC Tech Insights, see the latest articles on Interoperability and Information Technology.
Greetings! As a supply chain tech advisor with 30+ years of hands-on experience, I take great pleasure in providing actionable insights and solutions to industry leaders. My focus is on supply chains leveraging emerging LogTech. I zero in on tech opportunities and those critical issues that are solvable, but not well addressed, offering industry executives clear paths to resolution. I have a wide range of experience to include successfully leading the development of 100s of innovative software solutions across supply chains and delivering business intelligence (BI) solutions to 1,000s of shippers. Click here for more info.