In this digital era, business leaders are realizing the necessity of true data interoperability. The old ways of data silos locked within proprietary enterprise systems simply won’t cut it. Further, cobbling together these systems with more custom-built data interfaces is not the answer. Indeed, traditional data integration efforts are a slow-moving technical process that demands significant know-how. Previously, standards development organizations (SDO) like GS1 have helped ease integration tasks such as inventory tracking with standardized barcodes. Indeed, SDOs do offer some hope today to continue facilitating data interoperability. However, integrating a data interface is still a slow and excruciating process. The question is can emerging technology help us to achieve true data interoperability?
In this article, I’ll first identity why data integration requires so much expertise to do it right and why it is so slow to do. And, more importantly, why much of the data that does get transferred is not understood by the receiver. I’ll also feature emerging technologies that can help both SDOs and businesses achieve data interoperability. Indeed, this is tech that offers security, increases understandability, and enables high velocity information exchange. This includes Artificial Intelligence (AI) / Machine Learning (ML), knowledge graphs, and digital identity tech.
- 1. Traditional Data Integration: Setups Takes Too Long, Much Expertise Needed, and Data Transferred Not Understandable.
- 2. Emerging Tech to Make Data Integrations Understandable, Secure, And High Velocity.
- a. AI / ML Data Interoperability Opportunities: Streamline Setups and Self-Learning Data Models.
- b. Knowledge Graph Interoperability Opportunities: Incorporate Data With Meaning and Define Relationships.
- c. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
1. Traditional Data Integration: Setups Takes Too Long, Much Expertise Needed, and Data Transferred Not Understandable.

To understand why traditional data integration efforts are a bottleneck to the path to seamless interoperability, we need to know why data interfaces take so long to set up. Further, when I refer to data integration this includes many types of data access methods to include APIs, file transfers, ELT, replication, and data streaming to name a few. With complex systems and vast amounts of data, a data integration setup is not just time-consuming but also demands a significant level of technical expertise. So, let’s look at all the challenging tasks of data integration and why many of these interfaces fail to transmit meaningful data.
a. First, All IT Integrators Have To Rely on Complicated Interface Spec Documentation.
Many business professionals may not realize that their IT integrator needs to rely on a complicated spec document to implement a new business-to-business (B2) data interface. For the most part, these data specifications are communicated through online documentation such as a PDF document or a web page. In fact, this is true for both proprietary data specifications and standards development organizations (SDO) specifications. Indeed, there are two types of specs that IT integrators can use, proprietary or standard. See below for a description of each type.
Two Types of Data Interface Documents
- Proprietary Data Interface Documentation. For instance, a commercial Software as a Service (SaaS) platform will normally provide proprietary “data dictionaries” spec documents. These specifications usually include a description of each data element and example use cases.
- Standards Development Organization (SDO) Documentation. For example, GS1 General Specifications Standard provides universally accepted data specifications for barcodes. These SDO specs are then followed by integrators to help businesses identify products and track packages.
Now, with both types of data interface documents, the IT integrator will follow the same setup practices. On the other hand, there are some differences such as the SDO specs are not proprietary and are open for any organization to follow. However in some cases, SDO organizations require the integrating organization to pay for this documentation. Also when using a particular SDO standard, the lead integrator such as a transportation carrier or manufacturer, may need to provide supplemental documentation for how it implements their particular instance of the data interface.
b. Further, Most Business-To- Business (B2B) Integrations Take Too Long, Need Much Expertise and the Data Transferred Not Understandable.
As described above, data integration is complicated! Worse, it takes too long and in many cases the data that is sent is not understood by the receiving organization Here’s why:
- Scarce Highly-Skilled IT Integrators. Indeed, both organizations involved in the data transfer require a competent IT integrator.
- Integrators Need Business Data Know-How. What’s more, these integrators need both business data know-how and the technical skills to implement. Without this know-how, data may get transferred without any technical error, but it may not be understood by the receiving organization.
- Data Interface Specs Docs Lacking or Too Complicated. First, because there is a lack of skilled IT Integrators, documentation is usually lacking and not well maintained. On the other hand, an organization with highly skilled Integrators provide complex documentation and implementation methodologies that are too unwieldy. As a result, spec docs are not understood, not followed, or results in meaningless data exchange.
- Source Systems Lack Quality Data. Worse in many cases, the data source does not have any meaningful information to begin with, so the data exchange becomes a waste of effort.
c. How Can We Make Data Integration Better?
There is not necessarily a quick fix, but there are actions on both the technical and business side that can immensely improve data interoperability. These include:
- Need for Common, Measurable Business Definitions. Indeed, how can businesses share information via data interfaces, if they do not have a common agreement on the critical business terms. For a more detailed discussion on this subject, see my article, Poor Operational Definitions Impede Supply Chain Tech Adoption: Now Is the Time For A Big Change.
- Emerging Tech Solutions. Now, there are emerging technologies that can both help SDOs and businesses achieve better data interoperability. Indeed, these technologies hold much promise to increase security, make data more understandable, and increase the velocity of adding new data interfaces. The remainder of this article will introduce you to these new technologies.
2. Emerging Tech to Make Data Integrations Understandable, Secure, And High Velocity.
Emerging technologies that can improve data interoperability include artificial intelligence (AI) / machine learning (ML), knowledge graphs, and digital identity tech. See below for details.
a. AI / ML Data Interoperability Opportunities: Streamline Setups and Self-Learning Data Models.
The advent of artificial intelligence (AI) and machine learning (ML) technologies holds great promise for data interoperability. Indeed, AI offers the promise of streamlined setups and intelligent, self-learning data models. Further, these technologies have the capability to parse through vast datasets, discern patterns, and help automate data integration processes. Thus, this type of tech offers the potential to significantly reduce the time and expertise required to set up new interfaces. Moreover, as these types of systems have the capability to learn and adapt, they become more proficient at anticipating needs and preempting issues.
1) AI / ML Data Integration Automation Support.
First, AI and ML offers several opportunities in the area of traditional data integration tasks. Basically, AI can automate and streamline many labor-intensive tasks. Example use cases include data discovery, mapping, data quality improvements, data transformation, and metadata management to name a few.
Also in the near future, AI agents could autonomously follow SDO-based data standards and establish new system-to-system interfaces. Indeed, these AI software programs will have the ability to interact with their environment, collect data, and use the data to perform self-determined tasks to meet predetermined goals.
However, before we can fully leverage AI and autonomous data integration we need to mature our data standards. This means eliminating the ambiguity that exists in much of our data today. For a more detailed discussion on AI data integration opportunities, see AIMultiple’s article, Machine Learning in Data Integration: 8 Challenges & Use Cases.
2) Embedded Learning Capability to Rapidly Mature Data Models.
Embedding a learning capability into our data standards also presents an AI opportunity to enhance data interoperability. Specifically, ML technology excels in classifying and predicting events. We could, therefore, apply ML to statistically analyze large datasets to identify new additions for data models and standards.
Essentially, this automated learning process would rapidly unearth new insights by examining vast amounts of data. This approach promises significant labor savings and a faster pace in maturing data models. For a more detailed discussion on leveraging AI in advancing semantic interoperability, see IEC’s white paper, Semantic interoperability: challenges in the digital transformation age.
b. Knowledge Graph Interoperability Opportunities: Incorporate Data With Meaning and Define Relationships.
Knowledge graphs bring a transformative approach to data interoperability by incorporating data with meaning and defining relationships in a contextual framework. Indeed, these semantic networks enable a more nuanced and rich representation of data, transcending the limitations of traditional databases and specification documentation. By mapping out entities and the connections between them, knowledge graphs facilitate a more intuitive understanding of complex data models and standards.
Specifically, graph tech defines relationships and contexts between data elements, storing those relationships themselves in a graph as data. Indeed, knowledge graph tech is well suited for supporting and documenting data interoperability. This is because this type of technology can support both standards development and for implementation. For instance with supply chains, knowledge graph tech can support anything from smart contracts, international ecommerce, and the Internet of Things (IoT). Coupled with AI, knowledge graphs can greatly facilitate the exchange of meaningful data using autonomous AI agents.
For more information, see these knowledge graph tech references:
- Kevin Doubleday’s article, Semantic Interoperability: Exchanging Data with Meaning
- My article, Knowledge Graph Tech: Enabling A More Discerning Perspective For AI.
- Pierre Levy’s blog posting, Semantic Interoperability and the Future of AI for an example of how we can implement knowledge graph technologies to further advance semantic interoperability.
c. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
An increasingly critical component of data interoperability is to have trust in who or what is the source of data. For that reason data owners are increasingly turning to digital identity tech. This is because they need to know who and what is accessing their data. At the same time, another reason for this need for trusted interoperability is to comply with regulations. Further, there is the need to protect organizations from bad actors such as hackers.
Moreover, in this age of digitalzation there is now an increasing need for trusted data interoperability. This is because businesses now need to share and receive data with more and more systems. What’s more, non-traditional systems such as AI agents and Internet of Things (IoT) devices are beginning to need digital credentials for sharing data. Thus, there is an increasing need to leverage digital identity technologies. To gain a better understanding of the importance of digital identity tech, below is a definition of what a digital identity is and what it encompasses.
Definition of Digital Identity
“A digital identity is an online presence that represents and acts on behalf of an external actor in an ecosystem. An identity could belong to a legal entity, a financial intermediary, or a physical object, for example. Ideally, a digital identity is verified by a trust anchor, or something confirming the legitimacy of an actor, so that those interacting with that actor’s digital identity have confidence the actor is who and what it claims to be.”
World Economic Forum
Indeed, digital identity technology and methodologies, more than ever, are critical to achieve data interoperability. For a detailed discussion of digital identity technology, especially for the supply chain industry, see my article, Digital Identity In Logistics And What To Know – The Best Security, Scary Risks.
Conclusion.
So, emerging technologies such as AI / ML, knowledge graphs, and digital identity tech can go a long way for businesses to achieve data interoperability. Specifically, these new technologies hold the promise to make data exchange more secure, more understandable, and increase business data velocity. However, we need knowledgeable business leaders and professionals to assure this tech is implemented right to achieve their business objectives. For more discussions on businesses achieving Logistics Interoperability, see my article, Achieving Logistics Interoperability: The Best Way to Breakthrough The Tangle Of Dumb Data Integrations.
Lastly, if you are in the supply chain industry and need help with moving forward in the area of data interoperability, please contact me to discuss next steps. I’m Randy McClure, a supply chain tech advisor. I have implemented 100s of tech pilot projects and innovative solutions across the supply chain as well as all transportation modes. I specialize in proof-of-concepts (POC) for emerging technologies and data-centric software development methods. To reach me, click here to access my contact form or you can find me on LinkedIn.
Also, for more from SC Tech Insights, see the latest articles on Interoperability and Information Technology.
Greetings! As a supply chain tech advisor with 30+ years of hands-on experience, I take great pleasure in providing actionable insights and solutions to logistics leaders. My focus is to drive transformation within the logistics industry by leveraging emerging LogTech, applying data-centric solutions, and increasing interoperability within supply chains. I have a wide range of experience to include successfully leading the development of 100s of innovative software solutions across supply chains and delivering business intelligence (BI) solutions to 1,000s of shippers. Click here for more info.