In this digital era, business leaders are realizing the necessity of true data interoperability. The old ways of data silos locked within proprietary enterprise systems simply won’t cut it. Further, cobbling together these systems with more custom-built data interfaces is not the answer. Indeed, traditional data integration efforts are a slow-moving technical process that demands significant know-how. Previously, standards development organizations (SDO) like GS1 have helped ease integration tasks such as inventory tracking with standardized barcodes. Indeed, SDOs do offer some hope today to continue facilitating data interoperability. However, integrating data interface is still a slow and excruciating process. The question is can emerging technology help us to achieve true data interoperability?
In this article, I’ll first highlight why data integration requires so much expertise to do it right and why it is so slow to do. I’ll also highlight emerging technologies that can help both SDOs and businesses achieve data interoperability. Indeed, this is tech that offers security, increases understandability, and enables high velocity information exchange. This includes Artificial Intelligence (AI) / Machine Learning (ML), knowledge graphs, and digital identity tech.
- 1. Traditional Data Integration: How Setups Takes Too Long and Much Expertise Needed to Make Understandable.
- 2. AI / ML Data Interoperability Opportunities: Streamline Setups and Self-Learning Data Models.
- 3. Knowledge Graph Interoperability Opportunities: Incorporate Data With Meaning and Define Relationships.
- 4. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
1. Traditional Data Integration: How Setups Takes Too Long and Much Expertise Needed to Make Understandable.

Traditional data integration often represents a bottleneck in the path to seamless interoperability. With complex systems and vast amounts of data, a data integration setup is not just time-consuming but also demands a significant level of technical expertise. So, let’s look at the challenging task of data integration.
a. First, All IT Integrators Have To Reference an Interface Spec Documentation.
To implement a new business-to-business (B2) data interface, an integrator needs a spec document. For the most part, these data specifications are communicated through online documentation such as a PDF document or a web page. In fact, this is true for both proprietary data specifications and standards development organizations (SDO) specifications. Specifically, there are two types of specs that IT integrators can use, proprietary or standard.
1) Proprietary Data Interface Documentation.
For instance, a commercial Software as a Service (SaaS) platform will normally provide proprietary “data dictionaries” spec documents. These are for both their database data elements and for associated application programming interfaces (API) for data exchange. These specifications usually include a description of each data element and example use cases.
2) Standards Development Organization (SDO) Documentation.
With SDO documentation, an IT integrator will follow the same practice except the SDO specs are not proprietary and are open for any organization to follow. For example, GS1 General Specifications Standard provides universally accepted data specifications for barcodes. These SDO specs are then followed by integrators to help businesses identify products and track packages. Indeed, these barcode data specifications are now used within most supply chains.
In some cases, SDO organizations require the integrating organization to pay for this documentation. Additionally, sometimes integrators may need supplemental documentation for how a specific organization implements a particular SDO standard.
b. Further, Most Business-To- Business (B2B) Integrations Require Too Much Expertise and Time to Implement.
The challenge with the way we approach B2B integration today is that both organizations require highly skilled IT integrators. Further, these integrators need both business data know-how and the technical skills to implement. Plus, these integrators need detailed interface specifications documents. The problem is that most organizations do not have a lot of depth in data integration expertise. Further, the interface spec documents are in many cases lacking or not followed during implementation. Worse in many cases, the data source and its system interface does not provide meaningful information to the intended receivers of the data.
c. How Can We Make Data Integration Better?
First, it is time to move away from the tangled web of proprietary data Interfaces. Positively, businesses and organizations need to work better with SDOs such as ISO, GS1, W3C, and ATSM International. This includes participation by business leaders. In fact, organizations that exchange data need to assure that they and their systems have a common understanding of terms and definitions with their business partners or technology cannot help. For a more detailed discussion, see my article, Poor Operational Definitions Impede Supply Chain Tech Adoption: Now Is the Time For A Big Change.
Now, there are emerging technologies that can both help SDOs and businesses achieve data interoperability. Indeed, these technologies hold much promise to increase security, make data more understandable, and increase the velocity of adding new data interfaces. These technologies include artificial intelligence (AI) / machine learning (ML), knowledge graphs, and digital identity tech. Continue reading for more information and references about these new technologies that can help us better to achieve data interoperability.
2. AI / ML Data Interoperability Opportunities: Streamline Setups and Self-Learning Data Models.
The advent of AI and ML technologies holds great promise for data interoperability. Indeed, AI offers the promise of streamlined setups and intelligent, self-learning data models. Further, these technologies have the capability to parse through vast datasets, discern patterns, and help automate data integration processes. Thus, this type of tech offers the potential to significantly reduce the time and expertise required to set up new interfaces. Moreover, as these types of systems have the capability to learn and adapt, they become more proficient at predicting needs and preempting issues.
a. AI / ML Data Integration Automation Support.
First, AI and ML offers several opportunities in the area of traditional data integration tasks. Basically, AI can automate and streamline many labor-intensive tasks. Example use cases include data discovery, mapping, data quality improvements, data transformation, and metadata management to name a few.
Also in the near future, AI agents could autonomously follow SDO-based data standards and establish new system-to-system interfaces. Indeed, these AI software programs will have the ability to interact with their environment, collect data, and use the data to perform self-determined tasks to meet predetermined goals.
However, before we can fully leverage AI and autonomous data integration we need to mature our data standards and eliminate the ambiguity that exists in much of our data today. For a more detailed discussion on AI data integration opportunities, see AIMultiple’s article, Machine Learning in Data Integration: 8 Challenges & Use Cases.
b. Embedded Learning Capability to Rapidly Mature Data Models.
Embedding a learning capability into our data standards also presents an AI opportunity to enhance data interoperability. Specifically, Machine Learning (ML) technology excels in classifying and predicting events. We could, therefore, apply ML to statistically analyze large datasets to identify new additions for data models and standards.
Essentially, this automated learning process would rapidly unearth new insights by examining vast amounts of data. This approach promises significant labor savings and a faster pace in maturing data models. For a more detailed discussion on leveraging AI in advancing semantic interoperability, see IEC’s white paper, Semantic interoperability: challenges in the digital transformation age.
3. Knowledge Graph Interoperability Opportunities: Incorporate Data With Meaning and Define Relationships.
Knowledge graphs bring a transformative approach to data interoperability by incorporating data with meaning and defining relationships in a contextual framework. Indeed, these semantic networks enable a more nuanced and rich representation of data, transcending the limitations of traditional databases and specification documentation. By mapping out entities and the connections between them, knowledge graphs facilitate a more intuitive understanding of complex data models and standards.
Specifically, graph tech defines relationships and contexts between data elements, storing those relationships themselves in a graph as data. Indeed, knowledge graph tech is well suited for supporting and documenting data interoperability. This is because this type of technology can support both standards development and for implementation. For instance with supply chains, knowledge graph tech can support anything from smart contracts, international ecommerce, and the Internet of Things (IoT). Coupled with AI, knowledge graphs can greatly facilitate the exchange of meaningful data using autonomous AI agents.
For more information on knowledge graph technology see Kevin Doubleday’s article, Semantic Interoperability: Exchanging Data with Meaning and my article, Knowledge Graph Tech: Enabling A More Discerning Perspective For AI. Also, see Pierre Levy’s blog posting, Semantic Interoperability and the Future of AI for an example of how we can implement knowledge graph technologies to further advance semantic interoperability.
4. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
An increasingly critical component of data interoperability is to have trust in who or what is the source of data. Further for data owners, digital identity tech is increasingly key so that they know who and what is accessing their data. Of course, one reason for this need for trusted interoperability is to comply with regulations and to protect organizations from bad actors such as hackers.
However, another reason for trusted interoperability requirements is that there is an increasing need for businesses to share and receive data with more and more systems. What’s more, non-traditional systems such as AI agents and Internet of Things (IoT) devices are beginning to share and consume an astronomical amount of data. Thus, there is an increasing need to leverage digital identity technologies. To gain a better understanding of the importance of digital identity tech, below is a comprehensive definition of what a digital identity is and what it encompasses.
Definition of Digital Identity
“A digital identity is an online presence that represents and acts on behalf of an external actor in an ecosystem. An identity could belong to a legal entity, a financial intermediary, or a physical object, for example. Ideally, a digital identity is verified by a trust anchor, or something confirming the legitimacy of an actor, so that those interacting with that actor’s digital identity have confidence the actor is who and what it claims to be.”
World Economic Forum
Now, in the early days of data transfer, digital identity was not much of an issue. First, there were not many connections to deal with. Second, IT staff could easily handle new connections by just issuing a password to authorized users. Also, for a data integration project, IT could just work with other IT staffs to set up the initial data connection. That was it! Now in our increasingly digital world, keeping our data network secure includes innumerable challenges. This includes an astronomical number of connections needed and an increase in bad actors such as hackers. Further, there are more data security compliance requirements, increasingly higher levels of automation, and the dynamic need for new data connections.
Thus, digital identity technology and methodologies, more than ever, are critical to achieve data interoperability. For a detailed discussion of digital identity technology, especially for the supply chain industry, see my article, Digital Identity In Logistics And What To Know – The Best Security, Scary Risks.
Conclusion.
So, emerging technologies such as AI / ML, knowledge graphs, and digital identity tech can go a long way for businesses to achieve data interoperability. Specifically, these new technologies hold the promise to make data exchange more secure, more understandable, and increase business data velocity. However, we need knowledgeable business leaders and professionals to assure this tech is implemented right to achieve their business objectives. For more discussions on businesses achieving Logistics Interoperability, see my article, Achieving Logistics Interoperability: The Best Way to Breakthrough The Tangle Of Dumb Data Integrations.

Achieving Logistics Interoperability: The Best Way to Breakthrough The Tangle Of Dumb Data Integrations.
Nowadays, logistics organizations know that their data holds the key to incredible insights and a competitive edge. However, many struggle to harness this power due their data being scattered across many systems, linked by a labyrinth of data integrations. Worse, even if this data gets transmitted to another system, the meaning of the data often gets lost in translation. Indeed, there is a better data integration approach and it is called semantic interoperability.
Click here where I’ll explain what semantic interoperability is and why it is the best approach for businesses to exchange meaningful data across their supply chains. Further, I’ll detail the benefits of semantic interoperability and how other industries and disciplines are advancing it. Also, I have four recommendations on how best for logistics and standards organizations to move forward with achieving semantic interoperability. This includes the importance of knowledgeable business leaders leading these interoperability efforts.
For more from SC Tech Insights, see the latest articles on Interoperability and Information Technology.
Greetings! As an independent supply chain tech advisor with 30+ years of hands-on experience, I take great pleasure in providing actionable insights and solutions to logistics leaders. My focus is to drive transformation within the logistics industry by leveraging emerging LogTech, applying data-centric solutions, and increasing interoperability within supply chains. I have a wide range of experience to include successfully leading the development of 100s of innovative software solutions across supply chains and delivering business intelligence (BI) solutions to 1,000s of shippers. Click here for more info.