
In the complex, digital world of business, it’s clear that true data interoperability between our systems, devices, and, now, AI agents is a necessity. Traditional data integration is a slog, requiring scarce and expensive technical expertise. Without a doubt, standards development organizations (SDO) like GS1 have helped ease integration tasks such as inventory tracking and standardized barcodes. However, data integration remains a painfully slow, error-prone process. Can emerging technologies be the catalyst for change?
In this article, I’ll first identify the reasons why data integration today is so complex, slow, and full of drudgery. Worse still, much of the data that does get transferred is not usable, much less actionable, by the receiving system. The good news is that there are new, emerging technologies that can help both businesses and SDOs achieve true data interoperability. This includes AI/ML, knowledge graphs, and digital identity tech that promise security, clarity, and high-velocity information exchange. The future of data interoperability is about to get a lot more interesting.
- 1. Traditional Data Integration: Setups Takes Too Long, Much Expertise Needed, and Data Transferred Not Understandable.
- 2. Emerging Tech to Make Data Integrations Understandable, Secure, And High Velocity.
- a. AI / ML that Streamlines and Automates Data Integration.
- b. Machine Learning (ML) for Accelerating Data Standard Development.
- c. Knowledge Graph Interoperability Opportunities: Incorporate Data with Structured Relationships and Shared Meaning.
- d. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
1. Traditional Data Integration: Setups Takes Too Long, Much Expertise Needed, and Data Transferred Not Understandable.
Traditional data integration practices are a bottleneck to seamless interoperability. First, they are time-consuming to set up, and second they require significant technical expertise. One of the reasons for this predicament is due to complex systems, large data sets, and the many types of data access methods to include APIs, file transfers, ELT, replication, and data streaming to name a few. However at the same time, many data integration efforts fall short due to organizational leaders not specifying their objectives as well as their lack of defining their business definitions and terms. As a result, in many cases, the data transferred is not understandable.
So, let’s look at these data integration challenges and what businesses and new, emerging technology can do to overcome these obstacles.
a. IT Data Integrators Require Detailed Interface Documentation.
First, IT requires very detailed spec documents to implement a new business-to-business (B2) data interface. For the most part, the lead IT integration team will provide these data specifications to their IT counterparts that are part of the integration efforts. Normally, these specs are online documentation such as a PDF document or a web page. Also, these detailed specs are needed for both proprietary data specifications and standards development organizations (SDO) specifications. See below for a description of these specification types.
Two Types of Data Interface Documents
- Proprietary Data Interface Documentation. For instance, a commercial Software as a Service (SaaS) platform will normally provide proprietary “data dictionaries” spec documents. These specifications usually include a description of each data element and example use cases.
- Standards Development Organization (SDO) Documentation. For example, GS1 General Specifications Standard provides universally accepted data specifications for barcodes. These SDO specs are then followed by integrators to help businesses identify products and track packages.
Now, with both types of data interface documents, the IT integrator will follow the same setup practices. On the other hand, there are some differences such as the SDO specs are not proprietary and are open for any organization to follow. However in some cases, SDO organizations require the integrating organization to pay for this documentation. Also when using a particular SDO standard, the lead integrator such as a transportation carrier or manufacturer, may need to provide supplemental documentation for how it implements their particular instance of the data interface.
b. Most System Integrations Lack Quality Data and Business Definitions.
One of the biggest stumbling blocks for organizations to exchange data is the data is of low quality. Besides organizations transferring data that is out-of-date and incomplete, the data transferred is ambiguous and not understandable by the receiving organization. This is basically a business problem. Here’s why:
- Source Systems Lack Quality Data. In many cases, the owners of the data, the business owners, are neglectful of their data. They leave it up to their IT staff who are not business experts to define data elements and determine which data elements to keep up-to-date.
- Integrators Need Business Data Know-How. What’s more, data integration teams need both business data know-how and the technical skills to implement. Without this know-how, data may get transferred without any technical error, but it may not be understood by the receiving organization.
c. Much IT Expertise Needed to Implement Business-To-Business Integrations.
As described above, data integration is complicated! Worse, it takes too long and in many cases the data that is sent is not understood by the receiving organization Here’s why:
- Scarce Highly-Skilled IT Integrators. Indeed, both organizations involved in the data transfer require a competent IT integrator.
- Data Interface Specs Docs Lacking or Too Complicated. First, because there is a lack of skilled IT Integrators, documentation is usually lacking and not well maintained. On the other hand, an organization with highly skilled Integrators provide complex documentation and implementation methodologies that are too unwieldy. As a result, spec docs are not understood, not followed, or results in meaningless data exchange.
d. How Can We Make Data Integration Better?
There is not necessarily a quick fix, but there are actions on both the technical and business side that can immensely improve data interoperability. These include:
- Need for Common, Measurable Business Definitions. Indeed, how can businesses share information via data interfaces, if they do not have a common agreement on the critical business terms. For a more detailed discussion on this subject, see my article, Poor Operational Definitions Impede Supply Chain Tech Adoption: Now Is the Time For A Big Change.
- Emerging Tech Solutions. Now, there are emerging technologies that can both help Standards Development Organizations (SDO) and businesses achieve better data interoperability. Indeed, these technologies hold much promise to increase security, make data more understandable, and increase the velocity of adding new data interfaces.
The remainder of this article will introduce you to these new, emerging technologies and methodologies that can help us achieve better data interoperability.
2. Emerging Tech to Make Data Integrations Understandable, Secure, And High Velocity.
Emerging technologies that can improve data interoperability include artificial intelligence (AI) / machine learning (ML), knowledge graphs, and digital identity tech. These technologies hold great promise for providing revolutionary new integration capabilities. These include streamlining integration setups and accelerating data standards developments with self-learning capabilities. Also, emerging tech can use knowledge graph structures to increase shared understanding of business data. Lastly, new digital identity tech can increase trust between systems, agents, and IoT devices that share data. See below for details.
a. AI / ML that Streamlines and Automates Data Integration.
First, AI and ML offers several opportunities in the area of traditional data integration tasks. Basically, AI can automate and streamline many labor-intensive tasks. Example use cases include data discovery, mapping, data quality improvements, data transformation, and metadata management to name a few.
Also in the near future, AI agents could autonomously follow SDO-based data standards and establish new system-to-system interfaces. Indeed, these AI software programs will have the ability to interact with their environment, collect data, and use the data to perform self-determined tasks to meet predetermined goals.
However, before we can fully leverage AI and autonomous data integration we need to mature our data standards. This means eliminating the ambiguity that exists in much of our data today. For a more detailed discussion on AI data integration opportunities, see AIMultiple’s article, Machine Learning in Data Integration: 8 Challenges & Use Cases.
b. Machine Learning (ML) for Accelerating Data Standard Development.
ML presents an AI opportunity to advance data standard development to increase data interoperability across industries. Specifically, ML technology excels in classifying and predicting events. We could, therefore, apply ML to statistically analyze large datasets to identify new additions for data models and standards.
Essentially, this automated learning process would rapidly unearth new insights by examining vast amounts of data. This approach promises significant labor savings and a faster pace in maturing data standards and models. For a more detailed discussion on leveraging AI in advancing semantic interoperability, see IEC’s white paper, Semantic interoperability: challenges in the digital transformation age.
c. Knowledge Graph Interoperability Opportunities: Incorporate Data with Structured Relationships and Shared Meaning.
Knowledge graphs bring a transformative approach to data interoperability by incorporating data with meaning and defining relationships in a contextual framework. Indeed, these semantic networks enable a more nuanced and rich representation of data, transcending the limitations of traditional databases and specification documentation. By mapping out entities and the connections between them, knowledge graphs facilitate a more intuitive understanding of complex data models and standards.
Specifically, graph tech defines relationships and contexts between data elements, storing those relationships themselves in a graph as data. Indeed, knowledge graph tech is well suited for supporting and documenting data interoperability. This is because this type of technology can support both standards development and for implementation. For instance with supply chains, knowledge graph tech can support anything from smart contracts, international ecommerce, and the Internet of Things (IoT). Coupled with AI, knowledge graphs can greatly facilitate the exchange of meaningful data using autonomous AI agents.
For more information, see these knowledge graph tech references:
- Kevin Doubleday’s article, Semantic Interoperability: Exchanging Data with Meaning
- My article, Knowledge Graph Tech: Enabling A More Discerning Perspective For AI.
- Pierre Levy’s blog posting, Semantic Interoperability and the Future of AI for an example of how we can implement knowledge graph technologies to further advance semantic interoperability.
d. Trusted Interoperability: Leveraging Digital Identity Tech to Achieve Confidence in the Data Exchanged by Partners and Entities.
An increasingly critical component of data interoperability is to have trust in who or what is the source of data. For that reason data owners are increasingly turning to digital identity tech. This is because they need to know who and what is accessing their data. At the same time, another reason for this need for trusted interoperability is to comply with regulations. Further, there is the need to protect organizations from bad actors such as hackers.
Moreover, in this age of digitalzation there is now an increasing need for trusted data interoperability. This is because businesses now need to share and receive data with more and more systems. What’s more, non-traditional systems such as AI agents and Internet of Things (IoT) devices are beginning to need digital credentials for sharing data. Thus, there is an increasing need to leverage digital identity technologies. For a detailed discussion of digital identity technology, especially for the supply chain industry, see my article, Digital Identity In Logistics And What To Know – The Best Security, Scary Risks.
Conclusion.
So, emerging technologies such as AI / ML, knowledge graphs, and digital identity tech can go a long way for businesses to achieve data interoperability. Specifically, these new technologies hold the promise to make data exchange more secure, more understandable, and increase business data velocity. However, we need knowledgeable business leaders and professionals to assure this tech is implemented right to achieve their business objectives. For more discussions on businesses achieving Logistics Interoperability, see my article, Achieving Logistics Interoperability: The Best Way to Breakthrough The Tangle Of Dumb Data Integrations.
Lastly, if you are in the supply chain industry and need help with moving forward in the area of data interoperability, please contact me to discuss next steps. I’m Randy McClure, a supply chain tech advisor. I have implemented 100s of tech pilot projects and innovative solutions across the supply chain as well as all transportation modes. I specialize in proof-of-concepts (POC) for emerging technologies and data-centric software development methods. To reach me, click here to access my contact form or you can find me on LinkedIn.
Also, for more from SC Tech Insights, see the latest articles on Interoperability and Information Technology.
Greetings! As a supply chain tech advisor with 30+ years of hands-on experience, I take great pleasure in providing actionable insights and solutions to logistics leaders. My focus is to drive transformation within the logistics industry by leveraging emerging LogTech, applying data-centric solutions, and increasing interoperability within supply chains. I have a wide range of experience to include successfully leading the development of 100s of innovative software solutions across supply chains and delivering business intelligence (BI) solutions to 1,000s of shippers. Click here for more info.