The new generation of integration platforms brings Data and APIs closer together in a common approach (2/2)
- Jan 30
- 3 min read
APIs are often highlighted in the field of integration and middleware. However, on their own, they cannot represent the full diversity of integration patterns. In fact, several integration technologies must be combined to cover all use cases. For example, we can combine the two major patterns in the integration market (modular logic supported by API Management and mass logic supported by Data integration) to form a common approach to integration: a second-generation hybrid integration platform.
Thus, the Service/API approach and the Data approach can be federated by ingestion, security, and above all, integration scenarios. We will clarify these different levels of integration in this article.
We concluded the first part of this article by introducing the topic of integration scenarios, which accompany the structuring of middleware.
For years, companies have been stacking integration capabilities, several generations of technologies, sometimes redundant. At the end of 2019, we intervened in a company in the building distribution sector: it had accumulated twelve middleware solutions, without any governance.
As part of this assignment, we issued recommendations, compiled in a middleware and integration master plan, which defined the company's integration strategy.
It recommended merging the following into a single platform:
Digital integration capabilities, focused on APIs, based on real time and messages
Data integration capabilities, focused on data, based on volume, push, and streaming.
The goal is to be able to:
Combine the ability to handle single items and volume, transactions and analytics, depending on the use case.
Streamline existing middleware and reduce technical debt from previous generations using a tailored platform.
This convergence can also be seen in Asia, where strong growth, the key role of e-commerce, and partner ecosystems are driving this type of construction, alongside the affirmation of a well-known discipline: architecture.
This type of platform is based on a selection of components, brought together by a DevOps model. Our work generally consists of building and deploying this unified view, regardless of the technologies chosen. Indeed, there is currently no vendor capable of providing an end-to-end solution.
Furthermore, as we have observed with data platforms, companies are reluctant to commit to a single vendor for such strategic projects.
This is where the concept of integration scenarios comes in. These are pre-identified and pre-wired use cases that are sure to be found in one company after another, models that will be technically applicable to the platform.
In the diagram above, we have listed nine, but there are many others:
Event-triggered IoT: the goal is to capture event messages from connected objects and correlate them to increase readability. This correlation occurs naturally through the application of data science/machine learning algorithms, which themselves trigger other events to be pushed to subscribers.
Data Virtualization: Data Virtualization is an integration pattern that generally relies on a set of technologies dominated by the use of SQL. These patterns can be covered by a platform of this type. The advantage is that there is no need for Data Virtualization that is separate from the integration strategy or that acts as an additional technology.
Batch Data Flow: this involves creating data aggregates, enriched by orchestrations, and made available in batches to consumers.
Cross-functional processes: these are processes that describe the lifecycle of information in the information system, and for which there are business performance metrics that only the platform can determine due to its cross-functional nature.
Digital Twin Experience: built around the Event-Triggered IoT scenario, as events received are sent back to the digital twin for real-time interpretation of its operation. This scenario is then enriched with metadata to converge events into a single view of the twin, justifying connection to an enterprise architecture repository or data catalog.
RPA Integration: the use of an RPA solution sometimes requires access to APIs exposed by the platform to avoid surface automation, which can cause information synchronization issues. This scenario is therefore also a means of controlling data quality in different sources by comparing them and reporting discrepancies.
File 2 API 2 File: files have not disappeared, and it is sometimes necessary to transform a file into multiple messages, or to aggregate messages into a file, or sometimes both in an end-to-end process. This also allows for the connection between historical assets and new means of exchange.
AI & ML support: this scenario aims to supply algorithms based on local databases to the platform with the information necessary and sufficient for learning. These scenarios arise when the integration platform acts as a fully-fledged data platform.
MDM Workflows: the platform can handle the creation and consumption of reference data.
These scenarios make it possible to qualify a company's integration strategy and the profile of a federated integration platform, and to select the best components for the intended uses.
In many cases, an audit of the existing system will be necessary to determine the extent of the technical debt and how to resolve it through correction and migration scenarios that are inseparable from the construction of this new asset.



Comments