Follow-up report of the “Data in Motion” Forum Conference
Set data in motion
providers about it
About 150 guests attended the Confluent “Data in Motion” conference in Frankfurt. The main focus there was on the added value of broadcast technology in daily business and organizational life. CEMEA’s new Vice President of Sales Roger Elling also introduced himself.
The gathering German conference “Data in Motion” at the Steigenberger Frankfurter Hof was all about Apache Kafka and Confluent on-premises, in the cloud, hybrid, or multi-cloud deployments.
CEMEA’s new director of sales at Confluent, Roger Elling, first set himself the goal of “speaking personally to the 200 largest companies in Germany”. He set the cap here at “about two billion euros in sales”. Illing has a lot of IT and administration experience eg from SAP, OpenText or Alteryx. Smaller clients are served by other partners. Confluent has about 20 of these in Germany, eg SVA integration.
Complete the essence of Kafka
The Confluent Kafka distribution is based on the full open source Kafka core – according to Illing, a unique selling point compared to other Kafka platforms. After all, Confluent founder and CEO Jay Kreps helped develop Apache Kafka. Confluent embeds the core of open source into enterprise functions such as security or governance.
Kafka acts as the central data stream system and as an alternative to legacy systems ETL– or messaging tools. Brief overview of the core function: Kafka continuously receives event data from all sides. Then they are arranged in the chronological order of their arrival in what are called topics, that is, data containers arranged by topic. Customers interested in corresponding topics or event data subscribe to these topics, where they are automatically provided with current data at the required interval or in near real time.
Additional software can extend the range of functionality: with the additional kSQL, for example, it is possible to search through streamed data such as SQL databases using the appropriate syntax. Kafka-based streaming platforms, among which Confluent claims to be the current leader, are designed to replace traditional siled data architectures, but also data lakes And the Data warehouses replacement or supplement. It is important when event data is needed as quickly as possible, used multiple times and, if necessary, supplemented with other data sources.
Kai Wähner, CTO at Confluent’s Field, gave insight into potential application areas:
- In the transportation sector, calculating arrival times, matching drivers and passengers (according to Wähner, most Uber-like services use Kafka) or preventive maintenance of vehicles.
- In banking, instant payment services, fraud detection, mobile banking, or improving customer experience.
- In real-time retail advertising, customer personalization, or pricing.
What does the company’s use of Confluent Cloud look like concretely? There were some examples of this in Frankfurt. According to Wähner, the deciding factor for the customer in any case is the individual added value. Illing, Sales Director of CEMEA: “We have three valued engineers in Europe who calculate concrete business cases with customers.”
Preparation at Allianz
Dr. Annegret Juncker, Allianz, gives insight into building Kafka’s microservice structures at the insurance company. She chose to set up new clients as an example. In the Kafka-based process redesign, which was previously implemented using convergent systems, the research was first done in situ: How exactly does a business process work? Who is involved and how? The result is visual field storytelling with all the actors, data, and actions. This representation is then formalized and programmed into independent microservices that communicate with each other without being strictly bound by each other. When a process goes beyond domain boundaries, APIs need to be developed.
For the Kafka architecture, Allianz also cached legacy systems integrated with a proprietary software layer that contains an interface to the rest of the stack environment. In this way, this legacy environment can be changed without affecting the entire system. Only the additional software layer may be affected. Junker: “This is how you can unload old components step by step from the inside.”
Mercedes-Benz improves product data management
Car manufacturer Mercedes-Benz began implementing Kafka Confluent in the third quarter of 2018. The first systems were launched in the first quarter of 2020. Today, more than 70 use cases are in use across departments. This also includes hybrids, where part of Confluent’s solution runs on premises and part in the cloud.
The resulting platform is fully managed, multi-tenant capable and has multiple namespaces, each with specific rights and roles. Mercedes-Benz Customer Connector Logs is an automated, self-service, self-programmed, module for communicating with existing data lakes for easy integration of event data and deployment pipelines. The security of the data lake is ensured by a vault solution from Hashicorp.
Product data management, for example, benefits from the platform. “Our BOM system had four batch processes with different shapes and runtimes,” says Henrik Berner, a Kafka implementation project manager for several years. “Today, the data is streamed and thus delivered almost in real time.” This saves a lot of time. Another application is a 360-degree view of customers who consent to data processing.
Siemens says goodbye to bulk product data runs
Siemens has improved product master data management with Confluent and Kafka. A similar project has been running for three years. In this case, the main data product is the complex SAP system. Consumers, for example, regional sales systems. Sales branches have had an electronic catalog for quite some time, which, of course, must always be up to date.
Prior to the introduction of the Confluent solution, changes to product master data sometimes took weeks and resulted in many inquiries. In the meantime, a solution for data delivery on demand has been implemented with the help of Kafka. Command tracing is planned as another use case.
GLS replaces Compact software
The GLS parcel service provider operates, among other things, two interconnected software modules for the digital processing of orders from the company’s 250,000 customers worldwide: a data center and a replication system from a third-party manufacturer. It will now be “sliced” step by step, according to Dirk Boening, Senior Director of Customer Solutions. The data replication system in particular caused problems: it gave up ghost with more than 1,000 connected clients.
Meanwhile, Kafka’s data replication has been replaced by using change data capture. Changing the database triggers a kafka event, which is then rendered in the broadcast environment. The new solution should be able to hold up to 10,000 customer connections. 2000 already installed.
To ensure that more of these applications and customers can be added, Confluent President at CEMEA Illing wants to increase the workforce in Germany from the current 60 to 80 in 2023. When used correctly, streaming solutions can save an enormous amount of time and money.