Um die unternehmerische Geschwindigkeit zu erhöhen, die Wettbewerbsfähigkeit durch neue Produkte und Services zu steigern und schnell auf plötzlich ändernde Markteinflüsse reagieren zu können, müssen Daten und Ereignisströme in Echtzeit geteilt, verarbeitet und ausgewertet werden können. Apache Kafka hat sich hier als Industrie-Standard für Event-Streaming etabliert. Ob Connected Car, Industrie 4.0 oder Customer 360 – alle diese zukunftsorientierten Themen benötigen schnelle Kommunikation, effiziente Vernetzung und eine Verarbeitung von enormen Datenmengen in Echtzeit.
7. Would you blindly cross the street with
traffic information that is 5 minutes old?
8. Event Streaming Enables New Outcomes
8
Cars / Transport
Without Event Streaming With Event Streaming
Call for driver availability
No knowledge of driver arrival
No data on feature usage
Real-time driver-rider match
Real-time ETA
Real-time sensor diagnostics
Banking Nightly updated account
balance
Batch fraud checks
Batch regulatory reporting
Real-time account updates
Real-time credit card fraud
alerts
Real-time regulatory reporting
Retail Post-order “out of stock” emails
No upsell through
personalization
Batch point-of sale reports
Real-time inventory
Real-time recommendations
Real-time sales reporting
9. The Rise of Event Streaming
2010
Apache Kafka
created at LinkedIn by
Confluent founders
2014
2020
80%
Fortune 100
Companies
trust and use
Apache Kafka
10. Event Streaming is the
Central Nervous System
for today’s enterprises.
Apache Kafka®
is the technology.
11. If you knew the state of every event in your
business and could reason on top of that data….
What problems would you solve?
13. Enterprise Data Platform Requirements Are Shifting
1 3 42
Scalable for
Transactional Data
Transient Raw dataBuilt for
Historical Data
Built for Real-
Time Events
Scalable for
ALL data
Persistent +
Durable
Enriched
data
● Value: Trigger
real-time workflows
(i.e. real-time order
management)
● Value: Scale across
the enterprise (i.e.
customer 360)
● Value: Build
mission-critical
apps with zero data
loss (i.e. instant
payments)
● Value: Add context &
situational awareness
(i.e. ride sharing ETA)
13
14. 14
Examples of Event Streaming Solutions in
Automotive based on Confluent (Kafka)
Traffic Data Quality Monitor
The client wants to ensure a consistent high
quality of traffic flow information for their
customers. For this, geolocation data from cars
can be used to compare provider data with real
field data across all regions.
Connection of Production Data to Big Data
Cluster
The client intends to leverage data from the
production process to increase quality and
reduce costs by identifying root causes of errors.
Deloitte connects the factory data to the big data
cluster.
Warranty Analytics Center
An automatic detection of outliers (including
dealers’ fraud) was implemented. Automatic
ranking by the cost-saving potential and
actuality of the detected pattern.
Connected Distribution
Cars can act as sensors as they are equipped with
GPS and mobile data connections. The car itself is
used as part of the Internet-of-Things (IoT) to track
the global internal Order to Delivery(OtD) process.
Streaming data analysis for vehicles
Streaming data analysis architecture to support
connected car use cases. The architecture was
validated through a fully functional prototype
including a user interface using connected car
traffic data from the client’s fleet.
Predictive Maintenance
Supporting the initiative Industry 4.0 the client
developed a proof of concept for detection of
potential failures within the production processes.
Deloitte supports the industrialization of this PoC
using Big Data technologies to ensure a
continuous analysis of the streamed data.
Real-time Production
The client places high emphasis on the
manufacturing of vehicles and their capability
to recognize bottlenecks or quality issues.
Deloitte delivered an end-to-end application to
stream machine logs in near real-time with the
goal to achieve quality and time improvements.
15. Confluent Platform
Fully Managed Cloud
Service
Self Managed Software FREEDOM OF CHOICE
COMMITTER-DRIVEN EXPERTISE PartnersTrainingProfessional
Services
Enterprise
Support
Apache Kafka
EFFICIENT
OPERATIONS AT SCALE
PRODUCTION-
STAGE PREREQUISITES
UNRESTRICTED
DEVELOPER PRODUCTIVITY
SQL-based Stream
Processing
KSQL (ksqlDB)
Rich Pre-built Ecosystem
Connectors | Hub | Schema
Registry
Multi-language Development
non-Java clients | REST Proxy
GUI-driven Mgmt & Monitoring
Control Center
Flexible DevOps Automation
Operator | Ansible
Dynamic Performance & Elasticity
Auto Data Balancer | Tiered Storage
Enterprise-grade Security
RBAC | Secrets | Audit logs
Data Compatibility
Schema Registry | Schema
Validation
Global Resilience
Multi-Region Clusters |
Replicator
Developer Operator Architect
Open Source | Community licensed | Enterprise Support
PARTNERSHIP
FOR BUSINESS SUCCESS
Complete Engagement Model
Revenue / Cost / Risk Impact
TCO / ROI
Management