Building Real-Time Pipelines With FLaNK
Timothy Spann, Principal Developer Advocate, Streaming - Cloudera Future of Data meetup, startup grind, AI Camp
The combination of Apache Flink, Apache NiFi, and Apache Kafka for building real-time data processing pipelines is extremely powerful, as demonstrated by this case study using the FLaNK-MTA project. The project leverages these technologies to process and analyze real-time data from the New York City Metropolitan Transportation Authority (MTA). FLaNK-MTA demonstrates how to efficiently collect, transform, and analyze high-volume data streams, enabling timely insights and decision-making.
Apache NiFi
Apache Kafka
Apache Flink
Apache Iceberg
LLM
Generative AI
Slack
Postgresql
3. B102: Enabling Real-Time Analytics
Wednesday, May 08 2024
12:00 p.m. - 12:45 p.m.
Real-time analytics contributes to building scalable and fault-tolerant data processing
pipelines.
Building Real-Time Pipelines With FLaNK
Timothy Spann, Principal Developer Advocate - Cloudera
The combination of Apache Flink, Apache NiFi, and Apache Kafka for building real-time data
processing pipelines is extremely powerful, as demonstrated by this case study using the
FLaNK-MTA project. The project leverages these technologies to process and analyze
real-time data from the New York City Metropolitan Transportation Authority (MTA).
FLaNK-MTA demonstrates how to efficiently collect, transform, and analyze high-volume data
streams, enabling timely insights and decision-making.
4. 4
Tim Spann
Twitter: @PaasDev // Blog: datainmotion.dev
Principal Developer Advocate.
ex-Pivotal, ex-Hortonworks, ex-StreamNative,
ex-PwC, ex-HPE, ex-E&Y.
https://medium.com/@tspann
https://github.com/tspannhw
12. 12
Already using Kafka? Already using NiFi? Need for Fast Flink?
Simple setup for many tables
Want metadata augmented data
Don’t need low latency?
Visual monitoring
Easy manual scaling
Easy to combine with NiFi
Debezium
Simple JDBC queries?
Transform individual records?
Want easy development with UI?
Lots of small files, events, records, rows?
Continuous stream of rows
Support many different sources
Debezium coming
Strong control of table and joins
Want high Throughput?
Want Low Latency?
Want Advanced Windowing and State?
Automatic records immediately
Pure SQL
Debezium
Kafka Connect, NiFi, Flink? Which engine to choose? Or All 3?
13. FLaNK Pipelines
External Context Ingest
Ingesting, routing, clean, enrich, transforming,
parsing, chunking and vectorizing structured,
unstructured, semistructured, binary data and
documents
Prompt engineering
Crafting and structuring queries to optimize
LLM responses
Context Retrieval
Enhancing LLM with external context such as
Retrieval Augmented Generation (RAG)
Roundtrip Interface
Act as a Discord, REST, Kafka, SQL, Slack bot to
roundtrip discussions
25. FLaNK for Halifax Canada Transit —
NiFi, Kafka, Flink, SQL, GTFS-RT | by
Tim Spann | Cloudera | Dec, 2023 |
Medium
Never Get Lost in the Stream.
NiFi-Kafka-Flink for getting to work… |
by Tim Spann | Cloudera | Dec, 2023 |
Medium
Iteration 1: Building a System to
Consume All the Real-Time Transit
Data in the World At Once | by Tim
Spann | Cloudera | Medium
Watching Airport Traffic in Real-Time
| by Tim Spann | Cloudera | Medium
Building Real-Time Pipelines With FLaNK
28. • Record Readers - Avro, CSV, Grok, IPFIX, JSAN1, JSON,
Parquet, Scripted, Syslog5424, Syslog, WindowsEvent, XML
• Record Writers - Avro, CSV, FreeFromText, Json, Parquet,
Scripted, XML
• Record Reader and Writer support referencing a schema
registry for retrieving schemas when necessary.
• Enable processors that accept any data format without
having to worry about the parsing and serialization logic.
• Allows us to keep FlowFiles larger, each consisting of
multiple records, which results in far better performance.
UNSTRUCTURED DATA WITH NIFI
37. Parse Addresses
● Python 3.10+
● PYAP Library
● Simple Library if your text includes an
address
● Address Parsing
● Address Detecting
● MIT Licensed
● Looking at other libraries, GenAI, DL, ML
https://github.com/tspannhw/FLaNK-python-processors
38. Address To Lat/Long
● Python 3.10+
● geopy Library
● Nominatim
● OpenStreetMaps (OSM)
● openstreetmap.org/copyright
● Returns as attributes and JSON file
● Works with partial addresses
● Categorizes location
● Bounding Box
https://github.com/tspannhw/FLaNKAI-Boston
40. Let’s do a metamorphosis on your data. Don’t fear changing data.
You don’t need to be a brilliant writer to stream
data.
Franz Kafka was a
German-speaking Bohemian
novelist and short-story writer,
widely regarded as one of the
major figures of 20th-century
literature. His work fuses elements
of realism and the fantastic.
--Wikipedia
YES, FRANZ, IT’S KAFKA
41. ● Open Source
● Log
● Distributed Event Store
● Highly Scalable, Exactly Once
● High-throughput, Low-latency
● Binary TCP-based protocol that is optimized for efficiency
● Source/Sinks: Debezium CDC, JDBC, Kafka, HTTP, JMS,
InfluxDB, HDFS, Kudu, S3, Syslog, MQTT, SFTP, MQTT
47. ● Open Source Performant Format for Large Analytic
Tables
● Support for multiple engines like Spark, Hive, Impala,
Trino, Flink, Presto and more.
● ACID Transactions
● Time Travel
● Rollback
● Partitioning
● Data Compaction
● Schema Evolution
48. FLINK & ICEBERG INTEGRATION
Robust Next Generation Architecture for Data Driven Business
Unified Processing Engine Massive Open table format
• Maximally open
• Maximally flexible
• Ultra high performance for MASSIVE data
• Can be used as Source and Sink
• Supports batch and streaming modes
• Supports time travel
51. CSP Community Edition
● Docker compose file of CSP to run from command line w/o any
dependencies, including Flink, SQL Stream Builder, Kafka, Kafka
Connect, Streams Messaging Manager and Schema Registry.
○ $>docker compose up
● Licensed under the Cloudera Community License
● Unsupported Commercially (Community Help - Ask Tim)
● Community Group Hub for CSP
● Find it on docs.cloudera.com (see QR Code)
● Kafka, Kafka Connect, SMM, SR, Flink, Flink SQL, MV, Postgresql, SSB
● Develop apps locally
52. Open Source Edition
• Apache NiFi in Docker
• Try new features
quickly
• Develop applications
locally
● Docker NiFi
○ docker run --name nifi -p 8443:8443 -d -e
SINGLE_USER_CREDENTIALS_USERNAME=admin -e
SINGLE_USER_CREDENTIALS_PASSWORD=ctsBtRBKHRAx69EqUghv
vgEvjnaLjFEB apache/nifi:latest
● Licensed under the ASF License
● Unsupported
● NiFi 1.25 and NiFi 2.0.0-M2
https://hub.docker.com/r/apache/nifi
53. https://github.com/tspannhw/FLaNK-Transit
SELECT n.speed, n.travel_time, n.borough, n.link_name, n.link_points,
n.latitude, n.longitude, DISTANCE_BETWEEN(CAST(t.latitude as STRING),
CAST(t.latitude as STRING),
m.VehicleLocationLatitude, m.VehicleLocationLongitude) as miles,
t.title, t.`description`, t.pubDate, t.latitude, t.longitude,
m.VehicleLocationLatitude, m.VehicleLocationLongitude,
m.StopPointRef, m.VehicleRef,
m.ProgressRate, m.ExpectedDepartureTime, m.StopPoint,
m.VisitNumber, m.DataFrameRef, m.StopPointName,
m.Bearing, m.OriginAimedDepartureTime, m.OperatorRef,
m.DestinationName, m.ExpectedArrivalTime, m.BlockRef,
m.LineRef, m.DirectionRef, m.ArrivalProximityText,
m.DistanceFromStop, m.EstimatedPassengerCapacity,
m.AimedArrivalTime, m.PublishedLineName,
m.ProgressStatus, m.DestinationRef, m.EstimatedPassengerCount,
m.OriginRef, m.NumberOfStopsAway, m.ts
FROM jsonmta /*+ OPTIONS('scan.startup.mode' = 'earliest-offset') */ m
FULL OUTER JOIN jsontranscom /*+ OPTIONS('scan.startup.mode' = 'earliest-offset') */ t
ON (t.latitude >= CAST(m.VehicleLocationLatitude as float) - 0.3)
AND (t.longitude >= CAST(m.VehicleLocationLongitude as float) - 0.3)
AND (t.latitude <= CAST(m.VehicleLocationLatitude as float) + 0.3)
AND (t.longitude <= CAST(m.VehicleLocationLongitude as float) + 0.3)
FULL OUTER JOIN nytrafficspeed /*+ OPTIONS('scan.startup.mode' = 'earliest-offset') */ n
ON (n.latitude >= CAST(m.VehicleLocationLatitude as float) - 0.3)
AND (n.longitude >= CAST(m.VehicleLocationLongitude as float) - 0.3)
AND (n.latitude <= CAST(m.VehicleLocationLatitude as float) + 0.3)
AND (n.longitude <= CAST(m.VehicleLocationLongitude as float) + 0.3)
WHERE m.VehicleRef is not null
AND t.title is not null
I Can Haz
Data?