Generate avro schemas from python dataclasses, Pydantic models and Faust Records. Code generation from avro schemas. Serialize/Deserialize python instances with avro schemas.
-
Updated
Dec 12, 2024 - Python
Generate avro schemas from python dataclasses, Pydantic models and Faust Records. Code generation from avro schemas. Serialize/Deserialize python instances with avro schemas.
A simple Python-based distributed workflow engine
Python bindings for RocksDB used by faust-streaming
Projects completed in the Udacity Data Streaming Nanodegree program. Tech used: Apache Kafka, Kafka Connect, KSQL, Faust Stream Processing, Spark Structured Streaming
The goal of this project is aimed at optimizing Bank Marketing Model through building an event streaming pipeline around Apache Kafka and its ecosystem that communicates with a Machine learning model microservice. Utilizing this to display the likelihood and status of Bank Customers in real time.
A highly-configurable, real-time data quality monitoring tool designed for streaming data
Django with Kafka, Debezium, and Faust for Email Sending using Change Data Capture
Graph streaming framework
This project cover latest skills to process data in real-time by building fluency in modern data engineering tools, such as Apache Spark, Kafka, Spark Streaming, and Kafka Streaming.
An end-to-end, web event stream processing pipeline
Add a description, image, and links to the faust-streaming topic page so that developers can more easily learn about it.
To associate your repository with the faust-streaming topic, visit your repo's landing page and select "manage topics."