Bufstream is a drop-in replacement for Apache Kafka that's 10x cheaper to operate. While Bufstream works well for any Kafka workload, it excels when paired with Protobuf. By integrating with the Buf Schema Registry, Bufstream can enforce data quality and governance policies without relying on opt-in client configuration.
In this demonstration, you'll run a Bufstream agent locally, use the
franz-go
library to publish and consume messages, and explore
Bufstream's schema enforcement features.
Before starting, perform the following steps to prepare your environment to run the demo.
-
Clone this repo:
$ git clone https://github.com/bufbuild/bufstream-demo.git $ cd ./bufstream-demo
The Bufstream demo application simulates two small applications that produce and consume
EmailUpdated
event messages. The demo
producer publishes these events to a Bufstream topic, followed
by a separate consumer that fetches from the same topic and
"verifies" the change. The demo app uses the off-the-shelf Kafka client
franz-go
to interact with Bufstream.
The demo attempts to publish and consume three payloads:
- A semantically valid, correctly formatted version of the
EmailUpdated
message. - A correctly formatted, but semantically invalid version of the message.
- A malformed message.
Some relevant sections to check out:
- cmd/bufstream-demo-produce: The demo producer application.
- cmd/bufstream-demo-consume: The demo consumer application.
- proto/bufstream/demo/v1/demo.proto: Defines the
EmailUpdated
message type, which is produced and consumed from theemail-updated
topic. - pkg: Contains helper packages for both the producer and consumer.
-
Boot up the Bufstream and demo apps:
$ docker compose up --build
-
The app logs both the publishing of the events on the producer side and shortly after, the consumption of these events. The consumer deserializes the first two messages correctly, while the final message fails with a deserialization error.
At this point, you've used Bufstream as a drop-in replacement for Apache Kafka.
Thus far, Bufstream hasn't applied any quality enforcement on the data passing through it — addresses can be invalid or malformed. To ensure the quality of data flowing through Bufstream, you can configure policies that require data to conform to a pre-determined schema, pass semantic validation, and even redact sensitive information on the fly.
We'll demonstrate this functionality using the Buf Schema Registry, but Bufstream also works with any registry that implements Confluent's REST API.
-
Uncomment the
data_enforcement
block inconfig/bufstream.yaml
. -
Uncomment the
--csr
command options under thedemo
service indocker-compose.yaml
. -
To pick up the configuration changes, terminate the Docker apps via
ctrl+c
and restart them withdocker compose up
. -
The app again logs both the attempted publishing and consumption of the three events. The first event will successfully reach the consumer. But with Bufstream's data quality enforcement enabled, the second and third messages are rejected.
-
For the message that reaches the consumer, notice the empty
old_address
in the logs. This field has been redacted by Bufstream as it's labeled with thedebug_redact
option.
By configuring Bufstream with a few data quality and governance policies, you've ensured that consumers receive only well-formed, semantically valid data.
After completing the demo, you can stop Docker and remove credentials from your machine with the following commands:
docker compose down --rmi all
To learn more about Bufstream, check out the launch blog post, dig into the benchmark and cost analysis, or join us in the Buf Slack!