All notable changes to this project will be documented in this file.
The format is base on Keep a Changelog, and this project adheres to Semantic Versioning.
- Use Confluent Cloud for Apache Flink to handle the Avro Flight Consolidator App.
- Upgraded the Snowflake Terraform Provider to version
1.0.0
. - Upgraded the AWS Terraform Provider to version
5.81.0
. - Upgraded the Confluent Terraform Provider to version
2.12.0
. - Issue #543.
- Upgraded the Apache Flink to version
1.20.0
. - Upgraded the Apache Iceberg to version
1.7.0
. - Now have Avro and JSON formatted data version of the Jave Flink App.
- Updated the AWS Terrafom Provider to version
5.75.0
. - Updated the Confluent Terrafom Provider to version
2.9.0
. - Updated the Snowflake Terrafom Provider to version
0.97.0
. - Issue #444.
- Issue #449.
- Issue #455.
- Issue #390.
- Issue #393.
- Issue #399.
- Issue #404.
- Issue #407.
- Issue #411.
- Issue #414.
- Issue #417.
- Issue #420.
- Issue #435.
- Issue #384.
- Issue #382.
- Issue #316.
How to create a User-Defined Table Function (UDTF) in PyFlink to fetch data from an external source for your Flink App?
markdown.
- Tweaked 'README.md`s.
- Tweaked code.
- Tweaked the
README.md
s.
- Issue #325.
- Tweaked main
README.md
.
- Added the
KNOWNISSUES.md
.
- Refactored the Java models.
- Refactored the Dockerfiles.
- Sink the data in DataGeneratorApp datastreams to Apache Iceberg Tables.
- No longer makes a distinction between languages when creating the Docker containers.
- Markdowns that explain in detail the
run-terraform-locally.sh
andrun-flink-locally.sh
, respectively.
- Tweaked main
README.md
.
- Typo in the comments on the
run-flink-locally.sh
BASH script.
- Tweaked the Java and Python
README.md
s, respectively. - Automatically zip the Python files via the
run-flink-locally.sh
BASH script.
- Converted FlightImporterApp.java to flight_importer_app.py.
- Converted UserStatisticsApp.java to user_statistics_app.py.
- Sink FlightImporterApp and UserStatisticsApp Apache Iceberg Tables.
- Use User-Defined Table Function in Python to call AWS Services.
- try-wtih-resources
- Terraform the AWS S3 bucket that is used for Apache Iceberg file storage.
- Updated the
run-terraform-locally.sh
Bash script handles both the plan/apply and destroy actions. - Terraform the AWS Secrets Manager Secrets for the Snowflake credentials.
- Terraform Snowflake resources.
- Add Unit Tests.
- Split the Terraform configuration into different files based on the jobs of each.
- Updated the
README.md
files. - Removed Project Nessie and MINIO docker containers from the
docker-compose.yml
- Upgrade Terraform AWS Provider to
5.66.0
, Terraform Snowflake Provider to0.95.0
, and Terraform Confluent Provider to2.1.0
. - Replaced deprecated Confluent and Snowflake resource/data blocks with updated resource/data blocks.
- Refactor
run-terraform-locally.sh
BASH script to accommodate all the new arguments for Snowflake. - Use the
service_account_user
variable as the secrets insert value, and to customize the naming of all the resources.
- Store and use the arguments pass to the
ConfluentClientConfigurationLookup
constructor.
- Added GitHub Workflow/Actions to execute the Terraform configuration in the cloud.
- Removed all deprecated calls in the creation of the Apache Flink custom Data Source.
- the
terraform.tfvars
is auto-generated in script. - new Apache Iceberg dependencies.
- commented the code in various modules.
- Updated
README.md
files - Refactored the project file organization.
- Updated main
README.md
file.
- Terraform the Confluent Cloud and AWS resources now.
- Created the
force-aws-secrets-destory.sh
Bash script. - Instructions to the main
README.md
to run Terraform locally.
- Updated
README.md
files. - Updated Kafka Topic names.
- Now pass
AWS_SESSION_TOKEN
to the Apache Flink docker containers.
- Created Docker containers for Apache Flink, and load them from the newly added
docker-compose.yml
. - Created the
run-flink.locally.sh
Bash script to get the AWS credentials and pass them todocker-compose
. - the
Glossary.md
- Refactored all the DAGs.
- Refactored the
README.md
files. - Now refer to the Flink Jobs as Flink Apps.
- Updated main
README.md
. - Fixed typo(s) in Java build.
- Fixed typo(s) in Java built DAGs.
- First release.