The application creates test H3 Hexagon Hierachial data in Kafka then loads to SingleStore via Pipelines
This project was generated with Python 3 and Docker.
- Setup Kafka Server as needed - Kafka Quick Setup
- Obtain application server - can use AWS - Can share server with Kafka
- In AWS add the application server to the Security Group's Inbound Rules for the Kafka Server
- Install Docker -
sudo apt install docker.io docker-compose nmon kafkacat -y
- Add Docker Group to User
sudo usermod -a -G docker ubuntu
- Relogin for the user to gain access to Docker.
- Make a local copy of the application code found on GitHub by
git clone https://github.com/JohnRTurner/h3hexagon.git
- Build the Docker image
docker build h3hexagon -t h3hexagon
- Run the Image
docker run -d --name h3hexagon -e KAFKA_SERVER=$(hostname):29092 -e KAFKA_TOPIC=h3hexagon -e RESOLUTION=7 -t h3hexagon
- View the logs
docker logs -f h3hexagon
- Proceed to loading the data SingleStore Setup
Option | Description |
---|---|
BATCH_SIZE | Batch Size |
KAFKA_TOPIC | Kafka Topic Name -Will Create |
PROC_COUNT | Processes to Concurrently Run |
KAFKA_SERVER | Kakfka Server |
Can view the code on GitHub
Filename | Description |
---|---|
main.py | Main module takes parameters and runs generator |
h3csv.py | Module to create gzip'ed CSV files |
kafka.py | wrapper for Kafka calls |
README.md | This file |
.dockerignore | Files not to copy to the repository |
Dockerfile | File to generate docker image |
requirements.txt | Python library requirements |
kafkasetup/README.md | Instructions to setup Kafka docker |
kafkasetup/docker-compose.yml | Sample docker-compose.yml |
singlestoresetup/README.md | Instructions to setup SingleStore with Pipelines |