-
Notifications
You must be signed in to change notification settings - Fork 6
3. Example
It is possible to create random logs to test the project just by running the commands above.
First of all, it is necessary to up the main container, the proxy server and Kibana if you want to visualize the data on Elasticsearch:
sudo docker-compose up -d buffalogs nginx kibana
Then, load the Elastic Common Schema template with:
buffalogs_module/config/elasticsearch$ ./load_templates.sh
At the end, you have to create the random test data launching the Python script from the buffalogs_module/examples folder:
python random_example.py
Now, you can see the logs data newly uploaded at localhost:5601
At this stage, it is your choice to run the application manually or automatically (using Celery).
To run the application manually, launch the command above from buffalogs_module/buffalogs:
python manage.py impossible_travel
To run it in an automated way, just start up all the tools with:
sudo docker-compose up -d
The previous command will start up the remaining containers. In particular, this will activate Celery and Celery Beat, respectively the distributed task queue and the scheduler.
In both cases, the results are available at localhost:80
Just the details of the logins with different user agents or countries