-
Notifications
You must be signed in to change notification settings - Fork 6
3. Example
Lorena Goldoni edited this page Jul 25, 2023
·
5 revisions
It is possible to create random logs to test the project just by running the commands below:
- Launch Elasticsearch and Kibana with the
docker-compose -f docker-compose.yaml -f docker-compose.elastic.yaml up -d elasticsearch kibana
command` - Load the Elastic Common Schema template running the
./load_templates.sh
script - Go on Kibana at
localhost:5601
→ Stack Management → Index Patterns and create a new Index pattern withcloud-*
as name and select@timestamp
for the timestamp field - Create the random test data launching the Python script from the buffalogs_module/examples folder:
python random_example.py
- Check: Now, you should be able to visualize 2000 Docs count at Stack management → Index Management for the
cloud-<today_date>
index
And you can analyze the logs data newly uploaded at localhost:5601
At this stage, it is your choice to run the application manually or automatically (using Celery).
To run the application manually, launch the management command below from buffalogs_module/buffalogs:
python manage.py impossible_travel
You can also clear all the data saved in the database just running:
python manage.py clear_models
To run it in an automated way, just start up all the tools with:
sudo docker-compose up -d
In both cases, the results are available at localhost:80
Just the details of the logins with different user agents or countries