Powers the central services required to run the following projects:
If you wish to contribute, please join our Discord located at: https://discord.gg/7xF65ap
The PS2Alerts project utilises Kubernetes for its deployment and containerisation solution. It matches current infrastructure, and it solves a TON of headaches when it comes to getting code out to the world. Particularly SSL certificates. Fuck SSL certificate management. Locally however, for the sake of lowering local dev environment complexity and "things doing weird stuff" we're using Ansible to provision and maintain the local development environment. It will install not only the services required to run the application, but also a set of standardized commands shared across all developers.
Linux Debian and Mac OSX Catalina are the only supported operating systems for development. It can be done in Windows, but it's a hassle. If you feel the need to develop on Windows, WSL2 will be much easier to set up. Mac OSX does work with homebrew
filling some gaps. Windows is not officially supported.
If you use windows
- Install using Powershell with admin rights enter:
wsl --install
- Restart and ubuntu will be available as an app or in the windows terminal
- When you first start ubuntu you will have to choose your username and password(this is your sudo password)
- To check WSL version type
wsl -l -v
in powershell - For Docker you can install windows version and then have that be integrated with WSL by ticking this checkbox and applying wsl intergration in settings Install ansible as given from the requirements and mongodb compass if you are working with data.
- Then add these to your hosts file to get to your markdown paste this
c:\Windows\System32\Drivers\etc\hosts
into explorer
127.0.0.1 dev.api.ps2alerts.com
127.0.0.1 dev.router.ps2alerts.com
127.0.0.1 dev.ps2alerts.com
127.0.0.1 dev.aggregator.ps2alerts.com
127.0.0.1 dev.aggregator-ps4eu.ps2alerts.com
127.0.0.1 dev.aggregator-ps4us.ps2alerts.com
- To ensure that the project will run properly
NVM install here then type
nvm install --lts
for a long term support version of node.js Use this to get yarn vianpm install --global yarn
Thenyarn install
in all of the repos that you cloned to ensure that you have the files necessary. Then in the stack you can runps2alerts-init
Followed byps2alerts-website-init
for the first time And thenps2alerts-website-dev
when you run this project again in the future - Checking the site is working
Go to
http://localhost:8080
to check that traefik is showing the services are running properly Thendev.ps2alerts.com
and you will get a https warning but you can click on advanced on Firefox/edge and continue. Note due to not having ssl you won't have any data but you can see that the site itself is working. - To get https follow the Generating SSL Certificates steps.
- Ansible
- Docker (including post install steps for Linux)
- A terminal program / PowerShell. For Linux I recommend Terminator.
- A good IDE. I recommend PHPStorm (paid) / IntelliJ IDEA (free).
- MongoDB Compass - if you're going to be interacting with data
Ensure you have git cloned all 4 projects in the organisation down to your local machine. You need to have them all as siblings, e.g:
/path/to/your/code/folder/ps2alerts
-- aggregator
-- api
-- stack
-- website
Run command ansible-playbook init.yml -K
and provide your sudo password. Ansible will ensure you have the correct commands etc.
Simply execute ps2alerts-init
in your terminal to begin! This will go off to each project and install all it's dependencies for each project. Grab a snickers and once it's done, we're good to go.
Once the project has fully initialized, you can start the project from now on using ps2alerts-start
!
We have designed the API project to initialize the database for you. See "How to get data collection going" section below.
If you want to access the dev environment over HTTPS (Say, because Chrome forces it and the main domain uses HSTS), you'll need to follow some additional steps:
Click to expand certificate instructions
The following instructions are based on Mac (alternatives for Linux are listed). YMMV without Mac.
- The certificates must be generated and placed in the
~/ps2alerts/certs
directory. To generate local self-signed certs, we're going to usemkcert
. Visit here for more context.cd ~/ps2alerts/certs
brew install mkcert nss
(nss is for Firefox, you can omit it if you don't use Firefox). Further install methodsmkcert -install
(this will install the root certificate into your system, you will be prompted for your user password)mkcert -cert-file dev.ps2alerts.com.pem -key-file dev.ps2alerts.com-key.pem dev.ps2alerts.com dev.api.ps2alerts.com dev.aggregator.ps2alerts.com dev.aggregator-ps4eu.ps2alerts.com dev.aggregator-ps4us.ps2alerts.com
- Restart the stack with
ps2alerts-stack-restart
if you're already running it and start it again withps2alerts-stack-start
to apply the rest of the certificate changes.
ps2alerts-start
ps2alerts-aggregator-msg
- Choose an open continent etc
- Open rabbit or run
ps2alerts-aggregator-msg
and see it doing things - Now you have data!
We are using MongoDB as our data document storage solution. In order to connect to Mongo for local development work, open up the Mongo Compass client and put the following in the connection string:
mongodb://root:foobar@localhost:27017
The project utilises RabbitMQ (MQs 101) for both storage of incoming data to be consumed by the API, and for administration of the aggregator. Once you have started the stack, you can access the dev environment version of RabbitMQ by going to the following URL:
Using credentials: guest
| guest
There, you can see the channels and queues created by us, and is provisioned via the Ansible script. We are currently creating an exchange (for future purposes) but are directly asserting and binding queues in our applications for now. On local dev, we don't use a vhost, on production we do as it's a shared service.
Below describes our queue topics:
- aggregatorAdmin- - Administrative messages manually triggered by developers, e.g.
instance metagame start 10 8
to start a metagame instance on Miller, Esamir. To inject messages into your local environment, runps2alerts-aggregator-msg
. - api-queue- - Messages to be consumed by the API and persisted, one queue per environment
- api-queue-delay-46min- - In order to figure out brackets, all GlobalAggregator messages must be delayed until the end of the alert, this is the 46 minute alert version.
- api-queue-delay-91min- - Ditto - 91 minute version
Chances are Rabbit is toppling due to a massive message backlog. To fix this, simply wipe the data partition for Rabbit's volume in Docker. On Mac, this is rm -rf ~/ps2alerts/mq/*
, should be similar for linux. Restart Rabbit.
We use grafana to monitor metrics from within the applications.
- Create / update dashboards, alerts and other matters on Grafana.
- NOTE: If you plan to edit alerts, you should be aware that Alert provisioning in Grafana is currently quite tedious. You are able to use the file provisioner to create the alerts, however after they are created you cannot edit them without copying them first. Unfortunately, this will mean you have two versions of the same alert (with one of them slightly tweaked).
- You will also need to export the alerts to the YML file and replace your new alert with the old one then commit that corrected file.
- It is recommended you do the following:
- Provision the usual set of alerts by uncommenting the line in
start-dev.yml
. - Create a copy of your alert
- Edit the alert
- Export the alert list
- Merge the alert into the
alerts.yml
undermonitoring/provisioning/alerting/alerts.yml
- Delete your "new" alert
- Restart the grafana container
- If you're finding that the alerts are being duplicated, you need to blow away your grafana instance.
sudo rm -rf ~/ps2alerts/monitoring/grafana && docker restart ps2alerts-grafana
- This is not ideal, but unfortunately it is the faff that presents itself with grafana alerts right now.
- Provision the usual set of alerts by uncommenting the line in
- Alternatively, you may wish to simply create the alert you have in question and merge that with the
alerts.yml
. This will mean you don't have all alerts, but at least you won't have alerts you cannot control.
- Export Dashboards / Alerts to their JSON models
- Dashboards
- Go to the dashboard's settings and look under JSON model. Copy and paste this model to
monitoring/lib/dashboards/<folder>/<dashboard>.json
- Go to the dashboard's settings and look under JSON model. Copy and paste this model to
- Alerts
- Go to the alert settings, click on Export, save the file to
monitoring/provisioning/alerting/alerts.yml
- Go to the alert settings, click on Export, save the file to
- Dashboards
- Commit these files
Deployment is currently manually done by Maelstromeous. Upon merge into master
, run ./sync-to-ceres.sh