Skip to content

Commit

Permalink
Merge pull request #174 from mapswipe/dev
Browse files Browse the repository at this point in the history
Deploy Script
  • Loading branch information
Hagellach37 authored Sep 24, 2019
2 parents f681335 + 4524dec commit 905583f
Show file tree
Hide file tree
Showing 7 changed files with 35 additions and 28 deletions.
13 changes: 9 additions & 4 deletions deploy.sh
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
git pull
python3 test_config.py
sudo docker-compose build --no-cache
sudo docker-compose up -d --force-recreate
sudo docker logs firebase_deploy
sudo docker ps -a
docker-compose build --no-cache postgres
docker-compose build --no-cache firebase_deploy
docker-compose build --no-cache mapswipe_workers
docker-compose build --no-cache manager_dashboard
docker-compose build --no-cache nginx
docker-compose build --no-cache api
docker-compose up -d --force-recreate postgres firebase_deploy mapswipe_workers manager_dashboard nginx api
docker logs firebase_deploy
docker ps -a
10 changes: 2 additions & 8 deletions docs/source/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,7 @@ Firebase is a central part of MapSwipe. In our setup we use *Firebase Database*,
1. `your_project_id`: This is the name of your Firebase project (e.g. *dev-mapswipe*)
2. `your_database_name`: This is the name of your Firebase database. It is very likely that this will be the same as your Firebase project name as well.)

The `mapswipe_workers` module uses the [Firebase Python SDK](https://firebase.google.com/docs/reference/admin/python) to access *Firebase Database* services as administrator, you must generate a Firebase Service Account Key in JSON format. You can get it from Firebase.
1. In the Firebase console, open Settings > Service Accounts.
2. Click Generate New Private Key
3. Store the JSON file under `mapswipe_workers/config/serviceAccountKey.json`
The `mapswipe_workers` module uses the [Firebase Python SDK](https://firebase.google.com/docs/reference/admin/python) to access *Firebase Database* services as administrator, you must generate a Service Account Key file in JSON format. For this we use the previously generated Service Account Key. (Check the *Google APIs and Services Credentials* section again if you don't have it.) Copy the file to `mapswipe_workers/config/serviceAccountKey.json`.

The `mapswipe_workers` module further uses the [Firebase Database REST API](https://firebase.google.com/docs/reference/rest/database) to access *Firebase Database* either as a normal user or project manager.

Expand Down Expand Up @@ -98,10 +95,7 @@ First, create a new cloud storage bucket:
3. Select storage location > `Multi-Region` > `eu`
4. Select storage class > `Coldline`

Then, generate a Google Cloud Service Account Key:
1. Google Cloud Platform > IAM & Management > Service Accounts
2. Create new Service Account > Select Name > e.g. `your_project_id_postgres_backup`
3. Select Role > `Storage-Administrator`
We need to access Google Cloud Storage. For this we use the previously generated Service Account Key. (Check the *Google APIs and Services Credentials* section again if you don't have it.) Copy the file to `postges/serviceAccountKey.json`.

```bash
GCS_Link_URL=https://console.cloud.google.com/storage/browser/your_project_id_postgres_backup
Expand Down
10 changes: 5 additions & 5 deletions mapswipe_workers/tests/test_00_main.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
python test_02_create_projects.py
python test_01_create_projects.py
if [[ $? = 0 ]]; then
echo "success"
else
Expand All @@ -7,7 +7,7 @@ else
fi


python test_03_mapping.py
python test_02_mapping.py
if [[ $? = 0 ]]; then
echo "success"
else
Expand All @@ -16,7 +16,7 @@ else
fi


python test_04_firebase_to_postgres.py
python test_03_firebase_to_postgres.py
if [[ $? = 0 ]]; then
echo "success"
else
Expand All @@ -25,7 +25,7 @@ else
fi


python test_05_generate_stats.py
python test_04_generate_stats.py
if [[ $? = 0 ]]; then
echo "success"
else
Expand All @@ -34,7 +34,7 @@ else
fi


python test_06_clean_up.py
python test_05_clean_up.py
if [[ $? = 0 ]]; then
echo "success"
else
Expand Down
2 changes: 2 additions & 0 deletions postgres/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# remove folder from build context
/scripts
17 changes: 11 additions & 6 deletions postgres/scripts/v1_to_v2/copy_groups_from_csv.sql
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,16 @@ UPDATE v1_groups
SET progress = 100
WHERE required_count <= 0;

UPDATE v1_groups
SET group_id = cast(v1_group_id as varchar);

UPDATE groups
SET (finished_count, required_count, progress) =
(SELECT finished_count, required_count, progress
FROM v1_groups
WHERE groups.group_id = v1_groups.group_id
AND groups.project_id = v1_groups.project_id);

-- Insert or update data of temp table to the permant table.
-- Note that the special excluded table is used to
-- reference values originally proposed for insertion
Expand All @@ -48,9 +58,4 @@ SELECT
project_type_specifics
FROM
v1_groups
ON CONFLICT (project_id, group_id) DO UPDATE
SET
finished_count = excluded.finished_count,
required_count = excluded.required_count,
progress = excluded.progress
;
ON CONFLICT (project_id, group_id) DO NOTHING;
2 changes: 1 addition & 1 deletion postgres/scripts/v1_to_v2/copy_to_csv.sql
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
-- Export v1 MapSwipe data to csv.
-- Rename attributes to conform to v2.
\copy (SELECT archive, image, importkey as "import_id", isfeatured AS "is_featured", lookfor AS "look_for", name, progress, projectdetails AS "project_details", project_id, project_type, state AS "status", info AS "project_type_specifics" FROM projects WHERE project_id in (select project_id from results group by project_id) AND project_id = 5549) TO projects.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);
\copy (SELECT i.import_id, i.info FROM imports i, projects p WHERE import_id in (select importkey as import_id from projects) AND p.project_id = 5549 AND p.importkey = i.import_id) TO imports.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);
\copy (SELECT i.import_id, i.info FROM imports i , projects p WHERE import_id in (select importkey as import_id from projects) AND p.project_id = 5549 AND p.importkey = i.import_id) TO imports.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);
\copy (SELECT project_id, group_id, count as "number_of_tasks", completedcount as "finished_count", verificationcount as "required_count", info as "project_type_specifics" FROM groups WHERE project_id in (select project_id from results group by project_id) AND project_id = 5549 ) TO groups.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);
\copy (SELECT project_id, group_id, task_id, info as "project_type_specifics" FROM tasks WHERE project_id in (select project_id from results group by project_id) AND project_id = 5549) TO tasks.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);
9 changes: 5 additions & 4 deletions postgres/scripts/v1_to_v2/generate_copy_to_csv.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,18 +20,19 @@ def get_query(project_ids):
f'-- Rename attributes to conform to v2.\n' \
f'\copy (SELECT archive, image, importkey as "import_id", isfeatured AS "is_featured", lookfor AS "look_for", name, progress, projectdetails AS "project_details", project_id, project_type, state AS "status", info AS "project_type_specifics" FROM projects {clause}) TO projects.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);\n' \
f'\copy (SELECT i.import_id, i.info FROM imports i {clause_import}) TO imports.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);\n' \
f'\copy (SELECT project_id, group_id, count as "number_of_tasks", completedcount as "finished_count", verificationcount as "required_count", info as "project_type_specifics" FROM groups {clause} ) TO groups.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);\n' \
f'\copy (SELECT project_id, group_id as "v1_group_id", count as "number_of_tasks", completedcount as "finished_count", verificationcount as "required_count", info as "project_type_specifics" FROM groups {clause} ) TO groups.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);\n' \
f'\copy (SELECT project_id, group_id, task_id, info as "project_type_specifics" FROM tasks {clause}) TO tasks.csv WITH (FORMAT CSV, DELIMITER ",", HEADER TRUE);\n'
return query


def get_result_query(project_ids):
clause = 'WHERE project_id in (SELECT project_id FROM projects GROUP BY project_id)'
if project_ids is None:
clause = ''
pass
elif len(project_ids) == 1:
clause = f'WHERE project_id = {project_ids[0]}'
clause = f'{clause} AND project_id = {project_ids[0]}'
else:
clause = f'WHERE project_id = {project_ids[0]}'
clause = f'{clause} AND project_id = {project_ids[0]}'
for project_id in project_ids[1:]:
clause = clause + f' OR project_id = {project_id}'

Expand Down

0 comments on commit 905583f

Please sign in to comment.