Skip to content

Commit

Permalink
Adds README for gp service
Browse files Browse the repository at this point in the history
  • Loading branch information
EdisonOrellana-NOAA committed Sep 30, 2024
1 parent 7335f68 commit 32f29eb
Show file tree
Hide file tree
Showing 2 changed files with 32 additions and 2 deletions.
2 changes: 1 addition & 1 deletion Core/LAMBDA/viz_functions/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -948,7 +948,7 @@ resource "aws_lambda_function" "viz_publish_service" {
GIS_USERNAME = "hydrovis.proc"
PUBLISH_FLAG_BUCKET = var.python_preprocessing_bucket
S3_BUCKET = var.viz_authoritative_bucket
SD_S3_PATH = "viz_sd_files/"
SD_S3_PATH = "viz_sd_files"
SERVICE_TAG = local.service_suffix
EGIS_DB_HOST = var.egis_db_host
EGIS_DB_DATABASE = var.egis_db_name
Expand Down
Original file line number Diff line number Diff line change
@@ -1 +1,31 @@
{\rtf1}
# Publish/Republish Geoprocessing Service
*This service is used in conjunction with the `Core/LAMBDA/viz_functions/viz_publish_service/lambda_function.py` LAMBDA function for automatically publishing/republishing Hydrovis services.*

### Overview
This geoprocessing service takes a .mapx file as input, and outputs a .sd (service definition) file. This service definition file is what the viz_publish_service lambda uses to publish services.

#### Environment variables for Publish Service
As enumerated in `Core\LAMBDA\viz_functions\main.tf`
```
GIS_PASSWORD = var.egis_portal_password
GIS_HOST = local.egis_host
GIS_USERNAME = "hydrovis.proc"
PUBLISH_FLAG_BUCKET = var.python_preprocessing_bucket
S3_BUCKET = var.viz_authoritative_bucket
SD_S3_PATH = "viz_sd_files"
SERVICE_TAG = local.service_suffix
EGIS_DB_HOST = var.egis_db_host
EGIS_DB_DATABASE = var.egis_db_name
EGIS_DB_USERNAME = jsondecode(var.egis_db_user_secret_string)["username"]
EGIS_DB_PASSWORD = jsondecode(var.egis_db_user_secret_string)["password"]
ENVIRONMENT = var.environment
```

### How to install the geoprocessing service on the ArcGIS Server stack

1. On an EC2 with ArcPro, copy the script in this PR (`mapxtosd.py`) to the computer, and include in that same directory a folder called `files` with `Empty_Project.aprx` in it (this is needed for the script, and can be found in the same utils folder of the repo with the script file). You will also need to sign-in to the Portal of the EGIS you're planning to publish to with the admin credentials.
2. In Catalog, create a new toolbox (regular is fine, doesn't need to be a Python Toolbox), and create a new script inside of that toolbox.
3. Set the name and set the Script File to point to the script contained in this PR, and check the box for Import script (no other options of the first page need to be checked). On the parameters page, setup parameters for all the input arguments listed in the script (make sure egis_db_password is optional, as retrieving the secret as an environment variable is the more secure method). Click OK to create the script tool:
4. Open the new script tool, and setup a test run with a real service. Getting it to run successfully may take some troubleshooting.
5. Once it runs successfully, click `Open History`, right click on the successful run, go to `Share As` -> `Share Web Tool` (this will be greyed out if you're not logged in as the admin account). Assign the description and tags that you want, choose to copy data to the server, and choose the Server to publish to (I chose GP in TI, which makes sense... but this will need to be done in UAT and Prod in coordination with the EGIS folks, to ensure that it's the server they want running these workflows.). Don't share with everyone (e.g. public), only authenticated users:
6. When you validate, you'll probably get some warnings about Empty_Project.aprx and some connection string being uploaded to the server, that's fine and good. You should be able to test the tool through a REST job, as is possible in the test service I created here: https://maps-testing.water.noaa.gov/gp/rest/services/

0 comments on commit 32f29eb

Please sign in to comment.