Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Undeploy SFMS #3828

Merged
merged 1 commit into from
Aug 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 0 additions & 20 deletions .github/workflows/deployment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -99,26 +99,6 @@ jobs:
oc login "${{ secrets.OPENSHIFT_CLUSTER }}" --token="${{ secrets.OC4_DEV_TOKEN }}"
bash openshift/scripts/oc_provision_nats_server_config.sh ${SUFFIX} apply

deploy-sfms-dev:
name: Deploy SFMS API to dev
if: github.triggering_actor != 'renovate'
needs: [build-api-image, deploy-dev-queue, configure-nats-server-name]
runs-on: ubuntu-22.04
steps:
- name: Set Variables
shell: bash
run: |
echo "SUFFIX=pr-${{ github.event.number }}" >> $GITHUB_ENV

- name: Checkout
uses: actions/checkout@v4

- name: Configure
shell: bash
run: |
oc login "${{ secrets.OPENSHIFT_CLUSTER }}" --token="${{ secrets.OC4_DEV_TOKEN }}"
MODULE_NAME=api SECOND_LEVEL_DOMAIN="apps.silver.devops.gov.bc.ca" VANITY_DOMAIN="${SUFFIX}-dev-psu.apps.silver.devops.gov.bc.ca" ENVIRONMENT="development" bash openshift/scripts/oc_deploy_sfms.sh ${SUFFIX} apply

deploy-dev:
name: Deploy to Dev
if: github.triggering_actor != 'renovate'
Expand Down
2 changes: 1 addition & 1 deletion .sonarcloud.properties
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ sonar.test.exclusions=*.feature
sonar.tests.inclusions=**/*.test.tsx

# Exclude duplication in fba tests due to many similar calculation numbers, ignore sample code as it's temporary, ignore sfms entrypoint, ignore util tests, ignore temporary fwi folder
sonar.cpd.exclusions=api/app/tests/fba_calc/*.py, api/app/weather_models/wind_direction_sample.py, api/app/sfms.py, web/src/features/moreCast2/util.test.ts, web/src/utils/fwi
sonar.cpd.exclusions=api/app/tests/fba_calc/*.py, api/app/weather_models/wind_direction_sample.py, web/src/features/moreCast2/util.test.ts, web/src/utils/fwi

# Encoding of the source code. Default is default system encoding
sonar.sourceEncoding=UTF-8
3 changes: 3 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,7 @@
"maxy",
"miny",
"morecast",
"nats",
"ndarray",
"numba",
"ORJSON",
Expand All @@ -101,8 +102,10 @@
"rocketchat",
"sfms",
"sqlalchemy",
"starlette",
"tobytes",
"upsampled",
"uvicorn",
"vectorize",
"VSIL",
"vsimem",
Expand Down
1 change: 0 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,6 @@ COPY ./api/alembic.ini /app
# Copy pre-start.sh (it will be run on startup):
COPY ./api/prestart.sh /app
COPY ./api/start.sh /app
COPY ./api/start_sfms.sh /app

# Copy installed Python packages
COPY --from=builder /home/worker/.cache/pypoetry/virtualenvs /home/worker/.cache/pypoetry/virtualenvs
Expand Down
154 changes: 58 additions & 96 deletions api/app/routers/sfms.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
""" Router for SFMS """
"""Router for SFMS"""

import io
import logging
from datetime import datetime, date
Expand All @@ -21,10 +22,11 @@
prefix="/sfms",
)

SFMS_HOURLIES_PERMISSIONS = 'public-read'
SFMS_HOURLIES_PERMISSIONS = "public-read"


class FileLikeObject(io.IOBase):
""" Very basic wrapper of the SpooledTemporaryFile to expose the file-like object interface.
"""Very basic wrapper of the SpooledTemporaryFile to expose the file-like object interface.

The aiobotocore library expects a file-like object, but we can't pass the SpooledTemporaryFile
object directly to aiobotocore. aiobotocore looks for a "tell" method, which isn't present
Expand All @@ -48,34 +50,16 @@


def get_meta_data(request: Request) -> dict:
""" Create the meta-data for the s3 object.
"""Create the meta-data for the s3 object.
# NOTE: No idea what timezone this is going to be. Is it UTC? Is it PST? Is it PDT?
"""
last_modified = datetime.fromisoformat(request.headers.get(
'Last-modified'))
create_time = datetime.fromisoformat(request.headers.get(
'Create-time'))
return {
'last_modified': last_modified.isoformat(),
'create_time': create_time.isoformat()}

@router.get('/ready')
async def get_ready():
""" A simple endpoint for OpenShift readiness """
return Response()


@router.get('/health')
async def get_health():
""" A simple endpoint for Openshift Healthchecks. """
return Response()


@router.post('/upload')
async def upload(file: UploadFile,
request: Request,
background_tasks: BackgroundTasks,
_=Depends(sfms_authenticate)):
last_modified = datetime.fromisoformat(request.headers.get("Last-modified"))
create_time = datetime.fromisoformat(request.headers.get("Create-time"))
return {"last_modified": last_modified.isoformat(), "create_time": create_time.isoformat()}


@router.post("/upload")
async def upload(file: UploadFile, request: Request, background_tasks: BackgroundTasks, _=Depends(sfms_authenticate)):
"""
Trigger the SFMS process to run on the provided file.
The header MUST include the SFMS secret key.
Expand All @@ -89,20 +73,17 @@
-F 'file=@hfi20220812.tif;type=image/tiff'
```
"""
logger.info('sfms/upload/')
logger.info("sfms/upload/")
# Get an async S3 client.
async with get_client() as (client, bucket):
# We save the Last-modified and Create-time as metadata in the object store - just
# in case we need to know about it in the future.
key = get_target_filename(file.filename)
logger.info('Uploading file "%s" to "%s"', file.filename, key)
meta_data = get_meta_data(request)
await client.put_object(Bucket=bucket,
Key=key,
Body=FileLikeObject(file.file),
Metadata=meta_data)
await client.put_object(Bucket=bucket, Key=key, Body=FileLikeObject(file.file), Metadata=meta_data)
await file.close()
logger.info('Done uploading file')
logger.info("Done uploading file")
try:
# We don't want to hold back the response to the client, so we'll publish the message
# as a background task.
Expand All @@ -122,10 +103,9 @@
# and can't be given that level of responsibility.
return Response(status_code=200)

@router.post('/upload/hourlies')
async def upload_hourlies(file: UploadFile,
request: Request,
_=Depends(sfms_authenticate)):

@router.post("/upload/hourlies")
async def upload_hourlies(file: UploadFile, request: Request, _=Depends(sfms_authenticate)):
"""
Trigger the SFMS process to run on the provided file for hourlies.
The header MUST include the SFMS secret key.
Expand All @@ -139,7 +119,7 @@
-F 'file=@hfi20220812.tif;type=image/tiff'
```
"""
logger.info('sfms/upload/hourlies')
logger.info("sfms/upload/hourlies")

if is_ffmc_file(file.filename):
# Get an async S3 client.
Expand All @@ -149,40 +129,34 @@
key = get_hourly_filename(file.filename)
logger.info('Uploading file "%s" to "%s"', file.filename, key)
meta_data = get_meta_data(request)
await client.put_object(Bucket=bucket,
Key=key,
ACL=SFMS_HOURLIES_PERMISSIONS,
Body=FileLikeObject(file.file),
Metadata=meta_data)
await client.put_object(Bucket=bucket, Key=key, ACL=SFMS_HOURLIES_PERMISSIONS, Body=FileLikeObject(file.file), Metadata=meta_data)
await file.close()
logger.info('Done uploading file')
logger.info("Done uploading file")
return Response(status_code=200)


@router.get('/hourlies', response_model=HourlyTIFs)
@router.get("/hourlies", response_model=HourlyTIFs)
async def get_hourlies(for_date: date):
"""
Retrieve hourly FFMC TIF files for the given date.
Retrieve hourly FFMC TIF files for the given date.
Files are named in the format: "fine_fuel_moisture_codeYYYYMMDDHH.tif", where HH is the two digit day hour in PST.
"""
logger.info('sfms/hourlies')
logger.info("sfms/hourlies")

async with get_client() as (client, bucket):
logger.info('Retrieving hourlies for "%s"', for_date)
bucket = config.get('OBJECT_STORE_BUCKET')
response = await client.list_objects_v2(Bucket=bucket, Prefix=f'sfms/uploads/hourlies/{str(for_date)}')
if 'Contents' in response:
hourlies = [HourlyTIF(url=f'https://nrs.objectstore.gov.bc.ca/{bucket}/{hourly["Key"]}') for hourly in response['Contents']]
logger.info(f'Retrieved {len(hourlies)} hourlies')
bucket = config.get("OBJECT_STORE_BUCKET")
response = await client.list_objects_v2(Bucket=bucket, Prefix=f"sfms/uploads/hourlies/{str(for_date)}")
if "Contents" in response:
hourlies = [HourlyTIF(url=f'https://nrs.objectstore.gov.bc.ca/{bucket}/{hourly["Key"]}') for hourly in response["Contents"]]
logger.info(f"Retrieved {len(hourlies)} hourlies")
return HourlyTIFs(hourlies=hourlies)
logger.info(f'No hourlies found for {for_date}')
logger.info(f"No hourlies found for {for_date}")
return HourlyTIFs(hourlies=[])


@router.post('/manual')
async def upload_manual(file: UploadFile,
request: Request,
background_tasks: BackgroundTasks):

@router.post("/manual")
async def upload_manual(file: UploadFile, request: Request, background_tasks: BackgroundTasks):
"""
Trigger the SFMS process to run on the provided file.
The header MUST include the SFMS secret key.
Expand All @@ -198,31 +172,27 @@
-F 'file=@hfi20220812.tif;type=image/tiff'
```
"""
logger.info('sfms/manual')
forecast_or_actual = request.headers.get('ForecastOrActual')
issue_date = datetime.fromisoformat(str(request.headers.get('IssueDate')))
secret = request.headers.get('Secret')
if not secret or secret != config.get('SFMS_SECRET'):
logger.info("sfms/manual")
forecast_or_actual = request.headers.get("ForecastOrActual")
issue_date = datetime.fromisoformat(str(request.headers.get("IssueDate")))
secret = request.headers.get("Secret")
if not secret or secret != config.get("SFMS_SECRET"):

Check warning on line 179 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L175-L179

Added lines #L175 - L179 were not covered by tests
return Response(status_code=401)
# Get an async S3 client.
async with get_client() as (client, bucket):
# We save the Last-modified and Create-time as metadata in the object store - just
# in case we need to know about it in the future.
key = os.path.join('sfms', 'uploads', forecast_or_actual, issue_date.isoformat()[:10], file.filename)
key = os.path.join("sfms", "uploads", forecast_or_actual, issue_date.isoformat()[:10], file.filename)

Check warning on line 185 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L185

Added line #L185 was not covered by tests
# create the filename
logger.info('Uploading file "%s" to "%s"', file.filename, key)
meta_data = get_meta_data(request)
await client.put_object(Bucket=bucket,
Key=key,
Body=FileLikeObject(file.file),
Metadata=meta_data)
await client.put_object(Bucket=bucket, Key=key, Body=FileLikeObject(file.file), Metadata=meta_data)

Check warning on line 189 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L189

Added line #L189 was not covered by tests
await file.close()
logger.info('Done uploading file')
logger.info("Done uploading file")

Check warning on line 191 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L191

Added line #L191 was not covered by tests
return add_msg_to_queue(file, key, forecast_or_actual, meta_data, issue_date, background_tasks)


def add_msg_to_queue(file: UploadFile, key: str, forecast_or_actual: str, meta_data: dict,
issue_date: datetime, background_tasks: BackgroundTasks):
def add_msg_to_queue(file: UploadFile, key: str, forecast_or_actual: str, meta_data: dict, issue_date: datetime, background_tasks: BackgroundTasks):
try:
# We don't want to hold back the response to the client, so we'll publish the message
# as a background task.
Expand All @@ -231,14 +201,14 @@
if is_hfi_file(filename=file.filename):
logger.info("HFI file: %s, putting processing message on queue", file.filename)
for_date = get_date_part(file.filename)
message = SFMSFile(key=key,
run_type=forecast_or_actual,
last_modified=datetime.fromisoformat(meta_data.get('last_modified')),
create_time=datetime.fromisoformat(meta_data.get('create_time')),
run_date=issue_date,
for_date=date(year=int(for_date[0:4]),
month=int(for_date[4:6]),
day=int(for_date[6:8])))
message = SFMSFile(

Check warning on line 204 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L204

Added line #L204 was not covered by tests
key=key,
run_type=forecast_or_actual,
last_modified=datetime.fromisoformat(meta_data.get("last_modified")),
create_time=datetime.fromisoformat(meta_data.get("create_time")),
run_date=issue_date,
for_date=date(year=int(for_date[0:4]), month=int(for_date[4:6]), day=int(for_date[6:8])),
)
background_tasks.add_task(publish, stream_name, sfms_file_subject, message, subjects)
except Exception as exception:
logger.error(exception, exc_info=True)
Expand All @@ -251,30 +221,22 @@
return Response(status_code=200)


@router.post('/manual/msgOnly')
async def upload_manual_msg(message: ManualSFMS,
background_tasks: BackgroundTasks,
secret: str | None = Header(default=None)):
@router.post("/manual/msgOnly")
async def upload_manual_msg(message: ManualSFMS, background_tasks: BackgroundTasks, secret: str | None = Header(default=None)):
"""
Trigger the SFMS process to run on a tif file that already exists in s3.
Client provides, key, for_date, runtype, run_date and an
SFMS message is queued up on the message queue.
"""
logger.info('sfms/manual/msgOnly')
logger.info("sfms/manual/msgOnly")

Check warning on line 231 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L231

Added line #L231 was not covered by tests
logger.info("Received request to process tif: %s", message.key)
if not secret or secret != config.get('SFMS_SECRET'):
if not secret or secret != config.get("SFMS_SECRET"):

Check warning on line 233 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L233

Added line #L233 was not covered by tests
return Response(status_code=401)

async with get_client() as (client, bucket):
tif_object = await client.get_object(Bucket=bucket,
Key=message.key)
logger.info('Found requested object: %s', tif_object)
tif_object = await client.get_object(Bucket=bucket, Key=message.key)
logger.info("Found requested object: %s", tif_object)

Check warning on line 238 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L237-L238

Added lines #L237 - L238 were not covered by tests
last_modified = datetime.fromisoformat(tif_object["Metadata"]["last_modified"])
create_time = datetime.fromisoformat(tif_object["Metadata"]["create_time"])
message = SFMSFile(key=message.key,
run_type=message.runtype,
last_modified=last_modified,
create_time=create_time,
run_date=message.run_date,
for_date=message.for_date)
message = SFMSFile(key=message.key, run_type=message.runtype, last_modified=last_modified, create_time=create_time, run_date=message.run_date, for_date=message.for_date)

Check warning on line 241 in api/app/routers/sfms.py

View check run for this annotation

Codecov / codecov/patch

api/app/routers/sfms.py#L241

Added line #L241 was not covered by tests
background_tasks.add_task(publish, stream_name, sfms_file_subject, message, subjects)
Loading
Loading