Skip to content

Commit

Permalink
Refactor to make S3 optional and simplify usage (#6)
Browse files Browse the repository at this point in the history
* Simplify docker usage

* Use local files

* Upload image to GitHub

* Remove assets directory
  • Loading branch information
fivegrant authored Oct 12, 2023
1 parent d2a0315 commit 7f58a4e
Show file tree
Hide file tree
Showing 15 changed files with 77 additions and 63 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/publish.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
- tag-generator
uses: darpa-askem/.github/.github/workflows/bake-publish.yml@main
with:
file: 'docker-bake.hcl'
file: 'docker/docker-bake.hcl'
group: 'prod'
registry: 'ghcr.io'
organization: ${{ github.repository_owner }}
Expand Down
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -159,4 +159,4 @@ cython_debug/
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/

outputs/
outputs/ta*
16 changes: 14 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@
.env:
cp sample.env .env

PHONY:init
init:.env

PHONY:dev-init
dev-init:.env
poetry install


PHONY:up
up: .env
docker build -f docker/Dockerfile -t integration-dashboard .
docker run --name dashboard -p8501:8501 -e USE_LOCAL='TRUE' -d integration-dashboard


PHONY:down
down:
docker kill dashboard; docker rm dashboard
30 changes: 16 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,25 @@ Specifically, the dashboard indicates
Currently, the dashboard is testing `knowledge-middleware` integration
but other services and TAs might be checked in the future.

![TA1 Dashboard Screenshot](./assets/ta1-dashboard-example.png)
![TA1 Dashboard Screenshot](https://github.com/DARPA-ASKEM/integration-dashboard/assets/14170067/da57d762-6e22-4130-ad34-ff790ef590e2)


## Usage
To set up the project, run

To view the current status, start the [Streamlit](https://streamlit.io/) app
by running:
```
make up
```
Upon execution, you can pass the following environment variables (with `docker run` do `-e ENV_NAME='ENV_VAL'` for each variable).

- `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`: Standard credentials for reading and writing to S3
- `BUCKET`: The bucket you'd like to read and write to.


To set up the project for development, run
```
make init
make dev-init
```

To add a new report, run from [`knowledge-middleware`](https://github.com/DARPA-ASKEM/knowledge-middleware) (NOT THIS REPO)
Expand All @@ -26,14 +39,3 @@ This uploads a `report_{datetime}.json` to S3 which the dashboard reads
off of directly.


To view the current status, start the [Streamlit](https://streamlit.io/) app
by running:
```
docker build . -t integration-dashboard
docker run --name dashboard -p8501:8501 integration-dashboard
```
Upon execution, you can pass the following environment variables (with `docker run` do `-e ENV_NAME='ENV_VAL'` for each variable).

- `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY`: Standard credentials for reading and writing to S3
- `BUCKET`: The bucket you'd like to read and write to.
- `SKEMA_RS_URL`, `TA1_UNIFIED_URL`, and `MIT_TR_URL`: Provide overrides for the services being used.
Binary file removed assets/ta1-dashboard-example.png
Binary file not shown.
29 changes: 7 additions & 22 deletions dashboard/ui/Home.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,32 +7,17 @@
"""
# Integration Dashboard
This dashboard tracks the status of TA1-TA4 integration by viewing
reports generated by TA4's [`knowledge-middleware`](https://github.com/DARPA-ASKEM/knowledge-middleware/).
Currently, the following features are shown on the dashboard:
- Scenario Overview: The name and description of the scenario.
- Integration Status: The status of `knowledge-middleware` calling an operation on a specific scenario.
- Execution Time: How long it took for `knowledge-middleware` to peform a certain operation.
- Logs: Unfiltered logs from `knowledge-middleware`
Terarium regularly uploads new reports to S3. Additionally, a
report can be manually generated by running the 'Report' action
on `knowledge-middleware`'s GitHub repo (NOTE: THIS MANUAL ACTION WILL LIKELY BE
REMOVED SOON).
New scenarios may be added by creating adding a new directory in scenarios. For the
operations you'd like to test over that scenario, make sure you have the proper resources
available. To see which resource files need to exist for a specific operation, please
check [`tests/resources.yaml`](https://github.com/DARPA-ASKEM/knowledge-middleware/blob/main/tests/resources.yaml).
Make sure to include a `config.yaml` inside your scenario directory which includes the fields:
- name
- description
- enabled (a bulleted list the operations you want to run for the scenario)
This dashboard tracks the status of TA1-TA4 and TA3-TA4 integration by viewing
reports generated by TA4's [`knowledge-middleware`](https://github.com/DARPA-ASKEM/knowledge-middleware/)
and [`simulation-integration`](https://github.com/DARPA-ASKEM/simulation-integration).
Terarium regularly uploads new reports to S3. See [here](https://github.com/DARPA-ASKEM/simulation-integration)
to add TA1 scenarios and [here](https://github.com/DARPA-ASKEM/simulation-integration) for TA3.
"""

st.sidebar.markdown("""
Integration Dashboard
Homepage for the Integration Dashboard
""")

4 changes: 2 additions & 2 deletions dashboard/ui/pages/1_TA1.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
import streamlit as st
import pandas as pd

from dashboard.ui.utils.storage import select_report
from dashboard.ui.utils.formatting import custom_title
from dashboard.utils.storage import select_report
from dashboard.utils.formatting import custom_title

# Let the user select a report based on formatted timestamps
st.title("TA1 Integration Dashboard")
Expand Down
32 changes: 16 additions & 16 deletions dashboard/ui/pages/2_TA3.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@
import streamlit as st
import pandas as pd

from dashboard.ui.utils.storage import select_report
from dashboard.ui.utils.formatting import custom_title
from dashboard.utils.storage import select_report
from dashboard.utils.formatting import custom_title


st.title("TA3 Integration Dashboard")
Expand All @@ -26,26 +26,26 @@
TA3 integration status
""")

# """
# ### Tests Overview
"""
### Tests Overview
# """
"""

if services is not None:
st.write("### Service Info")
service_names = list(services.keys())
service_data = {
"Service": service_names,
"Version": [services[name]["version"] for name in service_names],
}
st.dataframe(pd.DataFrame(service_data), hide_index=True)


# if services is not None:
# st.write("### Service Info")
# service_names = list(services.keys())
# service_data = {
# "Service": service_names,
# "Version": [services[name]["version"] for name in service_names],
# }
# st.dataframe(pd.DataFrame(service_data), hide_index=True)
proper_names = {
"pyciemss": "PyCIEMSS",
"sciml": "SciML"
}


for service in proper_names:
test_results = defaultdict(lambda: defaultdict())

Expand All @@ -60,7 +60,7 @@
dataframes = {name: pd.DataFrame(index=scenarios, columns=operations) for name in tests}


st.write(f"### {proper_names[service]} Overview")
st.write(f"## {proper_names[service]} Overview")

for test in tests:
df = dataframes[test]
Expand Down
File renamed without changes.
File renamed without changes.
17 changes: 16 additions & 1 deletion dashboard/ui/utils/storage.py → dashboard/utils/storage.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
import streamlit as st

BUCKET = os.environ.get("BUCKET")
USE_LOCAL = os.environ.get("USE_LOCAL", "FALSE").lower() == "true"
s3 = boto3.client("s3")


Expand All @@ -24,6 +25,14 @@ def format_timestamp_from_filename(filename):
raise Exception("Extra file was included")


def fetch_local(ta):
files = glob("report*.json", root_dir=f"outputs/{ta}")
return {
file: json.load(open(f"outputs/{ta}/{file}"))
for file in files
}


def download(ta):
objects = s3.list_objects(Bucket=BUCKET, Prefix=ta)
handles = [content["Key"] for content in objects['Contents']]
Expand All @@ -36,7 +45,13 @@ def generate_timestamp_to_filenames(ta):


def select_report(ta):
report_files = download(ta)
if not USE_LOCAL:
report_files = download(ta)
else:
report_files = fetch_local(ta)
if len(report_files) == 0:
st.warning("No reports available")
st.stop()
timestamp_to_filename = {format_timestamp_from_filename(f): f for f in report_files}
selected_timestamp = st.selectbox("Select a report", sorted(timestamp_to_filename.keys(), reverse=True))
return report_files[timestamp_to_filename[selected_timestamp]]
1 change: 1 addition & 0 deletions Dockerfile → docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ COPY dashboard dashboard
RUN poetry install

COPY .streamlit .streamlit
COPY outputs outputs
ENV AWS_ACCESS_KEY_ID notprovided
ENV AWS_SECRET_ACCESS_KEY notprovided
ENV BUCKET notprovided
Expand Down
2 changes: 1 addition & 1 deletion docker-bake.hcl → docker/docker-bake.hcl
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ target "_platforms" {
}

target "integration-dashboard-base" {
context = "."
context = ".."
tags = tag("integration-dashboard", "", "")
dockerfile = "Dockerfile"
}
Expand Down
1 change: 1 addition & 0 deletions outputs/notice.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Place your local reports in `outputs/ta1` and `outputs/ta3`.
4 changes: 1 addition & 3 deletions sample.env
Original file line number Diff line number Diff line change
@@ -1,6 +1,4 @@
AWS_ACCESS_KEY_ID=aaaaaaaaaaaaaaaaaaaa
AWS_SECRET_ACCESS_KEY=aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
BUCKET=somebucket
SKEMA_RS_URL=http://skema-rs.staging.terarium.ai
TA1_UNIFIED_URL=https://api.askem.lum.ai
MIT_TR_URL=http://3.83.68.208
USE_LOCAL=FALSE

0 comments on commit 7f58a4e

Please sign in to comment.