Skip to content

Latest commit

 

History

History
160 lines (123 loc) · 6.39 KB

README-backend.md

File metadata and controls

160 lines (123 loc) · 6.39 KB

Data management API for NAV

It serves a REST-API for managing data products, and provides functionality for self-service access to the data source.

Getting started with local development

  1. Install required dependencies
  1. Configure gcloud so you can access Nais clusters
  2. Login to GCP and configure docker
gcloud auth login --update-adc
gcloud auth configure-docker europe-north1-docker.pkg.dev 

# There also exists a make target for login to docker:
make docker-login
  1. (Optional) If you are on mac with arm (m1, m2, m3, etc.) install rosetta
softwareupdate --install-rosetta
  1. Run som build commands
# Build all binaries
make build

# Run the tests
make test

Run with fully local resources

Note: this does not currently work, and we might not try to revive it

With this configuration all dependencies run as containers, as can be seen in docker-compose.yaml:

  • Google BigQuery using bigquery-emulator, with additional mocks for the IAM Policy endpoints
  • Google Cloud Storage using fake-gcs-server
  • Metabase with a patch for enabling use of bigquery-emulator
  • Fake API servers for teamkatalogen and naisconsole

There are still a couple of services missing, though much functionality should work without this:

  • Fetching of Google Groups
  • Creating Google Cloud Service Accounts
  1. Start the dependencies and API
# Starts the dependencies in the background, and runs the API in the foreground
$ make run
  1. (Optional): Start the nada-frontend

  2. (Optional): Take a look at the locally running Metabase, the username is: nada@nav.no, and password is: superdupersecret1

Making changes to the database or generated models and queries

  1. Migrations allows you to modify the existing database, these are automatically applied during startup of the application
  2. Queries lets you generate new models and queries based on the existing structure

NB: If you make changes to the Queries remember to run the generate command so your changes are propagated:

$ make generate

Making changes to api or Data Transfer Objects(DTOs)

  1. We use tygo(https://github.com/gzuidhof/tygo) to generate Typescript types for DTOs.
  2. When you change or add a DTO, they should be put in pkg/service folder.
  3. run
$ make generate-ts

will generate the types by parsing .go files in pkg/service.

  1. Changes to api url and request definition require manual coding for frontend api client, and the code is organized in frontend/lib/rest.

NB: use inheriance in go types needs extra annotation for tygo: tstype:",extends", look for examples if you are not familiar with it.

Bumping the Metabase version

The file .metabase_version controls the version of Metabase that is used in tests and for deployment to dev and prod. Check the Metabase releases page for available versions; we follow the Metabase Enterprise track.

When you bump this version the following events will occur when you make a PR:

  1. We build one Metabase images, which is used during integration tests and for local development
  2. We run the nada-backend integration tests using the new version of Metabase
  3. We deploy the new version of Metabase to dev

On merge to main:

  1. We deploy the new version of Metabase prod

Bumping the Mocks version

In the Makefile we set the target version for the mocks. If you change the mocks, you also need to bump the MOCKS_VERSION, so we get the latest changes.

Update the images locally

We build and push images for the patched metabase and customized big-query emulator to speed up local development and integration tests. If you need to make changes to these:

  1. Make changes to the base images

Note: building the big query emulator requires quite a bit of memory, so if you see something like clang++: signal: killed you need to increase the amount of memory you allocate to your container run-time.

  1. Build the new images locally
$ make build-all
  1. (optional) Push the images to the container registry; requires that you have run make docker-login
$ make push-all

Debugging a running process with IntelliJ

While this doc is written for IntelliJ, I believe it should work pretty much the same for VSCode or other editors.

  1. Follow the setup guide: https://www.jetbrains.com/help/go/attach-to-running-go-processes-with-debugger.html#attach-to-a-process-on-a-local-machine
  2. Step (2) in the guide linked above is covered by: make run-online-dbg

Architecture

flowchart TD
    %% Define the layers
    Transport["Transport (e.g., HTTP)"] --> Router["Router (METHOD /path)"]
    Router --> Endpoint["Encoding and decoding (JSON)"]
    Endpoint --> Handler["Handler (e.g., Request Handlers)"]
    Handler --> Service1["Service1 (e.g., Data Processing Service)"]
    Handler --> Service2["Service2 (e.g., Authentication Service)"]
    Handler --> ServiceN["ServiceN"]
    Service1 --> Model1["Model1 (e.g., Big Query Model)"]
    Service2 --> Model2["Model2 (e.g., Data accesss)"]
    ServiceN --> ModelN["ModelN (e.g., Metabase)"]
    Service1 --> Storage1["Storage1 (e.g., PostgreSQL)"]
    Service2 --> Storage2["Storage2 (e.g., MongoDB)"]
    Service2 --> StorageN["StorageN"]
    Service1 --> API1["External API 1 (e.g., GCP Big Query API)"]
    Service2 --> API2["External API 2 (e.g., Metabase API)"]
    ServiceN --> APIN

%% Styling classes
classDef service fill:#f9f,stroke:#333,stroke-width:2px;
class Service1,Service2,ServiceN service;
classDef model fill:#bbf,stroke:#333,stroke-width:2px;
class Model1,Model2,ModelN model;
classDef storage fill:#ffb,stroke:#333,stroke-width:2px;
class Storage1,Storage2,StorageN storage;
classDef api fill:#bfb,stroke:#333,stroke-width:2px;
class API1,API2,APIN api;
Loading