Skip to content

Commit

Permalink
Merge pull request #714 from joshunrau/v1.0
Browse files Browse the repository at this point in the history
  • Loading branch information
joshunrau authored Feb 25, 2024
2 parents a4cdb0e + 25b75e7 commit a598cd1
Show file tree
Hide file tree
Showing 36 changed files with 372 additions and 173 deletions.
17 changes: 10 additions & 7 deletions .env.template
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,19 @@
## ---------------------------------

# The domain name to use for your site in the production compose stack
# If you set this to your own domain name, HTTPS will be enabled by default
# Otherwise, if you do not want to use HTTPS, for example if your domain is
# being handled by an additional, system-level reverse proxy, you can do so
# by specifying the HTTP port as the address (e.g., localhost:80).
SITE_ADDRESS=localhost:80
# The domain name to use for the gateway service in the production compose stack
GATEWAY_SITE_ADDRESS=localhost:3500

# The port to listen on
# The ports to listen on
APP_PORT=5500
PLAYGROUND_PORT=3750
GATEWAY_PORT=3500

# The Docker release to use (latest = stable)
RELEASE_CHANNEL=main
# The MongoDB version to use
MONGODB_VERSION=7.0

## ---------------------------------
## PRODUCTION + DEVELOPMENT
Expand Down Expand Up @@ -41,8 +46,6 @@ GITHUB_REPO_URL=https://github.com/DouglasNeuroInformatics/OpenDataCapture
GATEWAY_ENABLED=true
# A link to the license governing distribution of the platform and all derivative work
LICENSE_URL=https://www.gnu.org/licenses/agpl-3.0.en.html
# The base URL for the gateway. In development, this should correspond to GATEWAY_DEV_SERVER_PORT
GATEWAY_BASE_URL=http://localhost:3500
# The interval (in miliseconds) at which the API will attempt to resync with the gateway
GATEWAY_REFRESH_INTERVAL=10000
# The path to the SQLite file for the gateway (e.g., file:/path/to/database.db)
Expand Down
11 changes: 10 additions & 1 deletion .github/workflows/pull-request.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,21 @@ on:
pull_request:
types: [opened, synchronize]
jobs:
configure:
runs-on: ubuntu-latest
outputs:
mongodb_version: ${{ steps.define.outputs.MONGODB_VERSION }}
steps:
- id: define
run: |
cat .env.template | grep MONGODB_VERSION >> "$GITHUB_OUTPUT"
build:
name: Build, Lint, and Test
runs-on: ubuntu-latest
needs: configure
services:
mongo:
image: mongo:7.0
image: mongo:${{needs.configure.outputs.mongodb_version}}
ports:
- 27017:27017
steps:
Expand Down
27 changes: 7 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,39 +28,26 @@

![license](https://img.shields.io/github/license/DouglasNeuroInformatics/OpenDataCapture)
![version](https://img.shields.io/github/package-json/v/DouglasNeuroInformatics/OpenDataCapture)
![build](https://github.com/DouglasNeuroInformatics/OpenDataCapture/actions/workflows/build.yaml/badge.svg)

<!-- ![build](https://github.com/DouglasNeuroInformatics/OpenDataCapture/actions/workflows/build.yaml/badge.svg) -->
<!-- [![codecov](https://codecov.io/gh/DouglasNeuroInformatics/OpenDataCapture/branch/main/graph/badge.svg?token=XHC7BY6PJ1)](https://codecov.io/gh/DouglasNeuroInformatics/OpenDataCapture) -->

</div>
<hr />

## About

Open Data Capture is an integrated suite of applications tailored for the continuous and longitudinal collection of both clinical and research data. This platform is anchored on a few foundational principles:
Open Data Capture is a web-based platform designed for continuous clinical data collection. Designed with clinician-researchers in mind, the platform enables both remote and in-person evaluations, encompassing a range of applications — from patient questionnaires completed at home to interactive memory tasks conducted in clinical settings. The platform is designed with a robust security framework, ensuring that all collected data is securely stored in a structured, standardized manner. It offers an intuitive, user-friendly interface to filter the stored data according to various criteria. Once filtered, the data can be used to generate dynamic graphs and tables, or exported for research purposes.

- Versatility in Instruments: Whether it's surveys, clinical questionnaires, interactive tasks, or neuroimaging data, our platform can handle it all under the umbrella of a generic instrument.
- User-Friendly Design: Designed with the user in mind, its intuitive interface ensures that even those without specialized knowledge can navigate and utilize the platform with ease.
- Streamlined Deployment: With our one-liner deployment solution, leverage Docker Compose for a hassle-free, automated setup.
## Quick Start

## Deployment
Assuming that Docker and Docker Compose are already installed on your system, you can deploy an instance of Open Data Capture using the following command:

### Download Repository

```shell
git clone https://github.com/DouglasNeuroInformatics/OpenDataCapture
cd OpenDataCapture
```

### Launch Application

```shell
./scripts/generate-env.sh
docker compose up -d
docker compose exec mongo mongosh --eval "rs.initiate({_id: 'rs0', members: [{_id: 0, host: 'localhost:27017'}]});"
```sh
./scripts/generate-env.sh && docker compose up -d
```

By default, the application will run on port 5500. So, navigate to `http://localhost:5500` in your browser and you should be greeted with the setup screen.
By default, the application will run on port 5500. So, navigate to `http://localhost:5500` in your browser and you should be greeted with the setup screen. After getting started, we highly recommend reading our [deployment guide](http://localhost:4000/en/tutorials/deployment/) for additional information on how to configure Open Data Capture to best meet the needs of your organization.

## Contribution

Expand Down
11 changes: 6 additions & 5 deletions apps/api/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM node:lts-iron as base
FROM node:lts-alpine as base
WORKDIR /app
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
Expand All @@ -8,10 +8,14 @@ RUN corepack enable
# PRUNE WORKSPACE
FROM base AS builder
COPY . .
RUN apk add --no-cache libc6-compat
RUN apk update
RUN pnpm dlx turbo prune @open-data-capture/api --docker

# INSTALL DEPENDENCIES
FROM base AS installer
RUN apk add --no-cache libc6-compat
RUN apk update
COPY .gitignore jsconfig.json ./
COPY --from=builder /app/out/json/ .
RUN pnpm install --frozen-lockfile
Expand All @@ -25,9 +29,6 @@ RUN pnpm dlx turbo build --filter=@open-data-capture/api
FROM base AS runner
COPY --from=installer /app/apps/api/dist/ /app/dist/
COPY --from=installer /app/apps/api/public/ /app/public/
RUN addgroup --system --gid 1001 app
RUN adduser --system --uid 1001 app
RUN chown -R app:app /app
USER app
RUN echo '{ "type": "module" }' > package.json
USER node
CMD [ "node", "./dist/app.mjs" ]
3 changes: 2 additions & 1 deletion apps/api/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
"dev": "NODE_ENV=development env-cmd -f ../../.env concurrently \"pnpm build:watch\" \"sleep 1 && nodemon dist/app.mjs\"",
"format": "prettier --write src",
"lint": "tsc && eslint --fix src",
"start": "env-cmd -f ../../.env --use-shell 'NODE_OPTIONS=\"--enable-source-maps\" API_PROD_SERVER_PORT=$API_DEV_SERVER_PORT NODE_ENV=production ESBUILD_BINARY_PATH=dist/bin/esbuild node dist/app.mjs'",
"start": "env-cmd -f ../../.env --use-shell 'NODE_OPTIONS=\"--enable-source-maps\" API_PROD_SERVER_PORT=$API_DEV_SERVER_PORT NODE_ENV=production ESBUILD_BINARY_PATH=dist/bin/esbuild GATEWAY_SITE_ADDRESS=http://localhost:$GATEWAY_DEV_SERVER_PORT node dist/app.mjs'",
"test": "env-cmd -f ../../.env vitest run"
},
"dependencies": {
Expand Down Expand Up @@ -53,6 +53,7 @@
"devDependencies": {
"@nestjs/testing": "^10.3.0",
"@open-data-capture/esbuild-plugin-native-modules": "workspace:*",
"@open-data-capture/esbuild-plugin-prisma": "workspace:*",
"@open-data-capture/esbuild-plugin-runtime": "workspace:*",
"@types/express": "^4.17.21",
"@types/lodash": "^4.14.202",
Expand Down
21 changes: 4 additions & 17 deletions apps/api/scripts/build.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import path from 'path';
import url from 'url';

import { nativeModulesPlugin } from '@open-data-capture/esbuild-plugin-native-modules';
import { prismaPlugin } from '@open-data-capture/esbuild-plugin-prisma';
import { runtimePlugin } from '@open-data-capture/esbuild-plugin-runtime';
import esbuild from 'esbuild';
import esbuildPluginTsc from 'esbuild-plugin-tsc';
Expand Down Expand Up @@ -36,21 +37,7 @@ const { __dirname, __filename, require } = await (async () => {
})();
`;

// Copy Prisma
async function copyPrisma() {
const coreDatabasePath = path.dirname(require.resolve('@open-data-capture/database/core'));
const files = await fs.readdir(coreDatabasePath);
const engineFilename = files.find((filename) => {
return filename.startsWith('libquery_engine') && filename.endsWith('.node');
});
if (!engineFilename) {
throw new Error(`Failed to resolve prisma engine from path: ${coreDatabasePath}`);
}
await fs.mkdir(path.join(outdir, 'core'));
await fs.copyFile(path.join(coreDatabasePath, engineFilename), path.join(outdir, 'core', engineFilename));
}

// Copy Prisma
// Copy ESBuild
async function copyEsbuild() {
const filepath = require.resolve('esbuild/bin/esbuild');
await fs.copyFile(filepath, path.join(binDir, 'esbuild'));
Expand All @@ -73,6 +60,7 @@ const options = {
tsconfigPath: tsconfig
}),
runtimePlugin({ outdir }),
prismaPlugin({ outdir: path.join(outdir, 'core') }),
nativeModulesPlugin()
],
target: ['node18', 'es2022'],
Expand All @@ -82,13 +70,12 @@ const options = {
if (process.argv.includes('--watch')) {
const ctx = await esbuild.context({
...options,
external: [...options.external, 'esbuild'],
external: [...(options.external ?? []), 'esbuild'],
sourcemap: true
});
await ctx.watch();
console.log('Watching...');
} else {
await copyPrisma();
await copyEsbuild();
await esbuild.build(options);
console.log('Done!');
Expand Down
12 changes: 9 additions & 3 deletions apps/api/src/assignments/assignments.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,20 @@ import { CreateAssignmentDto } from './dto/create-assignment.dto';

@Injectable()
export class AssignmentsService {
private readonly gatewayBaseUrl: string;
private readonly assignmentBaseUrl: string;

constructor(
@InjectModel('Assignment') private readonly assignmentModel: Model<'Assignment'>,
configurationService: ConfigurationService,
private readonly gatewayService: GatewayService
) {
this.gatewayBaseUrl = configurationService.get('GATEWAY_BASE_URL');
if (configurationService.get('NODE_ENV') === 'production') {
const siteAddress = configurationService.get('GATEWAY_SITE_ADDRESS')!;
this.assignmentBaseUrl = siteAddress.origin;
} else {
const gatewayPort = configurationService.get('GATEWAY_DEV_SERVER_PORT')!;
this.assignmentBaseUrl = `http://localhost:${gatewayPort}`;
}
}

async create({ expiresAt, instrumentId, subjectId }: CreateAssignmentDto): Promise<Assignment> {
Expand All @@ -44,7 +50,7 @@ export class AssignmentsService {
id: subjectId
}
},
url: `${this.gatewayBaseUrl}/assignments/${id}`
url: `${this.assignmentBaseUrl}/assignments/${id}`
}
});
try {
Expand Down
72 changes: 55 additions & 17 deletions apps/api/src/configuration/configuration.schema.ts
Original file line number Diff line number Diff line change
@@ -1,22 +1,60 @@
import { $BooleanString } from '@open-data-capture/common/core';
import { z } from 'zod';

export const $Configuration = z.object({
API_DEV_SERVER_PORT: z.coerce.number().positive().int().optional(),
API_PROD_SERVER_PORT: z.coerce.number().positive().int().default(80),
DEBUG: $BooleanString,
GATEWAY_API_KEY: z.string().min(32),
GATEWAY_BASE_URL: z.string().url(),
GATEWAY_ENABLED: $BooleanString,
GATEWAY_REFRESH_INTERVAL: z.coerce.number().positive().int(),
MONGO_DIRECT_CONNECTION: z.string().optional(),
MONGO_REPLICA_SET: z.string().optional(),
MONGO_RETRY_WRITES: z.string().optional(),
MONGO_URI: z.string().url(),
MONGO_WRITE_CONCERN: z.string().optional(),
NODE_ENV: z.enum(['development', 'production', 'test']),
SECRET_KEY: z.string().min(32),
VERBOSE: $BooleanString
});
const $OptionalURL = z.preprocess(
(arg) => arg || undefined,
z
.string()
.url()
.optional()
.transform((arg) => (arg ? new URL(arg) : undefined))
);

export const $Configuration = z
.object({
API_DEV_SERVER_PORT: z.coerce.number().positive().int().optional(),
API_PROD_SERVER_PORT: z.coerce.number().positive().int().default(80),
DEBUG: $BooleanString,
GATEWAY_API_KEY: z.string().min(32),
GATEWAY_DEV_SERVER_PORT: z.coerce.number().positive().int().optional(),
GATEWAY_ENABLED: $BooleanString,
GATEWAY_INTERNAL_NETWORK_URL: $OptionalURL,
GATEWAY_REFRESH_INTERVAL: z.coerce.number().positive().int(),
GATEWAY_SITE_ADDRESS: $OptionalURL,
MONGO_DIRECT_CONNECTION: z.string().optional(),
MONGO_REPLICA_SET: z.string().optional(),
MONGO_RETRY_WRITES: z.string().optional(),
MONGO_URI: z
.string()
.url()
.transform((arg) => new URL(arg)),
MONGO_WRITE_CONCERN: z.string().optional(),
NODE_ENV: z.enum(['development', 'production', 'test']),
SECRET_KEY: z.string().min(32),
VERBOSE: $BooleanString
})
.superRefine((env, ctx) => {
if (env.NODE_ENV === 'production') {
if (!env.GATEWAY_SITE_ADDRESS) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
message: 'GATEWAY_SITE_ADDRESS must be defined in production'
});
}
} else if (env.NODE_ENV === 'development') {
if (!env.API_DEV_SERVER_PORT) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
message: 'API_DEV_SERVER_PORT must be defined in production'
});
}
if (!env.GATEWAY_DEV_SERVER_PORT) {
ctx.addIssue({
code: z.ZodIssueCode.custom,
message: 'GATEWAY_DEV_SERVER_PORT must be defined in production'
});
}
}
});

export type Configuration = z.infer<typeof $Configuration>;
2 changes: 1 addition & 1 deletion apps/api/src/configuration/configuration.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,6 @@ export class ConfigurationService {
constructor(private readonly configService: ConfigService) {}

get<TKey extends Extract<keyof Configuration, string>>(key: TKey) {
return this.configService.get<Configuration[TKey]>(key)!;
return this.configService.get<Configuration[TKey]>(key) as Configuration[TKey];
}
}
14 changes: 0 additions & 14 deletions apps/api/src/core/config/config.schema.ts

This file was deleted.

14 changes: 14 additions & 0 deletions apps/api/src/gateway/gateway.module.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,21 @@ import { GatewaySynchronizer } from './gateway.synchronizer';
HttpModule.registerAsync({
inject: [ConfigurationService],
useFactory: (configurationService: ConfigurationService) => {
let baseURL: string;
if (configurationService.get('NODE_ENV') === 'production') {
const internalNetworkUrl = configurationService.get('GATEWAY_INTERNAL_NETWORK_URL');
const siteAddress = configurationService.get('GATEWAY_SITE_ADDRESS')!;
if (siteAddress.hostname === 'localhost' && internalNetworkUrl) {
baseURL = internalNetworkUrl.origin;
} else {
baseURL = siteAddress.origin;
}
} else {
const gatewayPort = configurationService.get('GATEWAY_DEV_SERVER_PORT')!;
baseURL = `http://localhost:${gatewayPort}`;
}
return {
baseURL,
headers: {
Authorization: `Bearer ${configurationService.get('GATEWAY_API_KEY')}`
}
Expand Down
Loading

0 comments on commit a598cd1

Please sign in to comment.