This installation is meant for developers of Open Forms. If you are looking to install Open Forms to try it out, or to run it in production, please consult the documentation for alternative installation methods.
Note
The Developer documentation on Read the Docs provides working internal links, in case you are reading this on Github or elsewhere.
The (backend) project is developed in Python using the Django framework.
For the SDK (frontend), see the Storybook documentation. This covers all SDK related development and will include custom application development documentation in the future.
You need the following libraries and/or programs:
- Python 3.12
- Python Virtualenv and Pip
- PostgreSQL 12 or above
- Redis for Celery to work
- Node.js, see
.nvmrc
for the exact version. Using nvm is recommended. - npm
- gettext
- gdal-bin (should pull in
libgeos
) - chromedriver
You will also need the following operating-system libraries:
- pkg-config
- libmagic1
- libxml2-dev
- libxmlsec1-dev
- libxmlsec1-openssl
- libpq-dev
Developers can follow the following steps to set up the project on their local development machine.
Navigate to the location where you want to place your project.
Get the code:
$ git clone git@github.com:open-formulieren/open-forms.git $ cd open-forms
Install all required libraries.
$ virtualenv env $ source env/bin/activate $ pip install -r requirements/dev.txt
Install and build the frontend libraries:
$ npm ci --legacy-peer-deps $ npm run build
Create a
.env
file with database settings. See dotenv.example for an example.$ cp dotenv.example .env
Activate your virtual environment and create the statics and database:
$ python src/manage.py collectstatic --link $ python src/manage.py compilemessages $ python src/manage.py migrate
Create a superuser to access the management interface:
$ python src/manage.py createsuperuser
You can now run your installation and point your browser to the address given by this command:
$ python src/manage.py runserver
Note: If you are making local (machine specific) changes, add them to your local
.env
file or src/openforms/conf/local.py
. You can base this file on the
example file included in the same directory.
The backend project (open-forms
, as opposed to open-forms-sdk
) comes with its
own SASS/Javascript build pipeline.
For a one-off build:
npm run build
However, while developing on frontend code, it's recommended to start a watch process that performs incremental builds:
npm start
The Docker image build copies the build artifacts of the SDK into the backend container. This is not available during local development, but can be mimicked by symlinking or fully copying a build of the SDK to Django's staticfiles. This enables you to use this particular SDK build for local backend dev and testing.
Note
This only builds the SDK once so that you can use it from within the backend project. For actual SDK development, please review the appropriate SDK documentation.
First, ensure you have checked out the SDK repository and made a production build:
cd /path/to/code/ git clone git@github.com:open-formulieren/open-forms-sdk.git cd open-forms-sdk npm install npm run build
This produces the production build artifacts in the
dist
folder, it should containopen-forms-sdk.js
andopen-forms-sdk.css
files.Next, symlink this so it gets picked up by Django's staticfiles:
$ ln -s /path/to/code/open-forms-sdk/dist src/openforms/static/sdk
Finally, you can run collectstatic to verify it all works as expected.
$ python src/manage.py collectstatic
If you're using a tagged version with the SDK code in a subdirectory, you can set the
SDK_RELEASE
environment variable - it defaults to latest
in dev settings.
When updating an existing installation:
Activate the virtual environment:
$ cd open-forms $ source env/bin/activate
Update the code and libraries:
$ git pull $ pip install -r requirements/dev.txt $ npm install --legacy-peer-deps $ npm run build
Update the statics and database:
$ python src/manage.py collectstatic --link $ python src/manage.py migrate
The documentation lives in the docs
folder.
The documentation build includes the SDK changelog, which requires you to set up a symlink (or a dummy file). The preferred approach is using the real symlink.
Ensure you have a clone of the SDK repository, e.g. in
/path/to/open-forms-sdk
.Set up the symlink from the changelog file:
$ ln -s /path/to/open-forms-sdk/CHANGELOG.rst docs/changelog-sdk.rst
Build the docs to verify it's all correct:
$ cd docs $ make html
Alternatively, instead of symlinking you can also just create the file
docs/changelog-sdk.rst
with some dummy content.
To run the test suite:
playwright install
python src/manage.py test src
See the detailed developer documentation for more information and different strategies.
A number of common settings/configurations can be modified by setting
environment variables. You can persist these in your local.py
settings
file or as part of the (post)activate
of your virtualenv.
SECRET_KEY
: the secret key to use. A default is set indev.py
DB_NAME
: name of the database for the project. Defaults toopenforms
.DB_USER
: username to connect to the database with. Defaults toopenforms
.DB_PASSWORD
: password to use to connect to the database. Defaults toopenforms
.DB_HOST
: database host. Defaults tolocalhost
DB_PORT
: database port. Defaults to5432
.SENTRY_DSN
: the DSN of the project in Sentry. If set, enabled Sentry SDK as logger and will send errors/logging to Sentry. If unset, Sentry SDK will be disabled.
All settings for the project can be found in src/openforms/conf
.
The file local.py
overwrites settings from the base configuration.
We use Celery as background task queue.
You can run celery beat and worker(s) in a shell to activate the asynchronous task queue processing:
To start beat which triggers periodic tasks:
$ ./bin/celery_beat.sh
To start the background workers executing tasks:
$ CELERY_WORKER_CONCURRENCY=4 ./bin/celery_worker.sh
Note
You can tweak CELERY_WORKER_CONCURRENCY
to your liking, the default is 1.
To start flower for task monitoring:
$ ./bin/celery_flower.sh
Commands can be executed using:
$ python src/manage.py <command>
You can always get a full list of available commands by running:
$ python src/manage.py help
There are a number of developer management commands available in this project.
Performs various appointment plugin calls.
Evaluate a particular decision definition.
List the available decision definitions for a given engine.
Check all forms and report duplicated component keys.
Export a form.
Import a form.
List the files in MS Sharepoint.
List the registered prefill plugins and the attributes they expose.
Execute the registration machinery for a given submission.
Render the summary/confirmation into a PDF for a given submission.
Render a summary for a given submission in a particular render mode.
Generate a submission and test the completion process flow.
The bin
folder contains some utility scripts sporadically used.
Wrapper around bumpversion
which takes care of package-lock.json
too.
This allows bumping the version according to semver, e.g.:
./bin/bumpversion.sh minor
Wrapper script around pip-compile
. New dependencies should be added to the
relevant .in
file in requirements
, and then you run the compile script:
./bin/compile_dependencies.sh
You should also use this to upgrade existing dependencies to a newer version, for example:
./bin/compile_dependencies.sh -P django
Any additional argument passed to the script are passed down to the underlying
pip-compile
call.
A utility that checks the JavaScript translation catalogs and detects strings that may still need translation.
After configuring the application groups in the admin through point-and-click, you call this script to dump the configuration into a fixture which will be loaded on all other installations.
After configuring the user groups with the appropriate permissions in the admin, you can this script to dump the configuration into a fixture which will be loaded on all other installations.
This script generates the OpenAPI specification from the API endpoint implementations.
You must call this after making changes to the (public) API.
Script to extract the backend and frontend translation messages into their catalogs for translation.