Skip to content

Template repository for a Python 3-based (data) science project with GPU acceleration using the Fast.ai ecosystem.

License

Notifications You must be signed in to change notification settings

kaust-vislab/fastai-gpu-data-science-project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

fastai-gpu-data-science-project

Repository containing scaffolding for a Python 3-based data science project with GPU acceleration using the Fast.ai ecosystem.

Creating a new project from this template

Simply follow the instructions to create a new project repository from this template.

Project organization

Project organization is based on ideas from Good Enough Practices for Scientific Computing.

  1. Put each project in its own directory, which is named after the project.
  2. Put external scripts or compiled programs in the bin directory.
  3. Put raw data and metadata in a data directory.
  4. Put text documents associated with the project in the doc directory.
  5. Put all Docker related files in the docker directory.
  6. Install the Conda environment into an env directory.
  7. Put all notebooks in the notebooks directory.
  8. Put files generated during cleanup and analysis in a results directory.
  9. Put project source code in the src directory.
  10. Name all files to reflect their content or function.

Using Conda

Creating the Conda environment

After adding any necessary dependencies to the Conda environment.yml file you can create the environment in a sub-directory of your project directory by running the following command.

$ conda env create --prefix ./env --file environment.yml

Once the new environment has been created you can activate the environment with the following command.

$ conda activate ./env

Note that the env directory is not under version control as it can always be re-created from the environment.yml file as necessary.

Updating the Conda environment

If you add (remove) dependencies to (from) the environment.yml file after the environment has already been created, then you can update the environment with the following command.

$ conda env update --prefix ./env --file environment.yml --prune

Listing the full contents of the Conda environment

The list of explicit dependencies for the project are listed in the environment.yml file. Too see the full lost of packages installed into the environment run the following command.

conda list --prefix ./env

Installing the JupyterLab extensions

If you wish to make use of the JupyterLab extensions included in the environment.yml file, then you will need to run the postBuild script after activating the environment to rebuild the client-side components of the extensions. Note that this step only needs to be done once (unless you add additional JupyterLab extensions).

$ conda activate ./env
(/path/to/project-dir/env)$ . postBuild

Using Docker

In order to build Docker images for your project and run containers with GPU acceleration you will need to install Docker, Docker Compose and the NVIDIA Docker runtime.

Detailed instructions for using Docker to build and image and launch containers can be found in the docker/README.md.