@page readme FastPL README
This work is based on Olivier Sobrie's work on preference learning algorithms (python code : https://github.com/oso/pymcda, thesis: https://tel.archives-ouvertes.fr/tel-01370555/document).
The objective of this repository is to translate the previous code (py-mcda) in C++ and set up the parallelisation tools needed to process a greater amount of entrypoints.
The README contains information on how to run the application and help future developpers to set their environment.
The full description and documentation of the project can be found at https://mostah.github.io/fastPL/.
-
include : Header files
-
src : Sources files
-
test : Test files
-
extsrc : External sources and dependencies
-
data : Data (datasets and models) repository
-
doc : Doxygen documentation repository
-
.circleci : CircleCi pipelines configuration
First thing first, build the docker image. You can find here some documentation on docker.
docker build . -t fastpl
Currently, the build of all dependencies requires ~1h
The following command will show the run config options (helper):
docker run fastpl ./Main -h
To run the app with on a specific dataset:
docker run fastpl ./Main -d $dataset_path -o $output_path
At the end of the algorithm, the model will be stored in the $output_path
Run all tests:
docker run fastpl ./Test
Run specified tests:
docker run fastpl ./Test --gtest_filter=TestGeneralName.TestPreciseName # One specific Test
docker run fastpl ./Test --gtest_filter=TestGeneralName.* # All tests of Name1 = GeneralName
Get the fastpl docker container id by running docker ps
docker run -it $container_id /bin/bash
cd /home/fastpl/logs
The following requires CMake to be installed in your machine.
From the root of the project directory:
git submodule init
git submodule update
mkdir build && cd build
cmake .. -DBUILD_DEPS:BOOL=ON -DUSE_SCIP=OFF && make
Currently, the build of all dependencies requires ~1h
From the build
directory:
The following command will show the run config options (helper):
./Main -h
To run the app on a specified dataset:
./Main -d $dataset_path -o $output_path
At the end of the algorithm, the model will be stored in the $output_path
From the build
directory:
Run all tests:
./Test
Run specified tests:
./Test --gtest_filter=TestGeneralName.TestPreciseName # One specific Test
./Test --gtest_filter=TestGeneralName.* # All tests of Name1 = GeneralName
The application configuration can be found at: app-config
The application configuration holds the general config of the learning algorithms.
log_level
: log filter, values inINFO
,ERROR
,DEBUG
log_file
: path of the logfiledata_dir
: data directory path. When changed, the args -d and -o passed along with ./Main program is set relatively to the data directory path configured heremodel_batch_size
: model population size used in the metaheuristicmax_iterations
: max iteration of the metaheuristic before terminating the applicationn_profile_update
: number of iteration of profile update for one weight update
CircleCi pipelines configuration can be found at: .circleci
- Build: Try to build all the project
- Tests (requires build): Try to pass all the tests
- Doc generation (requires build): Try to generate the documentation
The documentation of this project is generated by doxygen. In order to generate locally or update the online documentation, the following modules must be installed: doxygen
, doxygen-doc
, doxygen-gui
, graphviz
.
The documentation configuration and files can be found at: doc
The online documentation is currently host on GitHub Pages: https://mostah.github.io/fastPL/.
The GitHub Page displays static html files from a specified branch. The files used by GitHub Pages to display the latest documentation are set on the gh-pages
branch and must not be changed manually.
The following script updates the gh-pages
branch by generating the documentation from the latest master
commit.
From root directory:
sh doc_generation.sh
From /doc
directory:
cmake . && doxygen Doxyfile.Doxigen
Open the documentation on your browser, from /doc/html
directory:
open index.html
Profiling requires the app to be run on a Docker container. The following assumes the fastpl app has already been built by docker.
docker run -it fastpl /bin/bash
./$PROGRAM
gprof $PROGRAM | python3 gprof2dot/gprof2dot.py | dot -Tpng -o analysis.png
gprof $PROGRAM > analysis.txt
Keep this terminal open, and open a new one
Get the $ID of the fastpl container running with docker ps
docker cp $ID:/home/fastPL/build/analysis.{txt or png} $YOUR_PATH
The profiling data should now be in $YOUR_PATH in your machine.