From fe687eb69e735745b730c6f20fa90a647b9e9d1d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Rapha=C3=ABl?= <23459708+Rapfff@users.noreply.github.com> Date: Wed, 5 Oct 2022 14:28:20 +0000 Subject: [PATCH 1/2] Update README.md --- README.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 8855020..fcfbdac 100644 --- a/README.md +++ b/README.md @@ -13,8 +13,8 @@ ## Introduction `jajapy` is a python library implementing the **Baum-Welch** algorithm on various kinds of Markov models. +`jajapy` generates models which are compatible with the Stormpy model checker. Thus, `jajapy`can be use as a learning extension to the Storm model checker. -Please cite this repository if you use this library. ## Main features `jajapy` provides: @@ -32,13 +32,15 @@ Please cite this repository if you use this library. +`jajapy` generates by default Stormpy models (except for GoHMM and MGoHMM). + ## Installation ``pip install jajapy`` ## Requirements - numpy - scipy -- stormpy (recommended) +- stormpy (recommended: if stormpy is not installed, `jajapy` will generate models in jajapy format). ## Documentation Available on [readthedoc](https://jajapy.readthedocs.io/en/latest/?) From ca6f4a58994438b11dba5c8ee8595184bc15832e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Rapha=C3=ABl?= <23459708+Rapfff@users.noreply.github.com> Date: Wed, 5 Oct 2022 14:33:52 +0000 Subject: [PATCH 2/2] Update tuto.rst --- docs/source/tuto.rst | 10 ++++++---- 1 file changed, 6 insertions(+), 4 deletions(-) diff --git a/docs/source/tuto.rst b/docs/source/tuto.rst index a891575..ee43674 100644 --- a/docs/source/tuto.rst +++ b/docs/source/tuto.rst @@ -75,10 +75,11 @@ Let now use our training set to learn ``original_model`` with the Baum-Welch alg .. code-block:: python - output_model = ja.BW_HMM().fit(training_set, nb_states=5) + output_model = ja.BW_HMM().fit(training_set, nb_states=5, stormpy_output=False) print(output_model) -For the initial model we used a randomly generated HMM with 5 states. +For the initial model we used a randomly generated HMM with 5 states. Since we are not planning to use Storm on this model, +we set the `stormpy_output` parameter to False. Evaluating the BW output model ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ @@ -160,7 +161,7 @@ At each iteration, the library will generate a new model with 7 states. >>> best_model = None >>> quality_best = -1024 >>> for n in range(1,nb_trials+1): - ... current_model = ja.BW_MC().fit(training_set,nb_states=7,pp=n) + ... current_model = ja.BW_MC().fit(training_set,nb_states=7,pp=n, stormpy_output=False) ... current_quality = current_model.logLikelihood(test_set) ... if quality_best < current_quality: #we keep the best model only ... quality_best = current_quality @@ -324,7 +325,8 @@ scheduler with probability 0.25. learning_rate = 0 output_model = ja.Active_BW_MDP().fit(training_set,learning_rate, nb_iterations=20, nb_sequences=50, - epsilon_greedy=0.75, nb_states=9) + epsilon_greedy=0.75, nb_states=9, + stormpy_output=False) output_quality = output_model.logLikelihood(test_set) print(output_model)