Skip to content

Commit

Permalink
Merge pull request #107 from neurolib-dev/fix/add_subjectwise_dmats
Browse files Browse the repository at this point in the history
loadData: add subjectwise length matrices ds.Dmats
  • Loading branch information
caglorithm authored Nov 10, 2020
2 parents ad2d9f2 + 78ddecc commit 4cf93d5
Show file tree
Hide file tree
Showing 5 changed files with 141 additions and 20 deletions.
103 changes: 103 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
**v0.5.10**

- models now have the parameter `sampling_dt` which will downsample the output to a specified step size (in ms)
- loadData: add subject-wise length matrices `ds.Dmats`

**v0.5.9**

- `ALN` model added to the multimodel framework
- `ThalamicMassModel` now works with autochunk for very long simulations with minimal RAM usage!

**v0.5.8**

- Hotfix: include `pypet_logging.ini` in pypi package
- Evolution: new method `getIndividualFromHistory()`

**v0.5.7**

- `example-0.5`: Demonstrating the use of external stimuli on brain networks
- `example-1.3`: 2D bifurcation diagrams using `pypet`
- `bold`: BOLD numerical overflow bug fixed
- `evolution`: dfEvolution and dfPop fix
- `exploration`: fix seed for random initial conditions
- various minor bugfixes

**v0.5.5**

- Hotfix for RNG seed in exploration: Seed `None` is now converted to `"None"` for for `pypet` compatibility only when saving the `model.params` to the trajectory.
- Fix: `dfEvolution` drops duplicate entries from the `evolution.history`.

**v0.5.4**

- New function `func.construct_stimulus()`
- New example of stimulus usage: `examples/example-0.5-aln-external-stimulus.ipynb`
- Fixed RNG seed bug, where the seed value None was converted to 0 (because of pypet) and lead to a predictable random number generator

**v0.5.3**

- `ALNModel` now records adaptation currents! Accessible via model.outputs.IA

**v0.5.1**

*Evolution:*

- NSGA-2 algorithm implemented (Deb et al. 2002)
- Preselect complete algorithms (using `algorithm="adaptive"` or `"nsga2"`)
- Implement custom operators for all evolutionary operations
- Keep track of the evolution history using `evolution.history`
- Genealogy `evolution.tree` available from `evolution.buildEvolutionTree()` that is `networkx` compatible [1]
- Continue working: `saveEvolution()` and `loadEvolution()` can load an evolution from another session [2]
- Overview dataframe `evolution.dfPop` now has all fitness values as well
- Get scores using `getScores()`
- Plot evolution progress with `evolutionaryUtils.plotProgress()`

*Exploration:*

- Use `loadResults(all=True)` to load all simulated results from disk to memory (available as `.results`) or use `all=False` to load runs individually from hdf. Both options populate `dfResults`.
- `loadResults()` has memory cap to avoid filling up RAM
- `loadDfResults()` creates the parameter table from a previous simulation
- `explorationUtils.plotExplorationResults()` for plotting 2D slices of the explored results with some advanced functions like alpha maps and contours for predefined regions.

*devUtils*

- A module that we are using for development and research with some nice features. Please do not rely on this file since there might be breaking changes in the future.
- `plot_outputs()` like a true numerical simlord
- `model_fit()` to compute the model's FC and FCD fit to the dataset, could be usefull for everyone
- `getPowerSpectrum()` does what is says
- `getMeanPowerSpectrum()` same
- a very neat `rolling_window()` from a `numpy` PR that never got accepted

*Other:*

- Data loading:
- `Dataset` can load different SC matrix normalizations: `"max", "waytotal", "nvoxel"`
- Can precompute FCD matrices to avoid having to do it later (`fcd=True`)
- `neurolib/utils/atlas.py` added with aal2 region names (thanks @jajcayn) and coordinates of centers of regions (from scans of @caglorithm's brain 🤯)
- `ParameterSpace` has `.lowerBound` and `.upperBound`.
- `pypet` finally doesn't create a billion log files anymore due to a custom log config

**v0.5.0**

- **New model**: Thalamus model `ThalamicMassModel` (thanks to @jajcayn)
- Model by Costa et al. 2016, PLOS Computational Biology
- New tools for parameter exploration: `explorationUtils.py` aka `eu`
- Postprocessing of exploration results using `eu.processExplorationResults()`
- Find parameters of explored simulations using `eu.findCloseResults()`
- Plot exploration results via `eu.plotExplorationResults()` (see example image below)
- Custom transformation of the inputs to the `BOLDModel`.
- This is particularly handy for phenomenological models (such as `FHNModel`, `HopfModel` and `WCModel`) which do not produce firing rate outputs with units in `Hz`.
- Improvements
- Models can now generate random initial conditions using `model.randomICs()`
- `model.params['bold'] = True` forces BOLD simulation
- `BoxSearch` class: `search.run()` passes arguments to `model.run()`
- BOLD output time array renamed to `t_BOLD`

**v0.4.1**

- **New model:** Wilson-Cowan neural mass model implemented (thanks to @ChristophMetzner )
- Simulations now start their output at `t=dt` (as opposed to `t=0` before). Everything before is now considered an initial condition.
- Fix: Running a simulation chunkwise (using `model.run(chunkwise=True)`) and normally (using `model.run()`) produces output of the same length
- Fix: `aln` network coupling, which apparent when simulating chunkwise with `model.run(chunkwise=True, chunksize=1)`
- Fix: Correct use of seed for RNG
- Fix: Matrices are not normalized to max-1 anymore before each run.
- Fix: Kolmogorov distance of FCD matrices and timeseries
51 changes: 32 additions & 19 deletions neurolib/optimize/exploration/explorationUtils.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import os
import logging

import numpy as np
import pandas as pd
Expand All @@ -15,6 +16,7 @@
from ...utils import functions as func
from ...utils import paths as paths


def plotExplorationResults(
dfResults,
par1,
Expand All @@ -28,6 +30,7 @@ def plotExplorationResults(
one_figure=False,
contour=None,
alpha_mask=None,
multiply_axis=None,
savename=None,
**kwargs,
):
Expand Down Expand Up @@ -92,9 +95,7 @@ def plotExplorationResults(
if nan_to_zero:
df_pivot = df_pivot.fillna(0)

plot_clim = (
kwargs["plot_clim"] if "plot_clim" in kwargs else (np.nanmin(df_pivot.values), np.nanmax(df_pivot.values))
)
plot_clim = kwargs.get("plot_clim", (np.nanmin(df_pivot.values), np.nanmax(df_pivot.values)))

if symmetric_colorbar:
plot_clim = (-np.max(np.abs(plot_clim)), np.max(np.abs(plot_clim)))
Expand All @@ -104,10 +105,10 @@ def plotExplorationResults(
# -----
# alpha mask
if alpha_mask is not None:
mask_threshold = kwargs["mask_threshold"] if "mask_threshold" in kwargs else 1
mask_alpha = kwargs["mask_alpha"] if "mask_alpha" in kwargs else 0.5
mask_style = kwargs["mask_style"] if "mask_style" in kwargs else None
mask_invert = kwargs["mask_invert"] if "mask_invert" in kwargs else False
mask_threshold = kwargs.get("mask_threshold", 1)
mask_alpha = kwargs.get("mask_alpha", 0.5)
mask_style = kwargs.get("mask_style", None)
mask_invert = kwargs.get("mask_invert", False)

# alpha_mask can either be a pd.DataFrame or an np.ndarray that is
# layed over the image, a string that is a key in the results df
Expand All @@ -134,10 +135,10 @@ def plotExplorationResults(
# ------------------
# plot contours
if contour is not None:
contour_color = kwargs["contour_color"] if "contour_color" in kwargs else "white"
contour_levels = kwargs["contour_levels"] if "contour_levels" in kwargs else None
contour_alpha = kwargs["contour_alpha"] if "contour_alpha" in kwargs else 1
contour_kwargs = kwargs["contour_kwargs"] if "contour_kwargs" in kwargs else dict()
contour_color = kwargs.get("contour_color", "white")
contour_levels = kwargs.get("contour_levels", None)
contour_alpha = kwargs.get("contour_alpha", 1)
contour_kwargs = kwargs.get("contour_kwargs", dict())

def plot_contour(contour, contour_color, contour_levels, contour_alpha, contour_kwargs):
# check if this is a dataframe
Expand Down Expand Up @@ -167,9 +168,11 @@ def plot_contour(contour, contour_color, contour_levels, contour_alpha, contour_
# check if contour is alist of variables, e.g. ["max_output", "domfr"]
if isinstance(contour, list):
for ci in range(len(contour)):
plot_contour(contour[ci], contour_color[ci], contour_levels[ci], contour_alpha[ci], contour_kwargs[ci])
plot_contour(
contour[ci], contour_color[ci], contour_levels[ci], contour_alpha[ci], contour_kwargs[ci]
)
else:
plot_contour(contour, contour_color, contour_levels, contour_alpha, contour_kwargs)
plot_contour(contour, contour_color, contour_levels, contour_alpha, contour_kwargs)

# colorbar
if one_figure == False:
Expand All @@ -185,16 +188,27 @@ def plot_contour(contour, contour_color, contour_levels, contour_alpha, contour_
ax.set_xlabel(par1_label)
ax.set_ylabel(par2_label)

# tick marks
ax.tick_params(
axis="both", direction="out", length=3, width=1, bottom=True, left=True,
)

# multiply / rescale axis
if multiply_axis:
ax.set_xticklabels(np.round(np.multiply(ax.get_xticks(), multiply_axis), 2))
ax.set_yticklabels(np.round(np.multiply(ax.get_yticks(), multiply_axis), 2))

# single by-values need to become tuple
if not isinstance(i, tuple):
i = (i,)
if by != ["_by"]:
title = " ".join([f"{bb}={bi}" for bb, bi in zip(by_label, i)])
title = "-".join([f"{bb}={bi}" for bb, bi in zip(by_label, i)])
ax.set_title(title)
if one_figure == False:
if savename:
save_fname = os.path.join(paths.FIGURES_DIR, f"{title}_{savename}")
plt.savefig(save_fname)
plt.savefig(save_fname)
logging.info(f"Saving to {save_fname}")
plt.show()
else:
axi += 1
Expand All @@ -204,6 +218,7 @@ def plot_contour(contour, contour_color, contour_levels, contour_alpha, contour_
if savename:
save_fname = os.path.join(paths.FIGURES_DIR, f"{savename}")
plt.savefig(save_fname)
logging.info(f"Saving to {save_fname}")
plt.show()


Expand All @@ -227,10 +242,8 @@ def contourPlotDf(

# unpack, why necessary??
contour_kwargs = contour_kwargs["contour_kwargs"]

contours = ax.contour(
Xi, Yi, dataframe, colors=color, levels=levels, zorder=1, alpha=alpha, **contour_kwargs,
)

contours = ax.contour(Xi, Yi, dataframe, colors=color, levels=levels, zorder=1, alpha=alpha, **contour_kwargs,)

clabel = contour_kwargs["clabel"] if "clabel" in contour_kwargs else False
if clabel:
Expand Down
1 change: 1 addition & 0 deletions neurolib/utils/loadData.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,6 +82,7 @@ def loadDataset(self, datasetName, normalizeCmats="max", fcd=False):
assert self.has_subjects

self.Cmats = self._normalizeCmats(self.getDataPerSubject("cm"), method=normalizeCmats)
self.Dmats = self.getDataPerSubject("len")

# take the average of all
self.Cmat = np.mean(self.Cmats, axis=0)
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

setuptools.setup(
name="neurolib",
version="0.5.9",
version="0.5.10",
description="Easy whole-brain neural mass modeling",
long_description=long_description,
long_description_content_type="text/markdown",
Expand Down
4 changes: 4 additions & 0 deletions tests/multimodel/test_aln.py
Original file line number Diff line number Diff line change
Expand Up @@ -252,6 +252,8 @@ def test_compare_w_neurolib_native_model(self):
aln_neurolib = ALNModel(seed=SEED)
aln_neurolib.params["duration"] = DURATION
aln_neurolib.params["dt"] = DT
aln_neurolib.params["mue_ext_mean"] = 0.0
aln_neurolib.params["mui_ext_mean"] = 0.0
aln_neurolib.run()
for (var_multi, var_neurolib) in NEUROLIB_VARIABLES_TO_TEST:
corr_mat = np.corrcoef(aln_neurolib[var_neurolib], multi_result[var_multi].values.T)
Expand Down Expand Up @@ -311,6 +313,8 @@ def test_compare_w_neurolib_native_model(self):
# delays <-> length matrix
aln_neurolib.params["signalV"] = 1.0
aln_neurolib.params["sigma_ou"] = 0.0
aln_neurolib.params["mue_ext_mean"] = 0.0
aln_neurolib.params["mui_ext_mean"] = 0.0
# match initial state at least for current - this seems to be enough
aln_neurolib.params["mufe_init"] = np.array(
[aln_multi[0][0].initial_state[0], aln_multi[1][0].initial_state[0]]
Expand Down

0 comments on commit 4cf93d5

Please sign in to comment.