Skip to content

Commit

Permalink
[doc] update jsonlan and neurojson toolbox download links
Browse files Browse the repository at this point in the history
  • Loading branch information
fangq committed May 14, 2024
1 parent ea67ea9 commit 56aa355
Show file tree
Hide file tree
Showing 3 changed files with 60 additions and 65 deletions.
52 changes: 25 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -521,10 +521,10 @@ where possible parameters include (the first value in [*|*] is the default)
tx3 - GL texture data for rendering (GL_RGBA32F)
the bnii/jnii formats support compression (-Z) and generate small files
load jnii (JSON) and bnii (UBJSON) files using below lightweight libs:
MATLAB/Octave: JNIfTI toolbox https://github.com/NeuroJSON/jnifti,
MATLAB/Octave: JSONLab toolbox https://github.com/NeuroJSON/jsonlab,
Python: PyJData: https://pypi.org/project/jdata
JavaScript: JSData: https://github.com/NeuroJSON/jsdata
MATLAB/Octave: JNIfTI toolbox https://neurojson.org/download/jnifti
MATLAB/Octave: JSONLab toolbox https://neurojson.org/download/jsonlab
Python: PyJData: https://neurojson.org/download/pyjdata
JavaScript: JSData: https://neurojson.org/download/jsdata
-Z [zlib|...] (--zip) set compression method if -F jnii or --dumpjson
is used (when saving data to JSON/JNIfTI format)
0 zlib: zip format (moderate compression,fast)
Expand Down Expand Up @@ -964,11 +964,11 @@ the ND array with built-in compression, one should call JData compatible
libraries, which can be found at https://neurojson.org/#software

Specifically, to parse/save .jnii files in MATLAB, you should use
- JSONLab for MATLAB (https://github.com/fangq/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
- `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://github.com/fangq/jsonlab)
- JSONLab for MATLAB (https://neurojson.org/download/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
- `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://neurojson.org/download/jsonlab)

To parse/save .jnii files in Python, you should use
- PyJData module (https://pypi.org/project/jdata/) or install `python3-jdata` on Debian/Ubuntu
- PyJData module (https://neurojson.org/download/pyjdata) or install `python3-jdata` on Debian/Ubuntu

In Python, the volumetric data is loaded as a `dict` object where `data['NIFTIData']`
is a NumPy `ndarray` object storing the volumetric data.
Expand All @@ -991,11 +991,11 @@ the ND array with built-in compression, one should call JData compatible
libraries, which can be found at https://neurojson.org/#software

Specifically, to parse/save .jnii files in MATLAB, you should use one of
- JSONLab for MATLAB (https://github.com/fangq/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
- `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://github.com/fangq/jsonlab)
- JSONLab for MATLAB (https://neurojson.org/download/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
- `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://neurojson.org/download/jsonlab)

To parse/save .jnii files in Python, you should use
- PyJData module (https://pypi.org/project/jdata/) or install `python3-jdata` on Debian/Ubuntu
- PyJData module (https://neurojson.org/download/pyjdata) or install `python3-jdata` on Debian/Ubuntu

In Python, the volumetric data is loaded as a `dict` object where `data['NIFTIData']`
is a NumPy `ndarray` object storing the volumetric data.
Expand Down Expand Up @@ -1091,11 +1091,11 @@ Although .jdat and .jnii have different suffix, they are both JSON/JData files a
can be opened/written by the same JData compatible libraries mentioned above, i.e.

For MATLAB
- JSONLab for MATLAB (https://github.com/fangq/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
- `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://github.com/fangq/jsonlab)
- JSONLab for MATLAB (https://neurojson.org/download/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
- `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://neurojson.org/download/jsonlab)

For Python
- PyJData module (https://pypi.org/project/jdata/) or install `python3-jdata` on Debian/Ubuntu
- PyJData module (https://neurojson.org/download/pyjdata) or install `python3-jdata` on Debian/Ubuntu

In Python, the volumetric data is loaded as a `dict` object where `data['MCXData']['PhotonData']`
stores the photon data, `data['MCXData']['Trajectory']` stores the trajectory data etc.
Expand Down Expand Up @@ -1317,19 +1317,17 @@ Best practices guide
To maximize MCX's performance on your hardware, you should follow the best
practices guide listed below:

### Use dedicated GPUs
### Use a middle-range or enthusiastic-grade GPU, use multiple of them if possible

A dedicated GPU is a GPU that is not connected to a monitor. If you use a
non-dedicated GPU, any kernel (GPU function) can not run more than a few
seconds. This greatly limits the efficiency of MCX. To set up a dedicated GPU,
it is suggested to install two graphics cards on your computer, one is set up
for displays, the other one is used for GPU computation only. If you have a
dual-GPU card, you can also connect one GPU to a single monitor, and use the
other GPU for computation (selected by `-G` in mcx). If you have to use a
non-dedicated GPU, you can either use the pure command-line mode (for Linux,
you need to stop X server), or use the `-r` flag to divide the total
simulation into a set of simulations with less photons, so that each simulation
only lasts a few seconds.
MCX is highly scalable, providing linear-speedup as long as you provide the
GPU cores it can use. As a result, the better the GPU you use, the higher the speed
you can get. An enthusiastic-grade GPU, such as RTX 4070Ti (~$700), can be 12x
faster than an low-end laptop RTX 4050 GPU even within the same generation.

MCX can readily take advantage of multiple GPUs if you have it installed. The
MCX simulation speed scales nearly linearly as the number of GPUs increases.
So, to maximize MCX performance, get at least a middle-level or high-end consumer
grade GPU; if you need more speed, throw in more GPUs will cut down the runtime.

### Launch as many threads as possible

Expand Down Expand Up @@ -1389,7 +1387,7 @@ open-source projects (with a compatible license).
### ZMat data compression unit

- Files: src/zmat/*
- Copyright: 2019-2020 Qianqian Fang
- Copyright: 2019-2023 Qianqian Fang
- URL: https://github.com/fangq/zmat
- License: GPL version 3 or later, https://github.com/fangq/zmat/blob/master/LICENSE.txt

Expand Down Expand Up @@ -1461,6 +1459,6 @@ Links:
- [1] http://developer.nvidia.com/cuda-downloads
- [2] http://www.nvidia.com/object/cuda_gpus.html
- [3] http://en.wikipedia.org/wiki/Row-major_order
- [4] http://iso2mesh.sourceforge.net/cgi-bin/index.cgi?jsonlab
- [4] https://neurojson.org/jsonlab
- [5] http://science.jrank.org/pages/60024/particle-fluence.html
- [6] http://www.opticsinfobase.org/oe/abstract.cfm?uri=oe-17-22-20178
53 changes: 25 additions & 28 deletions README.txt
Original file line number Diff line number Diff line change
Expand Up @@ -493,10 +493,10 @@ where possible parameters include (the first value in [*|*] is the default)
tx3 - GL texture data for rendering (GL_RGBA32F)
the bnii/jnii formats support compression (-Z) and generate small files
load jnii (JSON) and bnii (UBJSON) files using below lightweight libs:
MATLAB/Octave: JNIfTI toolbox https://github.com/NeuroJSON/jnifti,
MATLAB/Octave: JSONLab toolbox https://github.com/NeuroJSON/jsonlab,
Python: PyJData: https://pypi.org/project/jdata
JavaScript: JSData: https://github.com/NeuroJSON/jsdata
MATLAB/Octave: JNIfTI toolbox https://neurojson.org/download/jnifti
MATLAB/Octave: JSONLab toolbox https://neurojson.org/download/jsonlab
Python: PyJData: https://neurojson.org/download/pyjdata
JavaScript: JSData: https://neurojson.org/download/jsdata
-Z [zlib|...] (--zip) set compression method if -F jnii or --dumpjson
is used (when saving data to JSON/JNIfTI format)
0 zlib: zip format (moderate compression,fast)
Expand Down Expand Up @@ -950,11 +950,11 @@ the ND array with built-in compression, one should call JData compatible
libraries, which can be found at https://neurojson.org/#software

Specifically, to parse/save .jnii files in MATLAB, you should use
* JSONLab for MATLAB (https://github.com/fangq/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
* `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://github.com/fangq/jsonlab)
* JSONLab for MATLAB (https://neurojson.org/download/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
* `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://neurojson.org/download/jsonlab)

To parse/save .jnii files in Python, you should use
* PyJData module (https://pypi.org/project/jdata/) or install `python3-jdata` on Debian/Ubuntu
* PyJData module (https://neurojson.org/download/pyjdata) or install `python3-jdata` on Debian/Ubuntu

In Python, the volumetric data is loaded as a `dict` object where `data['NIFTIData']`
is a NumPy `ndarray` object storing the volumetric data.
Expand All @@ -977,11 +977,11 @@ the ND array with built-in compression, one should call JData compatible
libraries, which can be found at https://neurojson.org/#software

Specifically, to parse/save .jnii files in MATLAB, you should use one of
* JSONLab for MATLAB (https://github.com/fangq/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
* `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://github.com/fangq/jsonlab)
* JSONLab for MATLAB (https://neurojson.org/download/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
* `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://neurojson.org/download/jsonlab)

To parse/save .jnii files in Python, you should use
* PyJData module (https://pypi.org/project/jdata/) or install `python3-jdata` on Debian/Ubuntu
* PyJData module (https://neurojson.org/download/pyjdata) or install `python3-jdata` on Debian/Ubuntu

In Python, the volumetric data is loaded as a `dict` object where `data['NIFTIData']`
is a NumPy `ndarray` object storing the volumetric data.
Expand Down Expand Up @@ -1077,11 +1077,11 @@ Although .jdat and .jnii have different suffix, they are both JSON/JData files a
can be opened/written by the same JData compatible libraries mentioned above, i.e.

For MATLAB
* JSONLab for MATLAB (https://github.com/fangq/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
* `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://github.com/fangq/jsonlab)
* JSONLab for MATLAB (https://neurojson.org/download/jsonlab) or install `octave-jsonlab` on Fedora/Debian/Ubuntu
* `jsonencode/jsondecode` in MATLAB + `jdataencode/jdatadecode` from JSONLab (https://neurojson.org/download/jsonlab)

For Python
* PyJData module (https://pypi.org/project/jdata/) or install `python3-jdata` on Debian/Ubuntu
* PyJData module (https://neurojson.org/download/pyjdata) or install `python3-jdata` on Debian/Ubuntu

In Python, the volumetric data is loaded as a `dict` object where `data['MCXData']['PhotonData']`
stores the photon data, `data['MCXData']['Trajectory']` stores the trajectory data etc.
Expand Down Expand Up @@ -1302,19 +1302,16 @@ MCX or MCXLAB prints a progress bar showing the percentage of completition.
To maximize MCX's performance on your hardware, you should follow the
best practices guide listed below:

=== Use dedicated GPUs ===
A dedicated GPU is a GPU that is not connected to a monitor. If you use
a non-dedicated GPU, any kernel (GPU function) can not run more than a
few seconds. This greatly limits the efficiency of MCX. To set up a
dedicated GPU, it is suggested to install two graphics cards on your
computer, one is set up for displays, the other one is used for GPU
computation only. If you have a dual-GPU card, you can also connect
one GPU to a single monitor, and use the other GPU for computation
(selected by -G in mcx). If you have to use a non-dedicated GPU, you
can either use the pure command-line mode (for Linux, you need to
stop X server), or use the "-r" flag to divide the total simulation
into a set of simulations with less photons, so that each simulation
only lasts a few seconds.
=== Use a middle-range or enthusiastic-grade GPU, use multiple of them if possible ===
MCX is highly scalable, providing linear-speedup as long as you provide the
GPU cores it can use. As a result, the better the GPU you use, the higher the speed
you can get. An enthusiastic-grade GPU, such as RTX 4070Ti (~$700), can be 12x
faster than an low-end laptop RTX 4050 GPU even within the same generation.

MCX can readily take advantage of multiple GPUs if you have it installed. The
MCX simulation speed scales nearly linearly as the number of GPUs increases.
So, to maximize MCX performance, get at least a middle-level or high-end consumer
grade GPU; if you need more speed, throw in more GPUs will cut down the runtime.

=== Launch as many threads as possible ===
It has been shown that MCX's speed is related to the thread number (-t).
Expand Down Expand Up @@ -1368,7 +1365,7 @@ the "optimal" thread number when you are not sure what to use.
=== ZMat data compression unit ===

* Files: src/zmat/*
* Copyright: 2019-2020 Qianqian Fang
* Copyright: 2019-2023 Qianqian Fang
* URL: https://github.com/fangq/zmat
* License: GPL version 3 or later, https://github.com/fangq/zmat/blob/master/LICENSE.txt

Expand Down Expand Up @@ -1435,6 +1432,6 @@ Links:
[1] http://developer.nvidia.com/cuda-downloads
[2] http://www.nvidia.com/object/cuda_gpus.html
[3] http://en.wikipedia.org/wiki/Row-major_order
[4] http://iso2mesh.sourceforge.net/cgi-bin/index.cgi?jsonlab
[4] https://neurojson.org/jsonlab
[5] http://science.jrank.org/pages/60024/particle-fluence.html
[6] http://www.opticsinfobase.org/oe/abstract.cfm?uri=oe-17-22-20178
20 changes: 10 additions & 10 deletions src/mcx_utils.c
Original file line number Diff line number Diff line change
Expand Up @@ -618,13 +618,13 @@ void mcx_savebnii(float* vol, int ndim, uint* dims, float* voxelsize, char* name
ubjw_begin_object(root, UBJ_MIXED, 0);
ubjw_write_key(root, "Python");
ubjw_begin_array(root, UBJ_STRING, 2);
ubjw_write_string(root, "https://pypi.org/project/jdata");
ubjw_write_string(root, "https://pypi.org/project/bjdata");
ubjw_write_string(root, "https://neurojson.org/download/pyjdata");
ubjw_write_string(root, "https://neurojson.org/download/pybjdata");
ubjw_end(root);
ubjw_write_key(root, "MATLAB");
ubjw_begin_array(root, UBJ_STRING, 2);
ubjw_write_string(root, "https://github.com/NeuroJSON/jnifty");
ubjw_write_string(root, "https://github.com/NeuroJSON/jsonlab");
ubjw_write_string(root, "https://neurojson.org/download/jnifty");
ubjw_write_string(root, "https://neurojson.org/download/jsonlab");
ubjw_end(root);
ubjw_write_key(root, "JavaScript");
ubjw_begin_array(root, UBJ_STRING, 2);
Expand Down Expand Up @@ -757,8 +757,8 @@ void mcx_savejnii(float* vol, int ndim, uint* dims, float* voxelsize, char* name
FILE* fp;
char fname[MAX_FULL_PATH] = {'\0'};
int affine[] = {0, 0, 1, 0, 0, 0};
const char* libpy[] = {"https://pypi.org/project/jdata", "https://pypi.org/project/bjdata"};
const char* libmat[] = {"https://github.com/NeuroJSON/jnifty", "https://github.com/NeuroJSON/jsonlab"};
const char* libpy[] = {"https://neurojson.org/download/pyjdata", "https://neurojson.org/download/pybjdata"};
const char* libmat[] = {"https://neurojson.org/download/jnifty", "https://neurojson.org/download/jsonlab"};
const char* libjs[] = {"https://www.npmjs.com/package/jda", "https://www.npmjs.com/package/bjd"};
const char* libc[] = {"https://github.com/DaveGamble/cJSON", "https://github.com/NeuroJSON/ubj"};

Expand Down Expand Up @@ -5555,10 +5555,10 @@ where possible parameters include (the first value in [*|*] is the default)\n\
tx3 - GL texture data for rendering (GL_RGBA32F)\n\
the bnii/jnii formats support compression (-Z) and generate small files\n\
load jnii (JSON) and bnii (UBJSON) files using below lightweight libs:\n\
MATLAB/Octave: JNIfTI toolbox https://github.com/NeuroJSON/jnifti,\n\
MATLAB/Octave: JSONLab toolbox https://github.com/NeuroJSON/jsonlab,\n\
Python: PyJData: https://pypi.org/project/jdata\n\
JavaScript: JSData: https://github.com/NeuroJSON/jsdata\n\
MATLAB/Octave: JNIfTI toolbox https://neurojson.org/download/jnifty\n\
MATLAB/Octave: JSONLab toolbox https://neurojson.org/download/jsonlab\n\
Python: PyJData: https://neurojson.org/download/pyjdata\n\
JavaScript: JSData: https://neurojson.org/download/jsdata\n\
-Z [zlib|...] (--zip) set compression method if -F jnii or --dumpjson\n\
is used (when saving data to JSON/JNIfTI format)\n\
0 zlib: zip format (moderate compression,fast) \n\
Expand Down

0 comments on commit 56aa355

Please sign in to comment.