Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

An error was encountered while fitting DKI #4

Open
406354348 opened this issue Mar 22, 2022 · 7 comments
Open

An error was encountered while fitting DKI #4

406354348 opened this issue Mar 22, 2022 · 7 comments

Comments

@406354348
Copy link

Traceback (most recent call last):
File "/home/jht/anaconda3/envs/gnn/lib/python3.7/multiprocessing/queues.py", line 242, in _feed
send_bytes(obj)
File "/home/jht/anaconda3/envs/gnn/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/home/jht/anaconda3/envs/gnn/lib/python3.7/multiprocessing/connection.py", line 393, in _send_bytes
header = struct.pack("!i", n)
struct.error: 'i' format requires -2147483648 <= number <= 2147483647

@kodiweera
Copy link
Owner

Can you please try fitting dki on a single data set? Do you have sufficient RAM?

@406354348
Copy link
Author

Can you please try fitting dki on a single data set? Do you have sufficient RAM?
I just use one data for fitting and have enough RAM, while my DMRI data is HCP, and the data dimension is (145,174,145,288)

@kodiweera
Copy link
Owner

How many shells your dwi data have? Did you correclty mentioned number of b0 and b-values? How did your DTI fitting go?

@406354348
Copy link
Author

你的 dwi 数据有多少个 shell?您是否正确提到了 b0 和 b 值的数量?你的 DTI 试衣效果如何?

My DWI data has three shells, each with 90 directions, b1000, b2000 and b3000, and there are 18 in b0. The code I use is python -m difit '/home/jht/gnn_pnt/data/102311' '/home/jht/gnn_pnt/data/102311/out' '/home/jht/gnn_pnt/data/work' --models dki --dki_b_values 1000 2000 3000 --dki_b0_images 18 --mem 9 --nprocs 1. DTI useless attempt. It is also possible that mem is set to 9, which is too large.

@kodiweera
Copy link
Owner

Please give it a try with fewer b0 images. increase --nprocs to 2. If you have more RAM, you can increase the ram. How big is the dataset?

@kodiweera
Copy link
Owner

You can try the docker/ singularity image to single out any system related issues. If you do a longitudinal study, container approach is the best.

@406354348
Copy link
Author

Thank you for your answer, my data size is 1.15g, I'll try your comments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants