Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

predict_md.sh (Middlebury 2014 dataset) GPU running out of memory #34

Open
andrea-unity opened this issue May 5, 2022 · 1 comment
Open

Comments

@andrea-unity
Copy link

andrea-unity commented May 5, 2022

I'm able to run inference on a Kitti 2015 dataset.
Do you know how can I run prediction on Middlebury 2014 with a single GPU with 24GB?
It always run out of memory, Should I downsize the input?

I'm using MiddEval3-data-H -> 1000 x 1500 size

Exception has occurred: RuntimeError
CUDA out of memory. Tried to allocate 5.49 GiB (GPU 0; 23.68 GiB total capacity; 16.71 GiB already allocated; 3.46 GiB free; 18.42 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
File "/home/andreaa/dev/stereo_depth/LEAStereo/retrain/skip_model_3d.py", line 47, in forward
s1 = F.interpolate(s1, [feature_size_d, feature_size_h, feature_size_w], mode='trilinear', align_corners=True)
File "/home/andreaa/dev/stereo_depth/LEAStereo/retrain/skip_model_3d.py", line 155, in forward
out10= self.cells[10](out9[0], out9[1])
File "/home/andreaa/dev/stereo_depth/LEAStereo/retrain/LEAStereo.py", line 41, in forward
cost = self.matching(cost)
File "/home/andreaa/dev/stereo_depth/LEAStereo/utils/multadds_count.py", line 21, in comp_multadds
_ = model(input_data, input_data)
File "/home/andreaa/dev/stereo_depth/LEAStereo/predict.py", line 48, in
mult_adds = comp_multadds(model, input_size=(3,opt.crop_height, opt.crop_width)) #(3,192, 192))

@andrea-unity andrea-unity changed the title predict.py GPU running out of memory predict_md.sh (Middlebury 2014 dataset) GPU running out of memory May 5, 2022
@AiYoWeiYL
Copy link

I have the same problem. Have you solved it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants