-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Copying (FieldmapNFFTOp) #152
Comments
I would expect composition to make shallow copies, i.e., to reuse existing object memory as much as possible. I haven't actually tested that with julia's base |
Thanks @JeffFessler, that makes sense to me in the interest of minimizing allocations. You've convinced me that having a shallow copy isn't cause for concern, but after some more digging this morning I've come across something interesting (and still related to copying FieldmapNFFTOp, I've changed the title to reflect this). If anyway we shallow or deepcopy, we should end up with a copy of the operator that uses at most the same amount of memory as the original operator, and ideally less if we are clever. It seems that at the moment, copied FieldmapNFFTOp structs grow with each subsequent copy. Evaluating an extension to the above example (with N = 256): F_fmap_nfft = FieldmapNFFTOp((N,N),tr,cmap,symmetrize=false)
F_fmap_nfft_copy = copy(F_fmap_nfft) and running
If this behaviour is unexpected (and feel free to let me know if it isn't) then I will investigate a fix. |
Interesting. Normally I would not expect a |
I did not have a deeper look into this issue so far. If I recall correctly, our Having said that, we might need to rename all of our |
julia> A= rand(3)
3-element Vector{Float64}:
0.996877573163787
0.6833890929190753
0.7392402674660837
julia> B = copy(A)
3-element Vector{Float64}:
0.996877573163787
0.6833890929190753
0.7392402674660837
julia> A .= 0
3-element Vector{Float64}:
0.0
0.0
0.0
julia> B
3-element Vector{Float64}:
0.996877573163787
0.6833890929190753
0.7392402674660837 This looks deep. But its a call to |
Here's an example that is more surprising (to me). I did not expect julia> x = [zeros(2), "hi"]
2-element Vector{Any}:
[0.0, 0.0]
"hi"
julia> y = copy(x)
2-element Vector{Any}:
[0.0, 0.0]
"hi"
julia> y[1] = 7;
julia> y
2-element Vector{Any}:
7
"hi"
julia> x
2-element Vector{Any}:
[0.0, 0.0]
"hi" |
@tknopp @JeffFessler I pushed a branch that contains a possible fix for this issue with the name |
#153 addresses this issue @JeffFessler and @tknopp |
Great, thanks for fixing @alexjaffray. So I am closing this issue for now. Please reopen if there are still issues. |
@nbwuzhe, @mrikasper and I are debugging some intermittent segmentation faults (or mem leak, not sure) in our (GIRF+B0)-aware spiral reconstruction workflow, and in the process have stumbled across what is likely a bug. It seems as though composition of operators with
FieldmapNFFTOp
results in a shallow copying of at least theplans
field of theFieldmapNFFTOp
. This is illustrated in the following MWE:This example fails on the last assert
@tknopp would it be possible to do a true deepcopy of the FieldmapNFFTOp?
The text was updated successfully, but these errors were encountered: