Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update hdf5_puff.f90.in so that new fields are written (separate files for ranks) #21

Closed
jdasmith opened this issue May 16, 2016 · 6 comments

Comments

@jdasmith
Copy link
Collaborator

No description provided.

@jdasmith jdasmith self-assigned this May 16, 2016
@jdasmith
Copy link
Collaborator Author

I have added some infrastructure to simplify the writing of a field mesh and limits. This is apparently working quite nicely, and has allowed me to write the front, real field to a file, which may be visualised. I need now to abstract out everything apart from the dataset specific bits (ie which I expect to get from range in z). These new blocks will make life easy for writing the electron datasets and field datasets in parallel too, since the limits are shared.

@jdasmith
Copy link
Collaborator Author

Note - I'm going to commit only once I've got the six files there, otherwise I might be causing more trouble than I'm saving.

@cabs46
Copy link
Collaborator

cabs46 commented Jun 11, 2016

Jesus Jonny, get to bed!
B.

Sent from my Samsung device

-------- Original message --------
From: Jonny Smith notifications@github.com
Date: 11/06/2016 03:26 (GMT+00:00)
To: UKFELs/Puffin Puffin@noreply.github.com
Subject: Re: [UKFELs/Puffin] Update hdf5_puff.f90.in so that new fields are written (separate files for ranks) (#21)

Note - I'm going to commit only once I've got the six files there, otherwise I might be causing more trouble than I'm saving.


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHubhttps://github.com//issues/21#issuecomment-225331795, or mute the threadhttps://github.com/notifications/unsubscribe/ADIlon039Ga_L4NQf_ObKGkUh8RfUELmks5qKhx-gaJpZM4IfVQY.

@jdasmith
Copy link
Collaborator Author

jdasmith commented Jun 12, 2016

I've committing what I have, though there are some questions. I fixed the particles so again they work with the distributed field, and tried the current version of the parallel-field branch on 8 cores. I looked at the sizes of the datasets produced by the various nodes, as I've not done the stitching back together yet (and stitching will be done in a fortran to C sense while the issues remain around fortran ordering in vizschema). Either see http://slideplayer.com/slide/3265765/ (slide 10 of 27) or https://ice.txcorp.com/trac/vizschema/wiki/DataOrdering for details on the fact that fortran array ordering should be supported. I'm highly likely to be getting some out by one errors in the size of the array in the meantime.

The front field is present on ranks 1,2,3,4 only (seems maybe there are four cells in that region)
The back field is present on all ranks for a dump at timestep 60. It's size is 186 cells on ranks 1 & 2 and 185 on the others (total 1482), which makes sense, but the 'active' field is 139 on all ranks except rank 7, where it is 245 (1218). The sum is 2704 cells in nz2, which is correct, but why the uneven allocation of the active field? (the rank 7 active fields are correspondingly nearly twice the size of those on the other ranks)

See below:
-bash-4.1$ for i in $(ls ~/testpuffin/testfig7-pf/fig7_aperp_*60.h5); do echo $i; h5ls $i | grep -i data; done
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_0_60.h5
aperp_active_imag Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_1_60.h5
aperp_active_imag Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_2_60.h5
aperp_active_imag Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_3_60.h5
aperp_active_imag Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_4_60.h5
aperp_active_imag Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_5_60.h5
aperp_active_imag Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_6_60.h5
aperp_active_imag Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_imag_7_60.h5
aperp_active_imag Dataset {245, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_0_60.h5
aperp_active_real Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_1_60.h5
aperp_active_real Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_2_60.h5
aperp_active_real Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_3_60.h5
aperp_active_real Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_4_60.h5
aperp_active_real Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_5_60.h5
aperp_active_real Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_6_60.h5
aperp_active_real Dataset {139, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_active_real_7_60.h5
aperp_active_real Dataset {245, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_0_60.h5
aperp_back_imag Dataset {186, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_1_60.h5
aperp_back_imag Dataset {186, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_2_60.h5
aperp_back_imag Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_3_60.h5
aperp_back_imag Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_4_60.h5
aperp_back_imag Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_5_60.h5
aperp_back_imag Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_6_60.h5
aperp_back_imag Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_imag_7_60.h5
aperp_back_imag Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_0_60.h5
aperp_back_real Dataset {186, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_1_60.h5
aperp_back_real Dataset {186, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_2_60.h5
aperp_back_real Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_3_60.h5
aperp_back_real Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_4_60.h5
aperp_back_real Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_5_60.h5
aperp_back_real Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_6_60.h5
aperp_back_real Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_back_real_7_60.h5
aperp_back_real Dataset {185, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_imag_0_60.h5
aperp_front_imag Dataset {1, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_imag_1_60.h5
aperp_front_imag Dataset {1, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_imag_2_60.h5
aperp_front_imag Dataset {1, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_imag_3_60.h5
aperp_front_imag Dataset {1, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_real_0_60.h5
aperp_front_real Dataset {1, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_real_1_60.h5
aperp_front_real Dataset {1, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_real_2_60.h5
aperp_front_real Dataset {1, 64, 64}
/gpfs/stfc/local/HCP084/bwm06/jds89-bwm06/testpuffin/testfig7-pf/fig7_aperp_front_real_3_60.h5
aperp_front_real Dataset {1, 64, 64}

@jdasmith
Copy link
Collaborator Author

I think the effect above is understood. A utility has been provided to stitch these files back together, and has been 'tested' in the sense that I've checked the data structure is the right shape, and appears to have stuff on axis and not off. There is some stuff towards the back of the domain fairly quickly though, which may not be physical. Handed over to Lawrence for further testing.

@jdasmith
Copy link
Collaborator Author

jdasmith commented Jun 30, 2016

The above issues don't seem to be as problematic, this works, subject to the qSeparate bug #39 there is a separate ticket for the test #28 #29, so marking closed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants