Skip to content

Commit

Permalink
Built site for gh-pages
Browse files Browse the repository at this point in the history
  • Loading branch information
ASKabalan committed Jun 5, 2024
1 parent 96985d8 commit 7625f46
Show file tree
Hide file tree
Showing 85 changed files with 6,620 additions and 5,866 deletions.
2 changes: 1 addition & 1 deletion .nojekyll
Original file line number Diff line number Diff line change
@@ -1 +1 @@
a55ddfcb
d98d715f
Binary file added marseille2024/assets/lsst_bg.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1,344 changes: 757 additions & 587 deletions marseille2024/index.html

Large diffs are not rendered by default.

13 changes: 3 additions & 10 deletions search.json
Original file line number Diff line number Diff line change
@@ -1,17 +1,10 @@
[
{
"objectID": "marseille2024/index.html",
"href": "marseille2024/index.html",
"title": "Differentiable and distributed Particle-Mesh n-body simulations",
"section": "",
"text": "lambda CMD model\n\n\n\n\n\n\n \n\n\n\nPhysical baryon density \\(\\Omega_b\\)\nPhysical dark matter density \\(\\Omega_{cdm}\\)\nThe age of the universe \\(t_0\\)\nScalar spectral index \\(n_s\\)\nCurvature fluctuation amplitude \\(A_s\\)\nReionization optical depth \\(\\tau\\)"
},
{
"objectID": "marseille2024/index.html#the-lambdacdm-view-of-the-universe",
"href": "marseille2024/index.html#the-lambdacdm-view-of-the-universe",
"title": "Differentiable and distributed Particle-Mesh n-body simulations",
"section": "",
"text": "lambda CMD model\n\n\n\n\n\n\n \n\n\n\nPhysical baryon density \\(\\Omega_b\\)\nPhysical dark matter density \\(\\Omega_{cdm}\\)\nThe age of the universe \\(t_0\\)\nScalar spectral index \\(n_s\\)\nCurvature fluctuation amplitude \\(A_s\\)\nReionization optical depth \\(\\tau\\)"
"section": "the \\(\\Lambda\\)CDM view of the universe",
"text": "the \\(\\Lambda\\)CDM view of the universe\n\n\n\n\n\n\n\nlambda CMD model\n\n\n\n\n\n\n \nCosmological Parameters:\n\nPhysical baryon density \\(\\Omega_b\\)\nPhysical dark matter density \\(\\Omega_{cdm}\\)\nThe age of the universe \\(t_0\\)\nScalar spectral index \\(n_s\\)\nCurvature fluctuation amplitude \\(A_s\\)\nReionization optical depth \\(\\tau\\)"
},
{
"objectID": "marseille2024/index.html#cosmological-probes",
Expand All @@ -25,6 +18,6 @@
"href": "marseille2024/index.html#traditional-cosmological-inference",
"title": "Differentiable and distributed Particle-Mesh n-body simulations",
"section": "Traditional cosmological inference",
"text": "Traditional cosmological inference\n\n\n\n\n\n\n\n\n (Hikage et al. 2018)\n\n\n\n\n\n\n\n\n\n\n\n(Hikage et al. 2018)\n\n\n\n ➢  Measure the ellipticity \\(e = \\gamma + e_i\\) of all galaxies\n\n\n ➢  Compute summary statistics based on the 2-point correlation function of the shear field\n\n\n ➢  Run an MCMC chain to recover the posterior distribution of the cosmological parameters, using an analytical likelihood \\[p(\\theta | x ) \\propto \\underbrace{p(x | \\theta)}_{\\mathrm{likelihood}} \\ \\underbrace{p(\\theta)}_{\\mathrm{prior}}\\]\n\n\n\n\n\n\nLimitations\n\n\n\nSimple summary statistics works well for Gaussian fields\nThe need to compute from theory the likelihood for simple summary statistics\n\n\n\n\n\n\nBeyond 2 point statistics : Forward modeling\n\n\n\n\n\n\n\n➕  No longer need to compute the likelihood analytically ➖  We need to infer the joint posterior \\(p(\\theta, z | x)\\) before marginalization to get \\(p(\\theta | x) = \\int p(\\theta, z | x) \\, dz\\)\n\n\n\n\nPossible solutions\n\n\n\nHamiltonian Monte Carlo\nVariational Inference\nDimensionality reduction using Fisher Information Matrix\n\nAll require a differentiable fast forward model\n\n\n\n\n\n\nForward Models in Cosmology\n \n\n\n\n\n\n\nInitial Conditions\n\n\n\n\n\n\nFinal Dark Matter\n\n\n\n\n\n\nLinear field\n\n\n\n\n\n\nLinear field\n\n\n\n\n\n\n\n\nCosmological simulations WIP\n\n\n\n\nFast Particle-Mesh WIP\nHaving a full n body simulation is very expensive, so we use a particle mesh simulation\nThe idea is to put particles in a grid and evolve the particles in the grid\nyou lose some precision on the small scales, but on the large scale it is pretty accurate.\nNumerical cheme\nCloud in Cell binning\n\n\n\nalt text\n\n\n\n\n\nalt text\n\n\nwhen we have the forces on the grid point we interpolate the forces on the particles\n\n\n\nalt text\n\n\n\n\n\nWhere does LSST fit into this WIP\nRubin observatory will provide us with a 15 TB of data per night for 10 years\nsay that we need to make a cube of big part of the volume of the survey.\nPicture of a 3D force vector field\nHigh end GPUs have a reached 80 GBs A100 next generation H100\n\n\n\nProblems to consider WIP\n\nDiffrentability\nFast GPU\nDistribution of the data\nMulti Host distribution\n\nJAX\n\n\n\nJAXDECOMP WIP\nexample of numpy code and grad\nwe can also run multiple devices in a single controlle set up\n80 w=x 80\nDifferentiable multi host distributed n body simulation that runs on GPUs\nRelated work\nFastPM Poqueres nbody 64^3 pmwd 512^3\n\n\n\nJAXPM WIP\nbuilt on top of JAXDECOMP\nexample code of JAXPM\nThree steps\nGenerate the linear field\nthen LPT simulation\nthen the final nbody simulation (use any differential equation solver like diffrax)\nFinally we put the final grid back to the particles\n\n\n\nAutoDiff capabilities WIP\nWe can simply differentiate the entire simulation using jax.grad\nWe can also interface it with machine learning frameworks like flax for the dark matter halo painting and galaxy painting\n\n\n\nBenchmark WIP\n\n\nAnimation WIP\nUnique solutions : no, two particles crossing each other yoy can’t tell which one is which"
"text": "Traditional cosmological inference\n\n\n\n\n\n\n\n\n (Hikage et al. 2018)\n\n\n\n\n\n\n\n\n\n\n\n\n\n ➢  Measure the ellipticity \\(e = \\gamma + e_i\\) of all galaxies\n\n\n ➢  Compute summary statistics based on the 2-point correlation function of the shear field\n\n\n ➢  Run an MCMC chain to recover the posterior distribution of the cosmological parameters, using an analytical likelihood \\[p(\\theta | x ) \\propto \\underbrace{p(x | \\theta)}_{\\mathrm{likelihood}} \\ \\underbrace{p(\\theta)}_{\\mathrm{prior}}\\]\n\n\n\n\n\n\nLimitations\n\n\n\nSimple summary statistics works well for Gaussian fields\nThe need to compute from theory the likelihood for simple summary statistics\n\n\n\n\n\n(Hikage et al. 2018)"
}
]
Loading

0 comments on commit 7625f46

Please sign in to comment.