Overlap fixing and more (#107)

* Update README

* Update density field reader

* Update name of SDSSxALFAFA

* Fix quick bug

* Add little fixes

* Update README

* Put back fit_init

* Add paths to initial snapshots

* Add export

* Remove some choices

* Edit README

* Add Jens' comments

* Organize imports

* Rename snapshot

* Add additional print statement

* Add paths to initial snapshots

* Add masses to the initial files

* Add normalization

* Edit README

* Update README

* Fix bug in CSiBORG1 so that does not read fof_00001

* Edit README

* Edit README

* Overwrite comments

* Add paths to init lag

* Fix Quijote path

* Add lagpatch

* Edit submits

* Update README

* Fix numpy int problem

* Update README

* Add a flag to keep the snapshots open when fitting

* Add a flag to keep snapshots open

* Comment out some path issue

* Keep snapshots open

* Access directly snasphot

* Add lagpatch for CSiBORG2

* Add treatment of x-z coordinates flipping

* Add radial velocity field loader

* Update README

* Add lagpatch to Quijote

* Fix typo

* Add setter

* Fix typo

* Update README

* Add output halo cat as ASCII

* Add import

* Add halo plot

* Update README

* Add evaluating field at radial distanfe

* Add field shell evaluation

* Add enclosed mass computation

* Add BORG2 import

* Add BORG boxsize

* Add BORG paths

* Edit run

* Add BORG2 overdensity field

* Add bulk flow clauclation

* Update README

* Add new plots

* Add nbs

* Edit paper

* Update plotting

* Fix overlap paths to contain simname

* Add normalization of positions

* Add default paths to CSiBORG1

* Add overlap path simname

* Fix little things

* Add CSiBORG2 catalogue

* Update README

* Add import

* Add TNG density field constructor

* Add TNG density

* Add draft of calculating BORG ACL

* Fix bug

* Add ACL of enclosed density

* Add nmean acl

* Add galaxy bias calculation

* Add BORG acl notebook

* Add enclosed mass calculation

* Add TNG300-1 dir

* Add TNG300 and BORG1 dir

* Update nb
This commit is contained in:
Richard Stiskalek 2024-01-30 16:14:07 +00:00 committed by GitHub
parent 0984191dc8
commit 9e4b34f579
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
30 changed files with 10037 additions and 248 deletions

View file

@ -4,24 +4,37 @@ Tools for analysing the suite of Constrained Simulations in BORG (CSiBORG) simul
## Ongoing projects
### Data to calculate
- [x] Process all CSiBORG1 snapshots (running).
- [ ] Calculate halo properties for CSiBORG1
- [x] Calculate initial properties for CSiBORG1
- [ ] Calculate halo properties for CSiBORG2
- [x] Calculate initial properties for CSiBORG2
- [ ] Process all Quijote simulations.
- [ ] Calculate halo properties for Quijote
- [ ] Calculate initial properties for Quijote
### General
- [ ] Add new halo properties to the catalogues.
- [x] Add initial halo properties to the catalogues.
- [x] Add a new flag for flipping x- and z-coordinates fro catalogues, snapshots and field readers.
- [x] Add radial velocity field loader.
### Consistent halo reconstruction
- [ ] Make a sketch of the overlap definition and add it to the paper.
- [ ] Improve the storage system for overlaps and calculate it for all simulations.
- [ ] Re-calculate the overlaps for CSiBORG1, Quijote and CSiBORG2
- [x] Fix the script to calculate the initial lagrangian positions etc.
### Enviromental dependence of galaxy properties
- [ ] Prepare the CSiBORG one particle files for SPH.
- [x] Prepare the CSiBORG one particle files for SPH.
- [ ] Transfer, calculate the SPH density field for CSiBORG1 and transfer back.
- [x] Check that the velocity-field flipping of x and z coordinates is correct.
- [x] Evaluate and share the density field for SDSS and SDSSxALFALFA for both CSiBORG2 and random fields.
- [ ] Check and verify the density field of galaxy colours (cannot do this now! Glamdring is super slow.)
#### Calculated data
##### SPH-density & velocity field
- *CSiBORG2_main*, *CSiBORG2_random*, *CSiBORG2_varysmall*
- Evaluated for SDSS and SDSSxALFALFA in: *CSiBORG2_main*, *CSiBORG2_random*
#### Radial velocity field
- *CSiBORG2_main
- [x] Check and verify the density field of galaxy colours (cannot do this now! Glamdring is super slow.)
- [x] Calculate the radial velocity field for random realizations (submitted)
- [x] Send Catherine concatenated data.
- [ ] Start analyzing DiSPERSE results.
### Mass-assembly of massive clusters
@ -33,13 +46,35 @@ Tools for analysing the suite of Constrained Simulations in BORG (CSiBORG) simul
### Effect of small-scale noise
- [ ] Study how the small-scale noise variation affects the overlap measure, halo concentration and spin.
- [ ] Add uncertainty on the halo concentration.
### Gravitational-wave and large-scale structure
- [ ] Validate the velocity field results agains Supranta data sets.
- [x] Write code to estimate the enclosed mass and bulk flow.
- [ ] Write code to estimate the average radial velocity in a spherical shell.
- [ ] Write code to calculate the power spectrum of velocities.
- [ ] Estimate the amplitude of the velocity field in radial shells around the observer, estimate analogous results for random simulations, and see if they agree within cosmic variance.
- [ ] Calculate power spectra of velocities and maybe velocity dispersion.
- [ ] Make the velocity field data available.
### CSiBORG meets X-ray
- [ ] Make available one example snapshot from the simulation. Mention the issue with x- and z-coordinates.
- [x] Make available one example snapshot from the simulation. Mention the issue with x- and z-coordinates.
- [ ] Answer Johan and make a comparison to the Planck clusters.
### CSiBORG advertising
- [ ] Decide on the webpage design and what to store there.
- [ ] Write a short letter describing the simulations.
### Calculated data
#### Enclosed mass & bulk velocity
- *CSiBORG2_main*, *CSiBORG2_varysmall*, *CSiBORG2_arandom*
#### SPH-density & velocity field
- *CSiBORG2_main*, *CSiBORG2_random*, *CSiBORG2_varysmall*
- Evaluated for SDSS and SDSSxALFALFA in: *CSiBORG2_main*, *CSiBORG2_random*
#### Radial velocity field
- *CSiBORG2_main, *CSiBORG2_random*

View file

@ -17,7 +17,8 @@ from csiborgtools import clustering, field, halo, match, read, summary
from .utils import (center_of_mass, delta2ncells, number_counts, # noqa
periodic_distance, periodic_distance_two_points, # noqa
binned_statistic, cosine_similarity, fprint, # noqa
hms_to_degrees, dms_to_degrees, great_circle_distance) # noqa
hms_to_degrees, dms_to_degrees, great_circle_distance, # noqa
radec_to_cartesian) # noqa
from .params import paths_glamdring, simname2boxsize # noqa
@ -52,7 +53,9 @@ class SDSSxALFALFA:
if fpath is None:
fpath = "/mnt/extraspace/rstiskalek/catalogs/5asfullmatch.fits"
sel_steps = self.steps if apply_selection else None
return read.SDSS(fpath, h=1, sel_steps=sel_steps)
survey = read.SDSS(fpath, h=1, sel_steps=sel_steps)
survey.name = "SDSSxALFALFA"
return survey
###############################################################################

View file

@ -17,5 +17,6 @@ from .density import (DensityField, PotentialField, TidalTensorField,
overdensity_field) # noqa
from .interp import (evaluate_cartesian, evaluate_sky, field2rsp, # noqa
fill_outside, make_sky, observer_peculiar_velocity, # noqa
nside2radec, smoothen_field) # noqa
smoothen_field, field_at_distance) # noqa
from .corr import bayesian_bootstrap_correlation # noqa
from .utils import nside2radec # noqa

View file

@ -15,7 +15,6 @@
"""
Tools for interpolating 3D fields at arbitrary positions.
"""
import healpy
import MAS_library as MASL
import numpy
import smoothing_library as SL
@ -23,7 +22,7 @@ from numba import jit
from tqdm import tqdm, trange
from ..utils import periodic_wrap_grid, radec_to_cartesian
from .utils import divide_nonzero, force_single_precision
from .utils import divide_nonzero, force_single_precision, nside2radec
###############################################################################
@ -219,18 +218,47 @@ def make_sky(field, angpos, dist, boxsize, verbose=True):
return out
def nside2radec(nside):
"""
Generate RA [0, 360] deg. and declination [-90, 90] deg. for HEALPix pixel
centres at a given nside.
"""
pixs = numpy.arange(healpy.nside2npix(nside))
theta, phi = healpy.pix2ang(nside, pixs)
###############################################################################
# Average field at a radial distance #
###############################################################################
ra = 180 / numpy.pi * phi
dec = 90 - 180 / numpy.pi * theta
return numpy.vstack([ra, dec]).T
def field_at_distance(field, distance, boxsize, smooth_scales=None, nside=128,
verbose=True):
"""
Evaluate a scalar field at uniformly spaced angular coordinates at a
given distance from the observer
Parameters
----------
field : 3-dimensional array of shape `(grid, grid, grid)`
Field to be interpolated.
distance : float
Distance from the observer in `Mpc / h`.
boxsize : float
Box size in `Mpc / h`.
smooth_scales : (list of) float, optional
Smoothing scales in `Mpc / h`. If `None`, no smoothing is performed.
nside : int, optional
HEALPix nside. Used to generate the uniformly spaced angular
coordinates. Recommended to be >> 1.
verbose : bool, optional
Smoothing verbosity flag.
Returns
-------
vals : n-dimensional array of shape `(npix, len(smooth_scales))`
"""
# Get positions of HEALPix pixels on the sky and then convert those to
# box Cartesian coordinates. We take HEALPix pixels because they are
# uniformly distributed on the sky.
angpos = nside2radec(nside)
X = numpy.hstack([numpy.ones(len(angpos)).reshape(-1, 1) * distance,
angpos])
X = radec_to_cartesian(X) / boxsize + 0.5
return evaluate_cartesian(field, pos=X, smooth_scales=smooth_scales,
verbose=verbose)
###############################################################################

View file

@ -18,6 +18,7 @@ imports.
"""
from numba import jit
import numpy
import healpy
def force_single_precision(x):
@ -42,3 +43,26 @@ def divide_nonzero(field0, field1):
for k in range(kmax):
if field1[i, j, k] != 0:
field0[i, j, k] /= field1[i, j, k]
def nside2radec(nside):
"""
Generate RA [0, 360] deg. and declination [-90, 90] deg for HEALPix pixel
centres at a given nside.
Parameters
----------
nside : int
HEALPix nside.
Returns
-------
angpos : 2-dimensional array of shape (npix, 2)
"""
pixs = numpy.arange(healpy.nside2npix(nside))
theta, phi = healpy.pix2ang(nside, pixs)
ra = 180 / numpy.pi * phi
dec = 90 - 180 / numpy.pi * theta
return numpy.vstack([ra, dec]).T

View file

@ -202,10 +202,9 @@ class RealisationsMatcher(BaseMatcher):
# in the reference simulation from the cross simulation in the initial
# snapshot.
match_indxs = radius_neighbours(
catx.knn(in_initial=True, subtract_observer=False, periodic=True),
cat0["lagpatch_coordinates"], radiusX=cat0["lagpatch_radius"],
radiusKNN=catx["lagpatch_radius"], nmult=self.nmult,
enforce_int32=True, verbose=verbose)
catx.knn(in_initial=True), cat0["lagpatch_coordinates"],
radiusX=cat0["lagpatch_radius"], radiusKNN=catx["lagpatch_radius"],
nmult=self.nmult, enforce_int32=True, verbose=verbose)
# We next remove neighbours whose mass is too large/small.
if self.dlogmass is not None:
@ -367,6 +366,7 @@ class ParticleOverlap(BaseMatcher):
cellmin = self.box_size // 2 - self.bckg_halfsize
cellmax = self.box_size // 2 + self.bckg_halfsize
ncells = cellmax - cellmin
boxsize_mpc = cat.boxsize
# We then pre-allocate the density field/check it is of the right shape
if delta is None:
delta = numpy.zeros((ncells,) * 3, dtype=numpy.float32)
@ -382,6 +382,7 @@ class ParticleOverlap(BaseMatcher):
for hid in iterator:
try:
pos = cat.snapshot.halo_coordinates(hid, is_group=True)
pos /= boxsize_mpc
except ValueError as e:
# If not particles found for this halo, just skip it.
if str(e).startswith("Halo "):
@ -852,6 +853,8 @@ def load_processed_halo(hid, cat, ncells, nshift):
pos = cat.snapshot.halo_coordinates(hid, is_group=True)
mass = cat.snapshot.halo_masses(hid, is_group=True)
pos /= cat.boxsize
pos = pos2cell(pos, ncells)
mins, maxs = get_halo_cell_limits(pos, ncells=ncells, nshift=nshift)
return pos, mass, numpy.sum(mass), mins, maxs

View file

@ -34,6 +34,8 @@ def simname2boxsize(simname):
"csiborg2_main": 676.6,
"csiborg2_varysmall": 676.6,
"csiborg2_random": 676.6,
"borg1": 677.7,
"borg2": 676.6,
"quijote": 1000.
}
@ -52,6 +54,8 @@ paths_glamdring = {
"csiborg2_random_srcdir": "/mnt/extraspace/rstiskalek/csiborg2_random", # noqa
"postdir": "/mnt/extraspace/rstiskalek/csiborg_postprocessing/",
"quijote_dir": "/mnt/extraspace/rstiskalek/quijote",
"borg2_dir": "/mnt/extraspace/rstiskalek/BORG_STOPYRA_2023",
"tng300_1_dir": "/mnt/extraspace/rstiskalek/TNG300-1/",
}

View file

@ -14,8 +14,10 @@
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
from .catalogue import (CSiBORG1Catalogue, CSiBORG2Catalogue, # noqa
CSiBORG2MergerTreeReader, QuijoteCatalogue) # noqa
from .snapshot import (CSIBORG1Snapshot, CSIBORG2Snapshot, QuijoteSnapshot, # noqa
CSiBORG1Field, CSiBORG2Field, QuijoteField) # noqa
from .snapshot import (CSiBORG1Snapshot, CSiBORG2Snapshot, QuijoteSnapshot, # noqa
CSiBORG1Field, CSiBORG2Field, QuijoteField, BORG2Field, # noqa
BORG1Field) # noqa
from .obs import (SDSS, MCXCClusters, PlanckClusters, TwoMPPGalaxies, # noqa
TwoMPPGroups, ObservedCluster, match_array_to_no_masking) # noqa
TwoMPPGroups, ObservedCluster, match_array_to_no_masking, # noqa
cols_to_structured) # noqa
from .paths import Paths # noqa

View file

@ -69,6 +69,7 @@ class BaseCatalogue(ABC):
self._observer_location = None
self._observer_velocity = None
self._flip_xz = False
self._boxsize = None
self._cache = OrderedDict()
@ -81,7 +82,7 @@ class BaseCatalogue(ABC):
def init_with_snapshot(self, simname, nsim, nsnap, paths, snapshot,
bounds, boxsize, observer_location,
observer_velocity, cache_maxsize=64):
observer_velocity, flip_xz, cache_maxsize=64):
self.simname = simname
self.nsim = nsim
self.nsnap = nsnap
@ -89,6 +90,7 @@ class BaseCatalogue(ABC):
self.boxsize = boxsize
self.observer_location = observer_location
self.observer_velocity = observer_velocity
self.flip_xz = flip_xz
self.cache_maxsize = cache_maxsize
@ -211,6 +213,24 @@ class BaseCatalogue(ABC):
raise TypeError("`boxsize` must be an integer or float.")
self._boxsize = float(boxsize)
@property
def flip_xz(self):
"""
Whether to flip the x- and z-coordinates to undo the MUSIC bug to match
observations.
Returns
-------
bool
"""
return self._flip_xz
@flip_xz.setter
def flip_xz(self, flip_xz):
if not isinstance(flip_xz, bool):
raise TypeError("`flip_xz` must be a boolean.")
self._flip_xz = flip_xz
@property
def cache_maxsize(self):
"""
@ -592,6 +612,10 @@ class BaseCatalogue(ABC):
elif key == "redshift_dist":
out = self["__cartesian_redshift_pos"]
out = numpy.linalg.norm(out - self.observer_location, axis=1)
elif key == "lagpatch_radius":
out = self.lagpatch_radius
elif key == "lagpatch_coordinates":
out = self.lagpatch_coordinates
elif key == "npart":
out = self.npart
elif key == "totmass":
@ -650,16 +674,23 @@ class CSiBORG1Catalogue(BaseCatalogue):
a boolean.
observer_velocity : 1-dimensional array, optional
Observer's velocity in :math:`\mathrm{km} / \mathrm{s}`.
flip_xz : bool, optional
Whether to flip the x- and z-coordinates to undo the MUSIC bug to match
observations.
cache_maxsize : int, optional
Maximum number of cached arrays.
"""
def __init__(self, nsim, paths=None, snapshot=None, bounds=None,
observer_velocity=None, cache_maxsize=64):
observer_velocity=None, flip_xz=True, cache_maxsize=64):
super().__init__()
if paths is None:
paths = Paths(**paths_glamdring)
super().init_with_snapshot(
"csiborg1", nsim, max(paths.get_snapshots(nsim, "csiborg1")),
paths, snapshot, bounds, 677.7, [338.85, 338.85, 338.85],
observer_velocity, cache_maxsize)
observer_velocity, flip_xz, cache_maxsize)
self._custom_keys = []
@ -675,8 +706,11 @@ class CSiBORG1Catalogue(BaseCatalogue):
@property
def coordinates(self):
# NOTE: We flip x and z to undo MUSIC bug.
z, y, x = [self._read_fof_catalogue(key) for key in ["x", "y", "z"]]
x, y, z = [self._read_fof_catalogue(key) for key in ["x", "y", "z"]]
if self.flip_xz:
return numpy.vstack([z, y, x]).T
else:
return numpy.vstack([x, y, z]).T
@property
@ -698,11 +732,18 @@ class CSiBORG1Catalogue(BaseCatalogue):
@property
def lagpatch_coordinates(self):
raise RuntimeError("Lagrangian patch coordinates are not available.")
fpath = self.paths.initial_lagpatch(self.nsim, self.simname)
data = numpy.load(fpath)
if self.flip_xz:
return numpy.vstack([data["z"], data["y"], data["x"]]).T
else:
return numpy.vstack([data["x"], data["y"], data["z"]]).T
@property
def lagpatch_radius(self):
raise RuntimeError("Lagrangian patch radius is not available.")
fpath = self.paths.initial_lagpatch(self.nsim, self.simname)
return numpy.load(fpath)["lagpatch_size"]
###############################################################################
@ -730,15 +771,20 @@ class CSiBORG2Catalogue(BaseCatalogue):
a boolean.
observer_velocity : 1-dimensional array, optional
Observer's velocity in :math:`\mathrm{km} / \mathrm{s}`.
flip_xz : bool, optional
Whether to flip the x- and z-coordinates to undo the MUSIC bug to match
observations.
cache_maxsize : int, optional
Maximum number of cached arrays.
"""
def __init__(self, nsim, nsnap, kind, paths=None, snapshot=None,
bounds=None, observer_velocity=None, cache_maxsize=64):
bounds=None, observer_velocity=None, flip_xz=True,
cache_maxsize=64):
super().__init__()
super().init_with_snapshot(
f"csiborg2_{kind}", nsim, nsnap, paths, snapshot, bounds,
676.6, [338.3, 338.3, 338.3], observer_velocity, cache_maxsize)
676.6, [338.3, 338.3, 338.3], observer_velocity, flip_xz,
cache_maxsize)
self._custom_keys = ["GroupFirstSub", "GroupContamination",
"GroupNsubs", "Group_M_Crit200"]
@ -767,15 +813,15 @@ class CSiBORG2Catalogue(BaseCatalogue):
@property
def coordinates(self):
# Loading directly the Gadget4 output, flip x and z to undo MUSIC bug.
out = self._read_fof_catalogue("GroupPos")
if self.flip_xz:
out[:, [0, 2]] = out[:, [2, 0]]
return out
@property
def velocities(self):
# Loading directly the Gadget4 output, flip x and z to undo MUSIC bug.
out = self._read_fof_catalogue("GroupVel")
if self.flip_xz:
out[:, [0, 2]] = out[:, [2, 0]]
return out
@ -795,11 +841,28 @@ class CSiBORG2Catalogue(BaseCatalogue):
@property
def lagpatch_coordinates(self):
raise RuntimeError("Lagrangian patch coordinates are not available.")
if self.nsnap != 99:
raise RuntimeError("Lagrangian patch information is only "
"available for haloes defined at the final "
f"snapshot (indexed 99). Chosen {self.nsnap}.")
fpath = self.paths.initial_lagpatch(self.nsim, self.simname)
data = numpy.load(fpath)
if self.flip_xz:
return numpy.vstack([data["z"], data["y"], data["x"]]).T
else:
return numpy.vstack([data["x"], data["y"], data["z"]]).T
@property
def lagpatch_radius(self):
raise RuntimeError("Lagrangian patch radius is not available.")
if self.nsnap != 99:
raise RuntimeError("Lagrangian patch information is only "
"available for haloes defined at the final "
f"snapshot (indexed 99). Chosen {self.nsnap}.")
fpath = self.paths.initial_lagpatch(self.nsim, self.simname)
return numpy.load(fpath)["lagpatch_size"]
@property
def GroupFirstSub(self):
@ -1086,12 +1149,11 @@ class QuijoteCatalogue(BaseCatalogue):
Maximum number of cached arrays.
"""
def __init__(self, nsim, paths=None, snapshot=None, bounds=None,
observer_velocity=None,
cache_maxsize=64):
observer_velocity=None, cache_maxsize=64):
super().__init__()
super().init_with_snapshot(
"quijote", nsim, 4, paths, snapshot, bounds, 1000,
[500., 500., 500.,], observer_velocity, cache_maxsize)
[500., 500., 500.,], observer_velocity, False, cache_maxsize)
self._custom_keys = []
self._bounds = bounds
@ -1131,11 +1193,14 @@ class QuijoteCatalogue(BaseCatalogue):
@property
def lagpatch_coordinates(self):
raise RuntimeError("Lagrangian patch coordinates are not available.")
fpath = self.paths.initial_lagpatch(self.nsim, self.simname)
data = numpy.load(fpath)
return numpy.vstack([data["x"], data["y"], data["z"]]).T
@property
def lagpatch_radius(self):
raise RuntimeError("Lagrangian patch radius is not available.")
fpath = self.paths.initial_lagpatch(self.nsim, self.simname)
return numpy.load(fpath)["lagpatch_size"]
def pick_fiducial_observer(self, n, rmax):
r"""

View file

@ -53,6 +53,12 @@ class Paths:
Path to the CSiBORG post-processing directory.
quijote_dir : str
Path to the Quijote simulation directory.
borg1_dir : str
Path to the BORG1 simulation directory.
borg2_dir : str
Path to the BORG2 simulation directory.
tng300_1_dir : str
Path to the TNG300-1 simulation directory.
"""
def __init__(self,
csiborg1_srcdir,
@ -61,13 +67,18 @@ class Paths:
csiborg2_varysmall_srcdir,
postdir,
quijote_dir,
borg1_dir,
borg2_dir,
tng300_1_dir
):
self.csiborg1_srcdir = csiborg1_srcdir
self.csiborg2_main_srcdir = csiborg2_main_srcdir
self.csiborg2_random_srcdir = csiborg2_random_srcdir
self.csiborg2_varysmall_srcdir = csiborg2_varysmall_srcdir
self.quijote_dir = quijote_dir
self.borg1_dir = borg1_dir
self.borg2_dir = borg2_dir
self.tng300_1_dir = tng300_1_dir
self.postdir = postdir
def get_ics(self, simname):
@ -83,10 +94,10 @@ class Paths:
-------
ids : 1-dimensional array
"""
if simname == "csiborg1":
if simname == "csiborg1" or simname == "borg1":
files = glob(join(self.csiborg1_srcdir, "chain_*"))
files = [int(search(r'chain_(\d+)', f).group(1)) for f in files]
elif simname == "csiborg2_main":
elif simname == "csiborg2_main" or simname == "borg2":
files = glob(join(self.csiborg2_main_srcdir, "chain_*"))
files = [int(search(r'chain_(\d+)', f).group(1)) for f in files]
elif simname == "csiborg2_random":
@ -175,25 +186,27 @@ class Paths:
str
"""
if simname == "csiborg1":
return join(self.csiborg1_srcdir, f"chain_{nsim}",
fpath = join(self.csiborg1_srcdir, f"chain_{nsim}",
f"snapshot_{str(nsnap).zfill(5)}.hdf5")
elif simname == "csiborg2_main":
return join(self.csiborg2_main_srcdir, f"chain_{nsim}", "output",
fpath = join(self.csiborg2_main_srcdir, f"chain_{nsim}", "output",
f"snapshot_{str(nsnap).zfill(3)}.hdf5")
elif simname == "csiborg2_random":
return join(self.csiborg2_random_srcdir, f"chain_{nsim}", "output",
f"snapshot_{str(nsnap).zfill(3)}.hdf5")
fpath = join(self.csiborg2_random_srcdir, f"chain_{nsim}",
"output", f"snapshot_{str(nsnap).zfill(3)}.hdf5")
elif simname == "csiborg2_varysmall":
return join(self.csiborg2_varysmall_srcdir,
fpath = join(self.csiborg2_varysmall_srcdir,
f"chain_16417_{str(nsim).zfill(3)}", "output",
f"snapshot_{str(nsnap).zfill(3)}.hdf5")
elif simname == "quijote":
return join(self.quijote_dir, "fiducial_processed",
fpath = join(self.quijote_dir, "fiducial_processed",
f"chain_{nsim}",
f"snapshot_{str(nsnap).zfill(3)}.hdf5")
else:
raise ValueError(f"Unknown simulation name `{simname}`.")
return fpath
def snapshot_catalogue(self, nsnap, nsim, simname):
"""
Path to the halo catalogue of a simulation snapshot.
@ -218,7 +231,7 @@ class Paths:
return join(self.csiborg2_main_srcdir, f"chain_{nsim}", "output",
f"fof_subhalo_tab_{str(nsnap).zfill(3)}.hdf5")
elif simname == "csiborg2_random":
return join(self.csiborg2_ranodm_srcdir, f"chain_{nsim}", "output",
return join(self.csiborg2_random_srcdir, f"chain_{nsim}", "output",
f"fof_subhalo_tab_{str(nsnap).zfill(3)}.hdf5")
elif simname == "csiborg2_varysmall":
return join(self.csiborg2_varysmall_srcdir,
@ -231,6 +244,40 @@ class Paths:
else:
raise ValueError(f"Unknown simulation name `{simname}`.")
def initial_lagpatch(self, nsim, simname):
"""
Path to the Lagrangain patch information of a simulation for halos
defined at z = 0.
Parameters
----------
nsim : int
IC realisation index.
simname : str
Simulation name.
Returns
-------
str
"""
if simname == "csiborg1":
return join(self.csiborg1_srcdir, f"chain_{nsim}",
"initial_lagpatch.npy")
elif simname == "csiborg2_main":
return join(self.csiborg2_main_srcdir, "catalogues",
f"initial_lagpatch_{nsim}.npy")
elif simname == "csiborg2_random":
return join(self.csiborg2_random_srcdir, "catalogues",
f"initial_lagpatch_{nsim}.npy")
elif simname == "csiborg2_varysmall":
return join(self.csiborg2_varysmall_srcdir, "catalogues",
f"initial_lagpatch_{nsim}.npy")
elif simname == "quijote":
return join(self.quijote_dir, "fiducial_processed",
f"chain_{nsim}", "initial_lagpatch.npy")
else:
raise ValueError(f"Unknown simulation name `{simname}`.")
def trees(self, nsim, simname):
"""
Path to the halo trees of a simulation snapshot.
@ -284,7 +331,7 @@ class Paths:
-------
str
"""
if simname == "csiborg":
if "csiborg" in simname:
fdir = join(self.postdir, "overlap")
elif simname == "quijote":
fdir = join(self.quijote_dir, "overlap")
@ -297,7 +344,7 @@ class Paths:
nsimx = str(nsimx).zfill(5)
min_logmass = float('%.4g' % min_logmass)
fname = f"overlap_{nsim0}_{nsimx}_{min_logmass}.npz"
fname = f"overlap_{simname}_{nsim0}_{nsimx}_{min_logmass}.npz"
if smoothed:
fname = fname.replace("overlap", "overlap_smoothed")
return join(fdir, fname)
@ -367,6 +414,13 @@ class Paths:
-------
str
"""
if simname == "borg2":
return join(self.borg2_dir, f"mcmc_{nsim}.h5")
if simname == "borg1":
#
return f"/mnt/zfsusers/hdesmond/BORG_final/mcmc_{nsim}.h5"
if MAS == "SPH" and kind in ["density", "velocity"]:
if simname == "csiborg1":
raise ValueError("SPH field not available for CSiBORG1.")
@ -581,3 +635,13 @@ class Paths:
files = glob(join(fdir, f"{simname}_tpcf*"))
run = "__" + run
return [f for f in files if run in f]
def tng300_1(self):
"""
Path to the TNG300-1 simulation directory.
Returns
-------
str
"""
return self.tng300_1_dir

View file

@ -18,6 +18,7 @@ should be implemented things such as flipping x- and z-axes, to make sure that
observed RA-dec can be mapped into the simulation box.
"""
from abc import ABC, abstractmethod, abstractproperty
from os.path import join
import numpy
from h5py import File
@ -35,14 +36,26 @@ class BaseSnapshot(ABC):
"""
Base class for reading snapshots.
"""
def __init__(self, nsim, nsnap, paths):
if not isinstance(nsim, int):
raise TypeError("`nsim` must be an integer")
self._nsim = nsim
def __init__(self, nsim, nsnap, paths, keep_snapshot_open=False,
flip_xz=False):
self._keep_snapshot_open = None
if not isinstance(nsnap, int):
if not isinstance(nsim, (int, numpy.integer)):
raise TypeError("`nsim` must be an integer")
self._nsim = int(nsim)
if not isinstance(nsnap, (int, numpy.integer)):
raise TypeError("`nsnap` must be an integer")
self._nsnap = nsnap
self._nsnap = int(nsnap)
if not isinstance(keep_snapshot_open, bool):
raise TypeError("`keep_snapshot_open` must be a boolean.")
self._keep_snapshot_open = keep_snapshot_open
self._snapshot_file = None
if not isinstance(flip_xz, bool):
raise TypeError("`flip_xz` must be a boolean.")
self._flip_xz = flip_xz
self._paths = paths
self._hid2offset = None
@ -106,6 +119,30 @@ class BaseSnapshot(ABC):
self._paths = Paths(**paths_glamdring)
return self._paths
@property
def keep_snapshot_open(self):
"""
Whether to keep the snapshot file open when reading halo particles.
This is useful for repeated access to the snapshot.
Returns
-------
bool
"""
return self._keep_snapshot_open
@property
def flip_xz(self):
"""
Whether to flip the x- and z-axes to undo the MUSIC bug so that the
coordinates are consistent with observations.
Returns
-------
bool
"""
return self._flip_xz
@abstractproperty
def coordinates(self):
"""
@ -221,6 +258,43 @@ class BaseSnapshot(ABC):
"""
pass
def open_snapshot(self):
"""
Open the snapshot file, particularly used in the context of loading in
particles of individual haloes.
Returns
-------
h5py.File
"""
if not self.keep_snapshot_open:
# Check if the snapshot path is set
if not hasattr(self, "_snapshot_path"):
raise RuntimeError("Snapshot path not set.")
return File(self._snapshot_path, "r")
# Here if we want to keep the snapshot open
if self._snapshot_file is None:
self._snapshot_file = File(self._snapshot_path, "r")
return self._snapshot_file
def close_snapshot(self):
"""
Close the snapshot file opened with `open_snapshot`.
Returns
-------
None
"""
if not self.keep_snapshot_open:
return
if self._snapshot_file is not None:
self._snapshot_file.close()
self._snapshot_file = None
def select_box(self, center, boxwidth):
"""
Find particle coordinates of particles within a box of size `boxwidth`
@ -248,10 +322,11 @@ class BaseSnapshot(ABC):
###############################################################################
class CSIBORG1Snapshot(BaseSnapshot):
class CSiBORG1Snapshot(BaseSnapshot):
"""
CSiBORG1 snapshot class with the FoF halo finder particle assignment.
CSiBORG1 was run with RAMSES.
CSiBORG1 was run with RAMSES. Note that the haloes are defined at z = 0 and
index from 1.
Parameters
----------
@ -261,9 +336,16 @@ class CSIBORG1Snapshot(BaseSnapshot):
Snapshot index.
paths : Paths, optional
Paths object.
keep_snapshot_open : bool, optional
Whether to keep the snapshot file open when reading halo particles.
This is useful for repeated access to the snapshot.
flip_xz : bool, optional
Whether to flip the x- and z-axes to undo the MUSIC bug so that the
coordinates are consistent with observations.
"""
def __init__(self, nsim, nsnap, paths=None):
super().__init__(nsim, nsnap, paths)
def __init__(self, nsim, nsnap, paths=None, keep_snapshot_open=False,
flip_xz=False):
super().__init__(nsim, nsnap, paths, keep_snapshot_open, flip_xz)
self._snapshot_path = self.paths.snapshot(
self.nsnap, self.nsim, "csiborg1")
self._simname = "csiborg1"
@ -272,6 +354,9 @@ class CSIBORG1Snapshot(BaseSnapshot):
with File(self._snapshot_path, "r") as f:
x = f[kind][...]
if self.flip_xz and kind in ["Coordinates", "Velocities"]:
x[:, [0, 2]] = x[:, [2, 0]]
return x
def coordinates(self):
@ -293,14 +378,19 @@ class CSIBORG1Snapshot(BaseSnapshot):
if not is_group:
raise ValueError("There is no subhalo catalogue for CSiBORG1.")
with File(self._snapshot_path, "r") as f:
f = self.open_snapshot()
i, j = self.hid2offset.get(halo_id, (None, None))
if i is None:
raise ValueError(f"Halo `{halo_id}` not found.")
x = f[kind][i:j + 1]
if not self.keep_snapshot_open:
self.close_snapshot()
if self.flip_xz and kind in ["Coordinates", "Velocities"]:
x[:, [0, 2]] = x[:, [2, 0]]
return x
def halo_coordinates(self, halo_id, is_group=True):
@ -313,8 +403,9 @@ class CSIBORG1Snapshot(BaseSnapshot):
return self._get_halo_particles(halo_id, "Masses", is_group)
def _make_hid2offset(self):
nsnap = max(self.paths.get_snapshots(self.nsim, "csiborg1"))
catalogue_path = self.paths.snapshot_catalogue(
self.nsnap, self.nsim, "csiborg1")
nsnap, self.nsim, "csiborg1")
with File(catalogue_path, "r") as f:
offset = f["GroupOffset"][:]
@ -326,7 +417,7 @@ class CSIBORG1Snapshot(BaseSnapshot):
# CSiBORG2 snapshot class #
###############################################################################
class CSIBORG2Snapshot(BaseSnapshot):
class CSiBORG2Snapshot(BaseSnapshot):
"""
CSiBORG2 snapshot class with the FoF halo finder particle assignment and
SUBFIND subhalo finder. The simulations were run with Gadget4.
@ -341,9 +432,16 @@ class CSIBORG2Snapshot(BaseSnapshot):
CSiBORG2 run kind. One of `main`, `random`, or `varysmall`.
paths : Paths, optional
Paths object.
keep_snapshot_open : bool, optional
Whether to keep the snapshot file open when reading halo particles.
This is useful for repeated access to the snapshot.
flip_xz : bool, optional
Whether to flip the x- and z-axes to undo the MUSIC bug so that the
coordinates are consistent with observations.
"""
def __init__(self, nsim, nsnap, kind, paths=None):
super().__init__(nsim, nsnap, paths)
def __init__(self, nsim, nsnap, kind, paths=None,
keep_snapshot_open=False, flip_xz=False):
super().__init__(nsim, nsnap, paths, keep_snapshot_open, flip_xz)
self.kind = kind
fpath = self.paths.snapshot(self.nsnap, self.nsim,
@ -390,6 +488,9 @@ class CSIBORG2Snapshot(BaseSnapshot):
else:
x = numpy.vstack([x, f[f"PartType5/{kind}"][...]])
if self.flip_xz and kind in ["Coordinates", "Velocities"]:
x[:, [0, 2]] = x[:, [2, 0]]
return x
def coordinates(self):
@ -408,7 +509,7 @@ class CSIBORG2Snapshot(BaseSnapshot):
if not is_group:
raise RuntimeError("While the CSiBORG2 subhalo catalogue exists, it is not currently implemented.") # noqa
with File(self._snapshot_path, "r") as f:
f = self.open_snapshot()
i1, j1 = self.hid2offset["type1"].get(halo_id, (None, None))
i5, j5 = self.hid2offset["type5"].get(halo_id, (None, None))
@ -425,9 +526,22 @@ class CSIBORG2Snapshot(BaseSnapshot):
else:
x1 = f[f"PartType1/{kind}"][i1:j1]
# Flipping of x- and z-axes
if self.flip_xz:
x1[:, [0, 2]] = x1[:, [2, 0]]
if i5 is not None and j5 - i5 > 0:
x5 = f[f"PartType5/{kind}"][i5:j5]
# Flipping of x- and z-axes
if self.flip_xz and kind in ["Coordinates", "Velocities"]:
x5[:, [0, 2]] = x5[:, [2, 0]]
# Close the snapshot file if we don't want to keep it open
if not self.keep_snapshot_open:
self.close_snapshot()
# Are we stacking high-resolution and low-resolution particles?
if i5 is None or j5 - i5 == 0:
return x1
@ -475,7 +589,7 @@ class CSIBORG2Snapshot(BaseSnapshot):
###############################################################################
class QuijoteSnapshot(CSIBORG1Snapshot):
class QuijoteSnapshot(CSiBORG1Snapshot):
"""
Quijote snapshot class with the FoF halo finder particle assignment.
Because of similarities with how the snapshot is processed with CSiBORG1,
@ -489,9 +603,12 @@ class QuijoteSnapshot(CSIBORG1Snapshot):
Snapshot index.
paths : Paths, optional
Paths object.
keep_snapshot_open : bool, optional
Whether to keep the snapshot file open when reading halo particles.
This is useful for repeated access to the snapshot.
"""
def __init__(self, nsim, nsnap, paths=None):
super().__init__(nsim, nsnap, paths)
def __init__(self, nsim, nsnap, paths=None, keep_snapshot_open=False):
super().__init__(nsim, nsnap, paths, keep_snapshot_open, flip_xz=False)
self._snapshot_path = self.paths.snapshot(self.nsnap, self.nsim,
"quijote")
self._simname = "quijote"
@ -515,13 +632,17 @@ class BaseField(ABC):
"""
Base class for reading fields such as density or velocity fields.
"""
def __init__(self, nsim, paths):
def __init__(self, nsim, paths, flip_xz=False):
if isinstance(nsim, numpy.integer):
nsim = int(nsim)
if not isinstance(nsim, int):
raise TypeError(f"`nsim` must be an integer. Received `{type(nsim)}`.") # noqa
self._nsim = nsim
if not isinstance(flip_xz, bool):
raise TypeError("`flip_xz` must be a boolean.")
self._flip_xz = flip_xz
self._paths = paths
@property
@ -548,6 +669,18 @@ class BaseField(ABC):
self._paths = Paths(**paths_glamdring)
return self._paths
@property
def flip_xz(self):
"""
Whether to flip the x- and z-axes to undo the MUSIC bug so that the
coordinates are consistent with observations.
Returns
-------
bool
"""
return self._flip_xz
@abstractmethod
def density_field(self, MAS, grid):
"""
@ -584,6 +717,24 @@ class BaseField(ABC):
"""
pass
@abstractmethod
def radial_velocity_field(self, MAS, grid):
"""
Return the pre-computed radial velocity field.
Parameters
----------
MAS : str
Mass assignment scheme.
grid : int
Grid size.
Returns
-------
field : 3-dimensional array
"""
pass
###############################################################################
# CSiBORG1 field class #
@ -600,9 +751,12 @@ class CSiBORG1Field(BaseField):
Simulation index.
paths : Paths, optional
Paths object. By default, the paths are set to the `glamdring` paths.
flip_xz : bool, optional
Whether to flip the x- and z-axes to undo the MUSIC bug so that the
coordinates are consistent with observations.
"""
def __init__(self, nsim, paths=None):
super().__init__(nsim, paths)
def __init__(self, nsim, paths=None, flip_xz=True):
super().__init__(nsim, paths, flip_xz)
self._simname = "csiborg1"
def density_field(self, MAS, grid):
@ -615,8 +769,7 @@ class CSiBORG1Field(BaseField):
else:
field = numpy.load(fpath)
# Flip x- and z-axes
if self._simname == "csiborg1":
if self.flip_xz:
field = field.T
return field
@ -634,8 +787,7 @@ class CSiBORG1Field(BaseField):
else:
field = numpy.load(fpath)
# Flip x- and z-axes
if self._simname == "csiborg1":
if self.flip_xz:
field[0, ...] = field[0, ...].T
field[1, ...] = field[1, ...].T
field[2, ...] = field[2, ...].T
@ -643,6 +795,14 @@ class CSiBORG1Field(BaseField):
return field
def radial_velocity_field(self, MAS, grid):
if not self.flip_xz and self._simname == "csiborg1":
raise ValueError("The radial velocity field is only implemented "
"for the flipped x- and z-axes.")
fpath = self.paths.field("radvel", MAS, grid, self.nsim, "csiborg1")
return numpy.load(fpath)
###############################################################################
# CSiBORG2 field class #
@ -661,10 +821,12 @@ class CSiBORG2Field(BaseField):
CSiBORG2 run kind. One of `main`, `random`, or `varysmall`.
paths : Paths, optional
Paths object. By default, the paths are set to the `glamdring` paths.
flip_xz : bool, optional
Whether to flip the x- and z-axes to undo the MUSIC bug so that the
coordinates are consistent with observations.
"""
def __init__(self, nsim, kind, paths=None):
super().__init__(nsim, paths)
def __init__(self, nsim, kind, paths=None, flip_xz=True):
super().__init__(nsim, paths, flip_xz)
self.kind = kind
@property
@ -696,7 +858,9 @@ class CSiBORG2Field(BaseField):
else:
field = numpy.load(fpath)
field = field.T # Flip x- and z-axes
if self.flip_xz:
field = field.T
return field
def velocity_field(self, MAS, grid):
@ -713,7 +877,7 @@ class CSiBORG2Field(BaseField):
else:
field = numpy.load(fpath)
# Flip x- and z-axes
if self.flip_xz:
field[0, ...] = field[0, ...].T
field[1, ...] = field[1, ...].T
field[2, ...] = field[2, ...].T
@ -721,6 +885,134 @@ class CSiBORG2Field(BaseField):
return field
def radial_velocity_field(self, MAS, grid):
if not self.flip_xz:
raise ValueError("The radial velocity field is only implemented "
"for the flipped x- and z-axes.")
fpath = self.paths.field("radvel", MAS, grid, self.nsim,
f"csiborg2_{self.kind}")
return numpy.load(fpath)
###############################################################################
# BORG1 field class #
###############################################################################
class BORG1Field(BaseField):
"""
BORG2 `z = 0` field class.
Parameters
----------
nsim : int
Simulation index.
paths : Paths, optional
Paths object. By default, the paths are set to the `glamdring` paths.
"""
def __init__(self, nsim, paths=None):
super().__init__(nsim, paths, False)
def overdensity_field(self):
fpath = self.paths.field(None, None, None, self.nsim, "borg1")
with File(fpath, "r") as f:
field = f["scalars/BORG_final_density"][:].astype(numpy.float32)
return field
def density_field(self):
field = self.overdensity_field()
omega0 = 0.307
rho_mean = omega0 * 277.53662724583074 # Msun / kpc^3
field += 1
field *= rho_mean
return field
def velocity_field(self, MAS, grid):
raise RuntimeError("The velocity field is not available.")
def radial_velocity_field(self, MAS, grid):
raise RuntimeError("The radial velocity field is not available.")
###############################################################################
# BORG2 field class #
###############################################################################
class BORG2Field(BaseField):
"""
BORG2 `z = 0` field class.
Parameters
----------
nsim : int
Simulation index.
paths : Paths, optional
Paths object. By default, the paths are set to the `glamdring` paths.
"""
def __init__(self, nsim, paths=None):
super().__init__(nsim, paths, False)
def overdensity_field(self):
fpath = self.paths.field(None, None, None, self.nsim, "borg2")
with File(fpath, "r") as f:
field = f["scalars/BORG_final_density"][:].astype(numpy.float32)
return field
def density_field(self):
field = self.overdensity_field()
omega0 = 0.3111
rho_mean = omega0 * 277.53662724583074 # h^2 Msun / kpc^3
field += 1
field *= rho_mean
# return field
def velocity_field(self, MAS, grid):
raise RuntimeError("The velocity field is not available.")
def radial_velocity_field(self, MAS, grid):
raise RuntimeError("The radial velocity field is not available.")
###############################################################################
# TNG300-1 field #
###############################################################################
class TNG300_1Field(BaseField):
"""
TNG300-1 dark matter-only `z = 0` field class.
Parameters
----------
paths : Paths, optional
Paths object. By default, the paths are set to the `glamdring` paths.
"""
def __init__(self, paths=None):
super().__init__(0, paths, False)
def overdensity_field(self, MAS, grid):
density = self.density_field(MAS, grid)
omega_dm = 0.3089 - 0.0486
rho_mean = omega_dm * 277.53662724583074 # h^2 Msun / kpc^3
density /= rho_mean
density -= 1
return density
def density_field(self, MAS, grid):
fpath = join(self.paths.tng300_1, "postprocessing", "density_field",
f"rho_dm_099_{grid}_{MAS}.npy")
return numpy.load(fpath)
def velocity_field(self, MAS, grid):
raise RuntimeError("The velocity field is not available.")
def radial_velocity_field(self, MAS, grid):
raise RuntimeError("The radial velocity field is not available.")
###############################################################################
# Quijote field class #
@ -739,7 +1031,7 @@ class QuijoteField(CSiBORG1Field):
Paths object.
"""
def __init__(self, nsim, paths):
super().__init__(nsim, paths)
super().__init__(nsim, paths, flip_xz=False)
self._simname = "quijote"

View file

@ -21,61 +21,53 @@ from tqdm import tqdm
###############################################################################
def read_interpolated_field(survey_name, kind, galaxy_index, paths, MAS, grid,
in_rsp, rand_data=False, verbose=True):
def read_interpolated_field(survey, simname, kind, MAS, grid, paths,
verbose=True):
"""
Read in the interpolated field at the galaxy positions, and reorder the
data to match the galaxy index.
Parameters
----------
survey_name : str
Survey name.
survey : Survey
Survey object.
simname : str
Simulation name.
kind : str
Field kind.
galaxy_index : 1-dimensional array
Galaxy indices to read in.
paths : py:class:`csiborgtools.read.Paths`
Paths manager.
MAS : str
Mass assignment scheme.
grid : int
Grid size.
in_rsp : bool
Whether to read in the field in redshift space.
rand_data : bool, optional
Whether to read in the random field data instead of the galaxy field.
paths : py:class:`csiborgtools.read.Paths`
Paths manager.
verbose : bool, optional
Verbosity flag.
Returns
-------
3-dimensional array of shape (nsims, len(galaxy_index), nsmooth)
val : 3-dimensional array of shape (nsims, num_gal, nsmooth)
Scalar field values at the galaxy positions.
smooth_scales : 1-dimensional array
Smoothing scales.
"""
nsims = paths.get_ics("csiborg")
nsims = paths.get_ics(simname)
for i, nsim in enumerate(tqdm(nsims,
desc="Reading fields",
disable=not verbose)):
fpath = paths.field_interpolated(
survey_name, kind, MAS, grid, nsim, in_rsp=in_rsp)
fpath = paths.field_interpolated(survey.name, simname, nsim, kind, MAS,
grid)
data = numpy.load(fpath)
out_ = data["val"] if not rand_data else data["rand_val"]
out_ = data["val"]
if i == 0:
out = numpy.empty((len(nsims), *out_.shape), dtype=out_.dtype)
indxs = data["indxs"]
smooth_scales = data["smooth_scales"]
out[i] = out_
# Reorder the data to match the survey index.
ind2pos = {v: k for k, v in enumerate(indxs)}
ks = numpy.empty(len(galaxy_index), dtype=numpy.int64)
if survey.selection_mask is not None:
out = out[:, survey.selection_mask, :]
for i, k in enumerate(galaxy_index):
j = ind2pos.get(k, None)
if j is None:
raise ValueError(f"There is no galaxy with index {k} in the "
"interpolated field.")
ks[i] = j
return out[:, ks, :]
return out, smooth_scales

View file

@ -32,7 +32,8 @@ def find_peak(x, weights, shrink=0.95, min_obs=5):
"""
Find the peak of a 1D distribution using a shrinking window.
"""
assert shrink <= 1.
if not shrink < 1:
raise ValueError("`shrink` must be less than 1.")
xmin, xmax = numpy.min(x), numpy.max(x)
xpos = (xmax + xmin) / 2
@ -58,9 +59,9 @@ class PairOverlap:
Parameters
----------
cat0 : :py:class:`csiborgtools.read.CSiBORGHaloCatalogue`
cat0 : instance of :py:class:`csiborgtools.read.BaseCatalogue`
Halo catalogue corresponding to the reference simulation.
catx : :py:class:`csiborgtools.read.CSiBORGHaloCatalogue`
catx : instance of :py:class:`csiborgtools.read.BaseCatalogue`
Halo catalogue corresponding to the cross simulation.
min_logmass : float
Minimum halo mass in :math:`\log_{10} M_\odot / h` to consider.
@ -305,17 +306,21 @@ class PairOverlap:
"""
assert (norm_kind is None or norm_kind in ("r200c", "ref_patch", "sum_patch")) # noqa
# Get positions either in the initial or final snapshot
pos0 = self.cat0().position(in_initial=in_initial)
posx = self.catx().position(in_initial=in_initial)
if in_initial:
pos0 = self.cat0("lagpatch_coordinates")
posx = self.catx("lagpatch_coordinates")
else:
pos0 = self.cat0("cartesian_pos")
posx = self.catx("cartesian_pos")
# Get the normalisation array if applicable
if norm_kind == "r200c":
norm = self.cat0("r200c")
if norm_kind == "ref_patch":
norm = self.cat0("lagpatch_size")
norm = self.cat0("lagpatch_radius")
if norm_kind == "sum_patch":
patch0 = self.cat0("lagpatch_size")
patchx = self.catx("lagpatch_size")
patch0 = self.cat0("lagpatch_radius")
patchx = self.catx("lagpatch_radius")
norm = [None] * len(self)
for i, ind in enumerate(self["match_indxs"]):
norm[i] = patch0[i] + patchx[ind]
@ -330,7 +335,7 @@ class PairOverlap:
dist[i] /= norm[i]
return numpy.array(dist, dtype=object)
def mass_ratio(self, mass_kind="totpartmass", in_log=True, in_abs=True):
def mass_ratio(self, in_log=True, in_abs=True):
"""
Pair mass ratio of matched halos between the reference and cross
simulations.
@ -350,7 +355,7 @@ class PairOverlap:
-------
ratio : array of 1-dimensional arrays of shape `(nhalos, )`
"""
mass0, massx = self.cat0(mass_kind), self.catx(mass_kind)
mass0, massx = self.cat0("totmass"), self.catx("totmass")
ratio = [None] * len(self)
for i, ind in enumerate(self["match_indxs"]):

186
notebooks/field_prop.ipynb Normal file
View file

@ -0,0 +1,186 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"from h5py import File\n",
"from scipy.stats import spearmanr\n",
"\n",
"import csiborgtools\n",
"\n",
"%matplotlib inline\n",
"%load_ext autoreload\n",
"%autoreload 2"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)\n",
"\n",
"# d = np.load(paths.field_interpolated(\"SDSS\", \"csiborg2_main\", 16817, \"density\", \"SPH\", 1024))"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {},
"outputs": [],
"source": [
"survey = csiborgtools.SDSS()(apply_selection=False)\n",
"# survey = csiborgtools.SDSSxALFALFA()(apply_selection=False)"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Reading fields: 0%| | 0/20 [00:00<?, ?it/s]"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"Reading fields: 100%|██████████| 20/20 [00:11<00:00, 1.80it/s]\n",
"Reading fields: 100%|██████████| 20/20 [00:10<00:00, 1.86it/s]\n"
]
}
],
"source": [
"for kind in [\"main\", \"random\"]:\n",
" x, smooth = csiborgtools.summary.read_interpolated_field(survey, f\"csiborg2_{kind}\", \"density\", \"SPH\", 1024, paths)\n",
" np .savez(f\"../data/{survey.name}_{kind}_density_SPH_1024.npz\", val=x, smooth_scales=smooth)\n",
"\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 37,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(20, 641409, 5)"
]
},
"execution_count": 37,
"metadata": {},
"output_type": "execute_result"
}
],
"source": []
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"array([[[nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" ...,\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan]],\n",
"\n",
" [[nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" ...,\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan]],\n",
"\n",
" [[nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" ...,\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan]],\n",
"\n",
" ...,\n",
"\n",
" [[nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" ...,\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan]],\n",
"\n",
" [[nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" ...,\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan]],\n",
"\n",
" [[nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" ...,\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan],\n",
" [nan, nan, nan, nan, nan]]], dtype=float32)"
]
},
"execution_count": 24,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"np.load(\"../data/SDSS_main_density_SPH_1024.npz\")[\"val\"]"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "venv_csiborg",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.4"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

7220
notebooks/field_sample.ipynb Normal file

File diff suppressed because one or more lines are too long

View file

@ -53,10 +53,10 @@ def density_field(nsim, parser_args):
# Read in the particle coordinates and masses
if parser_args.simname == "csiborg1":
snapshot = csiborgtools.read.CSIBORG1Snapshot(nsim, nsnap, paths)
snapshot = csiborgtools.read.CSiBORG1Snapshot(nsim, nsnap, paths)
elif "csiborg2" in parser_args.simname:
kind = parser_args.simname.split("_")[-1]
snapshot = csiborgtools.read.CSIBORG2Snapshot(nsim, nsnap, paths, kind)
snapshot = csiborgtools.read.CSiBORG2Snapshot(nsim, nsnap, paths, kind)
elif parser_args.simname == "quijote":
snapshot = csiborgtools.read.QuijoteSnapshot(nsim, nsnap, paths)
else:
@ -106,10 +106,10 @@ def velocity_field(nsim, parser_args):
nsnap = max(paths.get_snapshots(nsim, parser_args.simname))
if parser_args.simname == "csiborg1":
snapshot = csiborgtools.read.CSIBORG1Snapshot(nsim, nsnap, paths)
snapshot = csiborgtools.read.CSiBORG1Snapshot(nsim, nsnap, paths)
elif "csiborg2" in parser_args.simname:
kind = parser_args.simname.split("_")[-1]
snapshot = csiborgtools.read.CSIBORG2Snapshot(nsim, nsnap, kind, paths)
snapshot = csiborgtools.read.CSiBORG2Snapshot(nsim, nsnap, kind, paths)
elif parser_args.simname == "quijote":
snapshot = csiborgtools.read.QuijoteSnapshot(nsim, nsnap, paths)
else:

94
scripts/field_shells.py Normal file
View file

@ -0,0 +1,94 @@
# Copyright (C) 2022 Richard Stiskalek
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation; either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
NOTE: This script is pretty dodgy.
A script to calculate the mean and standard deviation of a field at different
distances from the center of the box such that at each distance the field is
evaluated at uniformly-spaced points on a sphere.
The script is not parallelized in any way but it should not take very long, the
main bottleneck is reading the data from disk.
"""
from argparse import ArgumentParser
from os.path import join
import csiborgtools
import numpy
from tqdm import tqdm
def main(args):
paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)
boxsize = csiborgtools.simname2boxsize(args.simname)
distances = numpy.linspace(0, boxsize / 2, 101)[1:]
nsims = paths.get_ics(args.simname)
folder = "/mnt/extraspace/rstiskalek/csiborg_postprocessing/field_shells"
mus = numpy.zeros((len(nsims), len(distances)))
stds = numpy.zeros((len(nsims), len(distances)))
for i, nsim in enumerate(tqdm(nsims, desc="Simulations")):
# Get the correct field loader
if args.simname == "csiborg1":
reader = csiborgtools.read.CSiBORG1Field(nsim, paths)
elif "csiborg2" in args.simname:
kind = args.simname.split("_")[-1]
reader = csiborgtools.read.CSiBORG2Field(nsim, kind, paths)
elif args.simname == "borg2":
reader = csiborgtools.read.BORG2Field(nsim, paths)
else:
raise ValueError(f"Unknown simname: `{args.simname}`.")
# Get the field
if args.field == "density":
field = reader.density_field(args.MAS, args.grid)
elif args.field == "overdensity":
if args.simname == "borg2":
field = reader.overdensity_field()
else:
field = reader.density_field(args.MAS, args.grid)
csiborgtools.field.overdensity_field(field, make_copy=False)
elif args.field == "radvel":
field = reader.radial_velocity_field(args.MAS, args.grid)
else:
raise ValueError(f"Unknown field: `{args.field}`.")
# Evaluate this field at different distances
vals = [csiborgtools.field.field_at_distance(field, distance, boxsize)
for distance in distances]
# Calculate the mean and standard deviation
mus[i, :] = [numpy.mean(val) for val in vals]
stds[i, :] = [numpy.std(val) for val in vals]
# Finally save the output
fname = f"{args.simname}_{args.field}_{args.MAS}_{args.grid}.npz"
fname = join(folder, fname)
numpy.savez(fname, mean=mus, std=stds, distances=distances)
if __name__ == "__main__":
parser = ArgumentParser()
parser.add_argument("--field", type=str, help="Field type.",
choices=["density", "overdensity", "radvel"])
parser.add_argument("--simname", type=str, help="Simulation name.",
choices=["csiborg1", "csiborg2_main", "csiborg2_varysmall", "csiborg2_random", "borg2"]) # noqa
parser.add_argument("--MAS", type=str, help="Mass assignment scheme.",
choices=["NGP", "CIC", "TSC", "PCS", "SPH"])
parser.add_argument("--grid", type=int, help="Grid size.")
args = parser.parse_args()
main(args)

128
scripts/fit_init.py Normal file
View file

@ -0,0 +1,128 @@
# Copyright (C) 2022 Richard Stiskalek
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation; either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
Script to calculate the particle centre of mass and Lagrangian patch size in
the initial snapshot. The initial snapshot particles are read from the sorted
files.
"""
from argparse import ArgumentParser
from datetime import datetime
import csiborgtools
import numpy
from mpi4py import MPI
from taskmaster import work_delegation
from tqdm import tqdm
from utils import get_nsims
def _main(nsim, simname, verbose):
"""
Calculate and save the Lagrangian halo centre of mass and Lagrangian patch
size in the initial snapshot.
Parameters
----------
nsim : int
IC realisation index.
simname : str
Simulation name.
verbose : bool
Verbosity flag.
Returns
-------
None
"""
paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)
cols = [("index", numpy.int32),
("x", numpy.float32),
("y", numpy.float32),
("z", numpy.float32),
("lagpatch_size", numpy.float32),
("lagpatch_ncells", numpy.int32),]
if simname == "csiborg1":
snap = csiborgtools.read.CSiBORG1Snapshot(nsim, 1, paths,
keep_snapshot_open=True)
cat = csiborgtools.read.CSiBORG1Catalogue(nsim, paths, snapshot=snap)
fout = f"/mnt/extraspace/rstiskalek/csiborg1/chain_{nsim}/initial_lagpatch.npy" # noqa
elif "csiborg2" in simname:
kind = simname.split("_")[-1]
snap = csiborgtools.read.CSiBORG2Snapshot(nsim, 0, kind, paths,
keep_snapshot_open=True)
cat = csiborgtools.read.CSiBORG2Catalogue(nsim, 99, kind, paths,
snapshot=snap)
fout = f"/mnt/extraspace/rstiskalek/csiborg2_{kind}/catalogues/initial_lagpatch_{nsim}.npy" # noqa
elif simname == "quijote":
snap = csiborgtools.read.QuijoteSnapshot(nsim, "ICs", paths,
keep_snapshot_open=True)
cat = csiborgtools.read.QuijoteHaloCatalogue(nsim, paths,
snapshot=snap)
fout = f"/mnt/extraspace/rstiskalek/quijote/fiducial_processed/chain_{nsim}/initial_lagpatch.npy" # noqa
else:
raise ValueError(f"Unknown simulation name `{simname}`.")
boxsize = csiborgtools.simname2boxsize(simname)
# Initialise the overlapper.
if simname == "csiborg" or "csiborg2" in simname:
kwargs = {"box_size": 2048, "bckg_halfsize": 512}
else:
kwargs = {"box_size": 512, "bckg_halfsize": 256}
overlapper = csiborgtools.match.ParticleOverlap(**kwargs)
out = csiborgtools.read.cols_to_structured(len(cat), cols)
for i, hid in enumerate(tqdm(cat["index"]) if verbose else cat["index"]):
out["index"][i] = hid
pos = snap.halo_coordinates(hid)
mass = snap.halo_masses(hid)
# Calculate the centre of mass and the Lagrangian patch size.
cm = csiborgtools.center_of_mass(pos, mass, boxsize=boxsize)
distances = csiborgtools.periodic_distance(pos, cm, boxsize=boxsize)
out["x"][i], out["y"][i], out["z"][i] = cm
out["lagpatch_size"][i] = numpy.percentile(distances, 99)
pos /= boxsize # need to normalize the positions to be [0, 1).
# Calculate the number of cells with > 0 density.
delta = overlapper.make_delta(pos, mass, subbox=True)
out["lagpatch_ncells"][i] = csiborgtools.delta2ncells(delta)
# Now save it
if verbose:
print(f"{datetime.now()}: dumping fits to .. `{fout}`.", flush=True)
with open(fout, "wb") as f:
numpy.save(f, out)
if __name__ == "__main__":
parser = ArgumentParser()
parser.add_argument("--simname", type=str,
choices=["csiborg1", "csiborg2_main", "csiborg2_random", "csiborg2_varysmall", "quijote"], # noqa
help="Simulation name")
parser.add_argument("--nsims", type=int, nargs="+", default=None,
help="IC realisations. If `-1` processes all.")
args = parser.parse_args()
paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)
nsims = get_nsims(args, paths)
def main(nsim):
_main(nsim, args.simname, MPI.COMM_WORLD.Get_size() == 1)
work_delegation(main, nsims, MPI.COMM_WORLD)

376
scripts/mass_enclosed.py Normal file
View file

@ -0,0 +1,376 @@
# Copyright (C) 2022 Richard Stiskalek
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation; either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
A script to calculate the enclosed mass or bulk flow at different distances
from the center of the box directly from the particles. Note that the velocity
of an observer is not being subtracted from the bulk flow.
The script is not parallelized in any way but it should not take very long, the
main bottleneck is reading the data from disk.
"""
from argparse import ArgumentParser
from os.path import join
from gc import collect
import csiborgtools
import numpy
from tqdm import tqdm
from numba import jit
from datetime import datetime
###############################################################################
# Read in information about the simulation #
###############################################################################
def t():
return datetime.now()
def get_reader(simname, paths, nsim):
"""
Get the appropriate snaspshot reader for the simulation.
Parameters
----------
simname : str
Name of the simulation.
paths : csiborgtools.read.Paths
Paths object.
nsim : int
Simulation index.
Returns
-------
reader : instance of csiborgtools.read.BaseSnapshot
Snapshot reader.
"""
if simname == "csiborg1":
nsnap = max(paths.get_snapshots(nsim, simname))
reader = csiborgtools.read.CSiBORG1Snapshot(nsim, nsnap, paths,
flip_xz=True)
elif "csiborg2" in simname:
kind = simname.split("_")[-1]
reader = csiborgtools.read.CSiBORG2Snapshot(nsim, 99, kind, paths,
flip_xz=True)
else:
raise ValueError(f"Unknown simname: `{simname}`.")
return reader
def get_particles(reader, boxsize, get_velocity=True, verbose=True):
"""
Get the distance of particles from the center of the box and their masses.
Parameters
----------
reader : instance of csiborgtools.read.BaseSnapshot
Snapshot reader.
boxsize : float
Box size in Mpc / h.
get_velocity : bool, optional
Whether to also return the velocity of particles.
verbose : bool
Verbosity flag.
Returns
-------
dist : 1-dimensional array
Distance of particles from the center of the box.
mass : 1-dimensional array
Mass of particles.
vel : 2-dimensional array, optional
Velocity of particles.
"""
if verbose:
print(f"{t()},: reading coordinates and calculating radial distance.")
pos = reader.coordinates()
dtype = pos.dtype
pos -= boxsize / 2
dist = numpy.linalg.norm(pos, axis=1).astype(dtype)
del pos
collect()
if verbose:
print(f"{t()}: reading masses.")
mass = reader.masses()
if get_velocity:
if verbose:
print(f"{t()}: reading velocities.")
vel = reader.velocities().astype(dtype)
if verbose:
print(f"{t()}: sorting arrays.")
indxs = numpy.argsort(dist)
dist = dist[indxs]
mass = mass[indxs]
if get_velocity:
vel = vel[indxs]
del indxs
collect()
if get_velocity:
return dist, mass, vel
return dist, mass
###############################################################################
# Calculate the enclosed mass at each distance #
###############################################################################
@jit(nopython=True, boundscheck=False)
def _enclosed_mass(rdist, mass, rmax, start_index):
enclosed_mass = 0.
for i in range(start_index, len(rdist)):
if rdist[i] <= rmax:
enclosed_mass += mass[i]
else:
break
return enclosed_mass, i
def enclosed_mass(rdist, mass, distances):
"""
Calculate the enclosed mass at each distance.
Parameters
----------
rdist : 1-dimensional array
Distance of particles from the center of the box.
mass : 1-dimensional array
Mass of particles.
distances : 1-dimensional array
Distances at which to calculate the enclosed mass.
Returns
-------
enclosed_mass : 1-dimensional array
Enclosed mass at each distance.
"""
enclosed_mass = numpy.full_like(distances, 0.)
start_index = 0
for i, dist in enumerate(distances):
if i > 0:
enclosed_mass[i] += enclosed_mass[i - 1]
m, start_index = _enclosed_mass(rdist, mass, dist, start_index)
enclosed_mass[i] += m
return enclosed_mass
###############################################################################
# Calculate enclosed mass from a density field #
###############################################################################
@jit(nopython=True)
def _cell_rdist(i, j, k, Ncells, boxsize):
"""Radial distance of the center of a cell from the center of the box."""
xi = boxsize / Ncells * (i + 0.5) - boxsize / 2
yi = boxsize / Ncells * (j + 0.5) - boxsize / 2
zi = boxsize / Ncells * (k + 0.5) - boxsize / 2
return (xi**2 + yi**2 + zi**2)**0.5
@jit(nopython=True, boundscheck=False)
def _field_enclosed_mass(field, rmax, boxsize):
Ncells = field.shape[0]
cell_volume = (1000 * boxsize / Ncells)**3
mass = 0.
volume = 0.
for i in range(Ncells):
for j in range(Ncells):
for k in range(Ncells):
if _cell_rdist(i, j, k, Ncells, boxsize) < rmax:
mass += field[i, j, k]
volume += 1.
return mass * cell_volume, volume * cell_volume
def field_enclosed_mass(field, distances, boxsize):
"""
Calculate the approximate enclosed mass within a given radius from a
density field.
Parameters
----------
field : 3-dimensional array
Density field in units of `h^2 Msun / kpc^3`.
rmax : 1-dimensional array
Radii to calculate the enclosed mass at in `Mpc / h`.
boxsize : float
Box size in `Mpc / h`.
Returns
-------
enclosed_mass : 1-dimensional array
Enclosed mass at each distance.
enclosed_volume : 1-dimensional array
Enclosed grid-like volume at each distance.
"""
enclosed_mass = numpy.zeros_like(distances)
enclosed_volume = numpy.zeros_like(distances)
for i, dist in enumerate(distances):
enclosed_mass[i], enclosed_volume[i] = _field_enclosed_mass(
field, dist, boxsize)
return enclosed_mass, enclosed_volume
###############################################################################
# Calculate the enclosed momentum at each distance #
###############################################################################
@jit(nopython=True, boundscheck=False)
def _enclosed_momentum(rdist, mass, vel, rmax, start_index):
bulk_momentum = numpy.zeros(3, dtype=rdist.dtype)
for i in range(start_index, len(rdist)):
if rdist[i] <= rmax:
bulk_momentum += mass[i] * vel[i]
else:
break
return bulk_momentum, i
def enclosed_momentum(rdist, mass, vel, distances):
"""
Calculate the enclosed momentum at each distance.
Parameters
----------
rdist : 1-dimensional array
Distance of particles from the center of the box.
mass : 1-dimensional array
Mass of particles.
vel : 2-dimensional array
Velocity of particles.
distances : 1-dimensional array
Distances at which to calculate the enclosed momentum.
Returns
-------
bulk_momentum : 2-dimensional array
Enclosed momentum at each distance.
"""
bulk_momentum = numpy.zeros((len(distances), 3))
start_index = 0
for i, dist in enumerate(distances):
if i > 0:
bulk_momentum[i] += bulk_momentum[i - 1]
v, start_index = _enclosed_momentum(rdist, mass, vel, dist,
start_index)
bulk_momentum[i] += v
return bulk_momentum
###############################################################################
# Main & command line interface #
###############################################################################
def main_borg(args, folder):
paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)
boxsize = csiborgtools.simname2boxsize(args.simname)
nsims = paths.get_ics(args.simname)
distances = numpy.linspace(0, boxsize / 2, 101)[1:]
cumulative_mass = numpy.zeros((len(nsims), len(distances)))
cumulative_volume = numpy.zeros((len(nsims), len(distances)))
for i, nsim in enumerate(tqdm(nsims, desc="Simulations")):
if args.simname == "borg1":
reader = csiborgtools.read.BORG1Field(nsim)
field = reader.density_field()
elif args.simname == "borg2":
reader = csiborgtools.read.BORG2Field(nsim)
field = reader.density_field()
else:
raise ValueError(f"Unknown simname: `{args.simname}`.")
cumulative_mass[i, :], cumulative_volume[i, :] = field_enclosed_mass(
field, distances, boxsize)
# Finally save the output
fname = f"enclosed_mass_{args.simname}.npz"
fname = join(folder, fname)
numpy.savez(fname, enclosed_mass=cumulative_mass, distances=distances,
enclosed_volume=cumulative_volume)
def main_csiborg(args, folder):
paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)
boxsize = csiborgtools.simname2boxsize(args.simname)
nsims = paths.get_ics(args.simname)
distances = numpy.linspace(0, boxsize / 2, 101)[1:]
# Initialize arrays to store the results
cumulative_mass = numpy.zeros((len(nsims), len(distances)))
mass135 = numpy.zeros(len(nsims))
masstot = numpy.zeros(len(nsims))
cumulative_velocity = numpy.zeros((len(nsims), len(distances), 3))
for i, nsim in enumerate(tqdm(nsims, desc="Simulations")):
reader = get_reader(args.simname, paths, nsim)
rdist, mass, vel = get_particles(reader, boxsize, verbose=False)
# Calculate masses
cumulative_mass[i, :] = enclosed_mass(rdist, mass, distances)
mass135[i] = enclosed_mass(rdist, mass, [135])[0]
masstot[i] = numpy.sum(mass)
# Calculate velocities
cumulative_velocity[i, ...] = enclosed_momentum(
rdist, mass, vel, distances)
for j in range(3): # Normalize the momentum to get velocity out of it.
cumulative_velocity[i, :, j] /= cumulative_mass[i, :]
# Finally save the output
fname = f"enclosed_mass_{args.simname}.npz"
fname = join(folder, fname)
numpy.savez(fname, enclosed_mass=cumulative_mass, mass135=mass135,
masstot=masstot, distances=distances,
cumulative_velocity=cumulative_velocity)
if __name__ == "__main__":
parser = ArgumentParser()
parser.add_argument("--simname", type=str, help="Simulation name.",
choices=["csiborg1", "csiborg2_main", "csiborg2_varysmall", "csiborg2_random", "borg1", "borg2"]) # noqa
args = parser.parse_args()
folder = "/mnt/extraspace/rstiskalek/csiborg_postprocessing/field_shells"
if "csiborg" in args.simname:
main_csiborg(args, folder)
elif "borg" in args.simname:
main_borg(args, folder)
else:
raise ValueError(f"Unknown simname: `{args.simname}`.")

View file

@ -97,32 +97,54 @@ def pair_match(nsim0, nsimx, simname, min_logmass, sigma, verbose):
"""
paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)
smooth_kwargs = {"sigma": sigma, "mode": "constant", "cval": 0}
bounds = {"lagpatch_size": (0, None)}
bounds = {"lagpatch_radius": (0, None)}
if simname == "csiborg1":
overlapper_kwargs = {"box_size": 2048, "bckg_halfsize": 512}
bounds |= {"dist": (0, 150), "totmass": (10**min_logmass, None)}
bounds |= {"dist": (0, 135), "totmass": (10**min_logmass, None)}
snap0 = csiborgtools.read.CSIBORG1Snapshot(nsim0, 0)
cat0 = csiborgtools.read.CSiBORG1Catalogue(nsim0, snapshot=snap0,
bounds=bounds)
# Reference simulation.
snap0 = csiborgtools.read.CSiBORG1Snapshot(
nsim0, 1, keep_snapshot_open=True)
cat0 = csiborgtools.read.CSiBORG1Catalogue(
nsim0, snapshot=snap0, bounds=bounds)
snapx = csiborgtools.read.CSIBORG1Snapshot(nsimx, 0)
catx = csiborgtools.read.CSiBORGCatalogue(nsimx, snapshot=snapx,
bounds=bounds)
# Cross simulation.
snapx = csiborgtools.read.CSiBORG1Snapshot(
nsimx, 1, keep_snapshot_open=True)
catx = csiborgtools.read.CSiBORG1Catalogue(
nsimx, snapshot=snapx, bounds=bounds)
elif "csiborg2" in simname:
raise RuntimeError("CSiBORG2 currently not implemented..")
kind = simname.split("_")[-1]
overlapper_kwargs = {"box_size": 2048, "bckg_halfsize": 512}
bounds |= {"dist": (0, 135), "totmass": (10**min_logmass, None)}
# Reference simulation.
snap0 = csiborgtools.read.CSiBORG2Snapshot(
nsim0, 99, kind, keep_snapshot_open=True)
cat0 = csiborgtools.read.CSiBORG2Catalogue(
nsim0, 99, kind, snapshot=snap0, bounds=bounds)
# Cross simulation.
snapx = csiborgtools.read.CSiBORG2Snapshot(
nsimx, 99, kind, keep_snapshot_open=True)
catx = csiborgtools.read.CSiBORG2Catalogue(
nsimx, 99, kind, snapshot=snapx, bounds=bounds)
elif simname == "quijote":
overlapper_kwargs = {"box_size": 512, "bckg_halfsize": 256}
bounds |= {"totmass": (10**min_logmass, None)}
snap0 = csiborgtools.read.QuijoteSnapshot(nsim0, "ICs")
cat0 = csiborgtools.read.QuijoteCatalogue(nsim0, snapshot=snap0,
bounds=bounds)
# Reference simulation.
snap0 = csiborgtools.read.QuijoteSnapshot(
nsim0, "ICs", keep_snapshot_open=True)
cat0 = csiborgtools.read.QuijoteCatalogue(
nsim0, snapshot=snap0, bounds=bounds)
snapx = csiborgtools.read.QuijoteSnapshot(nsimx, "ICs")
catx = csiborgtools.read.QuijoteCatalogue(nsimx, snapshot=snapx,
bounds=bounds)
# Cross simulation.
snapx = csiborgtools.read.QuijoteSnapshot(
nsimx, "ICs", keep_snapshot_open=True)
catx = csiborgtools.read.QuijoteCatalogue(
nsimx, snapshot=snapx, bounds=bounds)
else:
raise ValueError(f"Unknown simulation name: `{simname}`.")

View file

@ -0,0 +1,76 @@
# Copyright (C) 2023 Richard Stiskalek
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation; either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
Quick script to output either halo positions and masses or positions of
galaxies in a survey as an ASCII file.
"""
from os.path import join
import csiborgtools
import numpy
from tqdm import tqdm
DIR_OUT = "/mnt/extraspace/rstiskalek/csiborg_postprocessing/ascii_positions"
def process_simulation(simname):
"""Watch out about the distinction between real and redshift space."""
paths = csiborgtools.read.Paths(**csiborgtools.paths_glamdring)
if "csiborg2" in simname:
nsims = paths.get_ics(simname)
kind = simname.split("_")[-1]
for nsim in tqdm(nsims, desc="Looping over simulations"):
cat = csiborgtools.read.CSiBORG2Catalogue(nsim, 99, kind, paths)
pos = cat["cartesian_pos"]
mass = cat["totmass"]
# Stack positions and masses
x = numpy.hstack([pos, mass.reshape(-1, 1)])
# Save to a file
fname = join(DIR_OUT, f"halos_real_{simname}_{nsim}.txt")
numpy.savetxt(fname, x)
else:
raise RuntimeError("Simulation not implemented..")
def process_survey(survey_name, boxsize):
"""Watch out about the distance definition."""
if survey_name == "SDSS":
survey = csiborgtools.SDSS()()
dist, ra, dec = survey["DIST"], survey["RA"], survey["DEC"]
elif survey_name == "SDSSxALFALFA":
survey = csiborgtools.SDSSxALFALFA()()
dist, ra, dec = survey["DIST"], survey["RA_1"], survey["DEC_1"]
else:
raise RuntimeError("Survey not implemented..")
# Convert to Cartesian coordinates
X = numpy.vstack([dist, ra, dec]).T
X = csiborgtools.radec_to_cartesian(X)
# Center the coordinates in the box
X += boxsize / 2
fname = join(DIR_OUT, f"survey_{survey_name}.txt")
numpy.savetxt(fname, X)
if __name__ == "__main__":
# process_simulation("csiborg2_main")
boxsize = 676.6
for survey in ["SDSS", "SDSSxALFALFA"]:
process_survey(survey, boxsize)

View file

@ -68,7 +68,7 @@ def read_single_catalogue(args, config, nsim, run, rmax, paths, nobs=None):
Returns
-------
`csiborgtools.read.CSiBORGHaloCatalogue` or `csiborgtools.read.QuijoteHaloCatalogue` # noqa
instance of `csiborgtools.read.BaseCatalogue`
"""
selection = config.get(run, None)
if selection is None:

View file

@ -0,0 +1,380 @@
# Copyright (C) 2024 Richard Stiskalek
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation; either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
Script to calculate the ACL of BORG voxels.
"""
from argparse import ArgumentParser
from glob import glob
from os.path import join
from re import search
import numpy as np
from h5py import File
from numba import jit
from tqdm import tqdm, trange
###############################################################################
# BORG voxels I/O #
###############################################################################
def find_mcmc_files(basedir):
"""
Find the MCMC files in the BORG run directory. Checks that the samples
are consecutive.
Parameters
----------
basedir : str
The base directory of the BORG run.
Returns
-------
files : list of str
"""
files = glob(join(basedir, "mcmc_*"))
print(f"Found {len(files)} BORG samples.")
# Sort the files by the MCMC iteration number.
indxs = [int(search(r"mcmc_(\d+)", f).group(1)) for f in files]
argsort_indxs = np.argsort(indxs)
indxs = [indxs[i] for i in argsort_indxs]
files = [files[i] for i in argsort_indxs]
if not all((indxs[i] - indxs[i - 1]) == 1 for i in range(1, len(indxs))):
raise ValueError("MCMC iteration numbers are not consecutive.")
return files
def load_borg_voxels(basedir, frac=0.25):
"""
Load the BORG density field samples of the central `frac` of the box.
Parameters
----------
basedir : str
The base directory of the BORG run.
frac : float
The fraction of the box to load. Must be <= 1.0.
Returns
-------
4-dimensional array of shape (n_samples, n_voxels, n_voxels, n_voxels)
"""
if frac > 1.0:
raise ValueError("`frac` must be <= 1.0")
files = find_mcmc_files(basedir)
start, end, x = None, None, None
for n, fpath in enumerate(tqdm(files, desc="Loading BORG samples")):
with File(fpath, 'r') as f:
if n == 0:
grid = f["scalars/BORG_final_density"].shape[0]
ncentral = int(grid * frac)
start = (grid - ncentral) // 2
end = (grid + ncentral) // 2
nvoxel = end - start
shape = (len(files), nvoxel, nvoxel, nvoxel)
x = np.full(shape, np.nan, dtype=np.float32)
x[n] = f["scalars/BORG_final_density"][start:end, start:end, start:end] # noqa
return x
def load_borg_galaxy_bias(basedir):
"""
Load the BORG `galaxy_bias` samples.
Parameters
----------
basedir : str
The base directory of the BORG run.
Returns
-------
samples : 2-dimensional array of shape (n_samples, jmax)
"""
files = find_mcmc_files(basedir)
x = None
for n, fpath in enumerate(tqdm(files, desc="Loading BORG samples")):
with File(fpath, 'r') as f:
# Figure out how many sub-samples there are.
if n == 0:
for j in range(100):
try:
bias = f[f"scalars/galaxy_bias_{j}"]
nbias = bias[...].size
except KeyError:
jmax = j - 1
x = np.full((len(files), jmax, nbias), np.nan,
dtype=np.float32)
break
for i in range(jmax):
x[n, i, :] = f[f"scalars/galaxy_bias_{i}"][...]
return x
###############################################################################
# ACL & ACF calculation #
###############################################################################
def calculate_acf(data):
"""
Calculates the autocorrelation of some data. Taken from `epsie` package
written by Collin Capano.
Parameters
----------
data : 1-dimensional array
The data to calculate the autocorrelation of.
Returns
-------
acf : 1-dimensional array
"""
# zero the mean
data = data - data.mean()
# zero-pad to 2 * nearest power of 2
newlen = int(2**(1 + np.ceil(np.log2(len(data)))))
x = np.zeros(newlen)
x[:len(data)] = data[:]
# correlate
acf = np.correlate(x, x, mode='full')
# drop corrupted region
acf = acf[len(acf)//2:]
# normalize
acf /= acf[0]
return acf
def calculate_acl(data):
"""
Calculate the autocorrelation length of some data. Taken from `epsie`
package written by Collin Capano. Algorithm used is from:
N. Madras and A.D. Sokal, J. Stat. Phys. 50, 109 (1988).
Parameters
----------
data : 1-dimensional array
The data to calculate the autocorrelation length of.
Returns
-------
acl : int
"""
# calculate the acf
acf = calculate_acf(data)
# now the ACL: Following from Sokal, this is estimated
# as the first point where M*tau[k] <= k, where
# tau = 2*cumsum(acf) - 1, and M is a tuneable parameter,
# generally chosen to be = 5 (which we use here)
m = 5
cacf = 2. * np.cumsum(acf) - 1.
win = m * cacf <= np.arange(len(cacf))
if win.any():
acl = int(np.ceil(cacf[np.where(win)[0][0]]))
else:
# data is too short to estimate the ACL, just choose
# the length of the data
acl = len(data)
return acl
def voxel_acl(borg_voxels):
"""
Calculate the ACL of each voxel in the BORG samples.
Parameters
----------
borg_voxels : 4-dimensional array of shape (n_samples, nvox, nvox, nvox)
The BORG density field samples.
Returns
-------
voxel_acl : 3-dimensional array of shape (nvox, nvox, nvox)
The ACL of each voxel.
"""
ngrid = borg_voxels.shape[1]
voxel_acl = np.zeros((ngrid, ngrid, ngrid), dtype=np.float32)
for i in trange(ngrid):
for j in range(ngrid):
for k in range(ngrid):
voxel_acl[i, j, k] = calculate_acl(borg_voxels[:, i, j, k])
return voxel_acl
def galaxy_bias_acl(galaxy_bias):
"""
Calculate the ACL of the galaxy bias parameters for each galaxy sub-sample.
Parameters
----------
galaxy_bias : 3-dimensional array of shape (n_samples, ncat, nbias)
The BORG `galaxy_bias` samples.
Returns
-------
acls_all : 2-dimensional array of shape (ncat, nbias)
"""
print("Calculating the ACL of the galaxy bias parameters.")
ncat = galaxy_bias.shape[1]
nbias = galaxy_bias.shape[2]
acls_all = np.full((ncat, nbias), np.nan, dtype=int)
for i in range(ncat):
acls = [calculate_acl(galaxy_bias[:, i, j]) for j in range(nbias)]
print(f"`galaxy_bias_{str(i).zfill(2)}` ACLs: {acls}.")
acls_all[i] = acls
return acls_all
def enclosed_density_acl(borg_voxels):
"""
Calculate the ACL of the enclosed overdensity of the BORG samples.
Parameters
----------
borg_voxels : 4-dimensional array of shape (n_samples, nvox, nvox, nvox)
The BORG density field samples.
Returns
-------
acl : int
"""
# Calculate the mean overdensity of the voxels.
x = np.asanyarray([np.mean(borg_voxels[i] + 1) - 1
for i in range(len(borg_voxels))])
mu = np.mean(x)
sigma = np.std(x)
acl = calculate_acl(x)
print("Calculating the boxed overdensity ACL.")
print(f"<delta_box> = {mu} +- {sigma}")
print(f"ACL = {acl}")
return acl
###############################################################################
# Voxel distance from the centre #
###############################################################################
@jit(nopython=True, boundscheck=False, fastmath=True)
def calculate_voxel_distance_from_center(grid, voxel_size):
"""
Calculate the distance in `Mpc / h` of each voxel from the centre of the
box.
Parameters
----------
grid : int
The number of voxels in each dimension. Assumed to be centered on the
box centre.
voxel_size : float
The size of each voxel in `Mpc / h`.
Returns
-------
voxel_dist : 3-dimensional array of shape (grid, grid, grid)
"""
x0 = grid // 2
dist = np.zeros((grid, grid, grid), dtype=np.float32)
for i in range(grid):
for j in range(grid):
for k in range(grid):
dist[i, j, k] = ((i - x0)**2 + (j - x0)**2 + (k - x0)**2)**0.5
return dist * voxel_size
if __name__ == "__main__":
parser = ArgumentParser()
parser.add_argument("kind", choices=["BORG1", "BORG2"],
help="The BORG run.", type=str)
parser.add_argument("--frac", help="The fraction of the box to load.",
default=0.25, type=float)
args = parser.parse_args()
dumpdir = "/mnt/extraspace/rstiskalek/dump"
outdir = "/mnt/extraspace/rstiskalek/csiborg_postprocessing/ACL"
if args.kind == "BORG1":
basedir = "/mnt/users/hdesmond/BORG_final"
grid = 256
boxsize = 677.6
elif args.kind == "BORG2":
basedir = "/mnt/extraspace/rstiskalek/BORG_STOPYRA_2023"
grid = 256
boxsize = 676.6
else:
raise ValueError(f"Unknown BORG run: `{args.kind}`.")
# First try to load the BORG samples from a dump file. If that fails, load
# them directly from the BORG samples.
fname = join(dumpdir, f"{args.kind}_{args.frac}.hdf5")
try:
with File(fname, 'r') as f:
print(f"Loading BORG samples from `{fname}`.")
borg_voxels = f["borg_voxels"][...]
except FileNotFoundError:
print("Loading directly from BORG samples.")
borg_voxels = load_borg_voxels(basedir, frac=args.frac)
with File(fname, 'w') as f:
print(f"Saving BORG samples to to `{fname}`.")
f.create_dataset("borg_voxels", data=borg_voxels)
enclosed_density_acl(borg_voxels)
# Calculate the voxel distance from the centre and their ACLs.
voxel_size = boxsize / grid
voxel_dist = calculate_voxel_distance_from_center(
borg_voxels.shape[1], voxel_size)
voxel_acl = voxel_acl(borg_voxels)
# Save the voxel distance and ACLs to a file.
fout = join(outdir, f"{args.kind}_{args.frac}.hdf5")
print(f"Writting voxel distance and ACLs to `{fout}`.")
with File(fout, 'w') as f:
f.create_dataset("voxel_dist", data=voxel_dist)
f.create_dataset("voxel_acl", data=voxel_acl)
# Now load the galaxy_bias samples.
fname = join(dumpdir, f"{args.kind}_galaxy_bias_{args.frac}.hdf5")
try:
with File(fname, 'r') as f:
print(f"Loading BORG `galaxy_bias` samples from `{fname}`.")
galaxy_bias = f["galaxy_bias"][...]
except FileNotFoundError:
print("Loading `galaxy_bias` directly from BORG samples.")
galaxy_bias = load_borg_galaxy_bias(basedir)
with File(fname, 'w') as f:
print(f"Saving `galaxy_nmean` BORG samples to to `{fname}`.")
f.create_dataset("galaxy_bias", data=galaxy_bias)
galaxy_bias_acl(galaxy_bias)

View file

@ -0,0 +1,56 @@
# Copyright (C) 2023 Richard Stiskalek
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation; either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
Script to iteratively load particles of a TNG simulation and construct the DM
density field.
"""
from glob import glob
from os.path import join
import MAS_library as MASL
import numpy as np
from h5py import File
from tqdm import trange
if __name__ == "__main__":
# Some parameters
basepath = "/mnt/extraspace/rstiskalek/TNG300-1"
snap = str(99).zfill(3)
grid = 1024
boxsize = 205000.0 # kpc/h
mpart = 0.00398342749867548 * 1e10 # Msun/h, DM particles mass
MAS = "PCS"
# Get the snapshot files
files = glob(join(basepath, "output", f"snapdir_{snap}", f"snap_{snap}.*"))
print(f"Found {len(files)} snapshot files.")
# Iterate over the snapshot files and construct the density field
rho = np.zeros((grid, grid, grid), dtype=np.float32)
for i in trange(len(files), desc="Reading snapshot files"):
with File(files[i], 'r') as f:
pos = f["PartType1/Coordinates"][...].astype(np.float32)
MASL.MA(pos, rho, boxsize, MAS, verbose=False)
# Convert to units h^2 Msun / kpc^3
rho *= mpart / (boxsize / grid)**3
# Save to file
fname = join(basepath, "postprocessing", "density_field",
f"rho_dm_{snap}_{grid}_{MAS}.npy")
print(f"Saving to {fname}.", flush=True)
np.save(fname, rho)

View file

@ -98,7 +98,8 @@ if __name__ == "__main__":
if args.mode == "prepare":
if args.nsim == -1:
nsims = [7444 + n * 24 for n in range(101)]
# nsims = [7444 + n * 24 for n in range(101)]
nsims = [8404 + n * 24 for n in range(61)]
for nsim in nsims:
print(f"Processing simulation {nsim}.")
particles_path = join(args.scratch_space,

View file

@ -174,8 +174,7 @@ class CSiBORG1Reader:
if which_snapshot == "initial":
self.nsnap = 1
raise RuntimeError("TODO not implemented")
self.source_dir = None
self.source_dir = f"/mnt/extraspace/rstiskalek/csiborg1/initial/ramses_out_{nsim}_new/output_00001" # noqa
elif which_snapshot == "final":
sourcedir = join(base_dir, f"ramses_out_{nsim}")
self.nsnap = max([int(basename(f).replace("output_", ""))
@ -195,7 +194,7 @@ class CSiBORG1Reader:
self.sph_file = f"/mnt/extraspace/rstiskalek/csiborg1/sph_temp/chain_{self.nsim}.hdf5" # noqa
def read_info(self):
filename = glob(join(self.source_dir, "info_*"))
filename = glob(join(self.source_dir, "info_*.txt"))
if len(filename) > 1:
raise ValueError("Found too many `info` files.")
filename = filename[0]
@ -675,6 +674,7 @@ def process_final_snapshot(nsim, simname):
flush=True)
# Lastly, create the halo mapping and default catalogue.
print(f"{now()}: writing `{reader.output_cat}`.")
print(f"{datetime.now()}: creating `GroupOffset`...")
halo_map, unique_halo_ids = make_offset_map(halo_ids)
# Dump the halo mapping.
@ -744,8 +744,9 @@ def process_initial_snapshot(nsim, simname):
del sort_indxs_final
collect()
print(f"{now()}: loading and sorting the initial particle position.")
print(f"{now()}: loading and sorting the initial particle information.")
pos = reader.read_snapshot("pos")[sort_indxs]
mass = reader.read_snapshot("mass")[sort_indxs]
del sort_indxs
collect()
@ -764,6 +765,8 @@ def process_initial_snapshot(nsim, simname):
with File(reader.output_snap, 'w') as f:
f.create_dataset("Coordinates", data=pos,
**hdf5plugin.Blosc(**BLOSC_KWARGS))
f.create_dataset("Masses", data=mass,
**hdf5plugin.Blosc(**BLOSC_KWARGS))
def process_initial_snapshot_csiborg2(nsim, simname):
@ -836,39 +839,6 @@ def process_initial_snapshot_csiborg2(nsim, simname):
**hdf5plugin.Blosc(**BLOSC_KWARGS))
###############################################################################
# Prepare CSiBORG1 RAMSES for SPH density field #
###############################################################################
def prepare_csiborg1_for_sph(nsim):
"""
Prepare a RAMSES snapshot for cosmotool SPH density & velocity field
calculation.
"""
reader = CSiBORG1Reader(nsim, "final")
print("------- Preparing CSiBORG1 for SPH -------")
print(f"Simulation index: {nsim}")
print(f"Output file: {reader.sph_file}")
print("-------------------------------------------------")
print(flush=True)
with File(reader.sph_file, 'w') as dest:
# We need to read pos first to get the dataset size
pos = reader.read_snapshot("pos")
dset = dest.create_dataset("particles", (len(pos), 7),
dtype=numpy.float32)
dset[:, :3] = pos
del pos
collect()
dset[:, 3:6] = reader.read_snapshot("vel")
dset[:, 6] = reader.read_snapshot("mass")
###############################################################################
# Command line interface #
###############################################################################
@ -883,8 +853,8 @@ if __name__ == "__main__":
"csiborg2_random", "csiborg2_varysmall"],
help="Simulation name.")
parser.add_argument("--mode", type=int, required=True,
choices=[0, 1, 2, 3],
help="0: process final snapshot, 1: process initial snapshot, 2: process both, 3: prepare CSiBORG1 for SPH.") # noqa
choices=[0, 1, 2],
help="0: process final snapshot, 1: process initial snapshot, 2: process both") # noqa
args = parser.parse_args()
if "csiborg2" in args.simname and args.mode in [0, 2]:
@ -897,8 +867,6 @@ if __name__ == "__main__":
process_final_snapshot(args.nsim, args.simname)
elif args.mode == 1:
process_initial_snapshot(args.nsim, args.simname)
elif args.mode == 2:
else:
process_final_snapshot(args.nsim, args.simname)
process_initial_snapshot(args.nsim, args.simname)
else:
prepare_csiborg1_for_sph(args.nsim)

View file

@ -24,13 +24,13 @@ if __name__ == "__main__":
# simname = "csiborg2_varysmall"
# mode = 1
chains = [1] + [25 + n * 25 for n in range(19)]
simname = "csiborg2_random"
mode = 1
# chains = [1] + [25 + n * 25 for n in range(19)]
# simname = "csiborg2_random"
# mode = 1
# chains = [7444 + n * 24 for n in range(1, 101)]
# simname = "csiborg1"
# mode = 3
chains = [7444 + n * 24 for n in range(101)]
simname = "csiborg1"
mode = 2
env = "/mnt/zfsusers/rstiskalek/csiborgtools/venv_csiborg/bin/python"
memory = 64

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long