csiborgtools/scripts/pre_fithalos.py

162 lines
6.5 KiB
Python
Raw Normal View History

Within halo work and NFW fit (#4) * add listing of snapshots * change distance to comoving * ignore cp files * rename nb * add str to list * add NFW profile shapes * add fits imports * Rename to Nsnap * in clumps_read only select props * make clumpid int * expand doc * add import * edit readme * distribute halos * add profile & posterior * add import * add import * add documentation * add rvs and init guess * update todo * update nb * add file * return end index too * change clump_ids format to int32 * skeleton of dump particle * update nb * add func to drop 0 clump indxs parts * add import * add halo dump * switch to float32 * Update TODO * update TODO * add func that loads a split * add halo object * Rename to clump * make post work with a clump * add optimiser * add Nsplits * ignore submission scripts * ignore .out * add dumppath * add job splitting * add split halos script * rename file * renaem files * rm file * rename imports * edit desc * add pick clump * add number of particles * update TODO * update todo * add script * add dumping * change dumpdir structure * change dumpdir * add import * Remove tqdm * Increase the number of splits * rm shuffle option * Change to remove split * add emojis * fix part counts in splits * change num of splits * rm with particle cut * keep splits * fit only if 10 part and more * add min distance * rm warning about not set vels * update TODO * calculate rho0 too * add results collection * add import * add func to combine splits * update TODO * add extract cols * update nb * update TODO
2022-10-30 21:16:56 +01:00
# Copyright (C) 2022 Richard Stiskalek
# This program is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation; either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful, but
# WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
A script to fit halos (concentration, ...). The particle array of each CSiBORG
realisation must have been split in advance by `runsplit_halos`.
Within halo work and NFW fit (#4) * add listing of snapshots * change distance to comoving * ignore cp files * rename nb * add str to list * add NFW profile shapes * add fits imports * Rename to Nsnap * in clumps_read only select props * make clumpid int * expand doc * add import * edit readme * distribute halos * add profile & posterior * add import * add import * add documentation * add rvs and init guess * update todo * update nb * add file * return end index too * change clump_ids format to int32 * skeleton of dump particle * update nb * add func to drop 0 clump indxs parts * add import * add halo dump * switch to float32 * Update TODO * update TODO * add func that loads a split * add halo object * Rename to clump * make post work with a clump * add optimiser * add Nsplits * ignore submission scripts * ignore .out * add dumppath * add job splitting * add split halos script * rename file * renaem files * rm file * rename imports * edit desc * add pick clump * add number of particles * update TODO * update todo * add script * add dumping * change dumpdir structure * change dumpdir * add import * Remove tqdm * Increase the number of splits * rm shuffle option * Change to remove split * add emojis * fix part counts in splits * change num of splits * rm with particle cut * keep splits * fit only if 10 part and more * add min distance * rm warning about not set vels * update TODO * calculate rho0 too * add results collection * add import * add func to combine splits * update TODO * add extract cols * update nb * update TODO
2022-10-30 21:16:56 +01:00
"""
from datetime import datetime
import numpy
Within halo work and NFW fit (#4) * add listing of snapshots * change distance to comoving * ignore cp files * rename nb * add str to list * add NFW profile shapes * add fits imports * Rename to Nsnap * in clumps_read only select props * make clumpid int * expand doc * add import * edit readme * distribute halos * add profile & posterior * add import * add import * add documentation * add rvs and init guess * update todo * update nb * add file * return end index too * change clump_ids format to int32 * skeleton of dump particle * update nb * add func to drop 0 clump indxs parts * add import * add halo dump * switch to float32 * Update TODO * update TODO * add func that loads a split * add halo object * Rename to clump * make post work with a clump * add optimiser * add Nsplits * ignore submission scripts * ignore .out * add dumppath * add job splitting * add split halos script * rename file * renaem files * rm file * rename imports * edit desc * add pick clump * add number of particles * update TODO * update todo * add script * add dumping * change dumpdir structure * change dumpdir * add import * Remove tqdm * Increase the number of splits * rm shuffle option * Change to remove split * add emojis * fix part counts in splits * change num of splits * rm with particle cut * keep splits * fit only if 10 part and more * add min distance * rm warning about not set vels * update TODO * calculate rho0 too * add results collection * add import * add func to combine splits * update TODO * add extract cols * update nb * update TODO
2022-10-30 21:16:56 +01:00
from mpi4py import MPI
Within halo work and NFW fit (#4) * add listing of snapshots * change distance to comoving * ignore cp files * rename nb * add str to list * add NFW profile shapes * add fits imports * Rename to Nsnap * in clumps_read only select props * make clumpid int * expand doc * add import * edit readme * distribute halos * add profile & posterior * add import * add import * add documentation * add rvs and init guess * update todo * update nb * add file * return end index too * change clump_ids format to int32 * skeleton of dump particle * update nb * add func to drop 0 clump indxs parts * add import * add halo dump * switch to float32 * Update TODO * update TODO * add func that loads a split * add halo object * Rename to clump * make post work with a clump * add optimiser * add Nsplits * ignore submission scripts * ignore .out * add dumppath * add job splitting * add split halos script * rename file * renaem files * rm file * rename imports * edit desc * add pick clump * add number of particles * update TODO * update todo * add script * add dumping * change dumpdir structure * change dumpdir * add import * Remove tqdm * Increase the number of splits * rm shuffle option * Change to remove split * add emojis * fix part counts in splits * change num of splits * rm with particle cut * keep splits * fit only if 10 part and more * add min distance * rm warning about not set vels * update TODO * calculate rho0 too * add results collection * add import * add func to combine splits * update TODO * add extract cols * update nb * update TODO
2022-10-30 21:16:56 +01:00
try:
import csiborgtools
except ModuleNotFoundError:
import sys
sys.path.append("../")
import csiborgtools
# Get MPI things
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
nproc = comm.Get_size()
paths = csiborgtools.read.CSiBORGPaths(**csiborgtools.paths_glamdring)
partreader =csiborgtools.read.ParticleReader(paths)
cols_collect = [("npart", numpy.int64), ("totpartmass", numpy.float64),
("Rs", numpy.float64), ("vx", numpy.float64),
("vy", numpy.float64), ("vz", numpy.float64),
("Lx", numpy.float64), ("Ly", numpy.float64),
("Lz", numpy.float64), ("rho0", numpy.float64),
("conc", numpy.float64), ("rmin", numpy.float64),
("rmax", numpy.float64), ("r200", numpy.float64),
("r500", numpy.float64), ("m200", numpy.float64),
("m500", numpy.float64), ("lambda200c", numpy.float64)]
def fit_clump(particles, clump, box):
out["npart"][n] = clump.Npart
out["rmin"][n] = clump.rmin
out["rmax"][n] = clump.rmax
out["totpartmass"][n] = clump.total_particle_mass
out["vx"][n] = numpy.average(clump.vel[:, 0], weights=clump.m)
out["vy"][n] = numpy.average(clump.vel[:, 1], weights=clump.m)
out["vz"][n] = numpy.average(clump.vel[:, 2], weights=clump.m)
out["Lx"][n], out["Ly"][n], out["Lz"][n] = clump.angular_momentum
for i, nsim in enumerate(paths.get_ics(tonew=False)):
if rank == 0:
print("{}: calculating {}th simulation `{}`."
.format(datetime.now(), i, nsim), flush=True)
nsnap = max(paths.get_snapshots(nsim))
box = csiborgtools.read.BoxUnits(nsnap, nsim, paths)
# Archive of clumps, keywords are their clump IDs
particle_archive = paths.split_path(nsnap, nsim)
clumpsarr = partreader.read_clumps(nsnap, nsim,
cols=["index", 'x', 'y', 'z'])
clumpid2arrpos = {ind: ii for ii, ind in enumerate(clumpsarr["index"])}
nclumps = len(particle_archive.files)
# Fit 5000 clumps at a time, then dump results
batchsize = 5000
# This rank does these `batchsize` clumps/halos
jobs = csiborgtools.utils.split_jobs(nclumps, nclumps // batchsize)[rank]
for clumpid in jobs:
... = fit_clump(particle_archive[str(clumpid)], clumpsarr[clumpid2arrpos[clumpid]])
jobs = csiborgtools.utils.split_jobs(nclumps, nproc)[rank]
for nsplit in jobs:
parts, part_clumps, clumps = csiborgtools.fits.load_split_particles(
nsplit, nsnap, nsim, paths, remove_split=False)
N = clumps.size
cols = [("index", numpy.int64), ("npart", numpy.int64),
("totpartmass", numpy.float64), ("Rs", numpy.float64),
("rho0", numpy.float64), ("conc", numpy.float64),
("lambda200c", numpy.float64), ("vx", numpy.float64),
("vy", numpy.float64), ("vz", numpy.float64),
("Lx", numpy.float64), ("Ly", numpy.float64),
("Lz", numpy.float64), ("rmin", numpy.float64),
("rmax", numpy.float64), ("r200", numpy.float64),
("r500", numpy.float64), ("m200", numpy.float64),
("m500", numpy.float64)]
out = csiborgtools.utils.cols_to_structured(N, cols)
out["index"] = clumps["index"]
for n in range(N):
# Pick clump and its particles
xs = csiborgtools.fits.pick_single_clump(n, parts, part_clumps,
clumps)
2022-11-25 13:24:06 +01:00
clump = csiborgtools.fits.Clump.from_arrays(
*xs, rhoc=box.box_rhoc, G=box.box_G)
out["npart"][n] = clump.Npart
out["rmin"][n] = clump.rmin
out["rmax"][n] = clump.rmax
out["totpartmass"][n] = clump.total_particle_mass
out["vx"][n] = numpy.average(clump.vel[:, 0], weights=clump.m)
out["vy"][n] = numpy.average(clump.vel[:, 1], weights=clump.m)
out["vz"][n] = numpy.average(clump.vel[:, 2], weights=clump.m)
out["Lx"][n], out["Ly"][n], out["Lz"][n] = clump.angular_momentum
# Spherical overdensity radii and masses
rs, ms = clump.spherical_overdensity_mass([200, 500])
out["r200"][n] = rs[0]
out["r500"][n] = rs[1]
out["m200"][n] = ms[0]
out["m500"][n] = ms[1]
2022-11-25 13:24:06 +01:00
out["lambda200c"][n] = clump.lambda200c
# NFW profile fit
if clump.Npart > 10 and numpy.isfinite(out["r200"][n]):
nfwpost = csiborgtools.fits.NFWPosterior(clump)
logRs, __ = nfwpost.maxpost_logRs()
Rs = 10**logRs
if not numpy.isnan(logRs):
out["Rs"][n] = Rs
out["rho0"][n] = nfwpost.rho0_from_Rs(Rs)
out["conc"][n] = out["r200"][n] / Rs
csiborgtools.read.dump_split(out, nsplit, nsnap, nsim, paths)
# Wait until all jobs finished before moving to another simulation
comm.Barrier()
# # Use the rank 0 to combine outputs for this CSiBORG realisation
# if rank == 0:
# print("Collecting results!")
# partreader = csiborgtools.read.ParticleReader(paths)
# out_collected = csiborgtools.read.combine_splits(
# utils.Nsplits, nsnap, nsim, partreader, cols_collect,
# remove_splits=True, verbose=False)
# fname = paths.hcat_path(nsim)
# print("Saving results to `{}`.".format(fname))
# numpy.save(fname, out_collected)
#
# comm.Barrier()
#
# if rank == 0:
# print("All finished! See ya!")