- Jupyter Notebook 83.3%
- Python 16.6%
| example | ||
| notes | ||
| npe | ||
| scripts | ||
| test | ||
| .gitattributes | ||
| .gitignore | ||
| cluster_script | ||
| cluster_script_gpu | ||
| environment.yml | ||
| environment_gpu.yml | ||
| LICENSE | ||
| pyproject.toml | ||
| README.md | ||
PineTree
Continuation of Tom Charnock's neural bias project, named Neural Physical Engine (NPE). For the original publication see https://arxiv.org/abs/1909.06379
It is now extended by the recent publication https://arxiv.org/abs/2407.01391
This repository is the implementation of the physics-informed and generative neural network for halo bias modelling and halo mock production from (approximate) dark matter overdensity fields. The network architecture consists of two parts: A one-convolutional network with symmetric kernels based on the multipole expansion to reduce number of independent weights in the kernel, and a log-normal Gaussian mixture density network that emulated the conditional halo mass function. For more details and results obtained with this model please see the reference mentioned above.
The main code is located in the folder npe/, while the folders scripts/ and notes/ contain a (so far) unsorted collection of scripts and jupyter notebooks that can be used as examples on how to utilise the code. A example training script is provided at the end of this README.
For questions or bugs, please open an issue or e-mail me via simon.ding@iap.fr
Installation
Create the conda environment (can be slow) by running
conda env create -f environment.yml
Then install the repository as package via
conda activate npe
cd npe/
pip install -e .
This assumes that cd npe/ takes you to the root of the repository.
(Currently not working!) Test that everything runs by executing
pytest test
Install optional dependency bias-bench
For benchmarking and plotting routines this repository relies on the bias-bench package. It should be installed into the same conda environment following the instructions here
Troubleshooting
It can happen that for some reason the conda install messes up with some dependencies, in this we can set up the environment using the following steps:
Create a new conda environment with Python3 by running
conda create --name npe_env 'python>=3.9'
First install the core dependencies of this package using
conda install numpy pandas pyaml tqdm notebook 'h5py>=3.9.0' 'matplotlib>=3.6'
If you don't want to run JAX with GPU-support then there is no need to install it anymore since it comes with the flax package. For JAX with GPU, please check the installation description at https://github.com/google/jax#installation. Also, it is best to install flax after having installed jax. Be aware that one needs have a working CUDA and jaxlib.
The working order of installation is by first beginning with the conda dependencies
conda install pandas matplotlib pyaml tqdm h5py
Then we install jaxlib, jax and flax through pip
pip install --upgrade pip
conda install jaxlib
and
# CUDA 12 installation
pip install --upgrade "jax[cuda12_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
# CUDA 11 installation
pip install --upgrade "jax[cuda11_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
pip install flax
Finally, install tensorflow-probabilities for jax via
pip install -Uq tfp-nightly[jax] > /dev/null
It is important to install the JAX-variant of tensorflow-probabilities since this will be independent of tensorflow. For more information, see https://www.tensorflow.org/probability/examples/TensorFlow_Probability_on_JAX.
In order to run the unit tests please install pytest using
conda install pytest
Extra: Configure git for Jupyter notebooks
In order to avoid checking in all outputs from jupyter notebooks, one can configure a git hook via
git config filter.strip-notebook-output.clean 'jupyter nbconvert --clear-output --inplace --stdin --stdout --log-level=ERROR'
The .gitattributes file with then use this filter whenever a notebook is added via git. This operation will not affect the local notebook state.
Example use case
A minimal training example is provided by scripts/train_pinetree.py with necessary files in the example/ folder. It can be run by
python scripts/train_pinetree.py --model_config example/model_config.yml --iteration_count 10