No description
Find a file
2012-11-06 13:25:25 -05:00
c_tools Slight re-organization of C/C++ tools. Significant modifications to support observational data. Python and pipeline scripts added 2012-10-31 10:43:15 -05:00
external Added argparse/setuptools. It is temporarily set to always on while I figure a way of detection python version 2012-11-06 13:25:25 -05:00
pipeline added README and cleaned up prepareGadgetCatalog script 2012-11-01 08:39:21 -05:00
python_tools Properly build and install the python tools in BINDIR/ext_build/python. Specify more dependencies. 2012-11-01 22:16:13 -04:00
zobov Slight re-organization of C/C++ tools. Significant modifications to support observational data. Python and pipeline scripts added 2012-10-31 10:43:15 -05:00
.gitignore Dump all modifications 2010-09-14 03:12:04 -05:00
GetQhull.cmake Changed the default libqhull to libqhull_p otherwise voz1b1 segv 2012-10-29 10:23:45 -05:00
README Re-formatted README files for screens with limited number of characters per line. :-) 2012-11-01 17:42:39 -04:00
run_env.sh.in Added another script to setup the environment prior to run a given executable 2012-11-01 21:30:44 -04:00
run_python.sh.in Fixed cmake scripts to run python build/install. New script to run python scripts. 2012-11-01 17:03:25 -04:00

After compiling, go to the pipeline directory.

Edit the parameters at the top of prepareGadgetCatalog.py: decide
where to put outputs, how many redshifts to do, how many slices,
subdivisions, subsamples, etc. etc.

Note that eventually prepareGadgetCatalog will be replaced by the more
general and flexible prepareCatalogs.

prepareGadgetCatalogs will produce a pipeline script for each
subsampling you choose. If you have multiple redshift particle files,
and choose multiple slices and/or subdivisions, they will be packaged
in the same pipeline script.

Run "./generateCatalog.py [name of pipeline script]" for each script
written by prepareGadgetCatalog. This will run generateMock, zobov,
and pruneVoids. At the end of it, you should have a void catalog for
each redshift, slice, and subdivision.

Check the logfiles for any error messages.

See the README of the public void catalog for the format of the
outputs.

I'm also working on incorporating plotting into the pipeline script,
so that you can immediately get some basic info about the voids.

Please do not change the outputs of pruneVoids etc. without
discussion, since further analysis relies on the current formats.

If you're wondering why these scripts are rather complex, it's because
it can also support A-P analysis, which is much more complicated :)

We can talk about ways to incorporate your analysis into the pipline
and to have your tools under this umbrella.

Good luck!