{
"info": {
"author": "Manodeep Sinha",
"author_email": "manodeep@gmail.com",
"bugtrack_url": null,
"classifiers": [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"Intended Audience :: Science/Research",
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
"Operating System :: POSIX",
"Programming Language :: C",
"Programming Language :: Python",
"Programming Language :: Python :: 2",
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.4",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6"
],
"description": "|logo| \n\n|Release| |PyPI| |MIT licensed| |ASCL| |Travis Build| |Issues| |RTD|\n\nDescription\n===========\n\nThis repo contains a set of codes to calculate correlation functions and \nother clustering statistics in a cosmological box (co-moving XYZ)\nor on a mock (RA, DEC, CZ). Read the documentation on `corrfunc.rtfd.io `_. \n\nWhy Should You Use it\n======================\n\n1. **Fast** Theory pair-counting is **7x** faster than ``SciPy cKDTree``, and at least **2x** faster than all existing public codes.\n2. **OpenMP Parallel** All pair-counting codes can be done in parallel (with strong scaling efficiency >~ 95% up to 10 cores)\n3. **Python Extensions** Python extensions allow you to do the compute-heavy bits using C while retaining all of the user-friendliness of python. \n4. **Weights** All correlation functions now support *arbitrary, user-specified* weights for individual points\n5. **Modular** The code is written in a modular fashion and is easily extensible to compute arbitrary clustering statistics. \n6. **Future-proof** As I get access to newer instruction-sets, the codes will get updated to use the latest and greatest CPU features. \n\n*If you use the codes for your analysis, please star this repo -- that helps us keep track of the number of users.*\n\nBenchmark against Existing Codes\n================================\n\nPlease see this\n`gist `__ for\nsome benchmarks with current codes. If you have a pair-counter that you would like to compare, please add in a corresponding function and update the timings. \n\nInstallation\n============\n\nPre-requisites\n--------------\n\n1. ``make >= 3.80``\n2. OpenMP capable compiler like ``icc``, ``gcc>=4.6`` or ``clang >= 3.7``. If\n not available, please disable ``USE_OMP`` option option in\n ``theory.options`` and ``mocks.options``. You might need to ask your\n sys-admin for system-wide installs of the compiler; if you prefer to\n install your own then ``conda install gcc`` (MAC/linux) or\n ``(sudo) port install gcc5`` (on MAC) should work. \n3. ``gsl >= 2.4``. Use either\n ``conda install -c conda-forge gsl``\n (MAC/linux) or ``(sudo) port install gsl`` (MAC) to install ``gsl``\n if necessary.\n4. ``python >= 2.7`` or ``python>=3.4`` for compiling the C extensions.\n5. ``numpy>=1.7`` for compiling the C extensions.\n\nPreferred Install Method\n-------------------------\n\n::\n\n $ git clone https://github.com/manodeep/Corrfunc/\n $ make \n $ make install\n $ python setup.py install (--user)\n $ make tests \n\nAssuming you have ``gcc`` in your ``PATH``, ``make`` and\n``make install`` should compile and install the C libraries + python\nextensions within the source directory. If you would like to install the\npython C extensions in your environment, then\n``python setup.py install (--user)`` should be sufficient. If you are primarily\ninterested in the ``python`` interface, you can condense all of the steps\nby using ``python setup.py install CC=yourcompiler (--user)`` after ``git clone``.\n\nCompilation Notes\n------------------\n\n- If python and/or numpy are not available, then the C extensions will not be compiled.\n\n- ``make install`` simply copies files into the ``lib/bin/include`` sub-directories. You do not need ``root`` permissions\n\n- Default compiler on MAC is set to ``clang``, if you want to specify a different compiler, you will have to call ``make CC=yourcompiler``, ``make install CC=yourcompiler``, ``make tests CC=yourcompiler`` etc. If you want to permanently change the default compiler, then please edit the `common.mk `__ file in the base directory.\n\n- If you are directly using ``python setup.py install CC=yourcompiler (--user)``, please run a ``make distclean`` beforehand (especially if switching compilers)\n\n\nAlternate Install Method\n-------------------------\n\nThe python package is directly installable via ``pip install Corrfunc``. However, in that case you will lose the ability to recompile the code according to your needs. Installing via ``pip`` is **not** recommended, please open an install issue on this repo first; doing so helps improve the code-base and saves future users from running into similar install issues. \n\nInstallation notes\n------------------\n\nIf compilation went smoothly, please run ``make tests`` to ensure the\ncode is working correctly. Depending on the hardware and compilation\noptions, the tests might take more than a few minutes. *Note that the\ntests are exhaustive and not traditional unit tests*.\n\nWhile I have tried to ensure that the package compiles and runs out of\nthe box, cross-platform compatibility turns out to be incredibly hard.\nIf you run into any issues during compilation and you have all of the\npre-requisites, please see the `FAQ `__ or `email\nthe Corrfunc mailing list `__. Also, feel free to create a new issue\nwith the ``Installation`` label.\n\nClustering Measures on a Cosmological box\n-----------------------------------------\n\nAll codes that work on cosmological boxes with co-moving positions are\nlocated in the ``theory`` directory. The various clustering measures\nare:\n\n1. ``DD`` -- Measures auto/cross-correlations between two boxes.\n The boxes do not need to be cubes.\n\n2. ``xi`` -- Measures 3-d auto-correlation in a cubic cosmological box.\n Assumes PERIODIC boundary conditions.\n\n3. ``wp`` -- Measures auto 2-d point projected correlation function in a\n cubic cosmological box. Assumes PERIODIC boundary conditions.\n\n4. ``DDrppi`` -- Measures the auto/cross correlation function between\n two boxes. The boxes do not need to be cubes.\n\n5. ``DDsmu`` -- Measures the auto/cross correlation function between\n two boxes. The boxes do not need to be cubes.\n\n6. ``vpf`` -- Measures the void probability function + counts-in-cells.\n\nClustering measures on a Mock\n-----------------------------\n\nAll codes that work on mock catalogs (RA, DEC, CZ) are located in the\n``mocks`` directory. The various clustering measures are:\n\n1. ``DDrppi_mocks`` -- The standard auto/cross correlation between two data\n sets. The outputs, DD, DR and RR can be combined using ``wprp`` to\n produce the Landy-Szalay estimator for `wp(rp)`.\n\n2. ``DDsmu_mocks`` -- The standard auto/cross correlation between two data\n sets. The outputs, DD, DR and RR can be combined using the python utility \n ``convert_3d_counts_to_cf`` to produce the Landy-Szalay estimator for `xi(s, mu)`.\n \n3. ``DDtheta_mocks`` -- Computes angular correlation function between two data\n sets. The outputs from ``DDtheta_mocks`` need to be combined with\n ``wtheta`` to get the full `\\omega(\\theta)`\n\n4. ``vpf_mocks`` -- Computes the void probability function on mocks.\n\nScience options\n===============\n\nIf you plan to use the command-line, then you will have to specify the\ncode runtime options at compile-time. For theory routines, these options\nare in the file `theory.options `__ while for the mocks, these options are\nin file `mocks.options `__.\n\n**Note** All options can be specified at \nruntime if you use the python interface or the static libraries. Each one of\nthe following ``Makefile`` option has a corresponding entry for the runtime\nlibraries. \n\nTheory (in `theory.options `__)\n-------------------------------------------------\n\n1. ``PERIODIC`` (ignored in case of wp/xi) -- switches periodic boundary\n conditions on/off. Enabled by default.\n\n2. ``OUTPUT_RPAVG`` -- switches on output of ``