KADATH SPECTRAL SOLVER

The Frankfurt University/Kadath (FUKA) Initial Data solver branch is a collection of ID solvers aimed at delivering consistent initial data (ID) solutions to the eXtended Conformal Thin-Sandwich (XCTS) formulation of Einstein's field
equations for a variety of compact object configurations to include extremely compact, asymmetric, and mixed spin binaries. We list here the additions and modifications from the base Kadath library in support of these ID solvers:
  • This branch includes memory optimizations that inspired portions of the optimization branch to enable ~8x speed up of binary solvers
  • Modification/addition of numerical spaces for the BH, BBH, BNS, and BHNS
  • Addition of an equation of state infrastructure utilizing Margherita standalone to handle tabulated and polytropic EOS
  • Addition of the Configurator framework to enable extensibility of solvers by managing controls, stages, and key variables
  • Addition of exporters for all the previously mentioned ID types to minimize the effort to import ID into an evolution scheme
Those changes have been implemented by J. Papenfort and S. Tootle, with inputs from P. Grandclément and E. Most.

Getting the sources :

This is done via a git server. The following command is used :
git clone --branch fuka https://gitlab.obspm.fr/grandcle/Kadath.git
This will create a Kadath directory on the local computer.
Please note that users are currently NOT allowed to push changes to Kadath.

Compiling the library using Cmake

  • Set the following environment variables based on your setup in your ~/.bashrc file
    • HOME_KADATH
    • KAD_CC - e.g. gcc
    • KAD_CXX - e.g. g++
    • KAD_NUMC - e.g. 7
      Note: this is the number of parallel jobs for CMAKE to run when building Kadath
  • Go to the Kadath repository.
  • Go to build_release.
  • Create a build directory and enter it
  • Invoke cmake (options) .. (the CMakeList.txt file is in ..)
  • The available main cmake options are the following (the value in parentheses corresponds to the default settings) :
    • -DPAR_VERSION = On/Off (On)
      Set to On to build the MPI parallel version of the library. The initial data codes within this branch are only designed for use with MPI.
    • -DCMAKE_BUILD_TYPE
      Specifies the build type (essentially Release or Debug)
    • -DMPI_CXX_COMPILER
      Path to the MPI C++ wrapper (when not automatically detected by cmake)
    • -DMPI_C_COMPILER
      Path to the MPI C wrapper (when not automatically detected by cmake)
  • Example using GNU+mpi compilers:
    cmake -DCMAKE_BUILD_TYPE=Release -DPAR_VERSION=On -DMPI_CXX_COMPILER=mpic++ -DMPI_C_COMPILER=mpicc ..
  • In the cases where cmake could not find some required external libraries, one must pass them manually through the Kadath/Cmake/CMakeLocal.cmake file (the fftw and scalapack libraries must usually be provided in that way). Some templates are provided for that file in this directory.
  • Once cmake has been successfully invoked, use make -j $KAD_NUMC to start the compilation.

Compiling the library with the compile script

A script compile is also provided that can be used to facilitate the installation process. So long as the above environment variables are set and the libraries are found, no additional input is necessary.
Run the compile script within build_release in order to build the library.

Generating executables

All of the initial data solvers can be found under the codes directory:
  • codes/BH
  • codes/NS
  • codes/BBH
  • codes/BNS
  • codes/BHNS
These codes are prepared to be used with Cmake to be compiled. To do so manually:
  • Within a given ID folder (e.g. codes/BH), create a build directory and move into the build directory
  • Run the same cmake command used above, e.g.:
    cmake -DCMAKE_BUILD_TYPE=Release -DPAR_VERSION=On -DMPI_CXX_COMPILER=mpic++ -DMPI_C_COMPILER=mpicc ..
  • run make -j $KAD_NUMC
  • Your executable(s) should be in ../bin/Release/
Alternatively, one can copy or create a symbolic link to $HOME_KADATH/build_release/compile to a given ID folder (e.g. codes/BH/)
One can then execute the compile script in the ID folder which will create the build directory and build the codes automatically