Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
107 changes: 0 additions & 107 deletions .github/workflows/test-spack.yml

This file was deleted.

1 change: 0 additions & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ graft mechanisms
graft modcc
graft python
graft scripts
# graft spack
graft sup
graft test
# graft validation
Expand Down
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
[![ci](https://github.com/arbor-sim/arbor/actions/workflows/test-matrix.yml/badge.svg)](https://github.com/arbor-sim/arbor/actions/workflows/test-matrix.yml)
[![spack](https://github.com/arbor-sim/arbor/actions/workflows/test-spack.yml/badge.svg)](https://github.com/arbor-sim/arbor/actions/workflows/test-spack.yml)
[![pip](https://github.com/arbor-sim/arbor/actions/workflows/test-pip.yml/badge.svg)](https://github.com/arbor-sim/arbor/actions/workflows/test-pip.yml)
[![pythonwheels](https://github.com/arbor-sim/arbor/actions/workflows/build-pip-wheels.yml/badge.svg)](https://github.com/arbor-sim/arbor/actions/workflows/build-pip-wheels.yml)
[![gitpod](https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/arbor-sim/arbor)
Expand Down
17 changes: 5 additions & 12 deletions doc/contrib/dependency-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,6 @@ Arbor relies on a (small) number of dependencies. We can distinguish three kinds

Note that the actual dependencies of your build configuration may vary.

In addition, ``spack/package.py`` contains a copy of the Spack package definition `upstream <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/arbor/package.py>`_. Here instructions for both in-repo and configure-time dependencies are defined.

This document contains rules for when and how to update dependencies and what to be mindful of when doing so.

List of dependencies
Expand Down Expand Up @@ -44,27 +42,22 @@ are essential. These environments must be able to build Arbor without issue, if
Also, build instructions for each of them must be given in the documentation.

* Ubuntu LTS-latest
* Ubuntu LTS-latest-1
* Ubuntu LTS-latest -1
* MacOS-latest
* MacOS-latest-1
* Cray programming environment on Piz Daint
* Programming environment on Juwels Booster (**todo** CI at JSC)
* MacOS-latest -1
* Programming environment on Juwels Booster and Jupiter
* Github Action venvs, see `list <https://github.com/actions/virtual-environments>`_.
* Manylinux containers. For compatibility of manylinux tags, see `here <https://github.com/pypa/manylinux#readme>`_.

Dependency update rules
-----------------------

#. ``doc/dependencies.csv``, git submodules and ``spack/package.py`` shall be in sync.
#. ``doc/dependencies.csv``, CPM installs and the ``spack`` package` shall be in sync.
#. Dependencies shall be set to a (commit hash corresponding to a) specific version tag. (All current dependencies use semver.)
#. The version shall be compatible with the user platforms (see above).
#. The version shall be compatible with the requirements in ``doc/dependencies.csv``.
#. The version shall be the lowest possible to facilitate the building of complex environments.
#. The submodule shall be set to the highest version provided by the latest Spack release ("Spack stable"). Spack CI tests both Spack stable and develop.
#. Moreover, dependencies shall not be updated past the most recent version of the dependency in Spack.

* This prevents Spack builds from pulling in ``master``, when a more recent version than available is required. `See here <https://spack.readthedocs.io/en/latest/packaging_guide.html#version-comparison>`_.
* This is a manual check, e.g. for pybind: `see pybind package.py <https://github.com/spack/spack/blob/develop/var/spack/repos/builtin/packages/py-pybind11/package.py>`_
#. The version shall be compatible with those available in the latest Spack release ("Spack stable").
#. Actually updating shall remain a manual process. Update may require nontrivial updates to Arbor, and updates to Spack upstream (e.g. make PR for pybind update).
#. A dependency update shall have a separate PR, and such a PR updates a single dependency at a time, unless the dependency update requires other dependencies to be updated.
#. This PR requires review by at least two reviewers.
Expand Down
1 change: 0 additions & 1 deletion doc/contrib/release.rst
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,6 @@ Post Release

#. Make a new PR setting ``VERSION`` to the next with a trailing ``-dev``. E.g. if you just released ``3.14.15``, change ``VERSION`` to ``3.15.16-dev``. Make sure the number portion always consists of a triple. Shorter versions are uninstallable by Spack (``spack install arbor@0.8`` will install v0.8.1, due to anything shorter than a triple being interpreted as a version range).

- Update ``spack/package.py``. The checksum of the targz is the sha256sum.
- Include changes such as to ``CITATIONS``, ``doc/index.rst`` in postrel PR. Copy Zenodo BibTex export to ``CITATIONS``.

#. Update ``scripts/check-all-tags.sh`` to check the current tag.
Expand Down
3 changes: 0 additions & 3 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,6 @@ Arbor
.. |ci| image:: https://github.com/arbor-sim/arbor/actions/workflows/test-matrix.yml/badge.svg
:target: https://github.com/arbor-sim/arbor/actions/workflows/test-matrix.yml

.. |spack| image:: https://github.com/arbor-sim/arbor/actions/workflows/test-spack.yml/badge.svg
:target: https://github.com/arbor-sim/arbor/actions/workflows/test-spack.yml

.. |pip| image:: https://github.com/arbor-sim/arbor/actions/workflows/test-pip.yml/badge.svg
:target: https://github.com/arbor-sim/arbor/actions/workflows/test-pip.yml

Expand Down
70 changes: 50 additions & 20 deletions doc/install/build_install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -645,22 +645,53 @@ Heterogeneous systems
Some HPC clusters offer different types of nodes, with different hardware and
where some may have GPUs. In order for the compilers to correctly target the
intended hardware and link to the appropriate libraries it may be necessary to
load a top-level module for cross-compiling. For example, on the hybrid Piz
Daint system, one would execute:
load a top-level module for cross-compiling. For example, on the hybrid Juwels
Booster system, one would execute:

.. code-block:: bash

module load daint-gpu
module load Stages/2026 CUDA

This loads the required dependencies for the GPU node architecture.

Bull / Atos Systems
-------------------

We need to add implementations for MPI and some build tools

.. code-block:: bash

module load OpenMPI GCC CMake

and if the Python interface is desired

.. code-block:: bash

module load Python SciPy-bundle mpi4py

then, in a proper build directory, we can configure the build. A common
seltection of options might be

.. code-block:: bash

cmake .. -DARB_VECTORIZE=ON -DCMAKE_BUILD_TYPE=release # use SIMD and select production build \
-DARB_WITH_MPI=ON # use MPI \
-DARB_GPU=cuda -DCMAKE_CUDA_ARCHITECTURES="80;90" # enable GPU support for A100 and H100 \
-DARB_WITH_PYTHON=ON -DARB_BUILD_PYTHON_STUBS=OFF # enable the Python interface

if GPU support is not required, the third line is omitted and MPI or Python
support can similarly disabled by setting the relevant options to ``OFF``. These
options should be correct on JSC systems like JUWELS and JUPITER. Alternatively,
the ParaStation MPI can be used, likewise available as a module.


Cray systems
------------

The compiler used by the MPI wrappers is set using a "programming environment" module.
The first thing to do is change this module, which by default is set to the Cray
programming environment, to a compiler that can compile Arbor.
For example, to use the GCC compilers, select the GNU programming environment:
The compiler used by the MPI wrappers is set using a "programming environment"
module. The first thing to do is change this module, which by default is set to
the Cray programming environment, to a compiler that can compile Arbor. For
example, to use the GCC compilers, select the GNU programming environment:

.. note::

Expand Down Expand Up @@ -738,14 +769,14 @@ python version, which knows about the Cray system:

.. code-block:: bash

$ module load cray-python/3.9.4.1
$ module load cray-python/3.9.4.1

Putting it all together, a typical workflow to build Arbor on a Cray system is:

.. code-block:: bash

export CRAYPE_LINK_TYPE=dynamic # only required if Cray PE version < 19.06

# For GPU setup
module load daint-gpu/21.09 # system specific
module load craype-accel-nvidia60 # system specific
Expand Down Expand Up @@ -951,41 +982,41 @@ If you hope to install Arbor from source in a virtual environment in order not t

.. code-block:: bash

#create a virtual environment
#create a virtual environment
conda create --name arbor_test
conda activate arbor_test

#go to the folder and clone the Arbor source package from GitHub
cd ~/miniconda3/envs/arbor_test/
mkdir src
cd src
git clone https://github.com/arbor-sim/arbor.git --recurse-submodules

#install python and numpy in this environment
conda install python=3.12.2
conda install numpy

#start the build
cd arbor
mkdir build
cd build
cmake .. -GNinja -DCMAKE_CXX_COMPILER=$(which g++) -DCMAKE_C_COMPILER=$(which gcc) -DARB_WITH_PYTHON=ON -DARB_VECTORIZE=ON -DPython3_EXECUTABLE=$(which python3) -DARB_USE_BUNDLED_LIBS=ON

#activate ninja to install
ninja
sudo ninja install

#correct the path to the site packages and the libc files
#first request the right Python site package path
python -c 'import numpy; print(numpy.__path__)'

#load the right path to the one used for installing
#load the right path to the one used for installing
#replace <site-packages> with the path you get in the previous operation before ‘/numpy’
cp -r ~/miniconda3/envs/arbor_test/src/arbor/build/python/arbor <site-packages>
#redirect the libc files such that the miniconda environment can access it

#redirect the libc files such that the miniconda environment can access it
ln -sf /lib/x86_64-linux-gnu/libstdc++.so.6 ~/miniconda3/envs/arbor_test/bin/../lib/libstdc++.so.6

#go to any working directory to try if you successfully installed arbor, by starting python and importing arbor.
#one thing to add here could be testing for the version, i.e.,
python -c 'import arbor; print(arbor.__version__)'
Expand All @@ -996,4 +1027,3 @@ If you hope to install Arbor from source in a virtual environment in order not t
conda activate arbor_test
python
>>>import arbor

2 changes: 1 addition & 1 deletion doc/install/spack.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ To install Arbor using Spack, run ``spack install arbor``.
Build Options
-------------

Arbor can be built with various options, just like the regular CMake build. For instance, to have Spack build Arbor with MPI enabled, run ``spack install arbor +mpi``. For a full overview of the build options, please refer to the `our Spack package.yml <https://github.com/arbor-sim/arbor/blob/master/spack/package.py>`_.
Arbor can be built with various options, just like the regular CMake build. For instance, to have Spack build Arbor with MPI enabled, run ``spack install arbor +mpi``.

Why use Spack?
--------------
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ exclude = [
"doc/scripts/inputs.py",
"doc/scripts/make_images.py",
".*",
"spack/package.py"]
]

line-length = 88
indent-width = 4
Expand Down
Loading
Loading