Create a conda virtual environment and install the required libraries:
$ conda env create -f environment.yml
If option A fails, follow instead this guide and then run:
$ pip install torch_geometric torch_scatter scikit-learn
The datasets used troughout our paper can be found here. Only the NeRF directory (nerf) is strictly required to run our code, whereas graphs (graph) and embeddings (emb) can be computed from scratch with some of the scripts provided in this repo, as detailed in the following sections.
By default, our scripts look for data in the ./data directory. Otherwise, you can set their --data-root command-line argument to your desired directory. The directory structure of --data-root is assumed to be the same as the one described here.
Model weights can be downloaded from here. Once downloaded, place them in a directory called ./ckpts and keep the directory structure described here.
To train one of our models yourself, run train.py with the required command-line arguments. For example, to train the
$ python train.py --loss l_rec_con --wandb-user ... --wandb-project ... --data-root ...
The other choices for --loss are l_rec (aka l_con (aka
If graphs for training and validation NeRFs are not present in --data-root, train.py will compute them before training starts. Otherwise, it will skip the graph computation step and use the graphs found in --data-root.
NeRF graphs can be downloaded from here. To compute them yourself, run export_graphs.py with the required command-line arguments. For example, to compute the graphs of NeRFs belonging to the test set of
$ python export_graphs.py --data-root ... --dataset shapenet --arch mlp --split test
NeRF embeddings can be downloaded from here. To compute them yourself, download/export the corresponding NeRF graphs first (see previous section) and then run export_embs.py with the required command-line arguments. For example, to compute the embeddings produced by the trained
$ python export_embs.py --ckpt_name l_rec_con --data.root ... --dataset shapenet --arch mlp --split test
To perform classification, download/export the desired NeRF embeddings first (see previous section) and then run classify.py with the required command-line arguments. For example, to train a classifier on the embeddings produced by the trained
$ python classify.py --ckpt-name l_rec_con --wandb-user ... --wandb-project ... --data-root ... --arch mlp
To perform retrieval, download/export the desired NeRF embeddings first (see previous section) and then run retrieve.py with the required command-line arguments. For example, to perform the retrieval experiment on
$ python retrieve.py --ckpt-name l_rec_con --wandb-user ... --wandb-project ... --data-root ... --query-arch mlp --gallery-arch triplane
If you find our work useful, please cite us:
@inproceedings{ballerini2026weight,
title = {Weight Space Representation Learning on Diverse {NeRF} Architectures},
author = {Ballerini, Francesco and Zama Ramirez, Pierluigi and Di Stefano, Luigi and Salti, Samuele},
booktitle = {The Fourteenth International Conference on Learning Representations},
year = {2026}