Skip to content

Lyz103/WWW26-R2NS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

R2NS

中文 | English

WWW 2026 paper: "R2NS: Recall and Re-ranking of Negative Samples for Sequential Recommendation"

中文说明

项目简介

本仓库是 WWW 2026 论文 R2NS: Recall and Re-ranking of Negative Samples for Sequential Recommendation 的官方代码仓库。

仓库主要包含三部分:

  • Ours/:R2NS 主方法实现
  • Baselines/:多种负采样基线方法
  • WWW_2026_Camera_Ready.pdf:论文 camera-ready 版本

快速跳转

中文仓库结构

R2NS/
├── Ours/
│   ├── data/
│   ├── run_finetune.bash
│   ├── run_finetune_full.py
│   ├── datasets.py
│   ├── models.py
│   ├── modules.py
│   ├── trainers.py
│   └── utils.py
├── Baselines/
│   ├── Neg_samples_DNS+/
│   ├── Neg_samples_gnno/
│   ├── Neg_samples_posmix/
│   ├── Neg_samples_srns/
│   └── Neg_samples_two_pass/
├── WWW_2026_Camera_Ready.pdf
└── README.md

中文环境依赖

仓库目前没有提供固定版本的 requirements.txt,需要手动安装依赖。

建议环境:

  • Python 3.9+
  • 支持 CUDA 的 PyTorch
  • NVIDIA GPU

代码中实际使用到的核心依赖包括:

  • torch
  • numpy
  • scipy
  • pandas
  • scikit-learn
  • tqdm
  • openpyxl
  • texttable
  • transformers
  • recbole
  • mamba-ssm

一个常见的环境配置方式如下:

conda create -n r2ns python=3.10 -y
conda activate r2ns

# 请根据你的 CUDA 版本先安装 PyTorch
# 例如:
# pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121

pip install numpy scipy pandas scikit-learn tqdm openpyxl texttable transformers recbole mamba-ssm

中文数据格式

代码默认每个数据集包含三个切分文件:

  • DATASET_train.txt
  • DATASET_val.txt
  • DATASET_test.txt

每一行表示一个用户的交互序列,物品 ID 之间用空格分隔,例如:

1802 1314
1802 1314 1313

当前仓库中只附带了 Beauty 的样例数据:

  • Ours/data/Beauty_train.txt
  • Ours/data/Beauty_val.txt
  • Ours/data/Beauty_test.txt

如果你要接入自己的数据集,只需要把三个文件放进 data/ 目录,并保持命名方式不变。

中文主方法运行

单次运行

请从 Ours/ 目录启动:

cd Ours

python run_finetune_full.py \
  --data_name Beauty \
  --backbone SASRec \
  --stage2_epoch 30 \
  --neg_sampler Uniform \
  --CL_type Gentle \
  --loss_type BCE \
  --M 5 \
  --N 100 \
  --K1 0.05

代码和脚本中常见的 backbone 包括:

  • SASRec
  • Mamba4Rec
  • GRU4Rec
  • LightSANs
  • Linrec
  • FMLPRecModel

批量运行

cd Ours
bash run_finetune.bash

运行批处理脚本前,建议先检查并修改:

  • models
  • datasets
  • K1
  • stage2_epoch
  • AVAILABLE_GPUS
  • JOBS_PER_GPU

该脚本会并行启动多个训练任务,并将日志写入脚本中配置的输出目录。

中文基线运行

仓库中包含以下基线目录:

  • Baselines/Neg_samples_DNS+
  • Baselines/Neg_samples_gnno
  • Baselines/Neg_samples_posmix
  • Baselines/Neg_samples_srns
  • Baselines/Neg_samples_two_pass

每个基线目录都带有各自的 run_finetune_full.pyrun_finetune.bash

重要:基线的数据路径

基线脚本默认从 ./data/ 读取数据,但当前仓库里实际只有 Ours/data/

因此在运行基线前,你需要做下面两件事中的一种:

  1. 在对应基线目录下创建一个名为 data 的软链接。
  2. 手动运行 run_finetune_full.py,并显式传入 --data_dir ../../Ours/data/

示例:

ln -s ../../Ours/data Baselines/Neg_samples_srns/data

cd Baselines/Neg_samples_srns
bash run_finetune.bash

或者:

cd Baselines/Neg_samples_srns

python run_finetune_full.py \
  --data_dir ../../Ours/data/ \
  --data_name Beauty \
  --backbone SASRec \
  --neg_sampler DNS \
  --CL_type Radical \
  --start_epoch 30 \
  --N 100

中文输出结果

根据不同脚本和目录,程序通常会生成:

  • 实验日志目录,例如 res/res2/ 或其他自定义目录
  • 保存在 output/ 下的模型 checkpoint
  • 类似 CL1.xlsx 的 Excel 汇总文件

中文复现注意事项

这个仓库已经足够用于阅读和继续实验,但如果你准备大规模复现,建议先检查以下几点:

  • Ours/run_finetune_full.py 在写结果时使用了 args.start2_epoch,但参数解析里实际定义的是 stage2_epoch
  • Ours/utils.py 里写 Excel 时使用了 args.K,而主方法脚本中定义的是 K1K2
  • 多个批量脚本里带有手工覆盖过的数据集列表、模型列表、GPU ID 和输出目录。
  • 当前仓库快照里只包含 Ours/data/Beauty_* 这组示例数据。

如果你要严格复现论文最终表格,建议先检查一遍脚本参数和结果落盘逻辑,再启动长时间训练任务。

中文引用说明

如果这个仓库对你的研究有帮助,请引用对应的 WWW 2026 论文。


English

Overview

This repository is the official implementation of the WWW 2026 paper "R2NS: Recall and Re-ranking of Negative Samples for Sequential Recommendation".

The repository mainly contains:

  • Ours/: the implementation of R2NS
  • Baselines/: several negative sampling baselines
  • WWW_2026_Camera_Ready.pdf: the camera-ready paper

Quick Links

Repository Structure

R2NS/
├── Ours/
│   ├── data/
│   ├── run_finetune.bash
│   ├── run_finetune_full.py
│   ├── datasets.py
│   ├── models.py
│   ├── modules.py
│   ├── trainers.py
│   └── utils.py
├── Baselines/
│   ├── Neg_samples_DNS+/
│   ├── Neg_samples_gnno/
│   ├── Neg_samples_posmix/
│   ├── Neg_samples_srns/
│   └── Neg_samples_two_pass/
├── WWW_2026_Camera_Ready.pdf
└── README.md

Environment

This repository does not currently provide a pinned requirements.txt, so dependencies need to be installed manually.

Recommended environment:

  • Python 3.9+
  • PyTorch with CUDA support
  • NVIDIA GPU

Core dependencies used by the code include:

  • torch
  • numpy
  • scipy
  • pandas
  • scikit-learn
  • tqdm
  • openpyxl
  • texttable
  • transformers
  • recbole
  • mamba-ssm

A typical setup is:

conda create -n r2ns python=3.10 -y
conda activate r2ns

# Install PyTorch first according to your CUDA version
# Example:
# pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121

pip install numpy scipy pandas scikit-learn tqdm openpyxl texttable transformers recbole mamba-ssm

Data Format

Each dataset is expected to contain three split files:

  • DATASET_train.txt
  • DATASET_val.txt
  • DATASET_test.txt

Each line is a user interaction sequence represented by space-separated item IDs, for example:

1802 1314
1802 1314 1313

The current repository snapshot only includes example files for Beauty:

  • Ours/data/Beauty_train.txt
  • Ours/data/Beauty_val.txt
  • Ours/data/Beauty_test.txt

To use your own dataset, place the three files under a data/ directory and keep the naming convention unchanged.

Run R2NS

Single run

Run from the Ours/ directory:

cd Ours

python run_finetune_full.py \
  --data_name Beauty \
  --backbone SASRec \
  --stage2_epoch 30 \
  --neg_sampler Uniform \
  --CL_type Gentle \
  --loss_type BCE \
  --M 5 \
  --N 100 \
  --K1 0.05

Common backbone choices in the codebase include:

  • SASRec
  • Mamba4Rec
  • GRU4Rec
  • LightSANs
  • Linrec
  • FMLPRecModel

Batch run

cd Ours
bash run_finetune.bash

Before launching the batch script, you should review:

  • models
  • datasets
  • K1
  • stage2_epoch
  • AVAILABLE_GPUS
  • JOBS_PER_GPU

The script launches multiple jobs in parallel and writes logs to its configured output directory.

Run Baselines

The repository includes the following baseline directories:

  • Baselines/Neg_samples_DNS+
  • Baselines/Neg_samples_gnno
  • Baselines/Neg_samples_posmix
  • Baselines/Neg_samples_srns
  • Baselines/Neg_samples_two_pass

Each baseline directory has its own run_finetune_full.py and run_finetune.bash.

Important: data path for baselines

Baseline scripts read data from ./data/ by default, while this repository currently only includes Ours/data/.

So before running a baseline, do one of the following:

  1. Create a symbolic link named data inside the target baseline directory.
  2. Run run_finetune_full.py manually and pass --data_dir ../../Ours/data/.

Example:

ln -s ../../Ours/data Baselines/Neg_samples_srns/data

cd Baselines/Neg_samples_srns
bash run_finetune.bash

Or:

cd Baselines/Neg_samples_srns

python run_finetune_full.py \
  --data_dir ../../Ours/data/ \
  --data_name Beauty \
  --backbone SASRec \
  --neg_sampler DNS \
  --CL_type Radical \
  --start_epoch 30 \
  --N 100

Outputs

Depending on the script and directory, the code may generate:

  • experiment log directories such as res/, res2/, or other custom folders
  • model checkpoints under output/
  • Excel summary files such as CL1.xlsx

Reproducibility Notes

This repository is usable for reading and continuing experiments, but a few details should be checked before large-scale reproduction:

  • Ours/run_finetune_full.py writes results using args.start2_epoch, while the parser actually defines stage2_epoch.
  • Ours/utils.py expects args.K when writing Excel results, while the main R2NS script defines K1 and K2.
  • Several batch scripts contain manually overwritten dataset lists, model lists, GPU IDs, and output directories.
  • The current repository snapshot only includes Ours/data/Beauty_*.

If you plan to reproduce the final tables strictly, review the script arguments and result-writing logic before launching long runs.

Citation

If you find this repository useful, please cite the corresponding WWW 2026 paper.

About

WWW2026 Code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors