First of all, thank you for sharing the code of your work with us.
The publication states that Fast-TGCN uses approx. 4.13 million learnable parameters. However, if I use the code published here, it uses approx. 24.44 million learnable parameters.
def count_parameters(model):
return sum(p.numel() for p in model.parameters() if p.requires_grad)
model = Baseline()
print(f"Nr of parameters in million: {count_parameters(model)/1e6}")
Can you tell me if you used a different number of parameters in the experiments you conducted?
Thanks
First of all, thank you for sharing the code of your work with us.
The publication states that Fast-TGCN uses approx. 4.13 million learnable parameters. However, if I use the code published here, it uses approx. 24.44 million learnable parameters.
Can you tell me if you used a different number of parameters in the experiments you conducted?
Thanks