Skip to content

libinliang866/mnngp

Repository files navigation

MNNGP: Deep Maxout Network Gaussian Process

This is the code for the implementation of the bayesian inference based on Deep Maxout Network Gaussian Process.

Contributor: Libin Liang, Ye Tian and Ge Cheng.

Introduction

Maxout network is first proposed in Maxout Networks. Given proper initialization of weight and bias, the inifinite width, deep maxout network will be a Gaussian process with a deterministic kernel. The code here is to implement the bayesian inference with the deep maxout network kernel.

Implementation

   run_testing.py  --dataset  mnist  ### mnist or cifar10 \
                   --num_of_training  1000 ### number of training sample  \
                   --num_of_testing   1000 ### number of testing sample   \
                   --maxout_rank 2         ### 2, 3 or 4         \
                   --depth 10                            \
                   --sigma_w_sq    3    ### variance level of weight initialization  \
                   --sigma_b_sq    0.1    ### variance level of bias initialization  \

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors