Skip to content

CVPRLab-UniParthenope/3DPyraNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Build Status

3DPyranet

3DPyranet is a deep pyramidal neural network, based on biological pyramidal neurons developed in [1];

Usage

Check train.py file for the correct pipeline and model usage. Code uses Sparse Softmax Cross Entropy as loss function, it doesn't need One Hot Encoding.

Documentation

Documentation can be found here:

https://3dpyranet.readthedocs.io/

Dependencies

  • Python 3;
  • Tensorflow 1.4+;
  • TQDM;
  • Numpy;
-> WARNING <-
In requirements.txt check which version of tensorflow do you need! By default tensorflow-gpu is enabled

FLAGS

Checkpoint and evaluation

Option Type Default Description
evaluate_every float 1 Number of epoch for each evaluation (decimals allowed)
test_milestones list 15,25,50 Each epoch where performs test
save_checkpoint boolean False Flag to save checkpoint or not
checkpoint_name string 3dpyranet.ckpt Name of checkpoint file

Input

Option Type Default Description
train_path string // Path to npy training set
train_labels_path string // Path to npy training set labels
val_path string // Path to npy val/test set
val_labels_path string // Path to npy val/test set labels
save_path string // Path where to save network model

Input parameters

Option Type Default Description
batch_size int 100 Batch size
depth_frames int 16 Number of consecutive samples
height int 100 Sample height
width int 100 Sample width
in_channels int 1 Sample channels
num_classes int 6 Number of classes

Hyper-parameters settings

Option Type Default Description
feature_maps int 3 Number of maps to use (strict model shares the number of maps in each layer)
learning_rate float 0.00015 Learning rate
decay_steps int 15 Number of epoch for each decay
decay_rate float 0.1 Learning rate decay
max_steps int 50 Maximum number of epoch to perform
weight_decay float None L2 regularization lambda

Optimization

Option Type Default Description
optimizer string MOMENTUM Optimization algorthim (GD - MOMENTUM - ADAM)
use_nesterov boolean False Flag to use Nesterov Momentum (it works only with MOMENTUM optimizer)

References

[1] Ullah, Ihsan, and Alfredo Petrosino. "Spatiotemporal features learning with 3DPyraNet." International Conference on Advanced Concepts for Intelligent Vision Systems. Springer, Cham, 2016.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages