Skip to content

Layers.c

Riccardo Viviano edited this page Jan 30, 2019 · 11 revisions

In the current version of learning lab (30/1/2019) there are 4 types of different layers you can build:

  • Fully-Connected layers
  • Convolutional layers
  • Residual layers
  • Batch normalized layers

Fully-connected layers:

In this type of layer you have some input neurons, some output neurons and output and input neurons are connected by weights. Each output neuron is connected to each input neuron, each connection is a weight saved as a float number. Furthermore each output neuron has a bias always as float number.

Initialization of Fully-Connected layer:

fcl* fully_connected(int input, int output, int layer, int dropout_flag, int activation_flag, float dropout_threshold)

int input:= is the number of input neurons of this layer

int output:= is the number of output neurons of this layer

int layer:= is a index => [0,N] that indicates which position is occupied by this layer

For example:

layer = 0 this layer is the first layer of the network, layer = n this layer is in the position n +1 in the network.

int dropout_flag:= This parameter can be set as NO_DROPOUT, DROPOUT, DROPOUT_TEST.

If it is set to DROPOUT, then after this layer there will be not a dropout, if is set to DROPOUT then during the training there is a dropout_threshold% (last parameter of fully-connected) that a neuron will not be considered, if it is set to DROPOUT_TEST then during you test phase the outputs coming from this layer will be shifted by dropout_threshold.

For example:

I used a threshold of 0.3 during the training. I will set the dropout flag to DROPOUT. Then during the test i will create a new threshold parameter 1-0.3 = 0.7. So, during the test phase i will set the dropout flag to DROPOUT_TEST and dropout threshold to 0.7.

int activation_flag:= indicates what kind of activation you want to apply. These are the activation that can be applied for now:

#define NO_ACTIVATION 0
#define SIGMOID 1
#define RELU 2
#define SOFTMAX 3
#define TANH 4
#define LEAKY_RELU 5

float dropout_threshold:= is the threshold used during the dropout.

Structure of fully-connected layer

typedef struct fcl { //fully-connected-layers
    int input,output,layer,dropout_flag;//dropout flag = 1 if dropout must be applied
    int activation_flag; // activation flag = 0 -> no activation, flag = 1 -> sigmoid, = 2 -> relu, = 3 -> softmax, 4->tanhh
    float* weights;// output*input
    float* d_weights;// output*input
    float* d1_weights;// output*input
    float* d2_weights;// output*input
    float* biases; //output
    float* d_biases; //output
    float* d1_biases; //output
    float* d2_biases; //output
    float* pre_activation; //output
    float* post_activation; //output
    float* dropout_mask;//output
    float* dropout_temp;//output
    float* temp;//output
    float* temp3;//output
    float* temp2;//input
    float* error2;//input
    float dropout_threshold;
} fcl;

You can access to each parameter with fcl* your_name_fcl_structure->param, the comments next to each vector parameter indicate the dimension of that vector.

The weights are the effective weights of the layer, same for the biases, each other vector is used for the feed forward, backpropagation and gradient descent.

If you want to access to the result after the activation function it is stored in post_activation vector, otherwise if you want to access to the result before the activation function it is stored in pre_activation vector

Free the space allocated by fully-connected layer, and save fully connected layer

If you want to deallocate the space occupied by the fully connected layer you can do that with

void free_fully_connected(fcl* f)

If you want to save you fully connected layer in a .bin file, you can do that with:

void save_fcl(fcl* f, int n)

The f layer will be save on a file named as n.bin where n is a number that will be converted in string format. To Load you fcl* layer:

fcl* load_fcl(FILE* fr)

Clone this wiki locally