Network

This module contains the class for lenet. This contains all the architecture design.

lenet.network.apply_adam(var_list, obj, learning_rate=0.0001)[source][source]

Sets up the ADAM optimizer

Parameters:
  • var_list – List of variables to optimizer over.
  • obj – Node of the objective to minimize

Notes

learning_rate: What learning rate to run with. (Default = 0.01) Set with LR

lenet.network.apply_gradient_descent(var_list, obj)[source][source]

Sets up the gradient descent optimizer

Parameters:
  • var_list – List of variables to optimizer over.
  • obj – Node of the objective to minimize

Notes

learning_rate: What learning rate to run with. (Default = 0.01) Set with LR

lenet.network.apply_l1(var_list, name='l1')[source][source]

This method applies L1 Regularization to all weights and adds it to the objectives collection.

Parameters:
  • var_list – List of variables to apply l1
  • name – For the tensorflow scope.

Notes

What is the co-efficient of the L1 weight? Set L1_COEFF.( Default = 0.0001 )

lenet.network.apply_regularizer(var_list)[source][source]

This method applyies Regularization to all weights and adds it to the objectives collection.

Parameters:var_list – List of variables to apply l1

Notes

What is the co-efficient of the L1 weight? Set L1_COEFF.( Default = 0.0001 )

lenet.network.apply_rmsprop(var_list, obj)[source][source]

Sets up the RMS Prop optimizer

Parameters:
  • var_list – List of variables to optimizer over.
  • obj – Node of the objective to minimize

Notes

  • learning_rate: What learning rate to run with. (Default = 0.001). Set LR
  • momentum: What is the weight for momentum to run with. (Default = 0.7). Set MOMENTUM
  • decay: What rate should learning rate decay. (Default = 0.95). Set DECAY
lenet.network.apply_weight_decay(var_list, name='weight_decay')[source][source]

This method applies L2 Regularization to all weights and adds it to the objectives collection.

Parameters:
  • name – For the tensorflow scope.
  • var_list – List of variables to apply.

Notes

What is the co-efficient of the L2 weight? Set WEIGHT_DECAY_COEFF.( Default = 0.0001 )

class lenet.network.lenet5(images)[source][source]

Definition of the lenet class of networks.

Notes

  • Produces the lenet model and returns the weights. A typical lenet has two convolutional layers with filters sizes 5X5 and 3X3. These are followed by two fully-connected layers and a softmax layer. This network model, reproduces this network to be trained on MNIST images of size 28X28.
  • Most of the important parameters are stored in global_definitions in the file global_definitions.py.
Parameters:images – Placeholder for images
images[source]

This is the placeholder for images. This needs to be fed in from lenet.dataset.mnist`.

dropout_prob[source]

This is also a placeholder for dropout probability. This needs to be fed in.

logits[source]

Output node of the softmax layer, before softmax. This is an output from a lenet.layers.dot_product_layer().

inference[source]

Output node of the softmax layer that produces inference.

predictions[source]

Its a predictions node which is tf.nn.argmax() of inference.

back_prop[source]

Backprop is an optimizer. This is a node that will be used by a lenet.trainer.trainer later.

obj[source]

Is a cumulative objective tensor. This produces the total summer objective in a node.

cost[source]

Cost of the back prop error alone.

labels[source]

Placeholder for labels, needs to be fed in. This is added fed in from the dataset class.

accuracy[source]

Tensor for accuracy. This is a node that measures the accuracy for the mini batch.

cook(labels)[source][source]

Prepares the network for training

Parameters:labels – placeholder for labels

Notes

  • Each optimizer has a lot parameters that, if you want to change, modify in the code directly. Most do not take in inputs and runs. Some parameters such as learning rates play a significant role in learning and are good choices to experiment with.
  • what optimizer to run with. (Default = sgd), other options include ‘rmsprop’ and ‘adam’. Set OPTIMIZER
lenet.network.process_params(params)[source][source]

This method adds the params to two collections. The first element is added to regularizer_worthy_params. The first and second elements are is added to trainable_parmas.

Parameters:params – List of two.