Loss¶
This script includes Module class for python users to use Computational Graph in their model.
-
class
singa.module.
Module
¶ Bases:
object
Base class for your neural network modules.
Example usage:
import numpy as np from singa import opt from singa import tensor from singa import device from singa import autograd from singa.module import Module class Model(Module): def __init__(self): super(Model, self).__init__() self.conv1 = autograd.Conv2d(1, 20, 5, padding=0) self.conv2 = autograd.Conv2d(20, 50, 5, padding=0) self.sgd = opt.SGD(lr=0.01) def forward(self, x): y = self.conv1(x) y = self.conv2(y) return y def loss(self, out, y): return autograd.softmax_cross_entropy(out, y) def optim(self, loss): self.sgd.backward_and_update(loss)
-
eval
()¶ Sets the module in evaluation mode.
-
forward
(*input)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
- Parameters
*input – the input training data for the module
- Returns
the outputs of the forward propagation.
- Return type
out
-
graph
(mode=True, sequential=False)¶ Turn on the computational graph. Specify execution mode.
-
loss
(*args, **kwargs)¶ Defines the loss function performed when training the module.
-
on_device
(device)¶ Sets the target device.
The following training will be performed on that device.
- Parameters
device (Device) – the target device
-
optim
(*args, **kwargs)¶ Defines the optim function for backward pass.
-
train
(mode=True)¶ Set the module in evaluation mode.
-