In this tutorial we assume Tensorflow lib and binding are installed in your Pharo Image. Follow the installation procedure from libtensorflow-pharo-bindings to install them.
First of all we can get the version of our API
self display openInJupyter:TensorFlowCAPI uniqueInstance version.
Create our first scalar tensor...
myScalar := Float pi asTensor.
self display openInJupyter: myScalar asNumbers.
self display openInJupyter: myScalar.
Since a tensor is an extenal C structure alocated in C memory, we have a handler to its address.
Let's create now a 2-Tensor (vector) and a 3-Tensor (Matrix)
myVector := TF_Tensor fromFloats: #(1 2 3 4 5).
self display openInJupyter: myVector
And a 3-Tensor
myMatrix := TF_Tensor fromFloats: #((1 2 3 4 5)(6 7 8 9 10)).
self display openInJupyter: myMatrix
We can ask a tensor for its rank or its shape. Rank is the number of dimensions of a Tensor, shape is the lenght of those dimensions. For example, the rank of a myMatrix will be 2, shape of myMatrix will be 2x5
self display openInJupyter:myMatrix rank
self display openInJupyter:myMatrix shape
A graph contains a set of Operations objects, which represent units of computation; and Tensor objects, which represent the units of data that flow bettween operations.
Graph can be serialized (as protobuff) and can be exchanged between platforms.
graph := TF_Graph create.
Operation is a node in a TF_Graph that takes 0, 1 or n tensors as inputs and produces 0, 1 or n-tensors as output.
Example of TF operations are:
After the dataflow has been defined, the graph is executed within a session and inside a device (CPU, GPU, ...)
Graph can also be splitted on many devices for a distributed execution.
A Dataflow graph is designed to be portable, It is a language-independant representation of the code, so that, it can be re-used in other languages.
Let's use the graph we have created. We are going to create two constants, c1 and c2 and assign their values, both of them 1-tensor type values, 3.0 and 4.0.
c1 := graph const:'c1' value: 3.0 asTensor.
c2 := graph const:'c2' value: 4.0 asTensor.
Next, we create an operation node summing c1 and c2 tensors and a session for our dataflow graph.
We want to show the output of the operation sum, so that we use the #runOutput: method and we have to provide the output of the sum operation, that is the output 0.
sum := c1 + c2.
session := TF_Session on:graph.
result := session runOutput: (sum output:0).
self display openInJupyter: result asNumbers
self display
extent:400@400;
openInJupyter: graph asRoassalView
In order to multiply two matrix, we are going to create to 2-tensors from two 2-dimensional arrays and declare as imput constants. Following the same procedure as we followed in sum operation, we will build the operation node and the session, after that, we will run the output.
graph := TF_Graph create.
t1 := TF_Tensor fromFloats:#((1 2)(3 4)).
t2 := TF_Tensor fromFloats:#((5 6)(7 8)).
c1 := graph const:'c1' value: t1.
c2 := graph const:'c2' value: t2.
mult := c1 * c2.
session := TF_Session on:graph.
result := session runOutput: (mult output:0).
self display
extent:400@400;
openInJupyter: result asNumbers
self display
extent:400@400;
openInJupyter: graph asRoassalView
let's make our first example. First of all let's create a tensor of 100x100x100 filled with 0.
graph := TF_Graph create.
zeros := graph zerosShaped: #(100 100 100).
session := TF_Session on: graph.
result := session runOutput: (zeros output: 0).
As we can see, we have created a 3-tensor of 100x100x100, A million of cells filled with zeros.
It computes a regression on arbitrary functions. Implementation of http://cs.stanford.edu/people/karpathy/convnetjs/demo/regression.html
RegressionNNExample exampleTrainedAndPlot.
| aux |
graph := TF_Graph create.
inputSize := 1.
hidden1Size := 20.
hidden2Size := 20.
hidden3Size := 20.
outputSize := 1.
graph
fromBlock: [
aux := graph truncatedNormalRandomShaped: { inputSize. hidden1Size} stddev: 1.0 / inputSize sqrt.
weights1 := graph variable: 'weights1' initialValueFrom: aux.
aux := graph zerosShaped: { hidden1Size}.
biases1 := graph variable: 'biases1' initialValueFrom: aux.
aux := graph truncatedNormalRandomShaped: { hidden1Size. hidden2Size} stddev: 1.0 / hidden1Size sqrt.
weights2 := graph variable: 'weights2' initialValueFrom: aux.
aux := graph zerosShaped: {hidden2Size}.
biases2 := graph variable: 'biases2' initialValueFrom: aux.
aux := graph truncatedNormalRandomShaped: { hidden2Size. hidden3Size } stddev: 1.0 / hidden2Size sqrt.
weights3 := graph variable: 'weights3' initialValueFrom: aux.
aux := graph zerosShaped: { hidden3Size }.
biases3 := graph variable: 'biases3' initialValueFrom: aux.
aux := graph truncatedNormalRandomShaped: {hidden3Size. outputSize } stddev: 1.0 / hidden3Size sqrt.
weights4 := graph variable: 'weights4' initialValueFrom: aux.
aux := graph zerosShaped: { outputSize }.
biases4 := graph variable: 'biases4' initialValueFrom: aux]
named: 'parameters'
hidden1 := graph
fromBlock: [:image |
input := image.
(image * weights1 + biases1) rectified]
named: 'layer1'.
hidden2 := graph fromBlock: [(hidden1 * weights2 + biases2) sigmoid] named: 'layer2'.
hidden3 := graph fromBlock: [(hidden2 * weights3 + biases3) sigmoid] named: 'layer3'.
prediction := graph fromBlock: [hidden3 * weights4 + biases4] named: 'layer4'.
loss := graph
fromBlock: [ :expected |
expectedLabel := expected.
(prediction - expectedLabel) squared meanOn: #(0) asInt32Tensor ]
inputTypes: {TF_Tensor typeFloat}
named: 'loss'.
| axis0 learningRate batchSize biasGradient one backprop learnBiases1 learnBiases2 learnBiases3 learnBiases4 learnWeights1 learnWeights2 learnWeights3 learnWeights4 |
learningRate := 0.1 asTensor.
batchSize := graph fromBlock: [(input sizeOn: 0) castTo: TF_Tensor typeFloat] named: 'batchSize'.
axis0 := graph const: #(0) asInt32Tensor.
one := 1.0 asTensor asOperationOn: graph.
graph
fromBlock: [ | gradient|
gradient := (prediction - expectedLabel).
biasGradient := gradient meanOn: axis0.
learnWeights4 := weights4 descent: hidden3 \* gradient @/ batchSize rate: learningRate.
learnBiases4 := biases4 descent: biasGradient rate: learningRate.
backprop := (gradient *\ weights4)]
named: 'learning4'.
graph
fromBlock: [ | gradient |
gradient := backprop @* hidden3 @* (one - hidden3).
biasGradient := gradient meanOn: axis0.
learnWeights3 := weights3 descent: hidden2 \* gradient @/ batchSize rate: learningRate.
learnBiases3 := biases3 descent: biasGradient rate: learningRate.
backprop := (gradient *\ weights3)]
named: 'learning3'.
graph fromBlock: [ | gradient |
gradient := backprop @* hidden2 @* (one - hidden2).
learnWeights2 := weights2 descent: hidden1 \* gradient @/ batchSize rate: learningRate.
learnBiases2 := biases2 descent: (gradient meanOn: axis0) rate: learningRate.
backprop := (gradient *\ weights2)]
named: 'learning2'.
graph fromBlock: [ | gradient |
gradient := backprop timesRectifiedGradOf: hidden1.
learnWeights1 := weights1 descent: input \* gradient rate: learningRate.
learnBiases1 := biases1 descent: (gradient meanOn: axis0) rate: learningRate]
named: 'learning1'.
learn := graph newOperation: 'Identity' named: 'learn' described: [:description |
description
addInput: loss output;
addControlInput: learnWeights1 output;
addControlInput: learnBiases1 output;
addControlInput: learnWeights2 output;
addControlInput: learnBiases2 output;
addControlInput: learnWeights3 output;
addControlInput: learnBiases3 output;
addControlInput: learnWeights4 output;
addControlInput: learnBiases4 output].
session := TF_Session on: graph.
graph initializeOn: session
function := [ :x | | y |
y := x * 10 - 5.
y * y sin ].
rnd := Random new.
xs := (1 to: 100) collect: [:i | {rnd next}].
ys := xs collect: [:x | {function value: x first}].
interval := 1 to: xs size.
10000 timesRepeat: [
|x indices y results|
indices := (1 to: 60) collect: [:i| interval atRandom].
x := indices collect: [:index | xs at: index].
y := indices collect: [:index | ys at: index].
results := session
runInputs: {input input: 0. expectedLabel input: 0}
values: {x asFloatTensor. y asFloatTensor}
outputs: {loss output:0. learn output}].
predictor := [ :x |
| result |
result := session
runInputs: {input input: 0}
values: {{{x}} asFloatTensor}
outputs: {prediction output: 0}.
result first asNumbers first first ].
xValues := 0 to: 1.0 by: 0.01.
b := RTGrapher new.
b extent: 300 @ 200.
ds := RTData new.
ds noDot.
ds points: xValues.
ds y: predictor.
ds x: #yourself.
ds connectColor: Color green.
b add: ds.
ds := RTData new.
ds noDot.
ds points: xValues.
ds y: function.
ds x: #yourself.
ds connectColor: Color red.
b add: ds.
self display
openInJupyter: b
self loadScript: IPRoassal js.
self display
interactionOn;
openInJupyter: graph asRoassalView