TensorFlow Print: Print The Value Of A Tensor Object In TensorFlow
TensorFlow Print - Print the value of a tensor object in TensorFlow by understanding the difference between building the computational graph and running the computational graph
< > Code:
You must be a Member to view code
Access all courses and lessons, gain confidence and expertise, and learn how things work and how to use them.
or Log In
We start by importing TensorFlow as tf.
import tensorflow as tf
Then we check the version of TensorFlow that we’re using.
We’re using 1.0.1.
We’re going to start by building the computational graph.
That is, we’re going to define the variables and computations that will take place.
However, nothing is going to be evaluated at this time.
The first thing we do is we define a TensorFlow Variable.
random_tensor_var_one = tf.Variable(tf.random_uniform([2, 3, 4], minval=0, maxval=10, dtype=tf.int32, seed=None, name=None))
We’re going to name it random_tensor_var_one.
Inside this variable, we’re going to generate a tensor that is 2x3x4 and we’re going to fill it with a random uniform distribution with minimum value of 0, max value of 10, and the data type is going to be int32.
When we print this:
We can see that it is tensor, it’s a variable, it gives us the shape and the data type, but it has not been evaluated yet.
This is because we’re still building the computational graph.
So when we go to print the value, we’re not going to see anything in it.
We define a second TensorFlow Variable which we’re going to name random_tensor_var_two, created the same way.
random_tensor_var_two = tf.Variable(tf.random_uniform([2, 3, 4], minval=0, maxval=10, dtype=tf.int32, seed=None, name=None))
When we print this TensorFlow Variable:
We see the same thing, that it is a tensor variable, we see the shape and the data type, but we don’t see any values.
This is because it has not been initialized for use in a graph yet so the print functionality just tells us what it is.
Next, we’re going to create a TensorFlow Variable.
random_tensor_add_result = tf.Variable(tf.add(random_tensor_var_one.initialized_value(), random_tensor_var_two.initialized_value()))
We’re going to use tf.add to add random_tensor_var_one to random_tensor_var_two.
We’re going to use the .initialized_value functionality for each of these TensorFlow Variables and that addition is going to be assigned to the Python variable, random_tensor_add_result.
The reason we use the .initialized_value is that when we run the global variables initializer later, we don’t want it to break down because those two tensors weren’t initialized before the sum tried to be calculated.
We print this variable and we see that it is a shape 2, 3, 4, data type int32, and it is a Tensor Variable.
Now that we have created and defined all of the operations for the computational graph, it’s time to run the computational graph we built.
We launch the graph in the session.
sess = tf.Session()
Then within the session.run, we’re going to use the global_variables_initializer functionality to initialize all the variables.
To print the value of random_tensor_var_one, we run it inside the session and then we print the result.
So we see that it is a 2x3x4 tensor.
We can now print the random_tensor_var_two TensorFlow Variable.
It’s run inside the session.
Then when we print it, we see the value there as well.
Finally, we can evaluate this addition.
So this addition will now use the initialize variables for random_tensor_var_two as well as random_tensor_var_one.
That’s going to be evaluated, the addition is going to happen, and then inside the sess.run, it’s going to return the result which we print.
That is what we would expect.
4+7 is 11, 3+7 is 10, 5+7 is 12, 8+0 is 8.
So adding the two tensors has worked and we were able to print the addition.
So we saw the actual result for each of these prints, whereas when we were just building the computational graph, we just saw the variables but we didn’t see any values.
Now that we’re done with the example, we can close the TensorFlow session as we’re done with all the resources we used and they are no longer required.