tf.random_uniform: Generate A Random Tensor In Tensorflow
tf.random_uniform - Generate a random tensor in TensorFlow so that you can use it and maintain it for further use even if you call session run multiple times
< > Code:
You must be a Member to view code
Access all courses and lessons, gain confidence and expertise, and learn how things work and how to use them.
or Log In
We import TensorFlow as tf.
import tensorflow as tf
Then we print out the TensorFlow version.
We see that we're using 1.0.1.
We're going to create a Python variable that uses the TensorFlow random uniform functionality.
rand_tensor_ex = tf.random_uniform([1, 2, 3], minval=0, maxval=1, dtype=tf.float32, seed=None, name=None)
We're going to create a tensor that’s 1x2x3 with a minimum value of 0 and a max value of 1, and the data type that we're going to use is tf.float32.
We're not going to use a seed and we're not going to give it a name.
We're going to use this Python variable to show what happens every time you run a TensorFlow session.
This time, we're going to do the same, tf.random_uniform, but we're going to put it into a TensorFlow variable.
random_tensor_var_ex = tf.Variable(tf.random_uniform([2, 3, 4], minval=0, maxval=1, dtype=tf.float32, seed=None, name=None))
What this will do is once it's run once, it'll store that variable in the session regardless of how many times we run the TensorFlow session.
We do a tf.random_uniform.
This time, we're going to make our tensor 2x3x4, a min value of 0, a max value of 1.
Data type again is tf.float32 with no seed and no name.
So the only difference between the Python variable, rand_tensor_ex, and random_tensor_var_ex is that we are putting this into a TensorFlow variable, whereas this, we're just running the TensorFlow random uniform functionality and putting it into a Python variable.
Next, we create a TensorFlow operation that initializes all the global variables in the graph.
init_var = tf.global_variables_initializer()
Then we define a session Python variable that's going to contain our TensorFlow session.
sess = tf.Session()
Now that we have a session, we can initialize all the variables.
What we're going to do is we’re going to print our random Python variable and our random TensorFlow variable to show you what happens every time you run a session.
So with the first one, it's not in a TensorFlow variable, so what we expect it to do is generate new numbers every time.
With the second one, random_tensor_var_ex, we expect it to run once the first time it’s run inside of the session and then it’s going to stay the same.
So we print the sess.run(rand_tensor_ex) and we see that it is in fact a tensor that is 1x2x3.
So we print it the first time.
We print it again and see that it's completely different.
So here, we have 0.48.
Here, we have 0.42.
All of the entries in this tensor are completely different.
So even though we have defined it and it is a Python variable, every time the TensorFlow session runs, it gives us a different result because it’s running this operation every time.
Now, we're going to print the result of the random_tensor_var_ex.
We see that it is a 2x3x4 tensor.
And if we print it again:
we see that it's the exact same value – 0.77 here, 0.77 here; 0.7666 there, 0.7666 there.
So we ran the session twice – once here and once here.
But because we have saved this uniform distribution tensor that we generated into a TensorFlow variable, it stayed with us throughout the running of the sessions, whereas in our rand_tensor_ex, we used the TensorFlow random uniform functionality but because we didn’t save it into a TensorFlow variable, every time we ran it, it generated a different value.
So when you create a random tensor in TensorFlow that’s going to be used across multiple TensorFlow session runs, you want to make sure that you are slotting it into a TensorFlow variable.
That way, it remains the same across all the operations that you do in a TensorFlow session.
The final thing we do is we close the TensorFlow session to release all the TensorFlow resources we used with the session as they're no longer required.