Tensorflow - Tensorboard Event Accumulator get Tensor from TensorEvent - numpy

I am working with Tensorflow and Tensorboard version 1.14.
I would like to perform some off-line analysis starting from Data I have saved during training using the tf.summary.tensor_summary()
I am not able to recover the data saved with the method described here, using the tf.train.summary_iterator which does recover scalar data but not the data I saved with the tensor_summary method.
Though with the EventAccumulator object I am able to recover the data I have saved, that it is returned as a TensorEvent Object which has the following attributes:
step
wall_time
tensor_proto
tensor_content
Thing is that I would like to convert this data into numpy array, the TensorEvent object sure has all the information needed (tensor_proto for type and shape, tensor_content for values), but not being a Tensor does not have a .value or a .numpy() method. So I do I trasform a TensorEvent Object into a numpy array? or equivalently into a Tensor object then into a numpy array?

You can use tf.make_ndarray to convert a TensorProto into a NumPy array:
tensor_np = tf.make_ndarray(tensor_event.tensor_proto)

Related

Convert Tensorflow Tensor to Numpyarray

I have a class 'tensorflow.python.framework.ops.Tensor as output and need to convert this to a numpy array.
.numpy() doesn't work because it isn't a eagerTensor.
.eval doesn't work as well, because i'm using tensorflow >2.0
Is there any other way to fix this?
img_height=330
img_width=600
img_depth=23
save_model="saved_Models/wheatModel"
prediction_data_path=["data/stacked/MOD13Q1.A2017.2738.tif","data/stacked/MOD13Q1.A2017.889.tif","data/stacked/MOD13Q1.A2017.923.tif"]
prediction_data=dataConv.preparePredictionData(prediction_data_path)
prediction_reshaped=dataConv.reshapeFiles(prediction_data,img_width,img_height,img_depth)
x_ds =tf.stack(prediction_reshaped)
model = tf.keras.models.load_model(save_model)
model.predict(x_ds)
image=model.get_layer(name='prediction_image').output
n,output_width,output_height,output_depth,output_channels=image.shape
print(type(image))
image=tf.reshape(image,(output_width,output_height,output_depth))
print(type(image))
image.numpy()
So in the code above.
I load my trained model
predict the given images
get the output from the next to last layer
reshape this data
Now i want to convert this tensor to an numpyarray

I cannot convert my pandas dataframe to a tensorflow dataset - Get a Value Error

My data source - https://www.kaggle.com/vbookshelf/respiratory-sound-database
Tensroflow version - 2.4.0
After a bit of Data Cleaning my pandas Dataframe looked like this:
My objective is to make Deep Learning Model for a Classification task, so I read the audio files with scipy.io wavefile and put the array as a feature in the data frame.
All the values of the audio have a shape of (882000,)
My problem is that I want to convert my Pandas Dataframe into a Tensorflow Dataset.
I get this error: ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type NumPy.ndarray).
I tried using tf.convert_to_tensor and still get the same error. What should I do?

What does the .numpy() function do?

I tried searching for the documentation online but I can't find anything that gives me an answer. What does .numpy() function do? The example code given is:
y_true = []
for X_batch, y_batch in mnist_test:
y_true.append(y_batch.numpy()[0].tolist())
Both in Pytorch and Tensorflow, the .numpy() method is pretty much straightforward. It converts a tensor object into an numpy.ndarray object. This implicitly means that the converted tensor will be now processed on the CPU.
Ever getting a problem understanding some PyTorch function you may ask help().
import torch
t = torch.tensor([1,2,3])
help(t.numpy)
Out:
Help on built-in function numpy:
numpy(...) method of torch.Tensor instance
numpy() -> numpy.ndarray
Returns :attr:`self` tensor as a NumPy :class:`ndarray`. This tensor and the
returned :class:`ndarray` share the same underlying storage. Changes to
:attr:`self` tensor will be reflected in the :class:`ndarray` and vice versa.
This numpy() function is the converter form torch.Tensor to numpy array.
If we look at this code below, we see a simple example where the .numpy() convert Tensors to numpy arrays automatically.
import numpy as np
ndarray = np.ones([3, 3])
print("TensorFlow operations convert numpy arrays to Tensors automatically")
tensor = tf.multiply(ndarray, 42)
print(tensor)
print("And NumPy operations convert Tensors to numpy arrays automatically")
print(np.add(tensor, 1))
print("The .numpy() method explicitly converts a Tensor to a numpy array")
print(tensor.numpy())
In the 2nd last line of code, we see that the tensorflow officials declared it as the converter of Tensor to a numpy array.
You may check it out here

Feeding the input with Tensors instead of numpy arrays in TensorFlow

If the input data is in numpy array format, then we can declare a placeholder in the graph and feed the placeholder with the numpy array data. However, if the input data is already in Tensor format (this is the case when we load jpg files using tf.image.decode_jpeg), then we can't feed a Tensor to a placeholder. In this case, should we use non trainable TF Variables as placeholders, and feed the Tensor to these Variables by tf.assign?
Figured it out. You can simply feed batches of Tensors to the model. The model probably has a line that looks similar to op = optimizer.minimize(loss). Then, each time sess.run(op) is called, the model will be trained on the batch provided to it. Also, each time sess.run(op) is called, we should have a different batch if we use tf.train.batch to provide the batch.

TensorFlow tensor to Pandas dataframe

My model learns W:
W = tf.Variable(tf.truncated_normal([pixels,h1],stddev=np.sqrt(2.0 / (pixels))))
I return W from a function that runs my TF graph / session.
In my notebook, I checked type of W:
type(W)
out: tensorflow.python.ops.variables.Variable
I also checked dimensionality of W:
W.get_shape()
out: TensorShape([Dimension(3072), Dimension(1024)])
I'd like to convert W into a Pandas dataframe (for examination, etc.).
How can I do this?
(Saw this answer on converting tensor to numpy with eval(), which could then be written to pandas of course. But that operation only seemed to work within the TF session.)
variables only exist within a session. they are defined in the graph, as operations, but dont actually store any values as such, in the graph . they only have values when a session is created from the graph, and initialize operation called (or load is called).
Of course, once you've loaded the value from the varaible, in a session, using eval, you're free to dispose of the session, and use the resulting numpy tensor jsut as any normal numpy tensor.