How to save a tensorflow model trained in google datalab notebook for offline prediction? - tensorflow

I am using Google Cloud Datalab notebook to train my tensorflow model. I want to save the trained model for offline prediction. However, I am clueless on how to save the model. Should I use any tensorflow model saving method or is there any datalab/google cloud storage specific method to do so? Any help in this regard is highly appreciated.

You can use any tensorflow model saving method, but I would suggest that you save it into a Google Cloud Storage bucket and not to local disk. Most tensorflow methods accept Google Cloud Storage paths in place of file names, using the gs:// prefix.
I would suggest using the SavedModelBuilder as it is currently the most portable. There is an example here: https://github.com/GoogleCloudPlatform/cloudml-samples/blob/master/flowers/trainer/model.py#L393

Related

export_inference_graph with google function or cloudML serverless

I use the TensorFlow models object detection to train a model on the cloud with this tutorial and I would like to know if there is an option to export the model also with the Cloud ML engine or with Google Cloud Function?
In their tutorial there is an only local example
I have train model and now I don't want to create an instance (or use my laptop) to create the exported .pb file for inference
Thanks for the help
Take a look at this Tutorials:
https://cloud.google.com/ai-platform/docs/getting-started-keras
https://cloud.google.com/ai-platform/docs/getting-started-tensorflow-estimator

How to train a Keras model on GCP with fit_generator

I have an ML model developed in Keras and I can train it locally by calling its fit_generator and providing it with my custom generator. Now I want to use GCP to train this model. I've been following this article that shows how I can train a Keras model on GCP but it does not say what should I do if I need to load all my data into memory, process it and then feed it to the model through a generator.
Does anyone know how I can use GCP if I have a generator?
In the example you are following, the Keras model gets converted into an estimator, using the function model_to_estimator; this step is not necessary in order to use GCP, as GCP supports compiled Keras models. If you keep the model as a Keras model, you can call either its function fit (which supports the use of generators since TensorFlow 1.12) or fit_generator and pass them your generator as the first argument. If it works locally for you, then it should also be able to work in GCP. I have been able to run models in GCP similar to the one in the url you shared and using generators without any problems.
Also be advised that the gcloud ml-engine commands are being replaced by gcloud ai-platform. I recommend you follow this guide, as it is more updated than the one you linked to.

Can I use AWS Sagemaker without S3

If I am not using the notebook on AWS but instead just the Sagemaker CLI and want to train a model, can I specify a local path to read from and write to?
If you use local mode with the SageMaker Python SDK, you can train using local data:
from sagemaker.mxnet import MXNet
mxnet_estimator = MXNet('train.py',
train_instance_type='local',
train_instance_count=1)
mxnet_estimator.fit('file:///tmp/my_training_data')
However, this only works if you are training a model locally, not on SageMaker. If you want to train on SageMaker, then yes, you do need to use S3.
For more about local mode: https://github.com/aws/sagemaker-python-sdk#local-mode
As far as I know, you cannot do that. Sagemaker's framework and estimator API makes it easy for SageMaker to feed in data to the model at every iteration or epoch. Feeding from local would drastically slow down the process.
That begs the question - qhy not use S3. Its cheap and fast.

Train Tensorflow on Google Cloud ML

I have a model that I am trying to train on my local machine, but it needs more RAM than I have on my computer.
Because of this, I wish to train this model on Google Cloud ML.
This model that I am trying to train uses Reinforcement Learning and takes some actions and receives rewards from an environment developed in Python that takes as input a CSV file.
How can I export these to be trained on Google Cloud ML?
Can these rewards files be stored in Google cloud storage? Tensorflow reads such files natively is you use tf.file

Use Google Cloud Machine Learning service to predict with a locally retrained Inception model

I have locally retrained the Inception model using the retrain.py file from Google Code Lab TensorFlow for Poets and want to use Google Cloud machine Learning service to make predictions.
Specifically, I want to modify the retrain.py file, so my TensorFlow application is prepared for
gcloud beta ml predict --instances=INSTANCES --model=MODEL
(i.e., prediction only; no need for Google Cloud ML training ala gloud beta ml jobs submit training).
I understand conceptually that the retrain.py file must be modified as described in Preparing a Model.
But there is no complete answer showing all the lines of code in the retrain.py file after being modified. And the popularity of Google Code Lab TensorFlow for Poets and Pete Warden’s screencasts about retraining Inception makes one expect this to be a very common example of image classification among the TensorFlow community; which means an answer will benefit many in the community.
Will someone please answer with their version of the retrain.py file after being modified as described in Preparing a Model?
Note 1:
I have researched my question to confirm it has not been answered…
… The question asked by Davide Biraghi and answered by JoshGC “Q: How predict an image in google machine learning” does not show any modifications to the retrain.py file that retrains the Inception model in Google Code Lab TensorFlow for Poets.
… The question asked by KlezFromSpace and answered by rhaertel80 (with helpful comments by Robert Lacok) “Q: Deploy Retrained inception model on Google cloud machine learning” does not show all the lines of code in the retrain.py file after being modified for: Defining outputs; Creating inputs; Supporting variable batch sizes; Using instance keys; Adding input and output collections to the graph; and Exporting (saving) the final model. (See above Preparing a Model.)
… The question asked by Vinkeet Kaushik and answered by Robert Lacok (with helpful comments by mrry) “Q: Export a basic Tensorflow model to Google Cloud ML” is not specific to the retrain.py file that retrains the Inception model in Google Code Lab TensorFlow for Poets.
Note 2:
I assume the jpeg image for which prediction is to be made is
gcloud beta ml predict --instances=INSTANCES --model=MODEL
where INSTANCES is the path to a JSON file with information about the image as per the question asked by Davide Biraghi and answered by rhaertel80 “Q: How convert a jpeg image into json file in Google machine learning”
Note 3:
I assume I will manually store the EXPORT and EXPORT.META files saved by the modified retrain.py file at the URL I use to create MODEL in Google Cloud Console.
This posting yesterday by Google's Slaven Bilac appears to be the answer.