Where is source for tensorflow gym environments implementation - tensorflow

I need to implement custom tensorflow gym environment to use it with tf agents.
Is there a code on Github for "standard" gym environment? Eg cart pole
Please note this is tensorflow specific question not openAi

In addition to my comment:
OpenAI Gym provides a cartpole example:
Cartpole.py
If you need any additonal information regarding gym, Cartpole integration or Actor-Critics I can highly recommend the following lectures:
Reinforcement Learning # FAU

According to this tutorial:
https://www.tensorflow.org/agents/tutorials/2_environments_tutorial
you either create python (OpenAi like) environment and then use tensorflow wrapper to make it more efficient (covered in the tutorial)
or
create pure tensorflow environment, code example here:
https://github.com/tensorflow/agents/blob/master/tf_agents/environments/tf_environment_test.py

Related

What is planned for the tf model garden?

First, thanks for a great library. While it helps with lots of great implementations, its seems that at least some parts of it do not keep up with the pace of tensorflow development.
What is planned for object detection stuff? Will tf-slim be replaced with something alive? Is tf2 support planned?
The official repository provides a collection of example implementations for SOTA models using the latest TensorFlow 2's high-level APIs.
The TensorFlow Model Garden team is actively working on providing more TensorFlow 2 models.
Please read this blog for more information.
https://blog.tensorflow.org/2020/03/introducing-model-garden-for-tensorflow-2.html
Please also check the GitHub repository to find more news.
https://github.com/tensorflow/models/tree/master/official#more-models-to-come
Please check the milestone for Object Detection API at https://github.com/tensorflow/models/milestones.
It will support TensorFlow 2 by early July.

How to use a custom model with Tensorflow Hub?

My goal is to test out Google's BERT algorithm in Google Colab.
I'd like to use a pre-trained custom model for Finnish (https://github.com/TurkuNLP/FinBERT). The model can not be found on TFHub library. I have not found a way to load model with Tensorflow Hub.
Is there a neat way to load and use a custom model with Tensorflow Hub?
Fundamentally: yes. Everyone can create the kind of models that TF Hub hosts, and I hope authors of interesting models do consider that.
For TF1 and the hub.Module format tailored to it, see
https://www.tensorflow.org/hub/tf1_hub_module#creating_a_new_module
For TF2 and its revised SavedModel format, see
https://www.tensorflow.org/hub/tf2_saved_model#creating_savedmodels_for_tf_hub
That said, a sophisticated model like BERT requires a bit of attention to export it with all bells and whistles, so it helps to have some tooling to build on. The BERT reference implementation for TF2 at https://github.com/tensorflow/models/tree/master/official/nlp/bert comes with an open-sourced export_tfhub.py script, and anyone can use that to export custom BERT instances created from that code base.
However, I understand from https://github.com/TurkuNLP/FinBERT/blob/master/nlpl_tutorial/training_bert.md#general-info that you are using Nvidia's fork of the original TF1 implementation of BERT. There are Hub modules created from the original research code, but the tooling to that end has not been open-sourced, and Nvidia doesn't seem to have added their own either.
If that's not changing, you'll probably have to resort to doing things the pedestrian way and get acquainted with their codebase and load their checkpoints into it.

how to serve pytorch or sklearn models using tensorflow serving

I have found tutorials and posts which only says to serve tensorflow models using tensor serving.
In model.conf file, there is a parameter model_platform in which tensorflow or any other platform can be mentioned. But how, do we export other platform models in tensorflow way so that it can be loaded by tensorflow serving.
I'm not sure if you can. The tensorflow platform is designed to be flexible, but if you really want to use it, you'd probably need to implement a C++ library to load your saved model (in protobuf) and give a serveable to tensorflow serving platform. Here's a similar question.
I haven't seen such an implementation, and the efforts I've seen usually go towards two other directions:
Pure python code serving a model over HTTP or GRPC for instance. Such as what's being developed in Pipeline.AI
Dump the model in PMML format, and serve it with a java code.
Not answering the question, but since no better answers exist yet: As an addition to the alternative directions by adrin, these might be helpful:
Clipper (Apache License 2.0) is able to serve PyTorch and scikit-learn models, among others
Further reading:
https://www.andrey-melentyev.com/model-interoperability.html
https://medium.com/#vikati/the-rise-of-the-model-servers-9395522b6c58
Now you can serve your scikit-learn model with Tensorflow Extended (TFX):
https://www.tensorflow.org/tfx/guide/non_tf

What is the difference between TensorFlow's contrib.slim.nets and models/slim/nets?

In Github repository, we have tensorflow/models having slim and we also have slim in tensorflow.contrib.slim
They both have similar names, functionality and structure. They provide similar nets. For example, inception_v1
any reference for this brain split?
why they did not just git sub
module? any discussion link?
which is the most usable/stable/maintained?
which is the real net used to train pre-trained data? this one or this one
which one of those two is the real slim shady?
https://github.com/tensorflow/models/blob/master/research/slim/slim_walkthrough.ipynb under the section titled Installation and setup :
Since the stable release of TF 1.0, the latest version of slim has been available as tf.contrib.slim, although, to use TF-Slim for image classification (as we do in this notebook), you also have to install the TF-Slim image models library from here.

How does Tensorflow works

I am new to Tensorflow.
I am looking to get some help in understanding what is the minimum I would need to setup and work with a TensorFlow system?
Do I really need to read through the Tensorflow website documentation to understand the whole work process?
Basics of tensorflow is that first we create a model which is called a computational graph with tensorflow objects then we create a tensorflow session in which we start running all the computation.
To install in windows ,I found this webpage Installation of tensorflow in windows
To learn more about tensorflow ,you also see tensorflow guide.
I hope this helps.
YES YOU SHOULD!
Here is an easier version of tutorial: https://pythonprogramming.net/tensorflow-introduction-machine-learning-tutorial/
Easier and funnier version: How to Make a Tensorflow Neural Network (LIVE)