How to write cross-framework machine learning code for tensorflow and pytorch? - tensorflow

Machine learning framework comprise, amongst other things, the following functions:
augmentations
metrics and losses
These functions are simple conversions of tensors and seem rather framework independent. However, for example tensorflow's categorical crossentropy loss uses some tensorflow specific functions like tf.convert_to_tensor() or tf.cast(). So it cannot be used easily in pytorch. Also tensorflow heavily prefers to work with tensorflow tensors instead of numpy ones to create tensorflow graphs to my knowledge.
Are there any existing efforts or ideas how to write such functions in a way that they can be used in both frameworks? I'm thinking of pure numpy functions which can be somehow converted to either tensorflow or pytorch.

Related

Why do we use keras back-end command in codes?

import tensorflow as tf
from tensorflow import keras
from keras import backend as K
What is the reason behind using the command—>
from keras import backend as K
What does it do? I would appreciate it if anyone explains it the simple way so that it does not get complicated in the mind.
You can find more information on what Keras backend actually is here or here.
In simpler terms to understand what Keras backend actually is
Keras is a model-level library that provides high-level building blocks for developing deep learning models. Keras does not provide low-level operations such as tensor multiplication and convolution. Instead, it relies on a specialized, well-optimized tensor library that serves as Keras' "backend engine". Instead of choosing one single tensor library and tying your Keras implementation to that library, Keras handles the problem in a modular way, allowing you to seamlessly connect multiple different backend engines to Keras.
Keras backend will allow you to write custom code or in a particular case a new "Keras module" for your use case that can support Theano and/or Tensorflow both. Like instead of tf.placeholder() you could write keras.backend.placeholder() which will work across both the libraries mentioned earlier.

What is the difference between tf.square, tf.math.square and tf.keras.backend.square?

I have been looking to learn TensorFlow and I have noticed that different functions are used for the same goal. To square a variable for instance, I have seen tf.square(), tf.math.square() and tf.keras.backend.square(). This is the same for most math operations. Are all these the same or is there any difference?
Mathematically, they should produce the same result. However Tensorflow functions in tensorflow.math.somefunction are used for operating Tensorflow tensors.
For example, when you write a custom loss or metric, the inputs and outputs should be Tensorflow tensors. So that Tensorflow knows how to take gradients of the functions. You can also use tf.keras.backend.* functions for custom loss etc.
Try to use tensorflow.math.somefunctions whenever you can, native operations are preferred. Because they are officially documented and guarateed to have backward compatibility between TF versions like TF 1.x and TF 2.x.

Should I use the standalone Keras library or tf.keras?

As Keras becomes an API for TensorFlow, there are lots of old versions of Keras code, such as https://github.com/keiserlab/keras-neural-graph-fingerprint/blob/master/examples.py
from keras import models
With the current version of TensorFlow, do we need to change every Keras code as?
from tensorflow.keras import models
You are mixing things up:
Keras (https://keras.io/) is a library independent from TensorFlow, which specifies a high-level API for building and training neural networks and is capable of using one of multiple backends (among which, TensorFlow) for low-level tensor computation.
tf.keras (https://www.tensorflow.org/guide/keras) implements the Keras API specification within TensorFlow. In addition, the tf.keras API is optimized to work well with other TensorFlow modules: you can pass a tf.data Dataset to the .fit() method of a tf.keras model, for instance, or convert a tf.keras model to a TensorFlow estimator with tf.keras.estimator.model_to_estimator. Currently, the tf.keras API is the high-level API to look for when building models within TensorFlow, and the integration with other TensorFlow features will continue in the future.
So to answer your question: no, you don't need to convert Keras code to tf.keras code. Keras code uses the Keras library, potentially even runs on top of a different backend than TensorFlow, and will continue to work just fine in the future. Even more, it's important to not just mix up Keras and tf.keras objects within the same script, since this might produce incompatabilities, as you can see for example in this question.
Update: Keras will be abandoned in favor of tf.keras: https://twitter.com/fchollet/status/1174019423541157888

Tensorflow and keras

I am newbie on deep learning and it happens to me to confuse between Keras and tensorflow. knowing that tensorflow is a framework and Keras is a library, what is the difference between using these two deep learning tools.
Keras purposes is to use a framework in backend like Tensorflow, Theano or CNTK in an easier way.
For example, create a simple convolutional model under Tensorflow can be hard.
While create the same model under keras is very instinctive.
The difference between Tensorflow/Theano/CNTK and Keras is the following :
Keras is a framework who use the functions of Tensorflow/Theano/CNTK.
So Keras needs one of them to do something.
Tensorflow/Theano/CNTK or other like coffee can do everything by themselves.
But, often, it's harder to develop a model with them.

Why use keras as backend instead of using tensorflow?

I see that there are many similar functions between tensorflow and keras like argmax, boolean_mask...I wonder why people have to use keras as backend along with tensorflow instead of using tensorflow alone.
Keras is not a backend, but it is a high-level API for building and training Neural Networks. Keras is capable of running on top of Tensorflow, Theano and CNTK. Most of the people prefer Keras due to its simplicity compared to other libraries like Tensorflow. I recommend Keras for beginners in Deep Learning.
A Keras tensor is a tensor object from the underlying backend (Theano,
TensorFlow or CNTK), which we augment with certain attributes that
allow us to build a Keras model just by knowing the inputs and outputs
of the model.
Theano vs Tensorflow
Tensorflow is necessary if you wish to use coremltools. Apple has promised support for architectures created using Theano but I haven't seen it yet.
Keras will require unique syntax sugar depending on the backend in use. I like the flexibility of Tensorflow input layers and easy-access to strong Google neural networks.