TensorFlow and KenLM - tensorflow

How does one use KenLM with tensorflow as decoder?
I know about tensorflow-with-kenlm tf fork, but it is based on 1.1 tf version which doesn't have many important features for my project.

Related

How to make predictions with Mask R-CNN and python 3.10

My problem:
I have weights for a Mask R-CNN Model, which has been trained using python 3.7 and tensorflow 1.13.1. I can use this environment to make predictions.
I am able to reproduce those predictions using python 3.8 and loading the weights with the Mask R-CNN for tensorflow 2 and tensorflow 2.4.1.
When I use python 3.10 and tensorflow 2.9.1 the prediction process runs without any errors but the results do not make any sense. The class instances are just a view randomly distributed specks. The results look similar for python 3.8 and tensorflow 2.9.1.
Where I'm at
I need to use python 3.10, I don't care about the tensorflow version. I found requirements for an environment that should work for python 3.9 using tensorflow 2.7. But to my understanding for python 3.10 I need tensorflow 2.8 or higher.
What I need
I have no experience with tensorflow or Mask R-CNN so I don't really know where to start. Did someone already encounter this kind of problem, is it something typical and does it point into a certain direction?

How to perform quantize aware training with tensorflow 1.15?

I am using tensorflow 1.15 due to dependency on multiple other modules and struggling to do quantize aware training. I came across tensorflow_model_optimization, but it works with tensorflow 2.x. Is there any way Quantization can be performed during training with tensorflow 1.15?
We cannot as quantization aware training was introduced in TF 2.0, so please upgrade to 2.x and let us know if you face any issues. as there is no workaround for it in 1.x.

What is the difference between keras and tf.keras?

I'm learning TensorFlow and Keras. I'd like to try https://www.amazon.com/Deep-Learning-Python-Francois-Chollet/dp/1617294438/, and it seems to be written in Keras.
Would it be fairly straightforward to convert code to tf.keras?
I'm not more interested in the portability of the code, rather than the true difference between the two.
The difference between tf.keras and keras is the Tensorflow specific enhancement to the framework.
keras is an API specification that describes how a Deep Learning framework should implement certain part, related to the model definition and training.
Is framework agnostic and supports different backends (Theano, Tensorflow, ...)
tf.keras is the Tensorflow specific implementation of the Keras API specification. It adds the framework the support for many Tensorflow specific features like: perfect support for tf.data.Dataset as input objects, support for eager execution, ...
In Tensorflow 2.0 tf.keras will be the default and I highly recommend to start working using tf.keras
At this point tensorflow has pretty much entirely adopted the keras API and for a good reason - it's simple, easy to use and easy to learn, whereas "pure" tensorflow comes with a lot of boilerplate code. And yes, you can use tf.keras without any issues, though you might have to re-work your imports in the code. For instance
from keras.layers.pooling import MaxPooling2D
Would turn into:
from tensorflow.keras.layers import MaxPooling2D
The history of Keras Vs tf.keras is long and twisted.
Keras: Keras is a high-level (easy to use) API, built by Google AI Developer/Researcher, Francois Chollet. Written in Python and capable of running on top of backend engines like TensorFlow, CNTK, or Theano.
TensorFlow: A library, also developed by Google, for the Deep Learning developer Community, for making deep learning applications accessible and usable to public. Open Sourced and available on GitHub.
With the release of Keras v1.1.0, Tensorflow was made default backend engine. That meant: if you installed Keras on your system, you were also installing TensorFlow.
Later, with TensorFlow v1.10.0, for the first time tf.keras submodule was introduced in Tensorflow. The first step in integrating Keras within TensorFlow
With the release of Keras 2.3.0,
first release of Keras in sync with tf.keras
Last major release to support other multi-backend engines
And most importantly, going forward, recommend switching the code from keras to Tensorflow2.0 and tf.keras packages.
Refer this tweet from François Chollet to use tf.keras.
That means,
Change Everywhere
From
from keras.models import Sequential
from keras.models import load_model
To
from tensorflow.keras.models import Sequential
from tensorflow.keras.models import load_model
And In requirements.txt,
tensorflow==2.3.0
*Disclaimer: it might give conflicts if you were using an older version of Keras. Do pip uninstall keras in that case.

Replace "from tensorflow.contrib import layers"

How to replace from tensorflow.contrib import layers with new core functionality. I want to move my TF 1.4 code to 1.12 in preparation for TF 2.0.
The core functionality corresponding to tf.contrib.layers is in tf.layers. Some of the differences are discussed in this question. However this will not prepare you for TF 2.0.
If your goal is to prepare your code for TF 2.0, consider that tf.contrib will be removed entirely (either split from TF or integrated into it) and that tf.layers too will be removed and the high-level API will reside under tf.keras. So to best prepare for TF 2.0 you should start using tf.keras.layers instead.
Here is a blog post about some of the practical differences to expect with TF 2.0.

How do I identify Keras version which is merged to Tensorflow current?

I am trying to use Keras/TensorFlow. But some options are not supported (ex. TensorBoard embeddings_freq) . I want to know TensorFlow merging policy for Keras, especially for synchronizing schedule and how to check Keras merged version.
The Keras in tf.keras is a reimplementation of keras and not a merge of a particular version. File issues if features you need are missing.