DeepPavlov error loading the model from Tensorflow (from_tf=True) - python-3.8

I'm trying to load the ruBERT model into Deeppavlov as follows:
#is a dict
config_path = {
"chainer": {
"in": [
"x"
],
"in_y": [
"y"
],
"out": [
"y_pred_labels",
"y_pred_probas"
],
"pipe": [
...
}
}
model = build_model(config_path, download=False)
At the same time, I have all the files of the original ruBERT model locally. However, an error throws when building the model:
OSError: Error no file named pytorch_model.bin found in directory ruBERT_hFace2 but there is a file for TensorFlow weights. Use `from_tf=True` to load this model from those weights.
At the same time, there is nowhere a clear explanation of how to pass this parameter through the build_model function.
How to pass this parameter across build_model correctly?
UPDATE 1
At the moment, the version of Deep Pavlov 1.0.2 is installed.
The checkpoint of the model consists of following files:

Currently there is no way to pass any parameter via build_model. In case of additional parameter you should align the configuration file accordingly. Alternatively you can change it via Python code.
from deeppavlov import build_model, configs, evaluate_model
from deeppavlov.core.commands.utils import parse_config
config = parse_config(f"config.json")
...
model = build_model(config, download=True, install=True)
But first please make sure that you are using the latest version of DeepPavlov. In addition please take a look at out recent article on Medium. If you need a further assistance please provide more details.

Related

import #with_kw from Parameters inside my own package

I have created my own Julia package and I am new to such practices.
I previously had this code:
using Parameters
#with_kw mutable struct MWE #deftype String
mwe1 = "default" ; #assert mwe1 in ["default", "1", "2"]
mwe2 = "test"
end
Which worked well. Now that I put it into my package, replacing using with import, I have the following error:
julia> import MyPackageMWE
[ Info: Precompiling RobustRSP [33e6bdf6-6d3e-458b-9f4e-8cd6eb784281]
[ Info: Loading JuMP
[ Info: Loading Gurobi
[ Info: Loading Combinatorics, DelimitedFiles, Dates and Random
[ Info: Loading Distributions, Graphs and Plots
[ Info: Loading Parameters and Formatting
[ Info: Loading Compose, Cairo and Fontconfig
[ Info: Loading .jl files 0%
ERROR: LoadError: UndefVarError: #with_kw not defined
Any ideas what went wrong?
Maybe it is related to #macros?
I don't know the package Parameters but it may be that if #with_kw is exported in Parameters then if you replace
using Parameters
with
import Parameters
then you should change
#with_kw with Parameters.#with_kw

Error after added Nuxt to my vue project (vue-style-loader css error)

Help me please. Added Nuxt(SSR) to the my vue project. All moved to the root of the project, created a page folder. The following error appeared:
ERROR in ./.nuxt/components/nuxt-loading.vue?vue&type=style&index=0&lang=css& (./node_modules/css-loader/dist/cjs.js??ref--3-oneOf-1-1!./node_modules/vue-loader/lib/loaders/stylePostLoader.js!./node_modules/postcss-loader/src??ref--3-oneOf-1-2!./node_modules/vue-loader/lib??vue-loader-options!./.nuxt/components/nuxt-loading.vue?vue&type=style&index=0&lang=css&)
Module build failed (from ./node_modules/css-loader/dist/cjs.js):
ValidationError: Invalid options object. CSS Loader has been initialized using an options object that does not match the API schema.
- options.modules has an unknown property 'compileType'. These properties are valid:
object { auto?, mode?, exportGlobals?, localIdentName?, localIdentRegExp?, context?, hashPrefix?, getLocalIdent? }
at validate (/app/node_modules/css-loader/node_modules/schema-utils/dist/validate.js:98:11)
at Object.loader (/app/node_modules/css-loader/dist/index.js:36:28)
# ./.nuxt/components/nuxt-loading.vue?vue&type=style&index=0&lang=css& (./node_modules/vue-style-loader??ref--3-oneOf-1-0!./node_modules/css-loader/dist/cjs.js??ref--3-oneOf-1-1!./node_modules/vue-loader/lib/loaders/stylePostLoader.js!./node_modules/postcss-loader/src??ref--3-oneOf-1-2!./node_modules/vue-loader/lib??vue-loader-options!./.nuxt/components/nuxt-loading.vue?vue&type=style&index=0&lang=css&) 4:14-327
# ./.nuxt/components/nuxt-loading.vue?vue&type=style&index=0&lang=css&
# ./.nuxt/components/nuxt-loading.vue
# ./.nuxt/App.js
# ./.nuxt/index.js
# ./.nuxt/client.js
# multi ./.nuxt/client.js
code from nuxt.config.js:
import { resolve } from 'path'
export default {
alias: {
style: resolve(__dirname, './assets/style'),
},
}
As I understand the error is related to the setting of the Nuxt.
In your package.json you do have sass-loader set to ^12.1.0, meanwhile the package introduces a breaking change during v11.0.0.
It requires Webpack5, which Nuxt is not compatible as of today. If you downgrade it down to v10.1.1 (and all related packages like node-sass), you should be fine.

How to create a tree visualization from a nested dictionary in Python 3

I would like to make a graphical visualization of a nested dictionary as a simple tree structure. I have tried several different solutions, but they are too old (python 2.7) or I get weird error messages, even after reinstalling the packages needed.
Here is an example of the nested dictionary. I can change the end nodes value to be whatever value is most useful and the dictionary should be able to scale and be used on a way bigger file structure.
{
"Folder1": {},
"Folder2": {
"Folder21": {},
"Folder22": {}
},
"Document1": {},
"Document2": {},
"Folder3": {
"Document31": {},
"Folder32": {
"Document 321": {},
"Document 322": {}
},
"Folder33": {
"Document331": {}
},
"Folder34": {
"Document341": {}
}
}
}
I have tried solutions using Mapping, NetworkX, GraphViz, pandas, matplotlib 3.1.3, Json, d3py 0.2.3, pyplot, numpy 1.18.1 and Pydot (pydot2 1.0.33, pydotplus 2.0.2).
Using pip3 18.1 to install the packages in Ubuntu 19
Then goal is creating something like this post has, but it is 7 years old, and I can't get it to work after translating from Python2 to Python3.
Python library for creating tree graphs out of nested Python objects (dicts)

How to configure ModelConfig.txt for TensorRT Models in Tensorflow Serving

Currently, in Tensorflow Serving you can specify a ModelConfig.txt that maps to a ModelConfig.proto file. This file contains a list of configurations for multiple models that will run within the Tensorflow Serving instance.
For Example:
model_config_list: {
config: {
name: "ssd_mobilenet_v1_coco",
base_path: "/test_models/ssd_mobilenet_v1_coco/",
model_platform: "tensorflow"
},
config: {
name: "faster_rcnn_inception_v2_coco",
base_path: "/test_models/faster_rcnn_inception_v2_coco/",
model_platform: "tensorflow"
}
}
As it stands when I attempt to place a TensorRT optimized model into the ModelConfig.txt the system fails.
How can I resolve this?

Create different versions form one bootstrap file with require.js

I develop an iPad/iPhone App web app. Both share some of the resources. Now I wanna build a bootstrap js that looks like this:
requirejs(['app'], function(app) {
app.start();
});
The app resource should be ipadApp.js or iphoneApp.js. So I create the following build file for the optimizer:
{
"appDir": "../develop",
"baseUrl": "./javascripts",
"dir": "../public",
"modules": [
{
"name": "bootstrap",
"out": "bootstrap-ipad.js",
"override": {
"paths": {
"app": "ipadApp"
}
}
},
{
"name": "bootstrap",
"out": "bootstrap-iphone.js",
"override": {
"paths": {
"app": "iphoneApp"
}
}
}
]
}
But this doesn't seems to work. It works with just one module but not with the same module with different outputs.
The only other solution that came in my mind was 4 build files which seems a bit odd. So is there a solution where i only need one build file?
AFAIK the r.js optimizer can only output a module with a given name once - in your case you are attempting to generate the module named bootstrap twice. The author of require.js, #jrburke made the following comment on a related issue here:
...right now you would need to generate a separate build command for each script being targeted, since the name property would always be "almond.js" for each one.
He also suggests:
...if you wanted just one build file to run, you could create a node program and drive the optimizer multiple times in one script file. This example shows using requirejs as a module and calling requirejs.optimize().
I took a similar approach in one of my projects - I made my build.js file an ERB template and created a Thor task that ran through my modules and ran r.js once for each one. But #jrburke's solution using node.js is cleaner.