Multi-stage $ref doesn't work with jsonschema version 3.0.1 - jsonschema

It seems that jsonschema version 3.0.1 does not accept multi-stage schema using $refs (while it works with jsonschema version 2.6.0).
I have to make it work under several module versions simply because my code will be running on different computers with different environments.
I verified my jsons on https://www.jsonschemavalidator.net/ (thanks for this link found in another StackOverflow question).
I Tried :
jsonschema -i myjson.json noRefs.schema.json --> 2.6.0 = OK, 3.0.1 OK
jsonschema -i myjson.json usingRefs.schema.json --> 2.6.0 = OK, 3.0.1 KO
Note :
Both *.schema.json worked on https://www.jsonschemavalidator.net/
File myjson.json :
{
"TopProperty" : {
"LowerProperty" : {"toto" : "plop"}
}
}
File noRefs.schema.json :
{
"type": "object",
"properties": {
"TopProperty": {"$ref": "#/schemaTopProperty"}
},
"schemaTopProperty": {
"$id": "schemaTopProperty",
"type": "object",
"properties": {
"LowerProperty": {
"type": "object",
"properties": {
"toto": {"type": "string"}
}
}
}
}
}
File usingRefs.schema.json :
{
"type": "object",
"properties": {
"TopProperty": {"$ref": "#/schemaTopProperty"}
},
"schemaTopProperty": {
"$id": "schemaTopProperty",
"type": "object",
"properties": {
"LowerProperty": {
"type": "object",
"properties": {
"toto": {"$ref": "#/justAString"}
}
}
}
},
"justAString": {
"$id": "justAString",
"type": "string"
}
}
Error message received :
Traceback (most recent call last):
File "/usr/bin/jsonschema", line 11, in <module>
sys.exit(main())
File "/usr/lib/python2.7/site-packages/jsonschema/cli.py", line 67, in main
sys.exit(run(arguments=parse_args(args=args)))
File "/usr/lib/python2.7/site-packages/jsonschema/cli.py", line 78, in run
for error in validator.iter_errors(instance):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 323, in iter_errors
for error in errors:
File "/usr/lib/python2.7/site-packages/jsonschema/_validators.py", line 274, in properties
schema_path=property,
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 339, in descend
for error in self.iter_errors(instance, schema):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 323, in iter_errors
for error in errors:
File "/usr/lib/python2.7/site-packages/jsonschema/_validators.py", line 251, in ref
for error in validator.descend(instance, resolved):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 339, in descend
for error in self.iter_errors(instance, schema):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 323, in iter_errors
for error in errors:
File "/usr/lib/python2.7/site-packages/jsonschema/_validators.py", line 274, in properties
schema_path=property,
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 339, in descend
for error in self.iter_errors(instance, schema):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 323, in iter_errors
for error in errors:
File "/usr/lib/python2.7/site-packages/jsonschema/_validators.py", line 73, in items
for error in validator.descend(item, items, path=index):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 339, in descend
for error in self.iter_errors(instance, schema):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 323, in iter_errors
for error in errors:
File "/usr/lib/python2.7/site-packages/jsonschema/_validators.py", line 274, in properties
schema_path=property,
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 339, in descend
for error in self.iter_errors(instance, schema):
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 323, in iter_errors
for error in errors:
File "/usr/lib/python2.7/site-packages/jsonschema/_validators.py", line 247, in ref
scope, resolved = validator.resolver.resolve(ref)
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 734, in resolve
return url, self._remote_cache(url)
File "/usr/lib/python2.7/site-packages/functools32/functools32.py", line 400, in wrapper
result = user_function(*args, **kwds)
File "/usr/lib/python2.7/site-packages/jsonschema/validators.py", line 744, in resolve_from_url
raise exceptions.RefResolutionError(exc)
jsonschema.exceptions.RefResolutionError: unknown url type: schemaTopProperty

Edit: my previous answer was incorrect.
TL;DR: You have two options:
Remove the $id properties from the definitions
Use #/ in the $id properties (Example: {"$id": "#/justAString"})
Details:
The issue is with the IDs, up until draft-04, $ref and $id were treated at face value, nothing special, but starting with draft-06 these are uri-references, in which case, when descending into {"$id": "schemaTopProperty"}, resolving {"$ref": "justAString"} is no more looking for a fragment justAString at the root structure, but for /justAString under schemaTopProperty host, which is a remote reference.
Hence my solutions to either remove the $ids which cause the definitions to be URLs (hosts in fact), or to define the $ids as what they are, fragments in the current schema.

Related

403 permission error when tpu-vm writing cloud bucket

When I run the train.py script from https://github.com/tensorflow/models/tree/master/official/nlp, I got a 403 permission error.
python3 official/nlp/train.py --tpu=con-bert1 --experiment=bert/pretraining --mode=train --model_dir=gs://con_bioberturk/general/ --config_file=gs://con_bioberturk/bert_base.yaml --config_file=gs://con_bioberturk/pretrain.yaml --params_override="task.init_checkpoint=gs://con_bioberturk/bert-base-turkish-cased-tf/model.ckpt"`
and my output is below:
I1115 07:49:02.847452 139877506112576 train_utils.py:368] Saving experiment configuration to gs://con_bioberturk/general/params.yaml
Traceback (most recent call last):
File "/usr/share/tpu/models/official/modeling/hyperparams/params_dict.py", line 349, in save_params_dict_to_yaml
yaml.dump(params.as_dict(), f, default_flow_style=False)
File "/usr/local/lib/python3.8/dist-packages/yaml/__init__.py", line 290, in dump
return dump_all([data], stream, Dumper=Dumper, **kwds)
File "/usr/local/lib/python3.8/dist-packages/yaml/__init__.py", line 278, in dump_all
dumper.represent(data)
File "/usr/local/lib/python3.8/dist-packages/yaml/representer.py", line 28, in represent
self.serialize(node)
File "/usr/local/lib/python3.8/dist-packages/yaml/serializer.py", line 55, in serialize
self.emit(DocumentEndEvent(explicit=self.use_explicit_end))
File "/usr/local/lib/python3.8/dist-packages/yaml/emitter.py", line 115, in emit
self.state()
File "/usr/local/lib/python3.8/dist-packages/yaml/emitter.py", line 220, in expect_document_end
self.flush_stream()
File "/usr/local/lib/python3.8/dist-packages/yaml/emitter.py", line 790, in flush_stream
self.stream.flush()
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/lib/io/file_io.py", line 219, in flush
self._writable_file.flush()
tensorflow.python.framework.errors_impl.PermissionDeniedError: Error executing an HTTP request: HTTP response code 403 with body '{
"error": {
"code": 403,
"message": "Access denied.",
"errors": [
{
"message": "Access denied.",
"domain": "global",
"reason": "forbidden"
}
]
}
}
when initiating an upload to gs://con_bioberturk/general/params.yaml
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "official/nlp/train.py", line 82, in <module>
app.run(main)
File "/usr/local/lib/python3.8/dist-packages/absl/app.py", line 308, in run
_run_main(main, args)
File "/usr/local/lib/python3.8/dist-packages/absl/app.py", line 254, in _run_main
sys.exit(main(argv))
File "official/nlp/train.py", line 47, in main
train_utils.serialize_config(params, model_dir)
File "/usr/share/tpu/models/official/core/train_utils.py", line 370, in serialize_config
hyperparams.save_params_dict_to_yaml(params, params_save_path)
File "/usr/share/tpu/models/official/modeling/hyperparams/params_dict.py", line 349, in save_params_dict_to_yaml
yaml.dump(params.as_dict(), f, default_flow_style=False)
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/lib/io/file_io.py", line 197, in __exit__
self.close()
File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/lib/io/file_io.py", line 239, in close
self._writable_file.close()
tensorflow.python.framework.errors_impl.PermissionDeniedError: Error executing an HTTP request: HTTP response code 403 with body '{
"error": {
"code": 403,
"message": "Access denied.",
"errors": [
{
"message": "Access denied.",
"domain": "global",
"reason": "forbidden"
}
]
}
}
'
Here is my settings:
tpu-vm name:con-bert1
TPU software version: tpu-vm-tf-2.10.0-pod
cloud bucket (con_bioberturk) and tpu-vm are in the same location
Looks like you need to add the service account that is currently active on your TPU VM to the GCS IAM. Instructions here - https://github.com/google-research/text-to-text-transfer-transformer/issues/1003
If that fails, try running gcloud auth login --update-adc on your TPU VM to add your credentials.
Hope this resolves your issue.

"web/session/authenticate" API endpoint returns an error instead of session_info in ODOO V16

I am using odoo V16. When I send a POST request to web/session/authenticate endpoint with the correct user credentials in the body like this:
{
"params": {
"db":<DB>,
"login": <LOGIN>,
"password": <PASSWORD>
}
}
I get a bad response with the error message "'NoneType' object has no attribute 'user'".
The expected behavior is a JSON response with session info like:
{
"jsonrpc": "2.0",
"id": null,
"result": {...}
}
I'm not sure if it is a bug in the new odoo version or I'm doing something wrong. Any help is appreciated. Thanks!
Full error response:
{
"jsonrpc": "2.0",
"id": null,
"error": {
"code": 200,
"message": "Odoo Server Error",
"data": {
"name": "builtins.AttributeError",
"debug": "Traceback (most recent call last):
File "/odoo-16/odoo/http.py", line 1963, in call
response = request._serve_nodb()
File "/odoo-16/odoo/http.py", line 1516, in _serve_nodb
response = self.dispatcher.dispatch(rule.endpoint, args)
File "/odoo-16/odoo/http.py", line 1775, in dispatch
result = endpoint(**self.request.params)
File "/odoo-16/odoo/http.py", line 673, in route_wrapper
result = endpoint(self, *args, **params_ok)
File "/odoo-16/addons/web/controllers/session.py", line 52, in authenticate
print('session_info', env['ir.http'].session_info())
File "/odoo-16/addons/web_tour/models/ir_http.py", line 12, in session_info
result = super().session_info()
File "/odoo-16/addons/web/models/ir_http.py", line 68, in session_info
user = request.env.user
AttributeError: 'NoneType' object has no attribute 'user'
",
"message": "'NoneType' object has no attribute 'user'",
"arguments": [
"'NoneType' object has no attribute 'user'"
],
"context": {}
}
}
}
I faced the same problem, and I found a workaround.
Problem
The problem happen when you call the Odoo API for authentication web/session/authenticate.
We usually send the login data like this in POST request to the server
{"params":{"db":"odoo16","login":"admin","password":"***"}}
I got the error saying:
File "/usr/lib/python3/dist-packages/odoo/addons/mail/models/ir_http.py", line 17, in session_info
user = request.env.user
AttributeError: 'NoneType' object has no attribute 'user'
Solution
This problem Happen when you has many databases in the server, so the request handler fail in fetching the user from env variable.
But if you update your odoo.conf and added a dbfilter = odoo16, and restart the server. you will have only one database like this:
Then if you call the API you will get the CORRECT response and works fine with you,
{
"jsonrpc": "2.0",
"id": null,
"result": {
"uid": 2,
"is_system": true,
"is_admin": true,
"user_context": {
"lang": "en_US",
"tz": "Africa/Cairo",
"uid": 2
},
...
...
...
}
}
Hope this help you fixing your issue till Odoo fix there bug.

Error when running Glue studio job. (InvalidInputException:Invalid S3 resource)

I am new to AWS glue. I have uploaded my csv to an S3 bucket and crawled into a Data Catalog database table. Looking fine with AWS Athena.
I then tried to start a glue studio job to remap the key. However it gives me an error I couldn't find the solution. The IAM role I used for the glue job has access to the S3 bucket. No idea about Lake Formation, I tried everything in LF to enable access (with the LF-tag, S3 location registration." No idea what's going on. Could someone shed some light? Thanks!
npy4j.protocol.Py4JJavaError: An error occurred while calling o93.getDynamicFrame.\n: java.lang.RuntimeException: class com.amazonaws.services.gluejobexecutor.model.InvalidInputException:Invalid S3 resource (Service: AWSLakeFormation; Status Code: 400; Error Code: InvalidInputException; Request ID: 208fb8b8-62d9-4ebb-9107-bd4b3b6e8119; Proxy: null) (Service: AWSGlueJobExecutor; Status Code: 400; Error Code: InvalidInputException; Request ID: 0685c719-d128-4596-87db-def2d5837a14; Proxy: null)
Error log:
2022-10-24 15:20:58,739 ERROR [main] glueexceptionanalysis.GlueExceptionAnalysisListener (Logging.scala:logError(9)): [Glue Exception Analysis] {
"Event": "GlueETLJobExceptionEvent",
"Timestamp": 1666624858735,
"Failure Reason": "Traceback (most recent call last):\n File \"/tmp/shipping_remap_1_yxd.py\", line 19, in <module>\n transformation_ctx=\"AmazonS3_node1666619146221\",\n File \"/opt/amazon/lib/python3.6/site-packages/awsglue/dynamicframe.py\", line 629, in from_catalog\n return self._glue_context.create_dynamic_frame_from_catalog(db, table_name, redshift_tmp_dir, transformation_ctx, push_down_predicate, additional_options, catalog_id, **kwargs)\n File \"/opt/amazon/lib/python3.6/site-packages/awsglue/context.py\", line 188, in create_dynamic_frame_from_catalog\n return source.getFrame(**kwargs)\n File \"/opt/amazon/lib/python3.6/site-packages/awsglue/data_source.py\", line 36, in getFrame\n jframe = self._jsource.getDynamicFrame()\n File \"/opt/amazon/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py\", line 1305, in __call__\n answer, self.gateway_client, self.target_id, self.name)\n File \"/opt/amazon/spark/python/lib/pyspark.zip/pyspark/sql/utils.py\", line 111, in deco\n return f(*a, **kw)\n File \"/opt/amazon/spark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py\", line 328, in get_return_value\n format(target_id, \".\", name), value)\npy4j.protocol.Py4JJavaError: An error occurred while calling o93.getDynamicFrame.\n: java.lang.RuntimeException: class com.amazonaws.services.gluejobexecutor.model.InvalidInputException:Invalid S3 resource (Service: AWSLakeFormation; Status Code: 400; Error Code: InvalidInputException; Request ID: 208fb8b8-62d9-4ebb-9107-bd4b3b6e8119; Proxy: null) (Service: AWSGlueJobExecutor; Status Code: 400; Error Code: InvalidInputException; Request ID: 0685c719-d128-4596-87db-def2d5837a14; Proxy: null)\n\tat com.amazonaws.services.glue.remote.LakeformationCredentialsProvider.refresh(LakeformationCredentialsProvider.scala:50)\n\tat com.amazonaws.services.glue.remote.LakeformationCredentialsProvider.<init>(LakeformationCredentialsProvider.scala:77)\n\tat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)\n\tat sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)\n\tat sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)\n\tat java.lang.reflect.Constructor.newInstance(Constructor.java:423)\n\tat com.amazonaws.services.glue.remote.MichiganAWSCredentialProviderProxy$.get(MichiganAWSCredentialProviderProxy.scala:14)\n\tat com.amazonaws.services.glue.util.FileSchemeWrapper.setHadoopConfiguration(FileSchemeWrapper.scala:43)\n\tat com.amazonaws.services.glue.util.FileSchemeWrapper.executeWith(FileSchemeWrapper.scala:82)\n\tat com.amazonaws.services.glue.util.FileSchemeWrapper.executeWithQualifiedScheme(FileSchemeWrapper.scala:90)\n\tat com.amazonaws.services.glue.HadoopDataSource.getDynamicFrame(DataSource.scala:556)\n\tat com.amazonaws.services.glue.DataSource.getDynamicFrame(DataSource.scala:101)\n\tat com.amazonaws.services.glue.DataSource.getDynamicFrame$(DataSource.scala:101)\n\tat com.amazonaws.services.glue.HadoopDataSource.getDynamicFrame(DataSource.scala:246)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:498)\n\tat py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)\n\tat py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)\n\tat py4j.Gateway.invoke(Gateway.java:282)\n\tat py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)\n\tat py4j.commands.CallCommand.execute(CallCommand.java:79)\n\tat py4j.GatewayConnection.run(GatewayConnection.java:238)\n\tat java.lang.Thread.run(Thread.java:750)\n",
"Stack Trace": [
{
"Declaring Class": "get_return_value",
"Method Name": "format(target_id, \".\", name), value)",
"File Name": "/opt/amazon/spark/python/lib/py4j-0.10.9-src.zip/py4j/protocol.py",
"Line Number": 328
},
{
"Declaring Class": "deco",
"Method Name": "return f(*a, **kw)",
"File Name": "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/sql/utils.py",
"Line Number": 111
},
{
"Declaring Class": "__call__",
"Method Name": "answer, self.gateway_client, self.target_id, self.name)",
"File Name": "/opt/amazon/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_gateway.py",
"Line Number": 1305
},
{
"Declaring Class": "getFrame",
"Method Name": "jframe = self._jsource.getDynamicFrame()",
"File Name": "/opt/amazon/lib/python3.6/site-packages/awsglue/data_source.py",
"Line Number": 36
},
{
"Declaring Class": "create_dynamic_frame_from_catalog",
"Method Name": "return source.getFrame(**kwargs)",
"File Name": "/opt/amazon/lib/python3.6/site-packages/awsglue/context.py",
"Line Number": 188
},
{
"Declaring Class": "from_catalog",
"Method Name": "return self._glue_context.create_dynamic_frame_from_catalog(db, table_name, redshift_tmp_dir, transformation_ctx, push_down_predicate, additional_options, catalog_id, **kwargs)",
"File Name": "/opt/amazon/lib/python3.6/site-packages/awsglue/dynamicframe.py",
"Line Number": 629
},
{
"Declaring Class": "<module>",
"Method Name": "transformation_ctx=\"AmazonS3_node1666619146221\",",
"File Name": "/tmp/shipping_remap_1_yxd.py",
"Line Number": 19
}
],
"Last Executed Line number": 19,
"script": "shipping_remap_1_yxd.py"
}

Postman authentication for Odoo 14

How to use postman to test odoo 14.0 controller methods that require authentication?
I used to have a simple request for authentication:
url: http://localhost:8014/web/session/authenticate
method: GET
headers: Content-Type: application/json
body:
{
"jsonrpc": "2.0",
"params": {
"db": "v14pos",
"login": "admin",
"password": "admin"
}
}
After sending the authentication request, postman will set the session_id cookie, and it will work.
But in 14.0 even though the session_id cookie is set, I get the following error when trying to call a url that requires authenticatoin:
{
"jsonrpc": "2.0",
"id": null,
"error": {
"code": 200,
"message": "Odoo Server Error",
"data": {
"name": "odoo.exceptions.AccessDenied",
"debug": "Traceback (most recent call last):\n File \"/home/obi/src/vs/odoo14/addons/http_routing/models/ir_http.py\", line 450, in _dispatch\n cls._authenticate(func)\n File \"/home/obi/src/vs/odoo14/odoo/addons/base/models/ir_http.py\", line 132, in _authenticate\n raise AccessDenied()\nException\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n File \"/home/obi/src/vs/odoo14/odoo/http.py\", line 639, in _handle_exception\n return super(JsonRequest, self)._handle_exception(exception)\n File \"/home/obi/src/vs/odoo14/odoo/http.py\", line 315, in _handle_exception\n raise exception.with_traceback(None) from new_cause\nodoo.exceptions.AccessDenied: Access Denied\n",
"message": "Access Denied",
"arguments": [
"Access Denied"
],
"context": {}
}
}
}
This worked for me for version 11.0.
I noticed that the HTTP header in 14.0 includes the cookie in a different way:
Cookie: TWISTED_SESSION=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJ1c2VyX2luZm8iOnsiYW5vbnltb3VzIjp0cnVlfSwiZXhwIjoxNjAzNjM0NDM5fQ.pJs2oOjQYOQrFnolafUlNZ4Bg4OMJ_itRaZPEUoaLeE; frontend_lang=en_US; fileToken=dummy-because-api-expects-one; tz=Africa/Khartoum; session_id=d36df662e749f368c32dcbecc07bf578dd57de8a
What is the TWISTED_SESSOIN? is it causing the problem?
I found the solution, or rather the problem.
I set wrong value for auth in the controller method, it was:
#http.route('/route/', auth='auth', type='json')
And changed it to:
#http.route('/route/', auth='user', type='json')

How to solve Node:SCSS Problem? Any Solutions

If you had solve this type of issue then give some solutions..
=> changed: C:\Users\DELL\Desktop\NA-TOURS\sass\main.scss
{
"status": 1,
"file": "C:/Users/DELL/Desktop/NA-TOURS/sass/abstracts/_variables.scss",
"line": 26,
"column": 1,
"message": "Invalid CSS after \"$default-font-size\": expected 1 selector or at-rule, was \": 1.6rem;\"",
"formatted": "Error: Invalid CSS after \"$default-font-size\": expected 1 selector or at-rule, was \": 1.6rem;\"\n on line 26 of sass/abstracts/_variables.scss\n from line 3 of sass/main.scss\n>> $default-font-size: 1.6rem;\r\n ^\n"
}