why dbt runs in cli but throws an error on cloud UI for the exact same model? - sql

I am executing dbt run -s model_name on CLI and the task completes successfully. However, when I run the exact same command on dbt cloud, I get this error:
Syntax or semantic analysis error thrown in server while executing query.
Error message from server: org.apache.hive.service.cli.HiveSQLException:
Error running query: org.apache.spark.sql.AnalysisException: cannot
resolve '`pv.meta.uuid`' given input columns: []; line 6 pos 4;
\n'Project ['pv.meta.uuid AS page_view_uuid#249595,
'pv.user.visitorCookieId AS (80) (SQLExecDirectW)")
it looks like it fails recognizing 'pv.meta.uuid' syntax which extract data from a json format. It is not clear to me what is going on. Any thoughts? Thank you!

Related

bq show command giving `BigQuery error in show operation: Not found:`

As per the Google Cloud BQ Documentation, bq show --job=true myproject:US.bquijob_123x456_123y123z123c command should return the status of a BQ Job. But in my case I am getting the following error :-
BigQuery error in show operation: Not found: Job my-project:US.bqjob_r4bc4365eb9z97aa8_000001855ca75006_1
What could be the reason behind this error ? Also, I checked that I do have all the necessary permissions namely :-
roles/bigquery.admin
roles/bigquery.user
roles/bigquery.jobUser
Thanks!

DBT: How to fix Database Error Expecting Value?

I was running into troubles today while running Airflow and airflow-dbt-python. I tried to debug a bit using the logs and the error shown in the logs was this one:
[2022-12-27, 13:53:53 CET] {functions.py:226} ERROR - [0m12:53:53.642186 [error] [MainThread]: Encountered an error:
Database Error
Expecting value: line 2 column 5 (char 5)
Quite a weird one.
Possibly check your credentials file that allows DBT to run queries on your DB (in our case we run DBT with BigQuery), in our case the credentials file was empty. We even tried to run DBT directly in the worker instead of running it through airflow, giving as a result exactly the same error. Unfortunately this error is not really explicit.

An error occurred when executing Randomwalk2dmobility in NS3

I am a newbie to NS3. I want to understand the execution status of handover in the Randomwalk2d module and visualize it. The default is two Ue and two enb, but errors will always occur during execution. Can anyone help me solve the problem?
This is my code link:https://drive.google.com/file/d/163NQOyvs0bTh2J4P9_vpS4Y7iqocB3HJ/view?usp=sharing
When I execute the command : ./waf --run scratch/lte_handover --visualize, the following error appear
../scratch/lte_handover.cc:In funtion 'int main(int, char**)':
../scratch/lte_handover.cc:296:78: error: expected ')' before ';' token
"Bounds",RectangleValue (Rectangle (0,2000,0,2000)));
^
Build failed
->task in 'lte_handover' failed with exit status 1 (run with -v to display more information)
Follow the instructions to enter the command :./waf --run scratch/lte_handover -v, and the following information appears
Several tasks use the same identifier. Please check the information on
https://waf.io/apidocs/Task.html?highlight=uid#waflib.Task.Task.uid
object 'SuidBuild_task'(
{task 139759060979784: SuidBuild_task -> }) defined in 'tap-creator'
object 'SuidBuild_task'(
{task 139759060980008: SuidBuild_task -> }) defined in 'tap-creator'
object 'SuidBuild_task'(
{task 139759065638504: SuidBuild_task -> }) defined in 'tap-creator'
Seems that you have an extra ) in that line above. You are not closing this command as you commented all the lines
ueMobility.SetPositionAllocator ("ns3::RandomRectanglePositionAllocator", // <-- close
ueMobility.SetMobilityModel ("ns3::RandomWalk2dMobilityModel","Bounds", RectangleValue (Rectangle (0,2000,0,2000)));

Apache Pig 0.12.0 on Hue not preprocessing statements as expected

I'm using Hue for PIG scripts on amazon EMR. I am using the declare and default statements as mentioned in the documentation.
I have some %default and %declare statements and it looks like they are
not preprocessed within Hue. Therefore, although the parameters are defined
in my script, the editor keeps popping in a parameter configuration window. If I leave the parameter blank, the job fails with an error.
Sample Script
%declare OUTPUT_FOLDER 'testingOutput01';
ts = LOAD 's3://testbucket1/input/testdata-00000.gz' USING PigStorage('\t');
STORE ts INTO 's3://testbucket1/$OUTPUT_FOLDER' USING PigStorage('\t');
Upon execution, it shows the pop-up window asking for values for OUTPUT_FOLDER. If I leave it blank it fails with the following error:
2015-06-23 20:15:54,908 [main] ERROR org.apache.pig.Main - ERROR 2997:
Encountered IOException. org.apache.pig.tools.parameters.ParseException:
Encountered "<EOF>" at line 1, column 12.
Was expecting one of:
<IDENTIFIER> ...
<OTHER> ...
<LITERAL> ...
<SHELLCMD> ...
Is that the expected behavior? Is this a known issue or am I missing something?
Configuration details:
AMI version:3.7.0
Hadoop distribution:Amazon 2.4.0
Applications:Hive 0.13.1, Pig 0.12.0, Impala 1.2.4, Hue
The same behavior is seen with default instead of declare.
If you need any clarifications then please do comment on this question. I will update it as needed.
Hue does not support %declare with a default statement. It will be fixed with: https://issues.cloudera.org/browse/HUE-2508
The current temporary workaround is to put any value in the popup.

FAILED: Error in metadata:

when i am trying to show tables from hive databases the following error displays..
i granted permissions to ware house & Tables even though the error shows
hive> show tables;
FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
Thanks in advance.
this error occurs when hive CLI is terminated improperly.
solution:
exit from hive, give 'jps' command. some process named runjar will be there. kill it using ' kill -9 pid'
thats it. you are done.
plz ignore typo- replied from mobile