sql: converting argument $1 type: unsupported type gorm.DB, a struct - go-gorm

Have this:
orm.DB.Table("users", orm.DB.Model(&[]models.User{})).Find(&users)
Got this:
sql: converting argument $1 type: unsupported type gorm.DB, a struct

Related

Postgresql - text Array to Json Array error

I am getting an error while trying to convert string array to json array in postgresql.
SQL:
select * from
jsonb_array_elements_text(to_jsonb('[{\"Apr2021\":\"1.2\",\"Aug2000\":\"1.3\",\"Dec2023\":\"22.5\",\"Feb2023\":\"66.7\",\"Jan2023\":\"99.1\",\"Jul2023\":\"11.0\",\"Jun2021\":\"44.2\",\"Mar2023\":\"55\",\"May2023\":\"10\",\"Nov2023\":\"44\",\"Oct2023\":\"99\",\"Sep2023\":\"33\"}]'::json))
Error:
> Invalid operation: invalid input syntax for type json Details: Token
> "Apr2021" is invalid.

Postgres | V9.4 | Extract value from json

I have a table that one of the columns is in type TEXT and holds a json object inside.
I what to reach a key inside that json and ask about it's value.
The column name is json_representation and the json looks like that:
{
"additionalInfo": {
"dbSources": [{
"user": "Mike"
}]
}
}
I want to get the value of the "user" and ask if it is equal to "Mike".
I tried the following:
select
json_representation->'additionalInfo'->'dbSources'->>'user' as singleUser
from users
where singleUser = 'Mike';
I keep getting an error:
Query execution failed
Reason:
SQL Error [42883]: ERROR: operator does not exist: text -> unknown
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 31
please advice
Thanks
The error message tells you what to do: you might need to add an explicit type cast:
And as you can not reference a column alias in the WHERE clause, you need to wrap it into a derived table:
select *
from (
select json_representation::json ->'additionalInfo'->'dbSources' -> 0 ->>'user' as single_user
from users
) as t
where t.single_user = 'Mike';
:: is Postgres' cast operator
But the better solution would be to change the column's data type to json permanently. And once you upgrade to a supported version of Postgres, you should use jsonb.

Trying to covert long to ToDate format

My input is long "20190503143744" and wanted to convert to format "2019-09-06 11:46:22"
Trying below code:
A = LOAD 'stp_master_subscriber_profile' using org.apache.hive.hcatalog.pig.HCatLoader() as (mdn:chararray, imei:chararray, imsi:chararray, subscriber_id:long, manufacturer:chararray, model:chararray, update_time:long, scenario:chararray, vz_customer:chararray, commit_time:long);
B = FOREACH A GENERATE ToString(ToDate((chararray)commit_time,'yyyyMMdd'),'yyyy-MM-dd HH:mm:ss') as event_date_gmt:chararray;
Getting error:
ERROR 1066: Unable to open iterator for alias B. Backend error : org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing [POUserFunc (Name: POUserFunc(org.apache.pig.builtin.ToDate2ARGS)[datetime] - scope-15 Operator Key: scope-15) children: null at []]: java.lang.IllegalArgumentException: Invalid format: "20190503143744" is malformed at "143744"
The issue is that you're specifying the format as yyyyMMdd but your original input is in yyyyMMddHHmmss format, so you get an error when Pig reaches 143744 instead of the end of your string. Try this:
B = FOREACH A GENERATE ToString(ToDate((chararray)commit_time,'yyyyMMddHHmmss'),
'yyyy-MM-dd HH:mm:ss') as event_date_gmt;

ERROR: column "blob" is of type jsonb but expression is of type character

val parquetDF = session.read.parquet("s3a://test/ovd").selectExpr("id", "topic", "update_id", "blob")
Trying to read parquet file and dump into Postgres. One of the column in postgres table is of JSONB datatype and in parquet it is in String format.
parquetDF.write.format("jdbc")
.option("driver", "org.postgresql.Driver")
.option("url", "jdbc:postgresql://localhost:5432/db_metamorphosis?binaryTransfer=true&stringtype=unspecified")
.option("dbtable", "entitlements.general")
.option("user", "mdev")
.option("password", "")
.option("stringtype", "unspecified")
.mode(SaveMode.Append)
.save()
And it fails with this erorr :
Caused by: org.postgresql.util.PSQLException: ERROR: column "blob" is of type jsonb but expression is of type character
Hint: You will need to rewrite or cast the expression.
Position: 85
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1998)
... 16 more
Someone on SO suggested to put stringtype=unspecified as then Postgres will decided the datatype for string, but it seems to be not working.
<scala.major.version>2.12</scala.major.version>
<scala.version>${scala.major.version}.8</scala.version>
<spark.version>2.4.0</spark.version>
<postgres.version>9.4-1200-jdbc41</postgres.version>

Query a hive table with array<array<string>> type

I have a hive table and had to put a filter where the value of the column =[]. The type of the column in array<array<string>>. I tried to use array_contains but gave the following error
Error while compiling statement: FAILED: SemanticException [Error
10016]: line 2:41 Argument type mismatch ''[]'': "array"
expected at function ARRAY_CONTAINS, but "string" is found
The sample values of the column could be
[]
[['a','b', 'c']]
[['a'],['b'], ['c']]
[]