PostgreSQL DBLink: No function matches the given name and argument types - sql

I was playing around with DBLINK and I wanted to try it. So I run this simple query
CREATE EXTENSION dblink;
SELECT *
FROM dblink(('dbname=genesis_admin')::text,
('SELECT * FROM user_account')::text);
then to my surprise
[WARNING ] CREATE EXTENSION dblink
ERROR: extension "dblink" already exists
[WARNING ] SELECT * FROM dblink(('dbname=genesis_admin')::text, ('SELECT * FROM user_account')::text)
ERROR: function dblink(text, text) does not exist
LINE 1: SELECT * FROM dblink(('dbname=genesis_admin')::text, ('SELE...
^
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
How can it not be existing if it already exists?

I encountered the same error and the reason is that this is because DBLink gets installed (by default) in Public schema and you have probably modified the search_path to a list that doesn't include public). The Database although contains the function DBlink(text, text) is therefore unable to find that function.
To get this to work, you need to add specific schema to the DBlink function call.
SELECT public.dblink(xxx, yyy);

Related

Extract JSON content in Metabase SQL query

Using: Django==2.2.24, Python=3.6, PostgreSQL is underlying DB
Working with Django ORM, I can easily make all sort of queries, but I started using Metabase, and my SQL might be a bit rusty.
The problem:
I am trying to get a count of the items in a list, under a key in a dictionary, stored as a JSONField:
from django.db import models
from jsonfield import JSONField
class MyTable(models.Model):
data_field = JSONField(blank=True, default=dict)
Example of the dictionary stored in data_field:
{..., "my_list": [{}, {}, ...], ...}
Under "my_list" key, the value stored is a list, which contains a number of other dictionaries.
In Metabase, I am trying to get a count for the number of dictionaries in the list, but even more basic things, none of which work.
Some stuff I tried:
Attempt:
SELECT COUNT(elem->'my_list') as my_list_count
FROM my_table, json_object_keys(data_field:json) AS elem
Error:
ERROR: syntax error at or near ":" Position: 226
Attempt:
SELECT ARRAY_LENGTH(elem->'my_list') as my_list_count
FROM my_table, JSON_OBJECT_KEYS(data_field:json) AS elem
Error:
ERROR: syntax error at or near ":" Position: 233
Attempt:
SELECT JSON_ARRAY_LENGTH(data_field->'my_list'::json)
FROM my_table
Error:
ERROR: invalid input syntax for type json Detail: Token "my_list" is invalid. Position: 162 Where: JSON data, line 1: my_list
Attempt:
SELECT ARRAY_LENGTH(JSON_QUERY_ARRAY(data_field, '$.my_list'))
FROM my_table
Error:
ERROR: function json_query_array(text, unknown) does not exist Hint: No function matches the given name and argument types. You might need to add explicit type casts. Position: 140
Basically, I think the issue is that I am using the wrong signatures (most of the time) in the methods I am trying to use.
I used this query to make sure I can at least get the keys from the dictionary:
SELECT JSON_OBJECT_KEYS(data_field::json)
FROM my_table
I was not able to use JSON_OBJECT_KEYS() without adding the ::json cast, I was getting this error:
ERROR: function json_object_keys(text) does not exist Hint: No function matches the given name and argument types. You might need to add explicit type casts. Position: 127
But with the json cast, I am getting all the keys as intended.
Thank you for taking a look!
EDIT:
I also found this interesting article with different solution but none of the solutions worked.
Also seen this SO post which did not help.
Ok, after some more digging around, I found this article, which had the correct format/syntax.
This code is what I used to fetch the list from the JSON object successfully:
select data_field::json->'my_list' as the_list
from my_table
Then, I used json_array_length() to get the number of elements:
select json_array_length(data_field::json->'my_list') as number_of_elements
from my_table
All done! :)
EDIT:
I just found the reason to this whole shenanigan.
In the code (which goes years back) we used this package:
jsonfield==1.0.3
And used this way:
from jsonfield import JSONField
The issue is that in the background, Postgres saves the data as a string, so it needs to be cast into a JSON.
Later Django introduced its own JSONField, which stores data as you would expect, without a need to cast:
from django.contrib.postgres.fields import JSONField

BigQuery fails to save view that uses functions

We're using BigQuery with their new dialect of "standard" SQL.
the new SQL supports inline functions written in SQL instead of JS, so we created a function to handle date conversion.
CREATE TEMPORARY FUNCTION
STR_TO_TIMESTAMP(str STRING)
RETURNS TIMESTAMP AS (PARSE_TIMESTAMP('%Y-%m-%dT%H:%M:%E*SZ', str));
It must be a temporary function as Google returns Error: Only temporary functions are currently supported; use CREATE TEMPORARY FUNCTION
if you try a permanent function.
If you try to save a view with a query that uses the function inline - you get the following error: Failed to save view. No support for CREATE TEMPORARY FUNCTION statements inside views.
If you try to outsmart it, and remove the function (hoping to add it during query time), you'll receive this error Failed to save view. Function not found: STR_TO_TIMESTAMP at [4:7].
Any suggestions on how to address this? We have more complex functions than the example shown.
Since the issue was marked as resolved, BigQuery now supports permanents registration of UDFs.
In order to use your UDF in a view, you'll need to first create it.
CREATE OR REPLACE FUNCTION `ACCOUNT-NAME11111.test.STR_TO_TIMESTAMP`
(str STRING)
RETURNS TIMESTAMP AS (PARSE_TIMESTAMP('%Y-%m-%dT%H:%M:%E*SZ', str));
Note that you must use a backtick for the function's name.
There's no TEMPORARY in the statement, as the function will be globally registered and persisted.
Due to the way BigQuery handles namespaces, you must include both the project name and the dataset name (test) in the function's name.
Once it's created and working successfully, you can use it a view.
create view test.test_view as
select `ACCOUNT-NAME11111.test.STR_TO_TIMESTAMP`('2015-02-10T13:00:00Z') as ts
You can then query you view directly without explicitly specifying the UDF anywhere.
select * from test.test_view
As per the documentation https://cloud.google.com/bigquery/docs/reference/standard-sql/data-definition-language#create_function_statement , the functionality is still in Beta phase but is doable. The functions can be viewed in the same dataset it was created and the view can be created.
Please share if that worked fine for you or if you have any findings which would be helpful for others.
Saving a view created with a temp function is still not supported, but what you can do is plan the SQL-query (already rolled out for the latest UI), and then save it as a table. This worked for me, but I guess it depends on the query parameters you want.
##standardSQL
## JS in SQL to extract multiple h.CDs at the same time.
CREATE TEMPORARY FUNCTION getCustomDimension(cd ARRAY<STRUCT< index INT64,
value STRING>>, index INT64)
RETURNS STRING
LANGUAGE js AS """
for(var i = 0; i < cd.length; i++) {
var item = cd[i];
if(item.index == index) {
return item.value
}
}
return '';
""";
SELECT DISTINCT h.page.pagePath, getcustomDimension(h.customDimensions,20), fullVisitorId,h.page.pagePathLevel1, h.page.pagePathLevel2, h.page.pagePathLevel3, getcustomDimension(h.customDimensions,3)
FROM
`XXX.ga_sessions_*`,
UNNEST(hits) AS h
WHERE
### rolling timeframe
_TABLE_SUFFIX = FORMAT_DATE('%Y%m%d',DATE_SUB(CURRENT_DATE(),INTERVAL YY DAY))
AND h.type='PAGE'
Credit for the solution goes to https://medium.com/#JustinCarmony/strategies-for-easier-google-analytics-bigquery-analysis-custom-dimensions-cad8afe7a153

SnappyData - Error creating Kafka streaming table

I'm seeing an issue when creating a spark streaming table using kafka from the snappy shell.
'The exception 'Invalid input 'C', expected dmlOperation, insert, withIdentifier, select or put (line 1, column 1):'
Reference: http://snappydatainc.github.io/snappydata/streamingWithSQL/#spark-streaming-overview
Here is my sql:
CREATE STREAM TABLE if not exists sensor_data_stream
(sensor_id string, metric string)
using kafka_stream
options (
storagelevel 'MEMORY_AND_DISK_SER_2',
rowConverter 'io.snappydata.app.streaming.KafkaStreamToRowsConverter',
zkQuorum 'localhost:2181',
groupId 'streamConsumer',
topics 'test:01');
The shell seems to not like the script at the first character 'C'. I'm attempting to execute the script using the following command:
snappy> run '/scripts/my_test_sensor_script.sql';
any help appreciated!
There is some inconsistency in documentation and actual syntax.The correct syntax is:
CREATE STREAM TABLE sensor_data_stream if not exists (sensor_id string,
metric string) using kafka_stream
options (storagelevel 'MEMORY_AND_DISK_SER_2',
rowConverter 'io.snappydata.app.streaming.KafkaStreamToRowsConverter',
zkQuorum 'localhost:2181',
groupId 'streamConsumer', topics 'test:01');
One more thing you need to do is to write row converter for your data
Mike, You need to create your own rowConverter class by implementing following trait -
trait StreamToRowsConverter extends Serializable {
def toRows(message: Any): Seq[Row]
}
and then specify that rowConverter fully qualified class name in the DDL.
The rowConverter is specific to a schema.
'io.snappydata.app.streaming.KafkaStreamToRowsConverter' is just an placeholder class name, which should be replaced by your own rowConverter class.

PL/SQL function return type integer : Invalid Identifier

I have a function called get_pid in a file called func.sql . The function get_id takes an
input: given_price Parts.price%type and returns an integer.
To call this function, I created a new file called main.sql. Then I login to sqlplus and I call main in this way:
SQL> #Q1_main
select get_pid(400) from dual
*
ERROR at line 1:
ORA-00904: "GET_PID": invalid identifier
Am I calling the function properly? What could possibly be wrong, I looked at the other questions posted about this topic, but I can't figure out what I am doing wrong.
Try to inform the name of the scheme where are your function.
for example, the username (schema) is dev, then you have to try like this.
select dev.get_pid(400) from dual.
when you are on prompt you always have to use the scheme name "." your function/procedure.

Why can't my As400 select from a newly created member alias?

I have set up the code as described in this question.
Creating an alias works, as well as dropping it.
For members that I have created myself, this is working correctly, but for existing members I get the following error when selecting from the alias:
SQL State: 42704
Vendor Code: -204
Message: [SQL0204] MyMemberName in MyLib type *FILE not found.
Cause . . . . . : MyMemberName in
TPLWHS type *FILE was not found. If the member name is *ALL, the table
is not partitioned. If this is an ALTER TABLE statement and the type
is *N, a constraint or partition was not found. If this is not an
ALTER TABLE statement and the type is *N, a function, procedure,
trigger or sequence object was not found. If a function was not found,
MyMemberName is the service program that contains the function. The
function will not be found unless the external name and usage name
match exactly. Examine the job log for a message that gives more
details on which function name is being searched for and the name that
did not match.
Recovery . . . : Change the name and try the request
again. If the object is a node group, ensure that the DB2 Multisystem
product is installed on your system and create a nodegroup with the
CRTNODGRP CL command. If an external function was not found, be sure
that the case of the EXTERNAL NAME on the CREATE FUNCTION statement
exactly matches the case of the name exported by the service program.
Any help you can offer is much appreciated. Thanks!
EDIT: Here is my code:
create alias MyLib.MyAlias for MyLib.MyLogicalFile(MyMember);
select * from MyLib.MyAlias;
drop alias MyLib.MyAlias;
The format of Lib.Alias has worked for me when I directly created the phyiscal and logical members. Perhaps the logical file is missing? I'll double check...
This error message can indicate that the file/logical file/member does not exist.