I need to execute a simple query:
SELECT * FROM MyTable WHERE Id IN (:ids)
Obviously, it returns the set of records which have their primary key 'Id' in the given list. How can I pass an array of integer IDs into ADOQuery.Parameters for parameter 'ids'? I have tried VarArray - it does not work. Parameter 'ids' has FieldType = ftInteger by default, if it matters.
There is no parameter type that can be used to pass a list of values to in. Unfortunately, this is one of the shortcomings of parameterized SQL.
You'll have to build the query from code to either generate the list of values, or generate a list of parameters which can then be filled from code. That's because you can pass each value as a different parameter, like this:
SELECT * FROM MyTable WHERE Id IN (:id1, :id2, :id3)
But since the list will probably have a variable size, you'll have to alter the SQL to add parameters. In that case it is just as easy to generate the list of values, although parametereized queries may be cached better, depending on which DB you use.
The IN param just takes a comma separated string of values like (1,2,3,4,5) so I assume you set the datatype to ftstring and just build the string and pass that...? Not tried it but it's what I would try...
Related
I have a column that has the type of the dataset in text.
So I want to do something like this:
SELECT CAST ('100' AS %INTEGER%);
SELECT CAST (100 AS %TEXT%);
SELECT CAST ('100' AS (SELECT type FROM dataset_types WHERE id = 2));
Is that possible with PostgreSQL?
SQL is strongly typed and static. Postgres demands to know the number of columns and their data type a the time of the call. So you need dynamic SQL in one of the procedural language extensions for this. And then you still face the obstacle that functions (necessarily) have a fixed return type. Related:
Dynamically define returning row types based on a passed given table in plpgsql?
Function to return dynamic set of columns for given table
Or you go with a two-step flow. First concatenate the query string (with another SELECT query). Then execute the generated query string. Two round trips to the server.
SELECT '100::' || type FROM dataset_types WHERE id = 2; -- record resulting string
Execute the result. (And make sure you didn't open any vectors for SQL injection!)
About the short cast syntax:
Postgres data type cast
I have a form where people can type in a start and end date, as well as a column name prefix.
In the backend, I want to do something along the lines of
SELECT *, CAST('{{startDate}}' AS TIMESTAMP) AS ({{prefix}} + '_startDate')
Is this possible? Basically, I want to dynamically create the name of the new column. The table is immediately returned to the user, so I don't want to mutate the underlying table itself. Thanks!
You can execute dynamic query that you have prepared by using EXECUTE keyword, otherwise it is not possible to have dynamic structure of SQL.
Since you are preparing your SQL outside database, you can use something like:
SELECT *, CAST('{{startDate}}' AS TIMESTAMP) AS {{prefix}}_startDate
Assuming that {{prefix}} is replaced with some string by your template before it is sent to database.
it looks that SqlQuery only supports sql that starts with select *? Doesn't it support other sql that only select some columns like
select id, name from person and maps the columns to the corresponding POJO?
If I use SqlFieldQuery to run sql, the result is a QueryCursor of List(each List contains one record of the result). But if the sql starts with select *, the this list's contents would be different with field query like:
select id,name,age from person
For the select *, each List is constructed with 3 parts:
the first elment is the key of the cache
the second element is the pojo object that contains the data
the tailing element are the values for each column.
Why was it so designed? If I don't know what the sql that SqlFieldsQuery runs , then I need additional effort to figure out what the List contains.
SqlQuery returns key and value objects, while SqlFieldsQuery allows to select specific fields. Which one to use depends on your use case.
Currently select * indeed includes predefined _key and _val fields, and this will be improved in the future. However, generally it's a good practice to list fields you want to fetch when running SQL queries (this is true for any SQL database, not only Ignite). This way your code will be protected from unexpected behavior in case schema is changed, for example.
the following function deletes all blanks in a text or varchar column and returns the modified text/varchar as an int:
select condense_and_change_to_int(number_as_text_column) from mytable;
This exact query does work.
Though my goal is to apply this function to all rows of a column in order to consistently change its values. How would I do this? Is it possible with the UPDATE-clause, or do i need to do this within a function itself? I tried the following:
UPDATE mytable
SET column_to_be_modiefied = condense_and_change_to_int(column_to_be_modiefied);
Basically i wanted to input the value of the current row, modify it and save it to the column permanantly.
I'd welcome all ideas regarding how to solve scenarios like these. I'm working with postgresql (but welcome also more general solutions).
Is it possible with an update? Well, yes and sort-of.
From your description, the input to the function is a string of some sort. The output is a number. In general, numbers should be assigned to columns with a number type. The assumption is that the column in question is a number.
However, your update should work. The result will be a string representation of the number.
After running the update, you can change the column type, with something like:
alter table mytable alter column column_to_be_modiefied int;
After running my query I get 1 column result as
5
6
98
101
Is there a way to store this result as array so that I can use it later
in queries like
WHERE NOT IN ('5','6','98','101')
I am aware of storing single variable results but is this possible?
I can not use #Table variable as I will be rerunning the query again in the future and it goes out of scope
There are multiple way of storing those column data like using Temporary Tables or View or Table valued function but IMO there is no need of storing that column data anywhere. You can directly use that column in any query saying below (or) perform a JOIN which would be much better option than NOT IN
select * from
table2
where some_column not in (select column1 from this_table);
While this method is not recommended, storing an array in a single column can be done using CSV's(Comma Separated Values). Simply create a VARCHAR array and store it by storing a string containing the values in a specific order. Basically store all of your values into a string with each value being separated by a comma in that string. Store that into a column of your choice. You can later fetch the string and parse it with a string parser i.e using the .split() function in python. AGAIN I do not recommend doing this, I would instead use multiple columns, one referring to each value and access them that way instead
Using separate columns would make it easy to use in a Stored Procedure.