Possible to exclude or reorder a column from `*`? [duplicate] - sql

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
SQL exclude a column using SELECT * [except columnA] FROM tableA?
Is it possible to exclude a column from a select * from table statement with SQL Server?
I have a need for this and this is my only option other than parsing a raw SQL string to get out the required field names (I really don't want to do that).
Just to be bold. When the query is made I do not have access to the list of fields needed from the table but I do know which field I do not need. This is part of a complex multi-part query.
Surely there must be some way even if it's "hackish" such as using table variables or views
My other option is to reorder the columns. My problem is with ExecuteScalar SQL functions which get the first row and first column.
EDIT
I can't add an answer since this is now closed but the way I ended up doing it was like so:
;with results_cte as (
select (calculation) as calculated_column, * from table
)
select * into #temptable from results_cte
where calculated_column<10 /*or whatever*/
alter table #temptable
drop column calculated_column
select * from #temptable
drop table #temptable

Nope. You'll have to build your statement manually or just select *.

No.
Instead, you could check syscolumns to get all of the field names, or (perhaps) SELECT * and ignore that column.

If you use dynamic SQL, you can generate the query from metadata about the table or view (INFORMATION_SCHEMA.COLUMNS) and exclude columns that way. I do this a lot to generate triggers or views.
But there is nothing in the SQL language which supports this.

The best way to handle this would be to select * and then just not present the excluded column to your users in your frontend. As others have noted, SQL has no direct capability of doing an all-columns-except construct.

Related

Is there any SQL query character limit while executing it by using the JDBC driver [duplicate]

I'm using the following code:
SELECT * FROM table
WHERE Col IN (123,123,222,....)
However, if I put more than ~3000 numbers in the IN clause, SQL throws an error.
Does anyone know if there's a size limit or anything similar?!!
Depending on the database engine you are using, there can be limits on the length of an instruction.
SQL Server has a very large limit:
http://msdn.microsoft.com/en-us/library/ms143432.aspx
ORACLE has a very easy to reach limit on the other side.
So, for large IN clauses, it's better to create a temp table, insert the values and do a JOIN. It works faster also.
There is a limit, but you can split your values into separate blocks of in()
Select *
From table
Where Col IN (123,123,222,....)
or Col IN (456,878,888,....)
Parameterize the query and pass the ids in using a Table Valued Parameter.
For example, define the following type:
CREATE TYPE IdTable AS TABLE (Id INT NOT NULL PRIMARY KEY)
Along with the following stored procedure:
CREATE PROCEDURE sp__Procedure_Name
#OrderIDs IdTable READONLY,
AS
SELECT *
FROM table
WHERE Col IN (SELECT Id FROM #OrderIDs)
Why not do a where IN a sub-select...
Pre-query into a temp table or something...
CREATE TABLE SomeTempTable AS
SELECT YourColumn
FROM SomeTable
WHERE UserPickedMultipleRecordsFromSomeListOrSomething
then...
SELECT * FROM OtherTable
WHERE YourColumn IN ( SELECT YourColumn FROM SomeTempTable )
Depending on your version, use a table valued parameter in 2008, or some approach described here:
Arrays and Lists in SQL Server 2005
For MS SQL 2016, passing ints into the in, it looks like it can handle close to 38,000 records.
select * from user where userId in (1,2,3,etc)
I solved this by simply using ranges
WHERE Col >= 123 AND Col <= 10000
then removed unwanted records in the specified range by looping in the application code. It worked well for me because I was looping the record anyway and ignoring couple of thousand records didn't make any difference.
Of course, this is not a universal solution but it could work for situation if most values within min and max are required.
You did not specify the database engine in question; in Oracle, an option is to use tuples like this:
SELECT * FROM table
WHERE (Col, 1) IN ((123,1),(123,1),(222,1),....)
This ugly hack only works in Oracle SQL, see https://asktom.oracle.com/pls/asktom/asktom.search?tag=limit-and-conversion-very-long-in-list-where-x-in#9538075800346844400
However, a much better option is to use stored procedures and pass the values as an array.
You can use tuples like this:
SELECT * FROM table
WHERE (Col, 1) IN ((123,1),(123,1),(222,1),....)
There are no restrictions on number of these. It compares pairs.

Is it possible to avoid specifying a column list in a SQL Server CTE?

Is it possible to avoid specifying a column list in a SQL Server CTE?
I'd like to create a CTE from a table that has many columns so that the structure is identical. There probably is a way to accomplish this without relisting every column name.
I've tried (unsuccessfully):
with pay_cte as
(select * from payments)
select * from pay_cte
I'm encouraged in my quest by this statement in the msdn documentation:
The list of column names is optional only if distinct names for all resulting columns are supplied in the query definition.
https://msdn.microsoft.com/en-us/library/ms175972.aspx
Yes, assuming you mean that you don't have to name every column in the with cte(Col1, Col2) as section.
You can easily try this yourself with a very simple test query along the lines of:
with cte as
(
select *
from sys.tables
)
select *
from cte

what is the maximum value we can use with IN operator in sql [duplicate]

I'm using the following code:
SELECT * FROM table
WHERE Col IN (123,123,222,....)
However, if I put more than ~3000 numbers in the IN clause, SQL throws an error.
Does anyone know if there's a size limit or anything similar?!!
Depending on the database engine you are using, there can be limits on the length of an instruction.
SQL Server has a very large limit:
http://msdn.microsoft.com/en-us/library/ms143432.aspx
ORACLE has a very easy to reach limit on the other side.
So, for large IN clauses, it's better to create a temp table, insert the values and do a JOIN. It works faster also.
There is a limit, but you can split your values into separate blocks of in()
Select *
From table
Where Col IN (123,123,222,....)
or Col IN (456,878,888,....)
Parameterize the query and pass the ids in using a Table Valued Parameter.
For example, define the following type:
CREATE TYPE IdTable AS TABLE (Id INT NOT NULL PRIMARY KEY)
Along with the following stored procedure:
CREATE PROCEDURE sp__Procedure_Name
#OrderIDs IdTable READONLY,
AS
SELECT *
FROM table
WHERE Col IN (SELECT Id FROM #OrderIDs)
Why not do a where IN a sub-select...
Pre-query into a temp table or something...
CREATE TABLE SomeTempTable AS
SELECT YourColumn
FROM SomeTable
WHERE UserPickedMultipleRecordsFromSomeListOrSomething
then...
SELECT * FROM OtherTable
WHERE YourColumn IN ( SELECT YourColumn FROM SomeTempTable )
Depending on your version, use a table valued parameter in 2008, or some approach described here:
Arrays and Lists in SQL Server 2005
For MS SQL 2016, passing ints into the in, it looks like it can handle close to 38,000 records.
select * from user where userId in (1,2,3,etc)
I solved this by simply using ranges
WHERE Col >= 123 AND Col <= 10000
then removed unwanted records in the specified range by looping in the application code. It worked well for me because I was looping the record anyway and ignoring couple of thousand records didn't make any difference.
Of course, this is not a universal solution but it could work for situation if most values within min and max are required.
You did not specify the database engine in question; in Oracle, an option is to use tuples like this:
SELECT * FROM table
WHERE (Col, 1) IN ((123,1),(123,1),(222,1),....)
This ugly hack only works in Oracle SQL, see https://asktom.oracle.com/pls/asktom/asktom.search?tag=limit-and-conversion-very-long-in-list-where-x-in#9538075800346844400
However, a much better option is to use stored procedures and pass the values as an array.
You can use tuples like this:
SELECT * FROM table
WHERE (Col, 1) IN ((123,1),(123,1),(222,1),....)
There are no restrictions on number of these. It compares pairs.

Oracle SQL: How to use more than 1000 items inside an IN clause [duplicate]

This question already has answers here:
SQL IN Clause 1000 item limit
(5 answers)
Closed 8 years ago.
I have an SQL statement where I would like to get data of 1200 ep_codes by making use of IN clause. When I include more than 1000 ep_codes inside IN clause, Oracle says I'm not allowed to do that. To overcome this, I tried to change the SQL code as follows:
SELECT period, ...
FROM my_view
WHERE period = '200912'
...
AND ep_codes IN (...1000 ep_codes...)
OR ep_codes IN (...200 ep_codes...)
The code was executed succesfully but the results are strange (calculation results are fetched for all periods, not just for 200912, which is not what I want). Is it appropriate to do that using OR between IN clauses or should I execute two separate codes as one with 1000 and the other with 200 ep_codes?
Pascal Martin's solution worked perfectly. Thanks all who contributed with valuable suggestions.
The recommended way to handle this in Oracle is to create a Temporary Table, write the values into this, and then join to this. Using dynamically created IN clauses means the query optimizer does a 'hard parse' of every query.
create global temporary table LOOKUP
(
ID NUMBER
) on commit delete rows;
-- Do a batch insert from your application to populate this table
insert into lookup(id) values (?)
-- join to it
select foo from bar where code in (select id from lookup)
Not sure that using so many values in a IN() is that good, actually -- especially for performances.
When you say "the results are strange", maybe this is because a problem with parenthesis ? What if you try this, instead of what you proposed :
SELECT ...
FROM ...
WHERE ...
AND (
ep_codes IN (...1000 ep_codes...)
OR ep_codes IN (...200 ep_codes...)
)
Does it make the results less strange ?
Actually you can use collections/multisets here. You'll need a number table type to store them.
CREATE TYPE NUMBER_TABLE AS TABLE OF NUMBER;
...
SELECT *
FROM my_view
WHERE period MEMBER OF NUMBER_TABLE(1,2,3...10000)
Read more about multisets here:
Seems like it would be a better idea, both for performance and maintainability, to put the codes in a separate table.
SELECT ...
FROM ...
WHERE ...
AND ep_code in (select code from ep_code_table)
could you insert the 1200 ep_code values into a temporary table and then INNER JOIN to that table to filter rows instead?
SELECT a.*
FROM mytable a
INNER JOIN tmp ON (tmp.ep_code = a.ep_code)
WHERE ...

how to convert result of an select sql query into a new table in ms access

how to convert result of an select sql query into a new table in msaccess ?
You can use sub queries
SELECT a,b,c INTO NewTable
FROM (SELECT a,b,c
FROM TheTable
WHERE a Is Null)
Like so:
SELECT *
INTO NewTable
FROM OldTable
First, create a table with the required keys, constraints, domain checking, references, etc. Then use an INSERT INTO..SELECT construct to populate it.
Do not be tempted by SELECT..INTO..FROM constructs. The resulting table will have no keys, therefore will not actually be a table at all. Better to start with a proper table then add the data e.g. it will be easier to trap bad data.
For an example of how things can go wrong with an SELECT..INTO clause: it can result in a column that includes the NULL value and while after the event you can change the column to NOT NULL the engine will not replace the NULLs, therefore you will end up with a NOT NULL column containing NULLs!
Also consider creating a 'viewed' table e.g. using CREATE VIEW SQL DDL rather than a base table.
If you want to do it through the user interface, you can also:
A) Create and test the select query. Save it.
B) Create a make table query. When asked what tables to show, select the query tab and your saved query.
C) Tell it the name of the table you want to create.
D) Go make coffee (depending on taste and size of table)
Select *
Into newtable
From somequery