How to split a large query in command line? - sql

I installed oracle db version 19c in my docker environment and set up a database filled with dummy data. However, when I try to run a very large query I get the error:
SP2-0341: line overflow during variable substitution (>3000 characters at line 1).
I tried splitting it up with linebreaks but depending on how I split it I get all kinds of errors like:
ERROR at line 2: ORA-00933: SQL command not properly ended
or
ERROR at line 2:
SP2-0341: line overflow during variable substitution (>3000 characters at line 3)
The query is formatted as
SELECT AA.n_name AS AA_n_name, AA.n_nationkey AS ...
FROM nation AS AA FULL OUTER JOIN supplier...
WHERE (AC.p_partkey = ... AND...) OR((AC.p_partkey = ...)); -- The where part is over 5000 characters long--
Is there an alternative or solution to tackling this in the command line? I tried running the query as a sql file as well and hit a 4999 limit. I am on a Ubuntu server if that would help and any assistance would be appreciated.

It depends on the environment that you're working in, but generally you are able to continue a command onto the next line by ending the line with a 'back slash' \.

Related

Running Query from text file

I'm trying to run a big query query from the command line, but because my query is very long I've written it in a text file. The query works from the GUI and I'm overwriting a table that already exsists
bq query --allow_large_results --replace --destination_table=me.Tbl_MyTable '`cat query.txt`'
However, I'm getting error results:
Error in query string: Error processing job
'dev:bqjob_r_00000123456789456123_1': Encountered "
"\'cat query.txt\' "" at line 1, column 1.
Was expecting: EOF
Do I need to put the entire file path in the .txt filename? (this doesn't seem to make a difference)
Are there any characters I need to be careful with in the text file (e.g. "\" or quotation marks) ?
I'm using where clauses and group by clauses - is that an issue?
Instead of cat, just pipe the input from the file. The command would be:
bq query --allow_large_results --replace --destination_table=me.Tbl_MyTable < query.txt
This will send the contents of query.txt to the bq tool.
Elliot is right, now if you want to cat, sed or anything, pipe it:
cat query.txt | bq query

Running db2 from bash script not working?

I'm currently using bash on CentOS. DB2 is installed and db2 is on my path.
I have a few lines in a script which are supposed to update my db2 database, but they aren't working. As a minimal reproduction, I can do the exact same thing right in the bash command line and get the same error. Here's that reproduction:
$ db2 connect to PLT02345 user uni using uni; db2 update USM_USER set STATUS = 1 where NAME = 'asm_admin'
I expect this to set STATUS to 1 for everything in PLT02345.USM_USER where the NAME is currently asm_admin.
Instead, I get an error about "ASM_ADMIN" not being valid in the context where it's used. Here's the full output:
Database Connection Information
Database server = DB2/LINUXX8664 10.1.2
SQL authorization ID = UNI
Local database alias = PLT02345
DB21034E The command was processed as an SQL statement because it was not a
valid Command Line Processor command. During SQL processing it returned:
SQL0206N "ASM_ADMIN" is not valid in the context where it is used.
SQLSTATE=42703
I'm confused - what about this makes it not valid? Is bash somehow mutilating the command and not passing everything as it should to db2?
If you're running this from the command line, Bash will drop the 's off 'asm_admin' because it simply assumes you're passing a string. The end result is the SQL becoming WHERE name = asm_admin which is invalid.
To correct this, you need to quote your whole command:
db2 "update USM_USER set STATUS = 1 where NAME = 'asm_admin'"

Using sql within shell script

I am currently trying to integrate an sql statement into a shell script, But facing major syntax issue:
My statement in the script:
su - <sid>adm -c 'hdbsql -U SYSTEM export "'SCHEMA'"."'*'" as binary into "'Export Location'" with reconfigure'
I get the following error:
* 257: sql syntax error: incorrect syntax near "*": line 1 col 16 (at pos 16) SQLSTATE: HY000
Would really appreciate if anyone could help me with this.
Thanks and Regards,
AK
Your command line doesn't make much sense to me. It starts with
su - <sid>adm
which means that you are redirecting the contents of the file "sid" into "su" and then redirecting the result of that operation into the file "adm".
Second problem is that in the command you are giving to adm, the single quotes end right before the "" which means, that the "" will get interpreted by the shell as a file glob:
-c 'hdbsql -U SYSTEM export "'SCHEMA'"."'*'" as binary into "'Export Location'" with reconfigure'
You'll need to escape those single quotes like this: "\'".
But I think your problem solving approach is not good. Try to reduce to problem and only then start adding additional things to it. So first try to execute the SQL statement from the "hdbsql" shell. Does it work?
$ hdbsql
> YOUR SQL STATEMENT HERE
Once that works, try to execute the SQL statement from the unix shell as a user:
$ hdbsql -U SYSTEM export ...
Once that works, try to execute it via su
$ su - ...

MySQL Dump Error 1064 (42000)

I'm having a MySQL dump from Version 4.0.21. I converted it to UTF-8 to fit with the special characters such as (Ü, ü, Ä, ä, Ö, ö, ß). Now I have to import it into the latest MySQL Version 5.5.36. All data have been imported but an error occurred at the end.
ERROR 1064 (42000) at line 80769: You have an error in your SQL syntax...use near '' at line 1
The empty string and the line numbers are confusing me. Importing with phpMyAdmin results the same as command line does, with the command:
mysql -u root -p bugtracker < E:\mantisUTF.dump
The import with the original dump from Version 4.0.21 is working perfect but without the above mentioned special characters.
First Lines of the dump file:
-- MySQL dump 9.11
--
-- Host: localhost Database: Mantis
-- ------------------------------------------------------
-- Server version 4.0.21-debug
--
-- Table structure for table `mantis_bug_file_table`
--
Last Lines (80768 & 80769):
INSERT INTO mantis_user_table VALUES (57,'fullName','firstName lastName','emailAdress','dd1875c93e8f17a24ebaf9c902b7165a','2014-01-29 13:43:21','2014-03-26 13:22:47',1,0,55,14,0,0,'1b886436b0c62598ab66e40ae89f0c016dc5777ebb601a73f2a07536281113ae'
Thanks in advance.
Relax
By rechecking my question i found the problem. The problem was a missing ')' at the end of the dump file.
Last line:
INSERT INTO mantis_user_table VALUES (57,'fullName','firstName lastName','emailAdress','dd1875c93e8f17a24ebaf9c902b7165a','2014-01-29 13:43:21','2014-03-26 13:22:47',1,0,55,14,0,0,'1b886436b0c62598ab66e40ae89f0c016dc5777ebb601a73f2a07536281113ae')

sqlcmd - How to get around column length limit without empty spaces?

I'm trying to use sqlcmd on a windows machine running SQL Server 2005 to write a query to a csv file. The command line options we typically use are:
-l 60 -t 300 -r 1 -b -W -h -1
However, the columns are getting truncated at 256 bytes. In an attempt to circumvent this, I tried using this command line option in place of -W:
-y 8000
This captures the entire fields, but the problem with this method is that the file balloons up from just over 1mb to about 200mb due to all the extra space (I realize 8000 is probably overkill, but it will probably have to be at least 4000 and I'm currently only working with a small subset of data). The -W option typically eliminates all this extra space, but when I try to use them together it tells me they're mutually exclusive.
Is there a way to get sqlcmd around this limit, or does anyone know if another program (such as bcp or osql) would make this easier?
Edit:
Here are the code snippets we're using to get the field that's being truncated (similar code is used for a bunch of fields):
SELECT ALIASES.AliasList as complianceAliases,
...
LEFT OUTER JOIN (Select M1.ID, M1.LIST_ID,stuff((SELECT '{|}' + isnull(Content2,'')+' '+isnull(Content3,'')+' '+isnull(Content4,'')+' '+isnull(Content5,'')+' '+isnull(Content6,'')+' '+isnull(Content7,'')
FROM fs_HOST3_TEST_web.ISI_APP_COMP_MULTI M2 with (nolock)
WHERE M1.LIST_ID = M2.LIST_ID and M1.ID = M2.ID and M1.TYPE = M2.TYPE
FOR XML PATH('')
),1,1,'') as AliasList
FROM fs_HOST3_TEST_web.ISI_APP_COMP_MULTI M1 with (nolock)
WHERE M1.LIST_ID = 2001 AND M1.TYPE = 'Aliases'
GROUP BY m1.list_id,m1.ID,m1.Type) as ALIASES
ON ALIASES.LIST_ID = PAIR.COMP_LIST_ID AND ALIASES.ID = PAIR.COMP_ID
I ended up solving this by using the "-y0" argument. It still left a bunch of whitespace but it looks like it only went to the end of the longest piece of data in each field.
I then ran the output through a program that removed repeating spaces and that solved all of the problems.