Invoke-Sqlcmd : Could not find stored procedure - sql-server-2016

I have a dump of a SQL database table which contains only data. It is one long list of INSERT statements. The file is about 10GB and when I try to import with Invoke-Sqlcmd or the SQL Server management studio it fails with the message "Not enough memory". Therefore I split the file into several smaller files of 250MB. All the lines are complete, so no half INSERT statements at the end or beginning of each file because of splitting the files.
When I use Powershell to import the data the first file imports without problems.
Invoke-Sqlcmd -ServerInstance myserver\instance -Database mydatabase -InputFile "C:\temp\files\dbo.Data.00.sql"
Whenever I try to import the next file I get the following error message.
Invoke-Sqlcmd : Could not find stored procedure 'I'.
At line:1 char:1
+ Invoke-Sqlcmd -ServerInstance myserver\instance -Database mydatabase -I ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [Invoke-Sqlcmd], SqlPowerShellSqlExecutionExceptio
+ FullyQualifiedErrorId : SqlError,Microsoft.SqlServer.Management.PowerShell.GetScriptCommand
It mentions the stored procedure could not be found but it are only INSERT statements. I also tried to specify the database name before the first INSERT statement but that does not change the result.
USE [mydatabase]
Any ideas what is going wrong here.

I managed to work around this issue by copying the dump file to a Linux host and use the following command to split the file into files of 250.000 lines each.
split -l 250000 dbo.Data.sql
There still was a problem with the split files. All files except for the first one contained NUL characters between each character.
I used the following solution to remove the NUL characters.
Removing "NUL" characters
By executing the following command for all split files except the first one.
tr < xab -d '\000' > xab.dbo.Data.sql
tr < xac -d '\000' > xac.dbo.Data.sql
etc...

Related

BCP update database table base on output from powershell

I have 4 files with the same csv header as following
Column1,Column2,Column3,Column4
But I only required data from Column2,Column3,Column4 for import the data into SQL database using BCP . I am using the PowerShell to select the columns that I want and import the required data using BCP but my powershell executed with no error and there are not data updated in my database table. May I know how to set the BCP to import the output from Powershell to database table. Here are my powershell script
$filePath = Get-ChildItem -Path 'D:\test\*' -Include $filename
$desiredColumn = 'Column2','Column3','Column4'
foreach($file in $filePath)
{
write-host $file
$test = import-csv $file | select $desiredColumn
write-host $test
$action = bcp <myDatabaseTableName> in $test -T -c -t";" -r"\n" -F2 -S <MyDatabase>
}
These are the output from the powershell script
D:\test\sample1.csv
#{column2=111;column3=222;column4=333} #{column2=444;column3=555;column4=666}
D:\test\sample2.csv
#{column2=777;column3=888;column4=999} #{column2=aaa;column3=bbb;column4=ccc}
First off, you can't update a table with bcp. It is used to bulk load data. That is, it will either insert new rows or export existing data into a flat file. Changing existing rows, usually called as updating, is out of scope for bcp. If that's what you need, you need to use another a tool. Sqlcmd works fine, and Powershell's got Invoke-Sqlcmd for running arbitary TSQL statements.
Anyway, the BCP utility has notoriously tricky syntax. As far as I know, one cannot bulk load data by passing the data as parameter to bcp, a source file must be used. Thus you need to save the filtered file and pass its name to bcp.
Exporting a filtered CSV is easy enough, just remember to use -NoTypeInformation switch, lest you'll get #TYPE Selected.System.Management.Automation.PSCustomObject as your first row of data. Assuming the bcp arguments are well and good (why -F2 though? And Unix newlines?).
Stripping double quotes requires another an edit to the file. Scrpting Guy has a solution.
foreach($file in $filePath){
write-host $file
$test = import-csv $file | select $desiredColumn
# Overwrite filtereddata.csv, should one exist, with filtered data
$test | export-csv -path .\filtereddata.csv -NoTypeInformation
# Remove doulbe quotes
(gc filtereddata.csv) | % {$_ -replace '"', ''} | out-file filtereddata.csv -Fo -En ascii
$action = bcp <myDatabaseTableName> in filtereddata.csv -T -c -t";" -r"\n" -F2 -S <MyDatabase>
}
Depending on your locale, column separator might be semicolon, colon or something else. Use -Delimiter '<character>' switch to pass whatever you need or change bcp's argument.
Erland's got a helpful page about bulk operations. Also, see Redgate's advice.
Without need to modify the file first, there is an answer here about how bcp can handle quoted data.
BCP in with quoted fields in source file
Essentially, you need to use the -f option and create/use a format file to tell SQL your custom field delimiter (in short, it is no longer a lone comma (,) but it is now (",")... comma with two double quotes. Need to escape the dblquotes and a small trick to handle the first doulbe quote on a line. But it works like a charm.
Also, need the format file to ignore column(s)... just set the destination column number to zero. All with no need to modify the file before load. Good luck!

SQL Server :query for exporting to file

I'm trying to learn the basics of sql programming, I am working with SQL Server 2014. I have managed to import a file into a table with the command:
BULK INSERT Db.dbo.Co2_table
FROM 'd:\dataset_co2.txt'
with
(
FIRSTROW =2,
ROWTERMINATOR ='\n'
)
GO
I would like to do the dual operation, that is exporting the content of a table to a file. I have tried:
SELECT *
INTO OUTFILE 'C:\datadump\sqldbdump.txt"
FROM dbo.alarms_2_2014
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
bcp Db.dbo.Co2_table out "C:\users\ws5.en-cre\desktop\prova.txt" -T –c
sqlcmd -S . -d Db -E -s, -W -Q "SELECT * FROM dbo.Co2_table" > ExcelTest.csv
But none of these seem to work (I get error messages). Any idea?
I suspect you are running those commands from Management Studio. You should use console for this command.This works for me. Also check if you have permissions on that folder.
bcp "select * from Db.dbo.Co2_table" queryout C:\users\ws5.en-cre\desktop\prova.txt -c -T
or
bcp Db.dbo.Co2_table out C:\users\ws5.en-cre\desktop\prova.txt -c -T
Also you have suspicious symbol in c parameter -T –c. It is not a regular dash -.
Thank you for you answers and suggestions, and apologies for my lack of precision and my late reply (in this case I missed the notifications from stackoverflow).
Regarding the question on whether I use mstudio or console, what I do is clicking on “new query” from mstudio, write the code and press execute. So I guess the answer is that I use mstudio.
If I try:
bcp "select * from Db.dbo.Co2_table" queryout
C:\users\ws5.en-cre\desktop\prova.txt -c –T
it says
Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'queryout'.
I guess in this case one of the problem is that the quotes are missing, but even adding them doesn’t solve the problem.
I am looking for a solution that can be implemented as a script. I am familiar with excel vba macros, I would like to implement something like that.
Thanks,
Alex

SQL Server : export query as a .txt file

I am trying to export my SQL Server query results into a folder in .txt format (this is for an automated job)
I know the equivalent in MySQL works with INTO OUTFILE. Does anyone know the best way to do this in SQL Server 2008 Management Studio?
SELECT DISTINCT RTRIM (s1.SGMNTID) AS 'AccCode',RTRIM (s1.DSCRIPTN) AS 'CodeDesc', CASE
WHEN s1.SGMTNUMB = '1' THEN '1'
WHEN s1.SGMTNUMB = '2' THEN '2'
WHEN s1.SGMTNUMB = '3' THEN '110'
WHEN s1.SGMTNUMB = '4' THEN '4'
WHEN s1.SGMTNUMB = '5' THEN '120'
END AS 'AccountType_id',
CASE WHEN s1.SGMTNUMB = '2'
THEN LEFT(s1.SGMNTID, 2)
ELSE 'DEFAULT'
END AS 'AccGroupName'
FROM GL40200 s1
UNION
SELECT REPLACE ([ACTNUMBR_1]+'-'+ [ACTNUMBR_2]+'-'+ [ACTNUMBR_3]+'-'+[ACTNUMBR_4]+'-'+ [ACTNUMBR_5],' ', '') AS 'AccCode',
'' AS 'CodeDesc',
'0' AS 'AccountType_id',
'Default' AS 'AccGroupName'
FROM GL00100 a
INTO OUTFILE 'C:\Users\srahmani\verian/myfilename.txt'
You do this in the SSMS app, not the SQL.
In the toolbar select:
Query --> Results To --> Results To File
Then Execute the SQL statements and it will prompt you to save to a text file with an .rpt extension. Open the results in a Text Editor.
Another way is from command line, using the osql:
OSQL -S SERVERNAME -E -i thequeryfile.sql -o youroutputfile.txt
This can be used from a BAT file and shceduled by a windows user to authenticated.
You can use bcp utility.
To copy the result set from a Transact-SQL statement to a data file,
use the queryout option. The following example copies the result of a query into the Contacts.txt data file. The example assumes that you are using Windows Authentication and have a trusted connection to the server instance on which you are running the bcp command. At the
Windows command prompt, enter:
bcp "<your query here>" queryout Contacts.txt -c -T
You can use BCP by directly calling as operating sytstem command in SQL Agent job.
You can use windows Powershell to execute a query and output it to a text file
Invoke-Sqlcmd -Query "Select * from database" -ServerInstance "Servername\SQL2008" -Database "DbName" > c:\Users\outputFileName.txt
The BCP Utility can also be used in the form of a .bat file, but be cautious of escape sequences (ie quotes "" must be used in conjunction with ) and the appropriate tags.
.bat Example:
C:
bcp "\"YOUR_SERVER\".dbo.Proc" queryout C:\FilePath.txt -T -c -q
-- Add PAUSE here if you'd like to see the completed batch
-q MUST be used in the presence of quotations within the query itself.
BCP can also run Stored Procedures if necessary. Again, be cautious: Temporary Tables must be created prior to execution or else you should consider using Table Variables.
This is quite simple to do and the answer is available in other queries. For those of you who are viewing this:
select entries from my_entries where id='42' INTO OUTFILE 'bishwas.txt';

How to route SQL print output to log file?

Anyone who knows how to route print ' ' in an sql script to a logfile when using Invoke-Sqlcmd?
I tried using sqlcmd -o someoutfile.txt, but it overwrites, it does not append to existing file. And if an SQL error occurs, only the error message is sent to file, not the print ' '.
When using Invoke-Sqlcmd | out-file someoutfile.txt -Append, it appends only Write-Output and eventually SQL errors, but not the print ' ' in the sql script excuted.
Has anyone found a solution for this?
Invoke-SqlCmd implements T-SQL PRINT statements and RAISERROR using the verbose parameter. To capture verbose output, first you'll need to include the parameter in your call to invoke-sqlcmd i.e. invoke-sqlcmd -verbose and next you can do one of two things:
If you're using Powershell V3 or higher you can redirect verbose output:
invoke-sqlcmd -verbose 4>&1 | outfile someoutfile.txt
If you're using Powershell V2 you can't redirect verbose output to a file, however you can use start-transcript to send all screen output to a file. One gotcha with this approach--it will not work with SQL Agent Powershell job step. It will however work with a cmdexec job step which calls powershell.exe.
And one moment...
The command "Invoke-Sqlcmd" has a parameter -SeverityLevel.
SeverityLevel specifies the lower limit for the error message severity level Invoke-Sqlcmd returns to the ERRORLEVEL PowerShell.
Invoke-Sqlcmd does not report severities for informational messages that have a severity of 10!
Severity Level 10: Status Information
This is an informational message that indicates a problem caused by mistakes in the information the user has entered. Severity level 0 is not visible in SQL Server.

Execute SQL from file in bash

I'm trying to load a sql from a file in bash and execute the loaded sql. The sql file needs to be versatile, meaning it cannot be altered in order to make things easy while being run in bash (escaping special characters like * )
So I have run into some problems:
If I read my sample.sql
SELECT * FROM SAMPLETABLE
to a variable with
ab=`cat sample.sql`
and execute it
db2 `echo $ab`
I receive an sql error because by doing a cat the * has been replaced by all the files in the directory of sample.sql.
Easy solution would be to replace "" with "\" . But I cannot do this, because the file needs to stay executable in programs like DB Visualizer etc.
Could someone give me hint in the right direction?
The DB2 command line processor has options that accept a filename as input, so you shouldn't need to load statements from a text file into a shell variable.
This command will execute all SQL statements in the file, with newline treated as the statement terminator:
db2 -f sample.sql
This command will execute all SQL statements in the file, with semicolon treated as the statement terminator:
db2 -t -f sample.sql
Other useful CLP flags are:
-x : Suppress the column headings
-v : Echo the statement text immediately before execution
-z : Tee a copy of all CLP output to the filename immediately following this flag
Redirect stdin from the file.
db2 < sample.sql
In case, you have a variable used in your script and wanted to get it replaced by the shell before executed in DB2 then use this approach:
Contents of File.sql:
cat <<xEOF
insert values(1,2) into ${MY_SCHEMA}.${MY_TABLE};
select * from ${MY_SCHEMA}.${MY_TABLE};
xEOF
In command prompt do:
export MY_SCHEMA='STAR'
export MY_TAVLE='DIMENSION'
Then you are all good to get it executed in DB2:
eval File.sq |db2 +p -t
The shell will replace the global variables and then DB2 will execute it.
Hope it helps.