bcp utility asking me to enter different parameters i do not undertand - bcp

I am using BCP To export data from sqlserver 2008R2 Database Name Health,and a table name patient.The out of the query should be save in a textfile:ApplicantsName.txt located at:
C:\Users\meuser\Desktop ApplicantsName.txt -C -T
After running the following query on the command prompt:
bcp "Select FirstName,LastName,PatientNumber from Health.dbo.Patient order by FirstName" queryout "C:\Users\meuser\Desktop ApplicantsName.txt" -C -T
It prompted me this:
Enter the file storage type of fiedl FirstName [char]:varchar
and then this:
Enter prefix-length of field FirstName[2]:FirstName
I have been entering some values but i think the best is to know how it works.After some time of research on the internet, know using bcp utility is one fastest way to export or import data between instance to a file.I follow exactly the samples provided by MS here but i think i need some practical explanation. Can some guide me how to go about this and a little bit of explanation or relevant ref. will be appreciated too.

#one angry researcher's solution of adding '-C RAW' did not work in my particular case but adding lower-case '-c' did. It performs the operation using a character data type
For instance:
bcp mydb.mytable out c:/temp/data.txt -T -c

You need to add a value for the -C parameter (capital C!). If you do not know what you're using it for, you probably won't be needing it and can omitt it.
Refer to the official documentation: http://msdn.microsoft.com/en-us/library/ms162802.aspx
edit: you could, for example, use
bcp "Select FirstName,LastName,PatientNumber from Health.dbo.Patient order by FirstName" queryout "C:\Users\meuser\Desktop\ApplicantsName.txt" -C RAW -T
You will need to fix your output directory too (seems you forgot a backslash there).

heres the sample bcp command with query and credentials (param)
bcp "SELECT * from yourtable" queryout c:\StockItemTransactionID_c.txt -c -Uusername -Pdbpassword -Sinstance -dYourDBName
Note: -U -P -S are case sensitive.

Related

Exporting SQL Table into a CSV file using Windows Batch Script

I am trying to create a windows batch file to export data from an SQL file to a CSV file.
I have an SQL file in %MYHOME%\database\NET-DB.sql which contains data that is:
NET-DB.sql
insert into net_network (id, a_id, alias, address, domain, mask) values('NET_10.10.1.0_10', 1, 'Local Network', '10.10.1.0', '', '255.255.252.0');
What I have tried so far in exporting the data from net_network table into a CSV file in my .bat file is with this command:
export.bat
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
COPY net_network TO '%MYHOME%\net\CSV-EXPORT_FILE.csv' DELIMITER ',' CSV HEADER;
pause
Since that does not work for me, what should be the correct approach for this implementation? Any help will be much appreciated.
Use SQLCMD
You need to modify the code to make it work in your environment, but here goes.
if not exist "%MYHOME%\net\NUL" mkdir "%MYHOME%\net"
cd "C:\your path\to\sqlcmd"
sqlcmd -S YourDBServer -d DB_NAME -E -Q "select id, a_id, alias, address, domain, mask from net_network"
-o "CSV-EXPORT-FILE.csv" -s"," -w 255
Some explanations:
-S The database server to connect to.
-d Name of the database to connect to.
-Q Query to run, can also be insert, delete, update etc.
-o select the output file
-s"," separated by comma
-w column width, this has to be as big as your largest columns characters.

Using documentParser function in Teradata Aster

I'm working with Teradata's Aster and am trying to parse a pdf(or html) file such that it is inserted into a table in the Beehive database in Aster. The entire pdf should correspond to a single row of data in the table.
This is to be done by using one of Aster's SQL-MR functions called documentParser. This will produce a text file(.rtf) containing a single row produced by parsing all the chapters from the pdf file, which would be then loaded into the table in Beehive.
I have been given this script that shows the use of documentParser and other steps involved in this parsing process -
/* SHELL INSTRUCTIONS */
--transform file in b64 (change file names to your relevant file)
base64 pp.pdf>pp.b64
--prepare a loadfile
rm my_load_file.txt
-- get the content of the file
var=$(cat pp.b64)
-- put in file
echo \""pp.b64"\"","\""$var"\" >> "my_load_file.txt"
-- create staging table
act -U db_superuser -w db_superuser -d beehive -c "drop table if exists public.cf_load_file;"
act -U db_superuser -w db_superuser -d beehive -c "create dimension table public.cf_load_file(file_name varchar, content varchar);"
-- load into staging table
ncluster_loader -U db_superuser -w db_superuser -d beehive --csv --verbose public.cf_load_file my_load_file.txt
-- use document parser to load the clean text (you will need to create the table beforehand)
act -U db_superuser -w db_superuser -d beehive -c "INSERT INTO got_data.cf_got_text_data (file_name, content) SELECT * FROM documentParser (ON public.cf_load_file documentCol ('content') mode ('text'));"
--done
However, I am stuck on the last step of the script because it looks like there is no function called documentParser in the list of functions that are available in Aster. This is the error I get -
ERROR: function "documentparser" does not exist
I tried to search for this function several times with the command \dF, but did not get any match.
I've attached a picture which present the gist of what I'm trying to do.
SQL-MR Document Parser
I would appreciate any help if any one has any experience with this.
What happened is that someone told you about this function documentParser but never gave you the function archive file (documentParser.zip) to install in Aster. This function does exist but it's not part of the official Aster Analytics Foundation (AAF). Please contact person who gave you this info for help.
documentParser belongs to so-called field functions that are developed and used by the Aster field team only. Not that you can't use it, but don't expect support to help you - only whoever gave you access to it.
If you don't have any contacts then next course of action I'd suggest to go to Aster Community Network and ask question about it there.

SQL Server :query for exporting to file

I'm trying to learn the basics of sql programming, I am working with SQL Server 2014. I have managed to import a file into a table with the command:
BULK INSERT Db.dbo.Co2_table
FROM 'd:\dataset_co2.txt'
with
(
FIRSTROW =2,
ROWTERMINATOR ='\n'
)
GO
I would like to do the dual operation, that is exporting the content of a table to a file. I have tried:
SELECT *
INTO OUTFILE 'C:\datadump\sqldbdump.txt"
FROM dbo.alarms_2_2014
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
bcp Db.dbo.Co2_table out "C:\users\ws5.en-cre\desktop\prova.txt" -T –c
sqlcmd -S . -d Db -E -s, -W -Q "SELECT * FROM dbo.Co2_table" > ExcelTest.csv
But none of these seem to work (I get error messages). Any idea?
I suspect you are running those commands from Management Studio. You should use console for this command.This works for me. Also check if you have permissions on that folder.
bcp "select * from Db.dbo.Co2_table" queryout C:\users\ws5.en-cre\desktop\prova.txt -c -T
or
bcp Db.dbo.Co2_table out C:\users\ws5.en-cre\desktop\prova.txt -c -T
Also you have suspicious symbol in c parameter -T –c. It is not a regular dash -.
Thank you for you answers and suggestions, and apologies for my lack of precision and my late reply (in this case I missed the notifications from stackoverflow).
Regarding the question on whether I use mstudio or console, what I do is clicking on “new query” from mstudio, write the code and press execute. So I guess the answer is that I use mstudio.
If I try:
bcp "select * from Db.dbo.Co2_table" queryout
C:\users\ws5.en-cre\desktop\prova.txt -c –T
it says
Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'queryout'.
I guess in this case one of the problem is that the quotes are missing, but even adding them doesn’t solve the problem.
I am looking for a solution that can be implemented as a script. I am familiar with excel vba macros, I would like to implement something like that.
Thanks,
Alex

SQL Server BCP Empty File

I'm trying to use bcp to query out a comma-separated-value file but each time I get an empty file.
Here's my bcp command:
bcp "SELECT * FROM ##OutAK " QUERYOUT D:\Outbound\raw\li14090413.raw -c -T -t -S DB1
I have verified that ##OutAK is NOT empty because select count (*) from ##OutAK is not 0. When open file using HEX editor, I see the following:
0D 0A
I found the problem. It seems BCP is "allergic" with NULL. So, I just put ISNULL() to all the null-able fields and the output file is back to normal now.

exporting query result to excel

I am trying to execute the below sql but I am getting "Invalid object name '.Sheet1$'."
INSERT INTO OPENDATASOURCE
('Microsoft.Jet.OLEDB.4.0',
'Database=c:\test.xls;Extended Properties=Excel 8.0')..[Sheet1$])
SELECT col1 FROM table;
its in mssql 2005.
any help is appreciated.
If you have xp_cmdshell enabled you can do this to export to delimited text file, which will open perfect in Excel.
EXEC xp_cmdshell 'SQLCMD -S [SERVERNAME] -d [DBNAME] -o "C:\Output.txt" -s "," -U "[USERNAME]" -P "[PASWORD]" -Q "SELECT TOP 10 * FROM table"';
According to this posting (and a few other samples Google found for me), you need three dots before the table: 8.0)...[Sheet1$]. (Don't ask me why).
Added: The German translation of this provides a complete example of Excel access:
SELECT * FROM OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0',
'Data Source=C:\DataFolder\Documents\TestExcel.xls;Extended Properties=EXCEL 5.0')...[Sheet1$] ;