How to export the sub-object value when using SequoiaDB export tool? - sequoiadb

How to export the sub-object value when using SequoiaDB export tool? For example, in the following situation, I only want to export the value of expression.park. And
after using the following statements, it reported an error.
sdbexprt -s "localhost" -p 11810 --type csv --file foo.bar.csv --fields expression.park,startTime,endTime -c foo -l bar
So how to solve this problem?

You can use --select instead of --fields to filter.
For example: sdbexprt -s "localhost" -p 11810 --type csv --file test.csv --select '{"expression.park":1}' -c foo -l bar
For more information:
Sdbexprt: http://doc.sequoiadb.com/cn/SequoiaDB-cat_id-1479195621-edition_id-0

Related

Export Query Result as CSV file from Docker PostgreSQL container to local machine

I'm not sure if this is possible of if I'm doing something wrong since I'm still pretty new to Docker. Basically, I want to export a query result inside PostgreSQL docker container as a csv file to my local machine.
This is where I got so far. Firstly, I run my PostgreSQL docker container with this command:
sudo docker run --rm --name pg-docker -e POSTGRES_PASSWORD=something -d -p 5432:5432 -v $HOME/docker/volumes/postgres:/var/lib/postgresql/data postgres
Then I access the docker container with docker exec to run PostgreSQL command that would copy the query result to a csv file with specified location like this:
\copy (select id,value from test) to 'test_1.csv' with csv;
I thought that should export the query result as a csv file named test_1.csv in the local machine, but I couldn't find the file anywhere in my local machine, also checked both of these directories: $HOME/docker/volumes/postgres; /var/lib/postgresql/data postgres
You can export the data to the STDOUT and pipe the result to a file in the client machine:
docker exec -it -u database_user_name container_name \
psql -d database_name -c "COPY (SELECT * FROM table) TO STDOUT CSV" > output.csv
-c tells psql you to execute a given SQL statement when the connection is established.
So your command should look like this:
docker exec -it -u postgres pgdocker \
psql -d yourdb -c "COPY (SELECT * FROM test) TO STDOUT CSV" > test_1.csv
The /var/lib/postgresql/data directory is where the database server stores its data files. It isn't a directory that users need to manipulate directly or where nothing interesting can be found.
Paths like test_1.csv are relative to working directory. The default directory when you enter the postgres container with docker exec is / so that's where your file should be. You can also switch to another directory with cd before running psql:
root#b9e5a0572207:/some/other/path# cd /some/other/path/
root#b9e5a0572207:/some/other/path# psql -U postgres
... or you can provide an absolute path:
\copy (select id,value from test) to '/some/other/path/test_1.csv' with csv;
You can use docker cp to transfer a file from the container to the host:
docker cp pg-docker:/some/other/path/test_1.csv /tmp
... or you can create a volume if this is something you do often.

Import CSV from Linux to Azure SQL Server

I have an Azure SQL Server database and a linux box. I have a csv file on the linux machine that I want to import into SQL Server. I have a table already created where I am going to import this file.
Why does this command return an Unknown argument: -U?
bcp table in ~/test.csv -U myUsername -S databaseServerName -d dbName -q -c -t
When I rearrange the arguments passed to bcp like below, it returns an Unknown argument: -S
bcp table in ~/test.csv -S databaseServerName -d dbName -U myUsername -q -c -t
So contrary to the documentation:
https://learn.microsoft.com/en-us/sql/tools/bcp-utility?redirectedfrom=MSDN&view=sql-server-2017#U
I hit issues where bcp does not like spaces after the argument names.
https://granadacoder.wordpress.com/2009/12/22/bcp-export/
quote from the article above:
//
The other syntax sugar is that there is no space after the -S
argument. As seen below
-SMyServerName\MyInstanceName
bcp.exe "SELECT cast(LastName as char(50)) , cast(FirstName as
char(50)) , cast(MiddleName as char(50)) , cast(Suffix as char(50))
FROM MyAdventureWorksDB.Person.Person ORDER BY NEWID()" queryout
PeopleRock.txt -c -t -T -SMyServerName\MyInstanceName
also
https://www.easysoft.com/products/data_access/odbc-sql-server-driver/bulk-copy.html#importing-data-table
check your syntax sugar in linux (below example is from above easysoft link)
./bcp AdventureWorks.HumanResources.myTeam in ~/myTeam.csv \
-f ~/myTeam.Fmt -U mydomain\myuser -S mymachine\sqlexpress
Note the above has the dbname.schemaname.tablename (before the "in" word above)

Unexpected argument executing cmdexec on a SQL job to export to CSV

I try to run this on a SQL job:
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" > D:\Test.csv
How can I fix this error?
Sqlcmd: '> D:\Test.csv': Unexpected argument.
Have you tried like this -
sqlcmd -S . -d CI_Reports -E -s"," -W -Q "SET NOCOUNT ON SELECT * FROM [dbo].[Table]" -o D:\Test.csv
where -o output_file which would identify the file that receives output from sqlcmd.
Additionally you could try BCP which is best suited for bulk coping data between an instance of Microsoft SQL Server and a data file in a user-specified format.
Read more here.

BCP import all files from a folder to database

BCP Import
How to do BCP import with all files in a folder.
folder
file1.csv
file2.csv
Need to import both the files.
bcp <tableName> in <filename> -t "^" -r "\n" -c -C 28591 -S <databaseinstance> -U <username> -P <password>
Using the above BCP cmd, we can import only one file at a time.
simple BCP command import only single file.
To achieve the above we need to use looping with the command.
I have used the following command simple command.
for /r %i in (*) do bcp <tablename> in %i -t "^" -r "\n" -c -C 28591 -S <databaseinstance> -U <username> -P <password>
It works.

How to execute postgres' sql queries from batch file?

I need to execute SQL from batch file.
I am executing following to connect to Postgres and select data from table
C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
select * from test;
I am able to connect to database, however I'm getting the error
'select' is not recognized as an internal or external command,
operable program or batch file.
Has anyone faced such issue?
This is one of the query i am trying, something similar works in shell script, (please ignore syntax error in the query if there are any)
copy testdata (col1,col2,col3) from '%filepath%/%csv_file%' with csv;
You could pipe it into psql
(
echo select * from test;
) | C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
When closing parenthesis are part of the SQL query they have to be escaped with three carets.
(
echo insert into testconfig(testid,scenarioid,testname ^^^) values( 1,1,'asdf'^^^);
) | psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME%
Use the -f parameter to pass the batch file name
C:/pgsql/bin/psql -h %DB_HOST% -p 5432 -U %DB_USER% -d %DB_NAME% -f 'sql_batch_file.sql'
http://www.postgresql.org/docs/current/static/app-psql.html
-f filename
--file=filename
Use the file filename as the source of commands instead of reading commands interactively. After the file is processed, psql terminates. This is in many ways equivalent to the meta-command \i.
If filename is - (hyphen), then standard input is read until an EOF indication or \q meta-command. Note however that Readline is not used in this case (much as if -n had been specified).
if running on Linux, this is what worked for me (need to update values below with your user, db name etc)
psql "host=YOUR_HOST port=YOUR_PORT dbname=YOUR_DB_NAME user=YOUR_USER_NAME password=YOUR_PASSWORD" -f "fully_qualified_path_to_your_script.sql"
You cannot put the query on separate line, batch interpreter will assume it's another command instead of a query for psql. I believe you will need to quote it as well.
I agree with Spidey:
1] if you are passing the file with the sql use -f or --file parameter
When you want to execute several commands the best way to do that is to add parameter -f, and after that just type path to your file without any " or ' marks (relative paths works also):
psql -h %host% -p 5432 -U %user% -d %dbname% -f ..\..\folder\Data.txt
It also works in .NET Core. I need it to add basic data to my database after migrations.
Kindly refer to the documentation
1] if you are passing the file with the sql use -f or --file parameter
2] if you are passing individual command use -c or --command parameter
If you are trying the shell script
psql postgresql://$username:$password#$host/$database < /app/sql_script/script.sql