Incorrect csv file from impala output - sql

Snapshot of the "csv" file
So I have the table created by impala saved as csv file using the following code:
impala-shell -B -o output.csv --output_delimiter=',' -q "select * from foo" which is supposed to return a csv file, but it didnt.
Anyone help will be appreciated!

The output file you have is close to a CSV file, except that it is using pipes instead of commas. To import this into Excel, just use the wizard and select pipe (|) as the delimiter.
I'm not sure what went wrong with your Impala export, but you should be able to work with what you already have.

Related

How to write match results from .cypher into textfile via cypher shell (Windows)?

I want to write match results based on cypher code inside a cypher file via cypher-shell into a text file (I am trying to do this on Windows). The cypher file contains: :beginmatch(n) return n;:commit
I tried to execute:
type x.cypher | cypher-shell.bat -u user -p secret > output.txt I get no error. But at the end there is just an empty text file "output.txt" inside the bin folder. Testing the cypher code directly in the cypher-shell (without piping) works. Can anyone help, please?
consider using the apoc library
that you can export to different formats, maybe it can help you.
Export to CSV
Export to JSON
Export to Cypher Script
Export to GraphML
Export to Gephi
https://neo4j.com/labs/apoc/xx/export/ --> xx your version Neo4j,example 4.0

BCP import issue

i am using BCP command to import text file in SQL database.
below is syntax,
exec xp_cmdshell 'bcp "Database.dbo.TableName" in "\\devm11\RND\2.txt" -c -t"þ|þ" -r -S "SQLservername" -T'
As not able to attach text file format, so please find below image for input file..
Problem:
This file gives EOF error while importing.
It's not work with provided field terminators..
i tried with different terminators like "|" and "^|" and "|^" but
nothing works..
Please suggest how to resolve this issue ?
Thanks

Export PSQL table to CSV file using cmd line args

I am attempting to export my PSQL table to a CSV file. I have read many tips online, such as Save PL/pgSQL output from PostgreSQL to a CSV file
From these posts I have been able to figure out
\copy (Select * From foo) To '/tmp/test.csv' With CSV
However I do not want to specify the path, I would like the user to be able to enter it via the export shell script. Is it possible to pass args to a .sql script from a .sh script?
I was able to figure out the solution.
In my bash script I used something similar to the following
psql $1 -c "\copy (SELECT * FROM users) TO '"$2"'" DELIMINTER ',' CSV HEADER"
This code copies the user table from the database specified by the first argument and exports it to a csv file located at the second arguement

Can I execute .sql file from SQLite command line when I don't have a .db file?

I've been writing SQL in environments where the databases and tables are all easy to pull in using simple 'FROM db.table'. Now I'm trying to do my own project on .csv files. I want to be able to write all of my queries in .sql files and execute them using command line.
I'm uncertain about the following:
What the best program to use is.
Wow to execute a .sql file from command line.
How to import a .csv file.
Yipes! I'm new to using command line and I'm new to pulling in my own tables.
I'm currently trying out SQLlite3, but from the documentation* it doesn't look like I can simply execute a .sql file using SQLlite3 in command line.
I've tried running "sqlite3 HelloWorld.sql" in command line for a file that just has "SELECT 'Hello World';" in it and this is what I get:
SQLite version 3.9.2 2015-11-02 18:31:45
Enter ".help" for usage hints.
sqlite>
Any advice would be greatly appreciated!
https://www.sqlite.org/cli.htmlb
On Windows you can execute SQL (files) via the command line:
>sqlite3 "" "SELECT 'Hello World!';"
Hello World!
>sqlite3 "" ".read HelloWorld.sql"
Hello World!
This won't create a database file because the first parameter is empty ("") and would normally yield in a database file.

Hive output to xlsx

I am not able to open an .xlsx file. Is this the correct way to output the result to an .xlsx file?
hive -f hiveScript.hql > output.xlsx
hive -S -f hiveScript.hql > output.xls
This will work
There is no easy way to create an Excel (.xlsx) file directly from hive. You could output you queries content to an older version of Excel (.xls) by the answers given above and it would open in Excel properly (with an initial warning in latest versions of Office) but in essence it is just a text file with .xls extension. If you open this file with any text editor you would see the contents of the query output.
Take any .xlsx file on your system and open it with a text editor and see what you get. It will be all junk characters since that is not a simple text file.
Having said that there are many programming languages that allow you to convert/read a text file and create xlsx. Since no information is provided/requested on this I will not go into details. However, you may use Pandas in Python to create excels.
output csv or tsv file, and I used Python to do converting (pandas library)
I am away from my setup right now so really cannot test this. But you can give this a try in your hive shell:
hive -f hiveScript.hql >> output.xls