How to write query results in impala to a CSV file with JDBC Driver - impala

Using Impala JDBC driver, is it possible to write query results to a CSV file?
OR is impala-shell the only way to achieve this?
What are the security/performance side-effects of calling impala-shell from a web application?
Thanks.

I don't believe Impala's SQL syntax currently supports this feature. I've filed an issue to request it.

Related

Are there any VSCode extensions that will connect to BigQuery?

I'm using VSCode to query into a Snowflake DWH using an extension (SQLTools and the SNowflake driver).
Is there a similar VSCode extension I can use to run SQL queries in BigQuery?
Try BigQuery Runner.
It is an extension that queries BigQuery and displays the results.
I'm not sure if it fits your requirements perfectly.
I made a simple search in GG and here is the first result:
https://github.com/google/vscode-bigquery
I think you should do some search before asking.

Connect to Vertica from Datagrip

I'm using DataGrip and I'm kind of new in it. There is a case for me to connect to Vertica DB. As far as I know, there is no native provided driver for connection to that type of databases. What steps should I take to connect to it? Is there some driver to deal with?
Thanks!
You should be able to add the Vertica jdbc jar as a driver. Download it from the Vertica site, then:
Go to File / Data Sources
Right click somewhere and click Add / Driver
Give it a name
Select the jdbc jar file you downloaded
Set the class to com.vertica.jdbc.Driver
Dialect: PostgreSQL
As for how well this works, I'm not sure. It really depends on how DataGrip uses jdbc. But this is how you would add it.
In addition to #woot answer
I would add that when you setup the datasource connection to Vertica. Set the URL with below format.
URL: jdbc:vertica://{HOST}:{PORT}/{DB}

Can Apache Drill parse a SQL statement for table names?

I have an existing Java project in Eclipse and I want to be able to parse a SQL statement that a user will input in order to find the table names in the statement. Is it possible for Apache Drill to accomplish this task, and if so how do I go about doing it?
I have been looking at the documentation for Drill but all I can find is a way to create functions in Eclipse that can later be used in Command Prompt. However, what I want is a way to use some sort of parsing function inside the Java project to find the table names in the user input.
You can JSQL Parser for getting table names from SQL statement. Find documentation here.
This is not task for Apache Drill, but you can use rich SQL parser in isolation from Apache Calcite toolkit http://calcite.apache.org/.

Can I append data to an existing table in Google Big query using the browser tool?

I am trying to append data to an existing table in Google Bigquery. Can I use the browser tool or is it only possible using the Python command line with the API?
Please let me know.
Thanks in advance.....
Unfortunately, this is not supported in the UI. I've filed a feature request to add this support. You should be able to do an append by using the bq command line client or issuing api requests directly.

Importing Multivalue DB with SSIS into SQL

I would like to know if it is possible to Transfer Data into SQL Server from a multi value database file using SSIS.
The only thing that I could find online was using a bluefinity tool to achieve this.
Thanks
Simona
I have done this from Universe, but Universe has an ODBC driver that allows the datavase to be viewed as if they were tables. Almost like SQL views.
SSIS can import from almost anything that you can get either a .Net, ODBC, or OLE DB driver for. There has to be someway to talk to the DB from an extrnal program, though.
I suspect that you has no driver (ODBC,OLEDB,.NET). So you can use C# script (even from SSIS) and extract data to flat file and then to SQL Server or do it directly to SQL Server.