How can I insert a field from storage into my SQL query in BigQuery - google-bigquery

I have made an SQL statement like the following example:
SELECT ip
FROM ip_table
LIMIT 500
Then I saved the result into a google storage table as a csv format. Now I found that I want more data about the ips I queries previously. Can I read the ips that I saved in the previous query and use them into a new query like this:
SELECT mroe_info
FROM ip_table
WHERE ip = ip_from_my_csv_file
Where ip_from_my_csv_file should iterate over the ips I have in my csv file.
Can you help me achieve this?

You can create external table (for example named my_csv_file) on top of your csv file (see Using External Data Sources) and than use it in your query
SELECT mroe_info
FROM `project.dataset.ip_table`
WHERE ip in (SELECT DISTINCT ip FROM `project.dataset.my_csv_file`)

Related

Sql script to search value in column of database by taking value from a file

I have a csv file with two columns. The file has over 200.000 rows. Inside database I have the same table with the same values.
How can I write a script so that I can search for the values that are present in file but not in database?
I am using SQL Developer for this
Creating an External table is the best option when you want to read the contents of a flat-file using a select query.
Click here to know more about how to create an external table.
After creating the external table, you can make use of a query similar to below to identify the records which are exclusively available in the external table(i.e. flat file).
select *
from new_external_table et
where not exists (select 1 from source_table st where et.column_name=st.column_name);

Compress Oracle Query Column

I have read access to an Oracle database. I am using cx_Oracle to make queries. One of the table column is a CLOB with XML strings. To speed up the network access I figured I would try to ask the database to first compress the xml data before it sends it because the network link is very slow. Is there any way to do this? I would uncompress the data on my end. Looking for something like:
SELECT COMPRESS(clob_column) AS comp_data
FROM table1
WHERE id=1
Thanks

Oracle SQL: How to Query A CSV With No Header/Column Names?

I have a third party tool which uses CSV Text Drivers which allows for executing SQL queries on CSV data imported into the tool. Most Oracle SQL queries work on this while many don't.
I have a requirement where I have to read and import data into the tool using a CSV file which has no column names or header fields available. How can I execute SQL queries on a table which has no column names or headers defined?
Sample Table:
AB 100 GPAA 9876
AC 101 GPAB 9877
AD 102 GPAC 9878
You would likely need to add the headers before running the queries. Is there a table in which the data will eventually end up? If so, you could export the column names from there first, then append the CSV info to the newly created file afterward.
So apparently, you can specify if your CSV file has a header or not when using the CSV SQL Text driver for interaction with CSV files.
jdbc:csv:////<path_to_file>/?_CSV_Header=false;
Then, we can have a query like
select distinct (column1) as accountID, (column2) as groupID from csv_file_name
The parameters (column1), (column2)... represent the actual columns in the file from left to right and they have to be written like this for the query to work.

postgresql/psql query where clause multiple "or" read from file

Sorry if a similar question has been asked before (I searched but couldn't find anything useful).
I have a table which has details of data transfers. One of the fields is an IP associated with the transfer. I need to develop a query which will get me a subset of the records in the table which match one of 79 IP's (there are 608 distinct IP's in the table). I have a file which has the required IP's separated by newlines. Is there a way to develop a query which reads this file of IP's to get the required records instead of me manually entering each of the IP's separated by an "or"?
If you have the text with IPs separated by newlines in the database or your client, this query would do the job:
Transform the list to an array, unnest it and join to the main table:
SELECT *
FROM (SELECT unnest(string_to_array(your_list_of_ips, E'\n')) AS ip) sub
JOIN data_transfers d USING (ip);
More about the used function in the manual here.
SQL COPY
To import from a file directly, you could use COPY. The data file has to be on the same machine as Postgres and you need to be a database superuser for this.
This time we already have a single IP per row:
CREATE TEMP TABLE tmp(ip text);
COPY tmp FROM '/path/to/file';
SELECT *
FROM tmp
JOIN data_transfers d USING (ip);
psql \copy
If your file is on a different machine or if you do not have superuser privileges, use the (mostly) equivalent \copy of psql instead. To do it from the bash (like requested in the comment):
psql dbname
dbname=# \set ips `cat ips.txt`
dbname=# SELECT *
dbname-# FROM (SELECT unnest(string_to_array(:'ips', E'\n')) AS ip) sub
dbname-# JOIN data_transfers d USING (ip);
\set is the psql meta-command to set a variable - to the contents of a file in this case.
ips.txt being your file with IPs.
:'ips' is the syntax for single-quoted SQL interpolation.
More details in the always useful manual here.
Here is a related case on pgsql-general.

Read a Text/CSV file from an SQL statement

I wish to use an external Text/CSV file to read data and run an SQL Query. Is this possible without using the External_Table concept? I do not have write permissions in the DB, hence cannot create a temp table in the DB.
Basically, I have a list of employee numbers (around 100) in a text file, using which I wish to run the following query each time:
SELECT emp_record FROM emp_data WHERE emp_no = "#file-containing-number"
I have to run a series of tasks on these and they are in no particular order or sequence, but have been provided in that text file as a list.
I am using the TOAD client and have only read-only permissions on the DB I connect to.
When I do this sort of thing I will open the file in notepad, add a comma to the end of each line and use the following SQL query:
select emp_record FROM emp_data WHERE emp_no IN (
... Paste contents of file here.
)
No - based on the limitations you mention in your question.
Are you saying you cannot even insert these records into a table in the database? Who is imposing these restrictions? You have a job to do. Other support staff should help in providing a means to accomplish the job.