I'm trying to insert binary data into a blob using SQLite3's shell, which means regular SQL statements. Here's my table:
CREATE TABLE MYTABLE
(ID INTEGER,
BINDATA BLOB NOT NULL,
SOMEFK INTEGER REFERENCES OTHERTABLE(ID) NOT NULL,
PRIMARY KEY(ID)
);
And this is the kind of insert statement I'm trying:
INSERT INTO MYTABLE (BINDATA, SOMEFK)
VALUES (__READBINDATA('/tmp/somefile'), 1);
With __READBINDATA(file) being the function I am looking for. Is that possible?
There is no built-in or shell function to read a file into a blob.
However, with the help of the hexdump tool, it's possible to transform a file's contents into a blob literal:
echo "insert into mytable(bindata, somefk) " \
"values(x'"$(hexdump -v -e '1/1 "%02x"' /tmp/somefile)"', 1);"
This command can then be piped into the sqlite3 shell.
Related
I have a bigint column named mycolumn. I execute SQL scripts using the PSQL command.
Using COPY command:
COPY public.mytable (myothercol, mycolumn) FROM stdin;
1 \N
\.
This works. But the following does not work:
EXECUTE 'insert into public.mytable (myothercol, mycolumn) values ($1,$2);' USING
1,NULL;
This gives me error:
column "mycolumn" is of type bigint but expression is of type text
Why does insert not work for null value, whereas COPY works?
You best tell PostgreSQL to convert the parameter to bigint explicitly:
EXECUTE 'insert into public.mytable (myothercol, mycolumn) values ($1,$2::bigint);'
USING 1,NULL;
The problem is that PostgreSQL does not automatically know what data type a NULL is, so it guesses text. COPY does not have to guess a data type.
In PostgreSQL I previously created a table like so:
CREATE TABLE IF NOT EXISTS stock_data (
code varchar,
date date,
open decimal,
high decimal,
low decimal,
close decimal,
volume decimal,
UNIQUE (code, date)
);
The idea is to import multiple csv files into this table. My approach is to use COPY ... FROM STDIN instead of COPY ... FROM '/path/to/file', as I want to be able to cat from the shell multiple csv files and pipe them to the sql script. The sql script to accomplish this currently looks like this:
CREATE TEMPORARY TABLE IF NOT EXISTS stock_data_tmp (
code varchar,
ddate varchar,
open decimal,
high decimal,
low decimal,
close decimal,
volume decimal,
UNIQUE (code, ddate)
);
\copy stock_data_tmp FROM STDIN WITH CSV;
INSERT INTO stock_data
SELECT code, to_date(date, 'YYYYMMDD'), open, high, low, close, volume
FROM stock_data_tmp;
DROP TABLE stock_data_tmp;
An example csv file looks like this
AAA,20140102,21.195,21.24,1.16,1.215,607639
BBB,20140102,23.29,2.29,2.17,2.26,1863
CCC,20140102,61.34,0.345,0.34,0.34,112700
DDD,20140102,509.1,50.11,50.09,50.11,409863
From the shell I try:
cat /path/to/20140102.txt | psql -d my_db_name -f ~/path/to/script/update_stock_data.sql
But it gives me this error:
psql:/path/to/script/update_stock_data.sql:22: ERROR: missing data for column "date"
CONTEXT: COPY stock_data_tmp, line 1: ""
However, if in my script I change the COPY command to:
\copy stock_data_tmp FROM '/path/to/20140102.txt' WITH csv;
... and simply call
psql -d my_db_name -f ~/path/to/script/update_stock_data.sql
it succeeds.
Why am I getting this error when using cat and STDIN, and not when using the file PATH?
Because if you use -f, COPY will try to read from that file and not from stdin.
I am going through a tutorial on how to use DB2 which is in a linux environment.
I am supposed to connect to a database, create a table, insert some data under db2 shell:
db2 connect to c3421m
db2
db2 => update command options using z ON Assignment_0.txt
db2 => update command options using v ON
db2 => CREATE TABLE BAND_2015 // gives error I got stuck here
// here is where I get stuck i am supposed to create a table and execute the follwing command under DB2 shell: CREATE TABLE BAND_2015
Code given:
create table band_2015 ( \
band_no integer not null primary key, \
band_name varchar(25) not null, \
band_home varchar(25) not null, \
band_type varchar(10) check (band_type in (‘concert’,’rock’,’jazz’,’military’)), \
b_start_date date not null, \
band_contact varchar(10) not null )
So how do I create this table? I was told to copy it to a text editor(do i save it as band_2015.sql ?). I am completely new to this but i have a lot of experience in other programming languages...
The problem is the terminating character. By default in the carriage return (enter). However, for your tutorial, you should type multi-line commands. For this case, you change the terminator character by defining another
For semi-colon
db2 -t
select *
from table;
For at sign or any other character.
db2 -td#
select *
from table #
For no character:
db2
select * from table
In the DB2 command-line processor by default commands and statements cannot span multiple lines, so it treats CREATE TABLE BAND_2015 as a complete statement, which is of course not the case. In the code given to you those backslashes appear for a reason -- they indicate to the CLP that the statement continues on the next line.
Alternatively, you can start the CLP with the command line option -t, which will designate the semicolon, instead of the new line, as the statement terminator. You can then type the statement as you did, without the backslashes, and terminate it with the ";".
CREATE TABLE updater
(
nzp_up SERIAL PRIMARY KEY,
version VARCHAR(50),
status INT,
report TEXT
);
INSERT INTO updater (version, status,report) values ('TestVersion' , 0,"123123123");
-617 SQL error: A blob data type must be supplied within this context.
Using a | (pipe) delimited file, you can use the LOAD command to insert values into blob & text data types. I had the same problem in the past - go to link in my comment
See my question: Consistent method of inserting TEXT column to Informix database using JDBC and ODBC
It seems that some tools like ODBC drivers can insert text as TEXT while others like JDBC drivers must use PreparedStatent or other techniques.
INSERT INTO updater (version, status,report)
values ('TestVersion' , 0,"123123123");
and
INSERT INTO updater (version, status,report)
values ('TestVersion' , 0,'123123123');
have the same effect in mySql.So lets try without double quotes in SQL.
I have a big SQL file (~ 200MB) with lots of INSERT instructions:
insert into `films_genres`
(`id`,`film_id`,`genre_id`,`num`)
values
(1,1,1,1),
(2,1,17,2),
(3,2,1,1),
...
How could I remove or ignore columns id, num in the script?
Easiest way might be to do the full insert into a temporary holding table and then insert the desired columns into the real table from the holding table.
insert into `films_genres_temp`
(`id`,`film_id`,`genre_id`,`num`)
values
(1,1,1,1),
(2,1,17,2),
(3,2,1,1),
...
insert into `films_genres`
(`film_id`,`genre_id`)
select `film_id`,`genre_id`
from `films_genres_temp`
CREATE TABLE #MyTempTable (id int,film_id smallint, genre_id int, num int)
INSERT INTO #MyTempTable (id,film_id,genre_id,num)
[Data goes here]
insert into films_genres (film_id,genre_id) select film_id,genre_id from #MyTempTable
drop table #MyTempTable
This Perl one-liner should do it:
perl -p -i.bak -e 's/\([^,]+,/\(/g; s/,[^,]+\)/\)/g' sqlfile
It edits the file in place, but creates a backup copy with the extension .bak.
Or if you prefer Ruby:
ruby -p -i.bak -e 'gsub(/\([^,]+,/, "("); gsub/, ")");' sqlfile