How can I update a single field in sqlite3 with the contents of a file? - sql

This is equivalent to my earlier question here, but for sqlite.
As before, I am trying to do the following using the sqlite3 command line client.
UPDATE my_table set my_column=CONTENT_FROM_FILE where id=1;
I have looked at the documentation on .import, but that seems to be a little heavyweight for what I am trying to do.
What is the correct way to set the value of one field from a file?
The method I seek should not impose constraints on the contents of the file.

Assuming the file content is all UTF-8 text and doesn't have any quote characters that would be misinterpreted, you could do this (assuming posix shell - on Windows try cygwin):
$ echo "UPDATE my_table set my_column='" >> temp.sql
$ cat YourContentFile >> temp.sql
$ echo "' where id=1;" >> temp.sql
$ sqlite3
SQLite version 3.7.13 2012-07-17 17:46:21
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite> .read temp.sql
If the content does have single quotes, escape them first with a simple find-and-replace (you'd need to do that anyway).
hth!

See: http://www.sqlite.org/cli.html#fileio
sqlite> INSERT INTO images(name,type,img)
...> VALUES('icon','jpeg',readfile('icon.jpg'));
In your case:
UPDATE my_table set my_column=readfile('yourfile') where id=1;
If you don't have readfile, you need to .load the module first.
Note
I found that the provided fileio module: http://www.sqlite.org/src/artifact?ci=trunk&filename=ext/misc/fileio.c uses sqlite3_result_blob. When I use it in my project with text columns, it results in Chinese characters being inserted into the table rather than the bytes read from file. This can be fixed by changing it to sqlite3_result_text. See http://www.sqlite.org/loadext.html for instructions on building and loading run-time extensions.

Related

Using Bash-variables in sqlite

thanks for taking your time to take a look at this.
I recently started to script in Bash and wanted to write a small script where the user input, depending on the chosen parameter, gets written in a sqlite database. I'm completly stuck, if you have a minute and an idea I'd be very greatful if you answer to this.
My code currently looks somethink like this:
#!/bin/bash
### checking if database is availibe etc.
## ...
if [ $# -gt 0 ]
case $1 in
"--add")
case $2 in
"-t")
sqlite3 DatabaseFile <<'END_SQL'
INSERT INTO databasenaem (tablename) ($3);
END_SQL
# ....
esac
"--change")
sqlite3 DatabaseFile <<'END_SQL'
UPDATE tablename SET tablename=$3 where ID=3;
END_SQL
esac
Thank you very much and have a great day.
It's hard to provide complete answer, as the structure of the tables in the database are not listed, and there is no sample data to replicate the problem.
With the little information that is available, 2 possible problems:
From the SQL statement, looks like the injected parameters are string. You have to quote them to create valid sql
The code uses quotes for the 'here-document' separator. This will disable the parameter substibution that will be needed to expand $3 and friends.
Also, check the spelling (e.g., 'databasename')
Consider the following
sqlite3 DatabaseFile <<END_SQL
INSERT INTO databasenaem (tablename) ('$3');
END_SQL
...
sqlite3 DatabaseFile <<END_SQL
UPDATE tablename SET tablename='$3' where ID=3;
END_SQL

How to insert a Line into a file in AIX using sed preferably?

I want to insert a line "new line" into a file "Textfile.txt" at line number 3 in AIX.
Before insertion Textfile.txt looks like
one
two
four
After Insertion Textfile.txt looks like
one
two
new line
four
I have already done it on Linux how ever with AIX I am finding it not working with solution of Linux.
Surprisingly I couldn't find a simple solution for this problem anywhere.
I am using this command in Linux and is working
echo "target_node = ${arr[0]}"
echo "target_file = ${arr[1]}"
echo "target_line = ${arr[2]}"
echo "target_text = ${arr[3]}"
escape "$(ssh -f ${arr[0]} "sed -i "${arr[2]}i$(escape ${arr[3]})" ${arr[1]}; exit")"
To sum the previous bits of information written as comments:
Option -i doesn't exist in AIX!sed, use a temporary file; the syntax of command is more strict than in Linux.
sed '2a\
Insert this after the 2nd line' "$target_file" >"$target_file.tmp"
mv -- "$target_file.tmp" "$target_file"
Hi Thanks for the help,
I created script in such a way that it copies the file to linux update changes and movies to AIX.

How can I update a single field in psql with the contents of a file?

I am trying to do the following:
UPDATE bodycontent set body=CONTENT_FROM_FILE where contentid=12943363;
I tried the following based on the highest voted answer to this question.
\set contentfill `cat Foo.txt`
UPDATE bodycontent set body=:'contentfill' where contentid=12943363;
This results in the following error.
ERROR: syntax error at or near ":"
LINE 1: UPDATE bodycontent set body=:'contentfill' where contentid=1...
Is there a clean, simple and effective way to achieve this on the psql command line?
Here is the output of psql --version:
psql (PostgreSQL) 8.4.17
After much searching, I discovered this answer, which does not directly discuss the problem of reading files, but gave me the necessary ingredient to make this work for my ancient Postgres.
\set contentFill `cat Foo.txt`
\set quoted_contentFill '\'' :contentFill '\''
UPDATE bodycontent SET body=:quoted_contentFill WHERE contentid=12943363;
Naturally, this will fail if there are un-escaped quotes inside Foo.txt, but I can easily preprocess to ensure there are none.

sqlite3 echo and '<' piping command not working as expected

I am reading a SQL book and the author is using Sqlite3, which is awesome because there is not a server to mess with.
In the book the author says to type:
sqlite3 -echo something.db < some.sql
The problem is nothing ever echos out to the terminal nor is there even a database created from the '<' redirection command.
Does anyone know what is going on...with this?
Actually, the command you show IS a proper way of creating a new database from sql dump.
Can you please show your sql file contents (cat some.sql)? The only way I can reproduce the behavior described is by feeding sqlite an empty sql file.
Try this commands and see if you can get the same result:
$ cat <<EOF > test.sql
> create table test1 (f1, f2, f3);
> insert into test1(f1, f2, f3) values ("foo", "bar", "baz");
> EOF
$ sqlite3 -echo test.db < test.sql
create table test1 (f1, f2, f3);
insert into test1(f1, f2, f3) values ("foo", "bar", "baz");
$ file test.db
test.db: SQLite 3.x database
something.db is an existing database. < some.sql means it picks data from that file and write it in the sqlite console. So both files have to exist.
something.db must be a valid l sqlite3 database file (or an empty or not existing file); some.sql must be a text file with your commands in it.
The -echo params specify that it have to print the command before execution.
According to your comments and your terminal screen shot, I get the impression that your "some.sql" file may be in the wrong encoding and/or starts with a BOM which confuses sqlite.
You can use the file command to find out, and if this is the problem use iconv, recode or your favorite text editor to convert the file into the encoding your terminal expects, and the correct line endings (\n aka "LF" in Linux, \r\n aka "CRLF" in Windows).

Execute SQL from file in bash

I'm trying to load a sql from a file in bash and execute the loaded sql. The sql file needs to be versatile, meaning it cannot be altered in order to make things easy while being run in bash (escaping special characters like * )
So I have run into some problems:
If I read my sample.sql
SELECT * FROM SAMPLETABLE
to a variable with
ab=`cat sample.sql`
and execute it
db2 `echo $ab`
I receive an sql error because by doing a cat the * has been replaced by all the files in the directory of sample.sql.
Easy solution would be to replace "" with "\" . But I cannot do this, because the file needs to stay executable in programs like DB Visualizer etc.
Could someone give me hint in the right direction?
The DB2 command line processor has options that accept a filename as input, so you shouldn't need to load statements from a text file into a shell variable.
This command will execute all SQL statements in the file, with newline treated as the statement terminator:
db2 -f sample.sql
This command will execute all SQL statements in the file, with semicolon treated as the statement terminator:
db2 -t -f sample.sql
Other useful CLP flags are:
-x : Suppress the column headings
-v : Echo the statement text immediately before execution
-z : Tee a copy of all CLP output to the filename immediately following this flag
Redirect stdin from the file.
db2 < sample.sql
In case, you have a variable used in your script and wanted to get it replaced by the shell before executed in DB2 then use this approach:
Contents of File.sql:
cat <<xEOF
insert values(1,2) into ${MY_SCHEMA}.${MY_TABLE};
select * from ${MY_SCHEMA}.${MY_TABLE};
xEOF
In command prompt do:
export MY_SCHEMA='STAR'
export MY_TAVLE='DIMENSION'
Then you are all good to get it executed in DB2:
eval File.sq |db2 +p -t
The shell will replace the global variables and then DB2 will execute it.
Hope it helps.