Does blobstor hava an opposite command, Ingres? - sql

I am using the blobstor command to load jpeg images into an ingres db, which is fine. But at some point I need to develop a manual way to copy them back out again.
I can find some examples of this that uses BCP, however these are for sql server db's. So my question is, does blobstor have an equal an opposite command to extract blobs, that can be used when select from an Ingres db. Pointers to any examples would be much appreciated.

I don't believe there is a blobstor-opposite tool which ships with Ingres, when I've had need for such a thing before now the solution was to write a short program.
As an example, here's a perl script. It uses DBI and the DBD-IngresII module. Hope it's of some use.
# Required: db=, table=, col=. Optional: user=.
# Anything else is a where clause.
use DBI;
my %p=(); my $where="";
foreach my $arg (#ARGV)
{
if ($arg =~ /(db|table|col|user)=(\S+)$/) { $p{$1}=$2; next; }
$where .= " ".$arg if($p{db} and $p{table} and $p{col});
}
die "db, table and col required.\n" if(!$p{db} or !$p{table}
or !$p{col});
my $user=""; $user=$p{user} if defined($p{user});
my $dbh=DBI->connect("dbi:IngresII:".$p{db},$user,"");
my $stm="select ".$p{col}." from ".$p{table};
$stm.=" where".$where if ($where ne "");
my $sth=$dbh->prepare($stm);
$sth->execute;
#row=$sth->fetchrow_array;
print $row[0];
$sth->finish;
$dbh->disconnect;

Related

Cronjob does not execute command line in perl script

I am unfamiliar with linux/linux environment so do pardon me if I make any mistakes, do comment to clarify.
I have created a simple perl script. This script creates a sql file and as shown, it would execute the lines in the file to be inserted into the database.
#!/usr/bin/perl
use strict;
use warnings;
use POSIX 'strftime';
my $SQL_COMMAND;
my $HOST = "i";
my $USERNAME = "need";
my $PASSWORD = "help";
my $NOW_TIMESTAMP = strftime '%Y-%m-%d_%H-%M-%S', localtime;
open my $out_fh, '>>', "$NOW_TIMESTAMP.sql" or die 'Unable to create sql file';
printf {$out_fh} "INSERT INTO BOL_LOCK.test(name) VALUES ('wow');";
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
while( my $sql_file = glob '*.sql' )
{
my $status = system ( "$SQL_COMMAND < $sql_file" );
if ( $status == 0 )
{
print "pass";
}
else
{
print "fail";
}
}
}
insert();
This works if I execute it while I am logged in as a user(I do not have access to Admin). However, when I set a cronjob to run this file let's say at 10.08am by using the line(in crontab -e):
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl > /dev/null 2>&1
I know the script is being executed as the sql file is created. However no new rows are inserted into the database after 10.08am. I've searched for solutions and some have suggested using the DBI module but it's not available on the server.
EDIT: Didn't manage to solve it in the end. A root/admin account was used to to execute the script so that "solved" the problem.
First things first, get rid of the > /dev/null 2>&1 at the end of your crontab entry (at least temporarily) so you can actually see any errors that may be occurring.
In other words, change it temporarily to something like:
08 10 * * * perl /opt/lampp/htdocs/otpms/Data_Tsunami/scripts/test.pl >/tmp/myfile 2>&1
Then you can examine the /tmp/myfile file to see what's being output.
The most likely case is that mysql is not actually on the path in your cron job, because cron itself gives a rather minimal environment.
To fix that problem (assuming that's what it is), see this answer, which gives some guidelines on how best to expand the cron environment to give you what you need. That will probably just involve adding the MySQL executable directory to your PATH variable.
The other thing you may want to consider is closing the out_fh file before trying to pass it to mysql - if the buffers haven't been flushed, it may still be an empty file as far as other processes are concerned.
The expression glob(".* *") matches all files in the current working
directory.
- http://perldoc.perl.org/functions/glob.html
you should not rely on the wd in a cron job. If you want to use a glob (or any file operation) with a relative path, set the wd with chdir first.
source: http://www.perlmonks.org/bare/?node_id=395387
So if your working directory is, for example /home/user, you should insert
chdir('/home/user/');
before the WHILE, ie:
sub insert()
{
my $SQL_COMMAND = "mysql -u $USERNAME -p'$PASSWORD' ";
chdir('/home/user/');
while( my $sql_file = glob '*.sql' )
{
...
replace /home/user with wherever your sql files are being created.
It's better to do as much processing within Perl as possible. It avoids the overhead of generating a separate shell process and leaves everything under the control of the program so that you can handle any errors much more simply
Database access from Perl is done using the DBI module. This program demonstrates how to achieve what you have written using the mysql utility. As you can see it's also much more concise
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $host = "i";
my $username = "need";
my $password = "help";
my $dbh = DBI->connect("DBI:mysql:database=test;host=$host", $username, $password);
my $insert = $dbh->prepare('INSERT INTO BOL_LOCK.test(name) VALUES (?)');
my $rv = $insert->execute('wow');
print $rv ? "pass\n" : "fail\n";

how to invoke SQL inside perl script

I am trying to connect with database and perform some SQL queries by using this code, but every time it hangs.
my $connect_str = `/osp/local/etc/.oralgn $srv_name PSMF`;
my $sqlFile = "/osp/local/home/linus/amit/mytest.sql";
my ($abc, $cde)= split (/\#/ , $connect_str );
print "$abc";
$ORACLE_SID=SDDG00;
`export $ORACLE_SID`;
#chomp($abc);
#$abc=~ s/\s+$//;
`sqlplus $abc`;
open (SQL, "$sqlFile");
while (my $sqlStatement = <SQL>) {
$sth = dbi->prepare($sqlStatement)
or die (qq(Can't prepare $sqlStatement));
$sth->execute()
or die qq(Can't execute $sqlStatement);
}
How do I invoke a SQL command inside Perl?
Reading the documentation for the DBI module would be a good start.
Your problem seems to be this line.
$sth = dbi->prepare($sqlStatement)
You're trying to call the prepare method on the class "dbi". But you don't have a class called "dbi" in your program (or, at least, I can't see one in the code you've shown us).
To use a database from Perl you need to do these things:
1/ Load the DBI module (note, "DBI", not "dbi" - Perl is case sensitive).
use DBI;
2/ Connect to the database and get a database handle (Read the DBD::Oracle documentation for more details on the arguments to the connect() method).
my $dbh = DBI->connect('dbi:Oracle:dbname', $user, $password);
3/ You can then use this database handle to prepare SQL statements.
my $sth = $dbh->prepare($sqlStatement);

Inserting a file into a Postgres bytea column using perl/SQL

I'm working with a legacy system and need to find a way to insert files into a pre-existing Postgres 8.2 bytea column using Perl.
So far my searching has lead me to believe the following:
there is no consensus best approach for this.
lo_import looks promising, but I'm apparently too perl-tarded to get it to work.
I was hoping to do something like the following
my $bind1 = "foo"
my $bind2 = "123"
my $file = "/path/to/file.ext"
my $q = q{
INSERT INTO generic_file_table
(column_1,
column_2,
bytea_column
)
VALUES
(?, ?, lo_import(?))
};
my $sth = $dbh->prepare($q);
$sth->execute($bind1, $bind2, $file);
$sth->finish();`
My script works w/o the lo_import/bytea part. But with it I get this error:
DBD::Pg::st execute failed: ERROR: column "contents" is of type bytea but expression is >of type oid at character 176
HINT: You will need to rewrite or cast the expression.
What I think I'm doing wrong is that I'm not passing the actual binary file to the DB properly. I think I'm passing the file path, but not the file itself. If that's true then what I need to figure out is how to open/read the file into a tmp buffer, and then use the buffer for the import.
Or am I way off base here? I'm open to any pointers, or alternative solutions as long as they work with Perl 5.8/DBI/PG 8.2.
Pg offers two ways to store binary files:
large objects, in the pg_largeobject table, which are referred to by an oid. Often used via the lo extension. May be loaded with lo_import.
bytea columns in regular tables. Represented as octal escapes like \000\001\002fred\004 in PostgreSQL 9.0 and below, or as hex escapes by default in Pg 9.1 and above eg \x0102. The bytea_output setting lets you select between escape (octal) and hex format in versions that have hex format.
You're trying to use lo_import to load data into a bytea column. That won't work.
What you need to do is send PostgreSQL correctly escaped bytea data. In a supported, current PostgreSQL version you'd just format it as hex, bang a \x in front, and you'd be done. In your version you'll have to escape it as octal backslash-sequences and (because you're on an old PostgreSQL that doesn't use standard_conforming_strings) probably have to double the backslashes too.
This mailing list post provides a nice example that will work on your version, and the follow-up message even explains how to fix it to work on less prehistoric PostgreSQL versions too. It shows how to use parameter binding to force bytea quoting.
Basically, you need to read the file data in. You can't just pass the file name as a parameter - how would the database server access the local file and read it? It'd be looking for a path on the server.
Once you've read the data in, you need to escape it as bytea and send that to the server as a parameter.
Update: Like this:
use strict;
use warnings;
use 5.16.3;
use DBI;
use DBD::Pg;
use DBD::Pg qw(:pg_types);
use File::Slurp;
die("Usage: $0 filename") unless defined($ARGV[0]);
die("File $ARGV[0] doesn't exist") unless (-e $ARGV[0]);
my $filename = $ARGV[0];
my $dbh = DBI->connect("dbi:Pg:dbname=regress","","", {AutoCommit=>0});
$dbh->do(q{
DROP TABLE IF EXISTS byteatest;
CREATE TABLE byteatest( blah bytea not null );
});
$dbh->commit();
my $filedata = read_file($filename);
my $sth = $dbh->prepare("INSERT INTO byteatest(blah) VALUES (?)");
# Note the need to specify bytea type. Otherwise the text won't be escaped,
# it'll be sent assuming it's text in client_encoding, so NULLs will cause the
# string to be truncated. If it isn't valid utf-8 you'll get an error. If it
# is, it might not be stored how you want.
#
# So specify {pg_type => DBD::Pg::PG_BYTEA} .
#
$sth->bind_param(1, $filedata, { pg_type => DBD::Pg::PG_BYTEA });
$sth->execute();
undef $filedata;
$dbh->commit();
Thank you to those who helped me out. It took a while to nail this one down. The solution was to open the file and store it. then specifically call out the bind variable that is type bytea. Here is the detailed solution:
.....
##some variables
my datum1 = "foo";
my datum2 = "123";
my file = "/path/to/file.dat";
my $contents;
##open the file and store it
open my $FH, $file or die "Could not open file: $!";
{
local $/ = undef;
$contents = <$FH>;
};
close $FH;
print "$contents\n";
##preparte SQL
my $q = q{
INSERT INTO generic_file_table
(column_1,
column_2,
bytea_column
)
VALUES
(?, ?, ?)
};
my $sth = $dbh->prepare($q);
##bind variables and specifically set #3 to bytea; then execute.
$sth->bind_param(1,$datum1);
$sth->bind_param(2,$datum2);
$sth->bind_param(3,$contents, { pg_type => DBD::Pg::PG_BYTEA });
$sth->execute();
$sth->finish();

Open a piped process (sqlplus) in perl and get information from the query?

Basically, I'd like to open a pipe to sqlplus using Perl, sending a query and then getting back the information from the query.
Current code:
open(PIPE, '-|', "sqlplus user/password#server_details");
while (<PIPE>) {
print $_;
}
This allows me to jump into sqlplus and run my query.
What I'm having trouble figuring out is how to let Perl send sqlplus the query (since it's always the same query), and once that's done, how can I get the information written back to a variable in my Perl script?
PS - I know about DBI... but I'd like to know how to do it using the above method, as inelegant as it is :)
Made some changes to the code, and I can now send my query to sqlplus but it disconnects... and I don't know how to get the results back from it.
my $squery = "select column from table where rownum <= 10;"
# Open pipe to sqlplus, connect to server...
open(PIPE, '|-', "sqlplus user/password#server_details") or die "I cannot fork: $!";
# Print the query to PIPE?
print PIPE $squery;
Would it be a case of grabbing the STDOUT from sqlplus and then storing it using the Perl (parent) script?
I'd like to store it in an array for parsing later, basically.
Flow diagram:
Perl script (parent) -> open pipe into sqlplus (child) -> print query on pipe -> sqlplus outputs results on screen (STDOUT?) -> read the STDOUT into an array in the Perl script (parent)
Edit: It could be that forking the process into sqlplus might not be viable using this method and I will have to use DBI. Just waiting to see if anyone else answers...
Forget screen scraping, Perl has a perfectly cromulent database interface.
I think you probably want IPC::Run. You'll be using the start function to get things going:
my $h = start \#cat, \$in, \$out;
You would assign your query to the $input variable and pump until you got the expected output in the $output variable.
$in = "first input\n";
## Now do I/O. start() does no I/O.
pump $h while length $in; ## Wait for all input to go
## Now do some more I/O.
$in = "second input\n";
pump $h until $out =~ /second input/;
## Clean up
finish $h or die "cat returned $?";
This example is stolen from the CPAN page, which you should visit if you want more examples.
If your query is static consider moving it into it's own file and having sqlplus load and execute it.
open(my $pipe, '-|', 'sqlplus', 'user/password#server_details', '#/path/to/sql-lib/your-query.sql', 'query_param_1', 'query_param_2') or die $!;
while (<$pipe>) {
print $_;
}

Execute SQL file in Perl

We have a Perl script which runs a SQL and puts data in the table.
Now instead of supplying a single SQL statement, we want to pass bunch of them putting them together in a .sql file. We know that our program will fail because it expects a single SQL statement, not s bunch of them (that too from a .sql file). How do we make it work with a .sql file (having multiple INSERT statements?). We are using the DBI package.
A small snippet of code:
$sth = $dbh->prepare("/home/user1/tools/mytest.sql");
$sth->execute || warn "Couldn't execute statement";
$sth->finish();
There is a sort of workaround for DDL. You need to slurp SQL file first and then enclose it's contents into BEGIN ... END; keywords. Like:
sub exec_sql_file {
my ($dbh, $file) = #_;
my $sql = do {
open my $fh, '<', $file or die "Can't open $file: $!";
local $/;
<$fh>
};
$dbh->do("BEGIN $sql END;");
}
This subroutine allows to run DDL (SQL) scripts with multiple statements inside (e.g. database dumps).
Not exactly sure what you want...
Once you create a DBI object, you can use it over and over again. Here I'm reading SQL statement after SQL statement from a file and processing each and every one in order:
use DBI;
my $sqlFile = "/home/user1/tools/mytest.sql"
my $dbh = DBI::Connect->new($connect, $user, $password)
or die("Can't access db");
# Open the file that contains the various SQL statements
# Assuming one SQL statement per line
open (SQL, "$sqlFile")
or die("Can't open file $sqlFile for reading");
# Loop though the SQL file and execute each and every one.
while (my $sqlStatement = <SQL>) {
$sth = dbi->prepare($sqlStatement)
or die("Can't prepare $sqlStatement");
$sth->execute()
or die("Can't execute $sqlStatement");
}
Notice that I'm putting the SQL statement in the prepare and not the file name that contains the SQL statement. Could that be your problem?
You don't need perl for this at all. Just use the mysql command line client:
mysql -h [hostname] -u[username] -p[password] [database name] < /home/user1/tools/mytest.sql
replace the [variables] with your information.
Note no space after -u or -p. If your mysql server is running on the same machine you can omit -h[hostname] (it defaults to localhost)
Here is how I've done it. In my case I dont assume one SQL per line and I assume, my example is a bit better :)
sub get_sql_from_file {
open my $fh, '<', shift or die "Can't open SQL File for reading: $!";
local $/;
return <$fh>;
};
my $SQL = get_sql_from_file("SQL/file_which_holds_sql_statements.sql");
my $sth1 = $dbh1->prepare($SQL);
$sth1->execute();