How to Insert BLOB Values - sql

I have the following table, FILES:
create table files(
id number,
file_name varchar2(25),
file_data blob);
I would like to be able to store data about binary files located on my computer in this table. However, when converting a file on my computer to hex, the string is too long to be inserted as Oracle will not work with string literals that have a length greater than 4,000. How may I insert a record into this table?

Usually what you do is:
You create an empty "Blob" object in your application.
You insert the empty Blob into the database as one of the columns of the row.
Then, in the same transaction, you retrieve an "output stream" from the Blob object you just inserted.
You send data to the output stream until all bytes are sent.
You close the output stream.
You commit the transaction.
It's a really bad practice to load entire files into memory and then insert them into the database. Use streaming instead.

Related

ORACLE APEX - How can I insert form data into multiple tables, in my case- one table is storing text data while other table is for BLOB?

I have a form to target 2 tables - one for text data, 2nd for documents (BLOB)
for stroing text data eg. Name, address details etc. - I am using process which executes PL/SQL procedure on passed page item values (in procedure i am calling next val from seq and inserting)
need help if I can use different process on same form item (blob type, file browse) for inserting blob into respective table.
or if i can use same process, how would i pass blob to procedure?
I am able to insert BLOB, if i use table based on blob table only by using Form - Automatic Row Processing (DML) option (identification section). But for other form elements I am using "Sql code" option.

data is converting to binary format while loading data into monet db using Apache pig

I am using MonetDb-Pig layer to load the csv data into Monet db. Internally it is using Binarybulkload commands to load the data but after loading data into table, the csv file values are not not matching with Monet db table values(int ,double).Seems to be data converted into binary format.
How can we get back the actual values in monetdb? .
Table Structure that I am using
CREATE TABLE "test" (
"s_suppkey" INT,
"s_name" CLOB,
"s_address" CLOB,
"s_nationkey" INT,
"s_phone" CLOB,
"s_acctbal" DOUBLE,
"s_comment" CLOB
);
Load command that I am using
COPY BINARY INTO "test" FROM (
'$PATH/part-1/col-0.bulkload',
'$PATH/part-1/col-1.bulkload',
'$PATH/part-1/col-2.bulkload',
'$PATH/part-1/col-3.bulkload',
'$PATH/part-1/col-4.bulkload',
'$PATH/part-1/col-5.bulkload',
'$PATH/part-1/col-6.bulkload'
);
please convert byte buffer from BigEndian to LittleEndian, and check
The information provided is insufficient to isolate the issue. The most probable issue is a mis-alignment of the number of values in each of the binary columns.
Check the size of the elements in 's_acctbal' input file, to see if it
produced Floats instead of Double binary values.
btw. the MonetDB-Pig project is not actively maintained, but we welcome
patches.

BULK INSERT from VARBINARY containing CSV file

I have a CSV file that needs to be BULK INSERTed in my database.
The actual scheme is:
Client generates file01.csv
Client moves file01.csv into the shared folder \\SERVERNAME\Sharing, that points to C:\Data in the Server
Client tells database the file is called file01.csv
Server BULK INSERTs C:\Data\file01.csv into the final table
Server removes the file01.csv from its queue
(It'll be deleted later)
The Windows shared folders are a bit buggy and unstable, so I want to make it a bit different:
Client generates file01.csv
Client inserts file01.csv in VARBINARY(MAX) column
Server simulates the CSV from the VARBINARY and BULK INSERTs it into the final table
(without generating any file in the server side)
The only way I found to make the second option happen is:
Server generates temp.csv from the VARBINARY data
Server BULK INSERTs temp.csv into the final table
(It'll be deleted later)
Is there a way to use a VARBINARY variable instead of a file in the BULK INSERT?
Or if it isn't possible, is there a better way to do this?
(Searched Google for a answer and found only how to read a VARBINARY value from a CSV file, so my question may be a duplicate)
One way you can do what you describe is to create an SSIS package (or console app for that matter) with a script task that reads the Varbinary column into a single-column DataTable, parses into a "final-table-formatted" data table, and then does the bulk insert. The whole process would be in-memory.
You could insert the varbinary(max) data into a FileTable. SQL Server would actually write the data to a local file, which you could then process with BULK INSERT.

Re-create a dropped table with schema and content

I had a table named XXXX. Suddenly, my table's contents were dropped (DROP table). I want to recreate my table, I know the schema of my table and I am having a text file with contents of my table before dropped.
Can I recreate my table without insert each row because it would take
a long time?
Is there any easy way to transfer the contents of text file to table
independent of the query language?
I'm using (MSSQL,POSTGRESQL,etc...)?
On PotsgreSQL
Procedure will be :
CREATE TABLE tablename (your fields);
Then admit your file is with good fields separators. The default is a tab character in text format, a comma in CSV format, do :
COPY tablename FROM 'path-to-your-file/filename.txt' WITH DELIMITER ',';
The path to your file should be accessible by the PostgreSQL server to /tmp is usually the more simple way to put your file.
Then you get back your table.

Reading the value from text file and updating it to a field in sql table

I have atext file with data like
Patient name: Patient1 Medical rec #: A1Admit date: 04/26/2009 Discharge date: 04/26/2009
DRG: 982 and so on.
In the format as given above i am having several records in text file.each field is separated by colon
I have to read this file and find out values and update corresponding fields in my sql table.Say drg value 982 has to be updated in drg column of sql table)
Please help in doing it through sql query or ssis package.
If I get this task I'll use SSIS.
Create 2 DataSources: flat file (for text file) and SQL Server connection
Use Lookup task to lookup value from text file for each record in the db table
Use execute SQL Task to update records by lookuped value
You MIGHT try doing this by means of BULK INSERT.
Create a temp-table to get hold the new values
BULK INSERT the file into said table (**)
[optionally do some data-enrichment/cleaning here]
merge the information from the temp-table into the actual table
The only problem with this MIGHT be that
the server cannot access the file directly (eg. when the file is on a
network share)
the file is of a format that can't be handled by BULK INSERT
Given the example data above you might need to load the data into one big column and then do the splitting into different columns by means of creative-sql (PatIndex, substring, the works...). You might try giving colon as a field-separator, but you'll still end up with data that needs (quite a bit) of cleaning.