I want to export data from my database to a csv file using IBEscript for a Firebird database. Everything works fine just the columns with blob data don't work. they just miss within the csv file.
When I run the same query in the IBexpert and export the data to csv I check the box "export text blob values" and the data is included. How can I use this option with the script as well?
I need the script since I want an automatic export using the task planer.
Thanks!!!
csv format with blobs makes no sense,since any delimiter (tab,semikolon, etc) can be part of the blob and will destroy your structure.
but automated import/export also with blobs is no real problem in scripts (ibeblock is not available in personal edition)
http://ibexpert.net/ibe/index.php?n=Doc.InsertingFileDataIntoADatabase
http://ibexpert.net/ibe/index.php?n=Doc.IbecLoadFromFile
http://ibexpert.net/ibe/index.php?n=Doc.IbecSaveToFile
Holger
www.ibexpert.com
Related
I have a pipeline that creates a dataset from a stored procedure on Azure SQL Server.
I want to then manipulate it in a power query step within the factory, but it fails to load in the power query editor with this error.
It opens up the JSON file (to correct it, I assume) but I can't see anything wrong with it.
If I download the extract from blob and upload it again as a .csv then it works fine.
The only difference I can find is that if I upload a blob direct to storage then the file information for the blob looks like this:
If I just let ADF create the .csv in blob storage the file info looks like this:
So my assumption is that somewhere in the process in ADF that creates the .csv file it's getting something wrong, and the Power Query module can't recognise it as a valid file.
All the other parts of the pipeline (Data Flows, other datasets) recognise it fine, and the 'preview data' brings it up correctly. It's just PQ that won't read it.
Any thooughts? TIA
I reproduced the same. When data is copied from SQL database to BLOB as csv file, Power query is unable to read. Also, Power query doesn't support json file. But when I tried to download the csv file and reupload, it worked.
Below are the steps to overcome this issue.
When I tried to upload the file in Blob and create the dataset for that file in power query, Schema is imported from connection/Store. It forces us to import schema either from connection/store or from sample file. There is no option as none here.
When data is copied from sql db to Azure blob, and dataset which uses the blob storage didn't have schema imported by default.
Once imported the schema, power query activity ran successfully.
Output before importing schema in dataset
After importing schema in dataset
I got a table with 60000 images as Hex like :
0xFFD8FFE000104A4649460001010100480...
How can I export all of them as real image in a folder? It is possible to do with a query?
You can use BCP:
The bulk copy program utility (bcp) bulk copies data between an
instance of MicrosoftSQL Server and a data file in a user-specified
format. The bcp utility can be used to import large numbers of new
rows into SQL Server tables or to export data out of tables into data
files.
here you have definition: LINK1
and here you have an example: LINK2
hope this help!
Any ideas how I can extract data from a SQL database, put it into a CSV file in a specific format and push it to an external url?
Most SQL databases have some sort of Export utility that can produce a CSV file. Google "Export " and you should find it. It is not a part of SQL standard, so every product does it differently.
My problem statement is that I have a csv blob and I need to import that blob into a sql table. Is there an utility to do that?
I was thinking of one approach, that first to copy blob to on-premise sql server using AzCopy utility and then import that file in sql table using bcp utility. Is this the right approach? and I am looking for 1-step solution to copy blob to sql table.
Regarding your question about the availability of a utility which will import data from blob storage to a SQL Server, AFAIK there's none. You would need to write one.
Your approach seems OK to me. Though you may want to write a batch file or something like that to automate the whole process. In this batch file, you would first download the file on your computer and the run the BCP utility to import the CSV in SQL Server. Other alternatives to writing batch file are:
Do this thing completely in PowerShell.
Write some C# code which makes use of storage client library to download the blob and once the blob is downloaded, start the BCP process in your code.
To pull a blob file into an Azure SQL Server, you can use this example syntax (this actually works, I use it):
BULK INSERT MyTable
FROM 'container/folder/folder/file'
WITH ( DATA_SOURCE = 'ds_blob',BATCHSIZE=10000,FIRSTROW=2);
MyTable has to have identical columns (or it can be a view against a table that yields identical columns)
In this example, ds_blob is an external data source which needs to be created beforehand (https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql)
The external data source needs to use a database contained credential, which uses an SAS key which you need to generate beforehand from blob storage https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql)
The only downside to this mehod is that you have to know the filename beforehand - there's no way to enumerate them from inside SQL Server.
I get around this by running powershell inside Azure Automation that enumerates blobds and writes them into a queue table beforehand
I want to create update script for BLOB column in Table which stores XSL Data in ORACLE. Can anybody help me in simple way without creating any directory. Here number of character involved is also more than 4000.
I have modified in TOAD by 'Save to File' and again from 'Load to File'. Now I want to transfer it to some other database using SQL Script.
Using the Oracle IMP and EXP utilities you can export a table into a file and import it into another database. Here is some information on how to use them:
http://www.orafaq.com/wiki/Import_Export_FAQ
It is not SQL but it also doesn't involve creating directories.