How to convert a bunch of .btr and .lck files to a readable SQL? - sql

I have a bunch of .btr and .lck files and I need to import those to a SQL Server Data Base.
How can I do that?

.LCK files are lock files. You can't (and don't need to) read those directly. The .BTR files are the data files. Do you have DDF files (FILE.DDF, FIELD.DDF, INDEX.DDF)? If so, you should be able to download a trial version of Pervasive PSQL v11 from www.pervasivedb.com. Once you've installed the trial version, you can create an ODBC DSN pointing to your data and then use SSIS or DTS or any number of programs to export the data from PSQL and import it to MS SQL.
If you don't have DDFs, you would need to either get them or create them. The DDFs describe record structure of each data file.

Related

Is there an alternative way to import data into Postgres than using psql?

I am under strict corporate environment and don't have access to Postgres' psql. Therefore I can't do what's shown e.g. in the SO Convert SQLITE SQL dump file to POSTGRESQL. However, I can generate the sqlite dump file .sql. The resulting dump.sql file is 1.3gb big.
What would be the best way to import this data into Postgres? I also have DBeaver and can connect to both databases simultaneously but unfortunately can't do INSERT from SELECT.
I think the term for that is 'absurd', not 'strict'.
DBeaver has an 'execute script' feature. But who knows, maybe it will be blocked.
EnterpriseDB offers binary downloads. If you unzip those to a local drive you might be able to execute psql from the bin subdirectory.
If you can install psycopg2 or pg8000 for python, you should be able to connect to the database and then loop over the dump file sending each line to the database with cur.execute(line) . It might take some fiddling if the dump file has any multi-line commands, but the example you linked to doesn't show any of those.

Modify my existing SSIS package to perform this specific operation

I have an SSIS package created using SSDT and running as a job on SQL Server 2014.
This SSIS package retrieves an Excel file (.xlsx) from a specific folder and exports its content into a specific table on my SQL Server database. The package runs fine.
My Data Flow is in the following sequence:
Import Excel file from folder
Apply a Conditional Split to split data with today's date
Export the data into the SQL Server table in the database
Here is my problem:
I will now have 4 additional Excel files into that folder and they will need to be exported into that same SQL Server table.
So what is the best way forward to achieve this (assuming all of them are possible solutions):
Rewrite 4 additional SSIS packages from scratch?
Use “Save As” existing package with a new name (4 times) and modify the file name to be retrieved?
Modify my existing SSIS package to accommodate for the additional 4 Excel files?
Any help would appreciated.
Assuming the 4 excel files are the same structure and going to the same table, you'll want to use the ForEach loop for each file in the folder.
SentryOne has a good example of looping through each file in a folder and archiving. I imagine it can be adjusted for your use case.

copy blob data into on-premise sql table

My problem statement is that I have a csv blob and I need to import that blob into a sql table. Is there an utility to do that?
I was thinking of one approach, that first to copy blob to on-premise sql server using AzCopy utility and then import that file in sql table using bcp utility. Is this the right approach? and I am looking for 1-step solution to copy blob to sql table.
Regarding your question about the availability of a utility which will import data from blob storage to a SQL Server, AFAIK there's none. You would need to write one.
Your approach seems OK to me. Though you may want to write a batch file or something like that to automate the whole process. In this batch file, you would first download the file on your computer and the run the BCP utility to import the CSV in SQL Server. Other alternatives to writing batch file are:
Do this thing completely in PowerShell.
Write some C# code which makes use of storage client library to download the blob and once the blob is downloaded, start the BCP process in your code.
To pull a blob file into an Azure SQL Server, you can use this example syntax (this actually works, I use it):
BULK INSERT MyTable
FROM 'container/folder/folder/file'
WITH ( DATA_SOURCE = 'ds_blob',BATCHSIZE=10000,FIRSTROW=2);
MyTable has to have identical columns (or it can be a view against a table that yields identical columns)
In this example, ds_blob is an external data source which needs to be created beforehand (https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql)
The external data source needs to use a database contained credential, which uses an SAS key which you need to generate beforehand from blob storage https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-scoped-credential-transact-sql)
The only downside to this mehod is that you have to know the filename beforehand - there's no way to enumerate them from inside SQL Server.
I get around this by running powershell inside Azure Automation that enumerates blobds and writes them into a queue table beforehand

Foxbase to postrgresql data transfer. (dbf files reader)

I rewrite a program based on the old Foxbase database consisting of files .dbf. I need a tool that would read these files, and helped in the transfer of data to PostgreSQL. You know maybe some of this type of tool?
pgdbf.sourceforge.net - has worked for all the DBF I've fed it. Quoting the site description:
PgDBF is a program for converting XBase databases - particularly
FoxPro tables with memo files - into a format that PostgreSQL can
directly import. It's a compact C project with no dependencies other
than standard Unix libraries.
If you are looking for something to run on Windows, and this doesn't compile directly, you could use cygwin (www.cygwin.com) to build and run pgdbf.
As part of the migration path you could use Python and my dbf module. A very simple script to convert the dbf files to csv would be:
import sys
import dbf
dbf.export(sys.argv[1])
which will create a .csv file of the same name as the dbf file. If you put that code into a script named dbf2csv.py you could then call it as
python dbf2csv.py dbfname
Hopefully there are some handy tools to get the csv file into PostgreSQL.

import csv to sql

I have to import an csv file to SQL database table which already created (empty and has the same number of named columns). It would be great if you could suggest any tutorials or give some tips.
I assume you are using Microsoft SQL Server. Do you need to do this in a program or manually? There is a tutorial on using the bcp command for that, or alternatively a SQL command. If you need to parse the CSV file for your own code, see this previous SO question.