Foxbase to postrgresql data transfer. (dbf files reader) - sql

I rewrite a program based on the old Foxbase database consisting of files .dbf. I need a tool that would read these files, and helped in the transfer of data to PostgreSQL. You know maybe some of this type of tool?

pgdbf.sourceforge.net - has worked for all the DBF I've fed it. Quoting the site description:
PgDBF is a program for converting XBase databases - particularly
FoxPro tables with memo files - into a format that PostgreSQL can
directly import. It's a compact C project with no dependencies other
than standard Unix libraries.
If you are looking for something to run on Windows, and this doesn't compile directly, you could use cygwin (www.cygwin.com) to build and run pgdbf.

As part of the migration path you could use Python and my dbf module. A very simple script to convert the dbf files to csv would be:
import sys
import dbf
dbf.export(sys.argv[1])
which will create a .csv file of the same name as the dbf file. If you put that code into a script named dbf2csv.py you could then call it as
python dbf2csv.py dbfname
Hopefully there are some handy tools to get the csv file into PostgreSQL.

Related

Is there an alternative way to import data into Postgres than using psql?

I am under strict corporate environment and don't have access to Postgres' psql. Therefore I can't do what's shown e.g. in the SO Convert SQLITE SQL dump file to POSTGRESQL. However, I can generate the sqlite dump file .sql. The resulting dump.sql file is 1.3gb big.
What would be the best way to import this data into Postgres? I also have DBeaver and can connect to both databases simultaneously but unfortunately can't do INSERT from SELECT.
I think the term for that is 'absurd', not 'strict'.
DBeaver has an 'execute script' feature. But who knows, maybe it will be blocked.
EnterpriseDB offers binary downloads. If you unzip those to a local drive you might be able to execute psql from the bin subdirectory.
If you can install psycopg2 or pg8000 for python, you should be able to connect to the database and then loop over the dump file sending each line to the database with cur.execute(line) . It might take some fiddling if the dump file has any multi-line commands, but the example you linked to doesn't show any of those.

Converting CSV to SQLite (db file)

I have downloaded a CSV file and am trying to use it for a SQL project (I am using Jupyter notebooks). Do I even need the CSV file or is there a way to use it without downloading it? I'm very new to all of this!
This is the link to the data that I downloaded:
https://github.com/new-york-civil-liberties-union/NYPD-Misconduct-Complaint-Database
What's your goal? Are you trying to learn SQL, or do you just want to work with the data?
If all you want is to load that csv into a table in a SQLite database, it would be easiest to do using the sqlite command line shell.
I'm on Windows, so forgive me if you aren't...
Open the Command Prompt
Navigate to the folder in which you want the new sqlite db file (e.g. cd C:\Users\User\Data)
sqlite3 NewDBName.db (e.g. sqlite3 MyNewDb.db)
.mode csv
.import path/to/downloaded/csv.csv NewTableName (e.g. .import C:\Users\User\Downloads\CCRB_database_raw.csv CCRB
That should be it. You can check it worked by running .schema- you should see the structure of your new tables.
Now you can try out some sql statements:
SELECT * FROM CCRB LIMIT 10;
Here are some more detailed instructions.

New to SQL, how can I import multiple .CSV files?

I'm new to SQL with a background in OOL type programming.
I've created the following query which successfully imports .CSV files into a table on SQL Management Studio. How would I go about importing multiple files? This would be quite straightforward using an OOL language, although I heard you have to read directories using cmd?
The working code is as following:
--Cihans Import for Holdings--
INSERT INTO Holdings1
SELECT * FROM OPENROWSET
('MSDASQL', 'Driver={Microsoft Access Text Driver (*.txt, *.csv)}
;DBQ=C:\Share\DataUploads\FundHoldings;','SELECT * FROM holdings.csv')
How would I go about looping/reading all files in a directory and importing data? We are given over 120 sheets a month, and I would like to import using the above. Or if anyone can recommend an alternative to this?
Do you have access to SSIS? (SQL Server Integration Services)?
If so you can set up an ETL task using this to import all files in a folder using a FOREACH LOOP CONTAINER which contains your data flow task & file system task?
Edit - Response to comment - The Solution in this thread should enable you to do what you need using just T-SQL: loop through files in folder
Basically - load all the files into a temptable, create a while loop, passing in a new filename from the folder each execution, then perform your data manipulation, then pass in the next filename until all are complete.

How to convert a bunch of .btr and .lck files to a readable SQL?

I have a bunch of .btr and .lck files and I need to import those to a SQL Server Data Base.
How can I do that?
.LCK files are lock files. You can't (and don't need to) read those directly. The .BTR files are the data files. Do you have DDF files (FILE.DDF, FIELD.DDF, INDEX.DDF)? If so, you should be able to download a trial version of Pervasive PSQL v11 from www.pervasivedb.com. Once you've installed the trial version, you can create an ODBC DSN pointing to your data and then use SSIS or DTS or any number of programs to export the data from PSQL and import it to MS SQL.
If you don't have DDFs, you would need to either get them or create them. The DDFs describe record structure of each data file.

import csv to sql

I have to import an csv file to SQL database table which already created (empty and has the same number of named columns). It would be great if you could suggest any tutorials or give some tips.
I assume you are using Microsoft SQL Server. Do you need to do this in a program or manually? There is a tutorial on using the bcp command for that, or alternatively a SQL command. If you need to parse the CSV file for your own code, see this previous SO question.