access: read CSV file into table - vba

is it possible to programmatically have access open a certain csv file and read it into a datasheet and set parameters like what time of delimiter, TEXT QUALIFIER etc, including all the steps that one takes to manually import a csv file into a table?

You can create a Scripting.FileSystemObject, then stream in the file line by line, separating fields by your delimiter, and adding each parsed record to the underlying recordset or another table.
You can also experiment with DoCmd.TransferText, as that seems the most promising built-in method.
Edit: for the more complete (and complex and arguably less efficient) solution, which would give you more control over schema and and treat any csv file like any other datasource, look at ODBC: Microsoft Text Driver - http://support.microsoft.com/kb/187670 for script examples. I believe you can already just link to any cvs files via the standard MS Access front-end, thereby allowing you to do just about any table read/copy operation on it. Otherwise, just set up a ODBC file dns entry to plug in your csv file (via Start->Program Files->Administrative Tools->Data Sources), and then you can select/link to it via your Access file.

It greatly depends on whether the Access table is already created or if you want to create it on the fly. Let's assume (for the sake of simplicity) that the table already exists (you can always go back and create the table). As suggested before you need some scripting tools:
Dim FSO As FileSystemObject, Fo As TextStream, line As String
Set FSO = CreateObject("Scripting.FileSystemObject")
Set Fo = FSO.OpenTextFile("csvfile.csv")
This will allow you to read your csv file which is a text file. You control here which delimiter you use and which date format will be employed ect. You need also to put in play the database engine:
Dim db As Database
Set db = DBEngine.OpenDatabase("mydatabase.accdb")
And this is basically all what you need. You read your csv file line by line:
While Not Fo.AtEndOfStream
line = Fo.ReadLine
Now you need to be able to format each field adequately for the table, meaning: text fields must be surrounded by quote marks ("), date fields must be surrounded by #, ect. Otherwise the database engine will noisily complain. But again, you are in charge here and you can do whatever cosmetic surgery you need to your input line. Another assumption I am making (for the sake of simplicity) is that you are comfortable programming in VBA.
So let's get to the crux of the problem:
db.Execute ("INSERT INTO myTable VALUES (" & line & ")")
Wend
At this moment, line has been changed into something edible for the database engine like, for example, if the original read line was
33,04/27/2019,1,1,0,9,23.1,10,72.3,77,85,96,95,98,10,5.4,5.5,5.1,Cashew,0
you changed it into
33,#04/27/2019#,1,1,0,9,23.1,10,72.3,77,85,96,95,98,10,5.4,5.5,5.1,"Cashew",0
One last note: It is important that each field of the csv file matches the one on the table. This includes at least, but not necessarily last, being in the right order. You must ensure that in you preprocessing stage. Hope this would put you on the right track.

Related

Query for finding all occurrences of a string in a database

I'm trying to find a specific string on my database. I'm currently using FlameRobin to open the FDB file, but this software doesn't seems to have a properly feature for this task.
I tried the following SQL query but i didn't work:
SELECT
*
FROM
*
WHERE
* LIKE '126278'
After all, what is the best solution to do that? Thanks in advance.
You can't do such thing. But you can convert your FDB file to a text file like CSV so you can search for your string in all the tables/files at the same time.
1. Download a database converter
First step you need a software to convert you databse file. I recommend using Full Convert to do it. Just get the free trial and download it. It is really easy to use and it will export each table in a different CSV file.
2. Find your string in multiple files at the same time
For that task you can use the Find in files feature of Notepad++ to search the string in all CSV files located at the same folder.
3. Open the desired table on FlameRobin
When Notepad++ highlight the string, it shows in what file it is located and the number of the line. Full Convert saves each CSV with the same name as the original table, so you can find it easily whatever database manager software you are using.
Here is Firebird documentation: https://www.firebirdsql.org/file/documentation/reference_manuals/fblangref25-en/html/fblangref25.html
You need to read about
Stored Procedures of "selectable" kind,
execute statement command, including for execute statement variant
system tables, having "relation" in names.
Then in your SP you do enumerate all the tables, then you do enumerate all the columns in those tables, then for every of them you run a usual
select 'tablename', 'columnname', columnname
from tablename
where columnname containing '12345'
over every field of every table.
But practically speaking, it most probably would be better to avoid SQL commands and just to extract ALL the database into a long SQL script and open that script in Notepad (or any other text editor) and there search for the string you need.

How can I move data from spreadsheet to a database through SQL

I want to move the data from a spreadsheet into a database. The program I am using is called SQLWorkbenchJ. I am kinda of lost and don't really know where to start. Is there any tips or ways that might point me in the right direction.
Sql Workbench/J provides the WbImportcommand in order to load a text file into a DB table. So if you save your spreadsheet file in the CSV (comma separed value) format you can then load it in a table using this command.
Here is an example to load the text file CLASSIFICATION_CODE.csvhaving ,as field delimiter and ^ as quoting character in the CLASSIFICATION_CODEDB table.
WbImport -type=text
-file='C:\dev\CLASSIFICATION_CODE.csv'
-delimiter=,
-table=CLASSIFICATION_CODE
-quoteChar=^
-badfile='C:\dev\rejected'
-continueOnError=true
-multiLine=true
-emptyStringIsNull=false;
You might not need all the parameters of the example. Refer to the documentation to find the ones you need.
If the data you have in your spreadsheet are heterogeneous (e.g. your spreadsheet has two books) then split them in two files in order to store them in separate DB tables.

Iterating through folder - handling files that don't fit schema

I have a directory containing multiple xlsx files and what I want to do is to insert the data from the files in to a database.
So far I have solved this by using tFileList -> tFileInputExcel -> tPostgresOutput
My problem begins when one of this files doesn't match the defined schema and returns an error resulting on a interruption of a workflow.
What I need to figure out is if it's possible skip that file (moving it to another folder for instance) and continuing iterating the rest of existing files.
If I check the option "Die on error" the process ends and doesn't process the rest of the files.
I would approach this by making your initial input schema on the tFileInputExcel be all strings.
After reading the file I would then validate the schema using a tSchemaComplianceCheck set to "Use another schema for compliance check".
You should be able to then connect a reject link from the tSchemaComplianceCheck to a tFileCopy configured to move the file to a new directory (if you want it to move it then just tick "Remove source file").
Here's a quick example:
With the following set as the other schema for the compliance check (notice how it now checks that id and age are Integers):
And then to move the file:
Your main flow from the tSchemaComplianceCheck can carry on using just strings if you are inserting into a database. You might want to use a tConvertType to change things back to the correct data types after this if you are doing any processing that requires proper data types or you are using your tPostgresOutput component to create the table as well.

Insert file into access table

I have a table named Reports which has 3 fields ID (auto number), filename (string field), theFile (attachment field).
What I want to is to run a SQL query and insert a PDF file into the attachments field (theFile).
Lets say the PDF file is located in the C: drive (C:\report1.pdf), I have tried the SQL query below but it is not working. I know its not good practice to store files in a database but I just want to try it out:
CurrentDb.Execute "INSERT INTO Reports (filename,theFile) VALUES ('report1'," & C:\report1.pdf & ")"
It's standard practice to store files in a database. Access certainly supports it, but not through SQL. You'll have to use DAO, as detailed at http://msdn.microsoft.com/en-us/library/office/bb258184%28v=office.12%29.aspx
"File" is not appropriate SQL data type supported in Access, available data types.
That is correct Derek, if you try to run a SQL statement like that you will get an error message of one type or another every time. I spent a fair amount of time researching this subject for my own DB, and from what I understand there are a number of options/alternatives; however, having an attachment column type and using SQL to insert a file is not an option with Access' current capabilities.
It is not bad practice to store files in a database, it is actually standard practice; however, it IS best practice to not store files in an ACCESS db. There are a few reasons for this which you can research on your own, but perhaps most notably, Access has a file size limit of 2GB, so if you store files in it you can run out of space quickly and then things get even more complicated.
Here are your options:
Change your column data type to OLE object and use some kind of stream reader to convert the files to binary data, then use a SQL statement to load them into your DB
Use the built in Access user interface for working directly with tables/attachments
Establish a DAO db connection and use Access' recordset.LoadFromFile function
Just store links to the files in the Access DB
The 4th option is the preferred method. It's very simple and you won't have to worry about complex code or the 2GB storage limit.

read text file line by line and insert/update values in table

i am exploring the option of whether DoCmd.TransferText will do what i need, and it seems like it wont. i need to insert data if it does not exist and update it if it does exist
i am planning to read a text file line by line like this:
Dim intFile As Integer
Dim strLine As String
intFile = FreeFile()
Open myFile For Input As #intFile
Line Input #intFile, strLine
Close #intFile
i guess each individual line will be a record. it will probably be comma separated and some fields will have a " text qualifier because within the field itself i will have commas
my question is how would i read a comma delimited text file that has double quotes sometimes as text qualifiers into a table in access?
I've been following your related questions, but am unsure where you're running into trouble. It seems like you're trying to accomplish two things with this text file import:
import the text file
append or update records in a table based on the imported data
Perhaps it would be more productive to tackle those steps as 2 separate operations, rather than attempting to combine them in one operation.
I thought you had step #1 working. If so, import the text into a separate table, tblImport, then use that table to drive the changes to your master table. If you're worried about database bloat due to repeatedly importing then deleting records in tblImport, you could make tblImport a link to an external database ... so as to isolate the bloat to that one.
If you want more detailed help for step #2, I think you will have to tell us how to identify which imported records don't exist in the master table, and how to match up imported and master records for the updates. Perhaps you could delete existing master records which have counterparts in tblImport, then append all of tblImport to your master table.
I'm assuming here that your text file is well formatted / delimited. As such, try recording a macro, then use the import wizard. [The import wizard can be activated by command line and have all necessary values passed into it by the function call.] Then copy / move the code that is generated by the macro into your method, providing for whatever variables you like (this is a method I've used in the past).