read text file line by line and insert/update values in table - sql

i am exploring the option of whether DoCmd.TransferText will do what i need, and it seems like it wont. i need to insert data if it does not exist and update it if it does exist
i am planning to read a text file line by line like this:
Dim intFile As Integer
Dim strLine As String
intFile = FreeFile()
Open myFile For Input As #intFile
Line Input #intFile, strLine
Close #intFile
i guess each individual line will be a record. it will probably be comma separated and some fields will have a " text qualifier because within the field itself i will have commas
my question is how would i read a comma delimited text file that has double quotes sometimes as text qualifiers into a table in access?

I've been following your related questions, but am unsure where you're running into trouble. It seems like you're trying to accomplish two things with this text file import:
import the text file
append or update records in a table based on the imported data
Perhaps it would be more productive to tackle those steps as 2 separate operations, rather than attempting to combine them in one operation.
I thought you had step #1 working. If so, import the text into a separate table, tblImport, then use that table to drive the changes to your master table. If you're worried about database bloat due to repeatedly importing then deleting records in tblImport, you could make tblImport a link to an external database ... so as to isolate the bloat to that one.
If you want more detailed help for step #2, I think you will have to tell us how to identify which imported records don't exist in the master table, and how to match up imported and master records for the updates. Perhaps you could delete existing master records which have counterparts in tblImport, then append all of tblImport to your master table.

I'm assuming here that your text file is well formatted / delimited. As such, try recording a macro, then use the import wizard. [The import wizard can be activated by command line and have all necessary values passed into it by the function call.] Then copy / move the code that is generated by the macro into your method, providing for whatever variables you like (this is a method I've used in the past).

Related

How to check if filenames in specific folder are inside database table?

Start situation:
a folder with lots of files (images mostly png).
two tables in database (MariaDB) which contains supposed image filenames. I query the filenames like this:
select filename from table1
UNION
select filename from table2;
I want to know if I have files not registered in the database tables.
My first approach is to put the list of filenames inside a textfile (I've used Linux command line, list is filename per line), but I don't know how to continue.
I can't write in the database. UPDATED. I got more auth to perform my job tasks. Therefore I solved this with the suggestion.
You need to do couple of things:
Import the file with list of files into the database.
Use cursor to go through the list of file names and match it to your table list which contains the list-b of file names
To import file name you can use the import method or directly read the file as a table virtually.
Thanks

Query for finding all occurrences of a string in a database

I'm trying to find a specific string on my database. I'm currently using FlameRobin to open the FDB file, but this software doesn't seems to have a properly feature for this task.
I tried the following SQL query but i didn't work:
SELECT
*
FROM
*
WHERE
* LIKE '126278'
After all, what is the best solution to do that? Thanks in advance.
You can't do such thing. But you can convert your FDB file to a text file like CSV so you can search for your string in all the tables/files at the same time.
1. Download a database converter
First step you need a software to convert you databse file. I recommend using Full Convert to do it. Just get the free trial and download it. It is really easy to use and it will export each table in a different CSV file.
2. Find your string in multiple files at the same time
For that task you can use the Find in files feature of Notepad++ to search the string in all CSV files located at the same folder.
3. Open the desired table on FlameRobin
When Notepad++ highlight the string, it shows in what file it is located and the number of the line. Full Convert saves each CSV with the same name as the original table, so you can find it easily whatever database manager software you are using.
Here is Firebird documentation: https://www.firebirdsql.org/file/documentation/reference_manuals/fblangref25-en/html/fblangref25.html
You need to read about
Stored Procedures of "selectable" kind,
execute statement command, including for execute statement variant
system tables, having "relation" in names.
Then in your SP you do enumerate all the tables, then you do enumerate all the columns in those tables, then for every of them you run a usual
select 'tablename', 'columnname', columnname
from tablename
where columnname containing '12345'
over every field of every table.
But practically speaking, it most probably would be better to avoid SQL commands and just to extract ALL the database into a long SQL script and open that script in Notepad (or any other text editor) and there search for the string you need.

"UNLOAD" data tables from AWS Redshift and make them readable as CSV

I am currently trying to move several data tables in my current AWS instance's redshift database to a new database in a different AWS instance (for background my company has acquired a new one and we need to consolidate to on instance of AWS).
I am using the UNLOAD command below on a table and I plan on making that table a csv then uploading that file to the destination AWS' S3 and using the COPY command to finish moving the table.
unload ('select * from table1')
to 's3://destination_folder'
CREDENTIALS 'aws_access_key_id=XXXXXXXXXXXXX;aws_secret_access_key=XXXXXXXXX'
ADDQUOTES
DELIMITER AS ','
PARALLEL OFF;
My issue is that when I change the file type to .csv and open the file I get inconsistencies with the data. there are areas where many rows are skipped and on some rows when the expected columns end I get additional columns with the value "f" for unknown reasons. Any help on how I could achieve this transfer would be greatly appreciated.
EDIT 1: It looks like fields with quotes are having the quotes removed. Additionally fields with commas are having the commas separated away. I've identified some fields with quotes and commas and they are throwing everything off. Would the addquotes clause I have apply to the entire field regardless of whether there are quotes and commas within the field?
Default document will have extension as txt and with quotes. Try to open it with Excel and then save as csv file.
refer https://help.xero.com/Q_ConvertTXT

How do I import data from a csv when the records are not separated by line breaks but with brackets

Looking at the AM data, just for a data analysis project and I'm having trouble importing the data into my dbms (postgresql).
My code is sql code is this:
DROP TABLE IF EXISTS member_details;
CREATE TABLE member_details(
pnum varchar(255),
.....
updatedon timestamp);
COPY member_details
FROM '/Users/etc/data/sample_dump.csv'
WITH DELIMITER ','
CSV;
Problem is that the csv file has no line breaks to separate the data, instead each record is within a bracket which my code above does not recognise and thus just imports all the data into the header in one line and so no records are created
how the data is structured
(dataA1, ....,dataAx),(dataB1,...,dataBx)
How can I alter my code so that postgresql imports the data record by record by recognising the brackets.
Based on the PostgreSQL COPY documentation, I don't believe it allows for row delimiters other than carriage returns and/or line feeds. I believe you'll need to process your file before importing. You can simply replace all ,( with \n(, then replace all the parenthesis to make it a standard csv format that COPY will happy consume.
Perhaps there's another method for PostgreSQL that would work too, but I haven't come across anything yet.

access: read CSV file into table

is it possible to programmatically have access open a certain csv file and read it into a datasheet and set parameters like what time of delimiter, TEXT QUALIFIER etc, including all the steps that one takes to manually import a csv file into a table?
You can create a Scripting.FileSystemObject, then stream in the file line by line, separating fields by your delimiter, and adding each parsed record to the underlying recordset or another table.
You can also experiment with DoCmd.TransferText, as that seems the most promising built-in method.
Edit: for the more complete (and complex and arguably less efficient) solution, which would give you more control over schema and and treat any csv file like any other datasource, look at ODBC: Microsoft Text Driver - http://support.microsoft.com/kb/187670 for script examples. I believe you can already just link to any cvs files via the standard MS Access front-end, thereby allowing you to do just about any table read/copy operation on it. Otherwise, just set up a ODBC file dns entry to plug in your csv file (via Start->Program Files->Administrative Tools->Data Sources), and then you can select/link to it via your Access file.
It greatly depends on whether the Access table is already created or if you want to create it on the fly. Let's assume (for the sake of simplicity) that the table already exists (you can always go back and create the table). As suggested before you need some scripting tools:
Dim FSO As FileSystemObject, Fo As TextStream, line As String
Set FSO = CreateObject("Scripting.FileSystemObject")
Set Fo = FSO.OpenTextFile("csvfile.csv")
This will allow you to read your csv file which is a text file. You control here which delimiter you use and which date format will be employed ect. You need also to put in play the database engine:
Dim db As Database
Set db = DBEngine.OpenDatabase("mydatabase.accdb")
And this is basically all what you need. You read your csv file line by line:
While Not Fo.AtEndOfStream
line = Fo.ReadLine
Now you need to be able to format each field adequately for the table, meaning: text fields must be surrounded by quote marks ("), date fields must be surrounded by #, ect. Otherwise the database engine will noisily complain. But again, you are in charge here and you can do whatever cosmetic surgery you need to your input line. Another assumption I am making (for the sake of simplicity) is that you are comfortable programming in VBA.
So let's get to the crux of the problem:
db.Execute ("INSERT INTO myTable VALUES (" & line & ")")
Wend
At this moment, line has been changed into something edible for the database engine like, for example, if the original read line was
33,04/27/2019,1,1,0,9,23.1,10,72.3,77,85,96,95,98,10,5.4,5.5,5.1,Cashew,0
you changed it into
33,#04/27/2019#,1,1,0,9,23.1,10,72.3,77,85,96,95,98,10,5.4,5.5,5.1,"Cashew",0
One last note: It is important that each field of the csv file matches the one on the table. This includes at least, but not necessarily last, being in the right order. You must ensure that in you preprocessing stage. Hope this would put you on the right track.