Access SQL, importing txt files, delimitation - sql

I have a question regarding Access queries. The below query will import data from a txt file into a new table. I have tested it, and it delimits by comma. Is there any way to change the delimitation character?
SELECT * INTO NewTable
FROM [Text;HDR=Yes;FMT=Delimited;Database=C:\Docs].Test.csv

Is there any way to change the delimitation character?
Yes. You can use a schema.ini file:
"When the Text driver is used, the format of the text file is determined by using a schema information file. The schema information file is always named Schema.ini and always kept in the same directory as the text data source. The schema information file provides the IISAM with information about the general format of the file, the column name and data type information, and several other data characteristics."
For the complete story see
Schema.ini File (Text File Driver)

Related

SQL How can I copy a csv file into a table with this delimiter problem?

I'm trying to copy a csv file into a table. The delimiter is ',' but the csv file has a field named 'Description' where it also uses ',' but not as a delimiter. As part of a text.
How could I copy the csv file into the Import table?
If the comma is always within the double quotes then it shouldn't be a problem.
If not, you have a corrupt CSV file. The simplest way is probably to parse your file prior to importing to fix the corruption.
The details of how exactly to parse will depend on the dataset. Which fields are optional? which fields are compulsory? How many commas can occur at most? That kind of information is crucial for writing a parsing script.

Query for finding all occurrences of a string in a database

I'm trying to find a specific string on my database. I'm currently using FlameRobin to open the FDB file, but this software doesn't seems to have a properly feature for this task.
I tried the following SQL query but i didn't work:
SELECT
*
FROM
*
WHERE
* LIKE '126278'
After all, what is the best solution to do that? Thanks in advance.
You can't do such thing. But you can convert your FDB file to a text file like CSV so you can search for your string in all the tables/files at the same time.
1. Download a database converter
First step you need a software to convert you databse file. I recommend using Full Convert to do it. Just get the free trial and download it. It is really easy to use and it will export each table in a different CSV file.
2. Find your string in multiple files at the same time
For that task you can use the Find in files feature of Notepad++ to search the string in all CSV files located at the same folder.
3. Open the desired table on FlameRobin
When Notepad++ highlight the string, it shows in what file it is located and the number of the line. Full Convert saves each CSV with the same name as the original table, so you can find it easily whatever database manager software you are using.
Here is Firebird documentation: https://www.firebirdsql.org/file/documentation/reference_manuals/fblangref25-en/html/fblangref25.html
You need to read about
Stored Procedures of "selectable" kind,
execute statement command, including for execute statement variant
system tables, having "relation" in names.
Then in your SP you do enumerate all the tables, then you do enumerate all the columns in those tables, then for every of them you run a usual
select 'tablename', 'columnname', columnname
from tablename
where columnname containing '12345'
over every field of every table.
But practically speaking, it most probably would be better to avoid SQL commands and just to extract ALL the database into a long SQL script and open that script in Notepad (or any other text editor) and there search for the string you need.

How to create format files using bcp from flat files

I want to use a format file to help import a comma delimited file using bulk insert. I want to know how you generate format files from a flat file source. The microsoft guidance on this subjects makes it seem as though you can only generate a format file from a SQL table. But I want it to look at text file and tell me what the delimiters are in that file.
Surely this is possible.
Thanks
The format file can, and usually does include more than just delimiters. It also frequently includes column data types, which is why it can only be automatically generated from the Table or view the data is being retrieved from.
If you need to find the delimiters in a flat file, I'm sure there are a number of ways to create a script that could accomplish that, as well as creating a format file.

Bulk Import of CSV into SQL Server

I am having a .CSV file that contain more than 1,00,000 rows.
I have tried the following method to Import the CSV into table "Root"
BULK INSERT [dbo].[Root]
FROM 'C:\Original.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
But there are so many errors like check your Terminators.
I opened the CSV with Notepad.
There is no Terminator , or \n. I find at end of the row a square box is there.
please help me to import this CSV into table.
http://msdn.microsoft.com/en-us/library/ms188609.aspx
Comma-separated value (CSV) files are not supported by SQL Server bulk-import operations. However, in some cases, a CSV file can be used as the data file for a bulk import of data into SQL Server. Note that the field terminator of a CSV file does not have to be a comma. To be usable as a data file for bulk import, a CSV file must comply with the following restrictions:
Data fields never contain the field terminator.
Either none or all of the values in a data field are enclosed in quotation marks ("").
Note: There may be other unseen characters that need to be stripped from the source file. VIM (command ":set list") or Notepad++(View > Show Symbol > Show All Characters) are two methods to check.
If you are comfortable with Java, I have written a set of tools for CSV manipulation, including an importer and exporter. The project is up on Github.com:
https://github.com/carlspring/csv-db-tools
The importer is here:
https://github.com/carlspring/csv-db-tools/tree/master/csv-db-importer
For instructions on how to use the importer, check:
https://github.com/carlspring/csv-db-tools/blob/master/csv-db-importer/USAGE
You will need to make a simple mapping file. An example can be seen here:
https://github.com/carlspring/csv-db-tools/blob/master/csv-db-importer/src/test/resources/configuration-large.xml

access: read CSV file into table

is it possible to programmatically have access open a certain csv file and read it into a datasheet and set parameters like what time of delimiter, TEXT QUALIFIER etc, including all the steps that one takes to manually import a csv file into a table?
You can create a Scripting.FileSystemObject, then stream in the file line by line, separating fields by your delimiter, and adding each parsed record to the underlying recordset or another table.
You can also experiment with DoCmd.TransferText, as that seems the most promising built-in method.
Edit: for the more complete (and complex and arguably less efficient) solution, which would give you more control over schema and and treat any csv file like any other datasource, look at ODBC: Microsoft Text Driver - http://support.microsoft.com/kb/187670 for script examples. I believe you can already just link to any cvs files via the standard MS Access front-end, thereby allowing you to do just about any table read/copy operation on it. Otherwise, just set up a ODBC file dns entry to plug in your csv file (via Start->Program Files->Administrative Tools->Data Sources), and then you can select/link to it via your Access file.
It greatly depends on whether the Access table is already created or if you want to create it on the fly. Let's assume (for the sake of simplicity) that the table already exists (you can always go back and create the table). As suggested before you need some scripting tools:
Dim FSO As FileSystemObject, Fo As TextStream, line As String
Set FSO = CreateObject("Scripting.FileSystemObject")
Set Fo = FSO.OpenTextFile("csvfile.csv")
This will allow you to read your csv file which is a text file. You control here which delimiter you use and which date format will be employed ect. You need also to put in play the database engine:
Dim db As Database
Set db = DBEngine.OpenDatabase("mydatabase.accdb")
And this is basically all what you need. You read your csv file line by line:
While Not Fo.AtEndOfStream
line = Fo.ReadLine
Now you need to be able to format each field adequately for the table, meaning: text fields must be surrounded by quote marks ("), date fields must be surrounded by #, ect. Otherwise the database engine will noisily complain. But again, you are in charge here and you can do whatever cosmetic surgery you need to your input line. Another assumption I am making (for the sake of simplicity) is that you are comfortable programming in VBA.
So let's get to the crux of the problem:
db.Execute ("INSERT INTO myTable VALUES (" & line & ")")
Wend
At this moment, line has been changed into something edible for the database engine like, for example, if the original read line was
33,04/27/2019,1,1,0,9,23.1,10,72.3,77,85,96,95,98,10,5.4,5.5,5.1,Cashew,0
you changed it into
33,#04/27/2019#,1,1,0,9,23.1,10,72.3,77,85,96,95,98,10,5.4,5.5,5.1,"Cashew",0
One last note: It is important that each field of the csv file matches the one on the table. This includes at least, but not necessarily last, being in the right order. You must ensure that in you preprocessing stage. Hope this would put you on the right track.