How to load from .csv file into sql database using C - sql

I have a table created in myFile.csv. I wanted to load this table into SQL database. I am working in C language under unix environment. I went through some links but I didn't get any useful direction. Thanks.

I think you are referring to a CSV file instead of a CVS file. CSV stands for Comma Seperated Values. To load data from C to a database you will need C libraries for the database that allow you to run SQL INSERT statements. C isn't suited really well for this task in this day and age. Java would likely be a better bet because because nearly all vendors provide JDBC drivers for this purpose. If you insist on doing this in C you will likely be using ODBC drivers or a native library for your database on non-Windows platforms. Some information about ODBC can be found at this link.

This is not a direct answer to your question.
If you want to load a text file into an SQL database, you can do this usually with some helper program from the database in question. For MySQL, this could be LOAD DATA INFILE or mysqlimport

not a real code snippet, but just guidelines...
read the file with fgets() , this will give you line by line output...
for each line, tokenize it using strtok
char *brkt;
for (item = strtok_r(line, ",", &brkt); line; line = strtok_r(NULL, ",", &brkt)) {
}
connect to the database and send your query . i.e. mysqlconnect() for mysql

Related

SQL Server - Copying data between tables where the Servers cannot be connected

We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.

Import MySQL dump into R (without requiring MySQL server)

Packages like RMySQL and sqldf allow one to interface with local or remote database servers. I'm creating a portable project which involves importing sql data in cases (or on devices) which do not always have access to a running server, but which do always have access to the latest .sql dump of the database.
The goal seems simple enough: import an .sql dump into R without the involvement of a MySQL server. More specifically, I'd like to create a list of lists in which the elements correspond to any databases defined in the .sql dump (there may be multiple), and those elements in turn consist of the tables in those databases.
To make this reproducible, let's take the sample sportsdb SQL file here — if you unzip it it's called sportsdb_sample_mysql_20080303.sql.
One would think sqldf might be able to do it:
read.csv.sql('sportsdb_sample_mysql_20080303.sql', sql="SELECT * FROM addresses")
Error in sqliteSendQuery(con, statement, bind.data) :
error in statement: no such table: addresses
This even though there certainly is a table addresses in the dump. This post on the sqldf list mentions the same error, but no solution.
Then there is an sql.reader function in the package ProjectTemplate, which looks promising. Poking around, the source for the function can be found here, and it assumes a running database server and relies on RMySQL — not what I need.
So... we seem to be running out of options. Any help from the hivemind appreciated!
(To reiterate, I am not looking for a solution that relies on access to an SQL server; that's easy with dbReadTable from the RMySQL package. I would very much like to bypass the server and get the data straight from the .sql dump file.)
depending on what you want to extract from the table, here is how you can play around with the data
numLines <- R.utils::countLines("sportsdb_sample_mysql_20080303.sql")
# [1] 81266
linesInDB <- readLines("sportsdb_sample_mysql_20080303.sql",n=60)
Then you can do some regex to get tables names (after CREATE TABLE), column names (between first brackets) and VALUES (lines after CREATE TABLE and between second brackets)
Reference:
Reverse engineering a mysqldump output with MySQL Workbench gives "statement starting from pointed line contains non UTF8 characters" error
EDIT: in response to OP's answer, if i interpret the python script correct, it is also reading it line by line, filter for INSERT INTO lines, parse as csv, then write to file. This is very similar to my original suggestion. My version below in R. If the file size is too large, it would be better to read in the file in chunks using some other R package
options(stringsAsFactors=F)
library(utils)
library(stringi)
library(plyr)
mysqldumpfile <- "sportsdb_sample_mysql_20080303.sql"
allLines <- readLines(mysqldumpfile)
insertLines <- allLines[which(stri_detect_fixed(allLines, "INSERT INTO"))]
allwords <- data.frame(stri_extract_all_words(insertLines, " "))
d_ply(allwords, .(X3), function(x) {
#x <- split(allwords, allwords$X3)[["baseball_offensive_stats"]]
print(x[1,3])
#find where the header/data columns start and end
valuesCol <- which(x[1,]=="VALUES")
lastCols <- which(apply(x, 2, function(y) all(is.na(y))))
datLastCol <- head(c(lastCols, ncol(x)+1), 1) - 1
#format and prepare for write to file
df <- data.frame(x[,(valuesCol+1):datLastCol])
df <- setNames(df, x[1,4:(valuesCol-1)])
#type convert before writing to file otherwise its all strings
df[] <- apply(df, 2, type.convert)
#write to file
write.csv(df, paste0(x[1,3],".csv"), row.names=F)
})
I don't think you will find a way to import a sql dump (which contains multiple tables with references) and then perform arbitrary sql queries on them within R. This would basically require the R package to run a complete database server (compatible with the one creating the dump) within R.
I would suggest exporting the tables/select statements you need as CSV from your database (see here). If you can only work from the dump and don't want to setup a server for the conversion you could use some simple regular expressions to turn the insert statements in your dump into a bunch of CSV files for the tables using a tool of your choosing like sed or awk (or even R as suggested by the other answer but that might be rather slow for this file size).
I'll reluctantly answer my own question, using the input from +bnord and +chinsoon12 (who both contributed pieces of the puzzle).
Short answer: there is no out of the box solution. As +bnord notes, it would be preferred to fix it server-side (e.g., by exporting to CSV format with mysqldump). However, as my question indicated, I'm looking for a solution that allows me to work with the sql dump, bypassing the server.
So if we have to work with the dump, how? The hardcore, manual way is to use regular expressions to convert INSERT statements to CSV, either (1) outside R using sed and awk on the .sql text file (+bnord), or (2) inside R with grep and gsub on strings loaded with readLines (+chinsoon12).
Some good soul wrote a python script that can convert sql dumps to CSV. This requires yet another piece of (potentially non-trivial to install/maintain) software, so it's not the answer I was hoping for, but it does look like a good model in case anyone wants to reinvent the wheel in R.
For now I'll stick with my modus operandi of (on Windows) running MySQL Community Server and using WorkBench to import the dump, then talk to the local server from R. A very indirect method that is a pain in the ass because of the inscrutable access rights system of MySQL (especially annoying since it's all just there in an ASCII text file), but the only way for now, it seems. Thanks all for your input!
(If a better, more complete answer comes along I'll gladly accept that, turning this into a comment if possible.)

Generate DDL SQL create table statement after scanning CSV file

Are there any command line tools (Linux, Mac, and/or Windows) that I could use to scan a delimited file and output a DDL create table statement with the data type determined for me?
Did some googling, but couldn't find anything. Was wondering if others might know, thanks!
DDL-generator can do this. It can generate DDL's for YAML, JSON, CSV, Pickle and HTML (although I don't know how the last one works). I just tried it on some data exported from Salesforce and it worked pretty well. Note you need to use it with Python 3, I could not get it to work with Python 2.7.
You can also try https://github.com/mshanu/idli. It can take csv file as input and can generate create statement with appropriate types.It can generate for mysql, oracle and postgres. I am actively working on this and happy to receive feedback for future improvement

Insert file into access table

I have a table named Reports which has 3 fields ID (auto number), filename (string field), theFile (attachment field).
What I want to is to run a SQL query and insert a PDF file into the attachments field (theFile).
Lets say the PDF file is located in the C: drive (C:\report1.pdf), I have tried the SQL query below but it is not working. I know its not good practice to store files in a database but I just want to try it out:
CurrentDb.Execute "INSERT INTO Reports (filename,theFile) VALUES ('report1'," & C:\report1.pdf & ")"
It's standard practice to store files in a database. Access certainly supports it, but not through SQL. You'll have to use DAO, as detailed at http://msdn.microsoft.com/en-us/library/office/bb258184%28v=office.12%29.aspx
"File" is not appropriate SQL data type supported in Access, available data types.
That is correct Derek, if you try to run a SQL statement like that you will get an error message of one type or another every time. I spent a fair amount of time researching this subject for my own DB, and from what I understand there are a number of options/alternatives; however, having an attachment column type and using SQL to insert a file is not an option with Access' current capabilities.
It is not bad practice to store files in a database, it is actually standard practice; however, it IS best practice to not store files in an ACCESS db. There are a few reasons for this which you can research on your own, but perhaps most notably, Access has a file size limit of 2GB, so if you store files in it you can run out of space quickly and then things get even more complicated.
Here are your options:
Change your column data type to OLE object and use some kind of stream reader to convert the files to binary data, then use a SQL statement to load them into your DB
Use the built in Access user interface for working directly with tables/attachments
Establish a DAO db connection and use Access' recordset.LoadFromFile function
Just store links to the files in the Access DB
The 4th option is the preferred method. It's very simple and you won't have to worry about complex code or the 2GB storage limit.

What is the easiest way to add a bunch of content to a SQL database?

Nothing technical here. Suppose I have a lot of different categorized data, and I would like to create a database out of it. Would someone literally hand plug in all that info with SQL code itself? Or do some people make a mock website just to input data? What are some of your strategies?
If there would be no way to do it automatically, then a mock website would be the way to go: you can even use it with more people at once, actually multiplying the input speed (as long as you don't mess up assigning each of them a different part of the data).
What format is your data in? And how much of it is there? If its Excel then SQL Server has tools to import it in. I'm not sure if MySQL has anything similar. Even if it doesn't one other technique I have used with Excel data is to use a formula to concatenate as required to generate the INSERT statements. Then just paste those into a query window and run that.
I wouldn't do a website for it unless I was building an admin site for it already and wanted to test that with the initial load.
Most databases have a way to do bulk inserts or have tools for data import.
My strategies normally involve such tools.
Here is an example of importing a CSV file to SQL Server.
Most database servers provide a way to import data from a variety of formats, you could look into that first.
If not, you could write a simple script or console application to parse your input data, and write out a SQL script to insert the data into appropriate tables.
For example, if you data was in a CSV file, you would parse each line in the file, and generate an insert statement to write out to a .sql file.
MyData.csv
1,2,3,'Test',4
2,3,4,'Test2,6
GeneratedInsert.sql
insert into table (col1,col2,col3,col4,col5) values (1,2,3,'Test',4)
insert into table (cal1,col2,col3,col4,col5) values (2,3,4,'Test2',6)
MySQL has a statement LOAD DATA INFILE that is intended for loading bulk data from flat files. It's easy to use and much faster than alternative methods.
But first you do have to use SQL to design tables with fields that match the field of your import data. That is, if you have some file with comma-separated data:
Titanic;1997;4 stars
Batman Begins;2005;5 stars
"Harry Potter and the Sorcerer's Stone";2001;3 stars
...
You would create a table:
CREATE TABLE Movies (
title VARCHAR(100) NOT NULL,
year YEAR NOT NULL
rating VARCHAR(10)
);
Then load data:
LOAD DATA INFILE 'movies.txt' INTO TABLE Movies
FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"';
Most web languages have some sort of auto-scaffolding that you can quickly set up. Useful for admin work as well, if your site is hosted without direct access to DB.
Otherwise, yeah - write the SQL statements. Useful to bring a database up as part of your build process.