Generate sql insert script from excel worksheet - sql

I have a large excel worksheet that I want to add to my database.
Can I generate an SQL insert script from this excel worksheet?

I think importing using one of the methods mentioned is ideal if it truly is a large file, but you can use Excel to create insert statements:
="INSERT INTO table_name VALUES('"&A1&"','"&B1&"','"&C1&"')"
In MS SQL you can use:
SET NOCOUNT ON
To forego showing all the '1 row affected' comments. And if you are doing a lot of rows and it errors out, put a GO between statements every once in a while

You can create an appropriate table through management studio interface and insert data into the table like it's shown below. It may take some time depending on the amount of data, but it is very handy.

There is a handy tool which saves a lot of time at
http://tools.perceptus.ca/text-wiz.php?ops=7
You just have to feed in the table name, field names and the data - tab separated and hit Go!

You can use the following excel statement:
="INSERT INTO table_name(`"&$A$1&"`,`"&$B$1&"`,`"&$C$1&"`, `"&$D$1&"`) VALUES('"&SUBSTITUTE(A2, "'", "\'")&"','"&SUBSTITUTE(B2, "'", "\'")&"','"&SUBSTITUTE(C2, "'", "\'")&"', "&D2&");"
This improves upon Hart CO's answer as it takes into account column names and gets rid of compile errors due to quotes in the column. The final column is an example of a numeric value column, without quotes.

Depending on the database, you can export to CSV and then use an import method.
MySQL - http://dev.mysql.com/doc/refman/5.1/en/load-data.html
PostgreSQL - http://www.postgresql.org/docs/8.2/static/sql-copy.html

Use the ConvertFrom-ExcelToSQLInsert from the ImportExcel in the PowerShell Gallery
NAME
ConvertFrom-ExcelToSQLInsert
SYNTAX
ConvertFrom-ExcelToSQLInsert [-TableName] <Object> [-Path] <Object>
[[-WorkSheetname] <Object>] [[-HeaderRow] <int>]
[[-Header] <string[]>] [-NoHeader] [-DataOnly] [<CommonParameters>]
PARAMETERS
-DataOnly
-Header <string[]>
-HeaderRow <int>
-NoHeader
-Path <Object>
-TableName <Object>
-WorkSheetname <Object>
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (http://go.microsoft.com/fwlink/?LinkID=113216).
ALIASES
None
REMARKS
None
EXAMPLE
ConvertFrom-ExcelToSQLInsert MyTable .\testSQLGen.xlsx

You could use VB to write something that will output to a file row by row adding in the appropriate sql statements around your data. I have done this before.

Here is another tool that works very well...
http://www.convertcsv.com/csv-to-sql.htm
It can take tab separated values and generate an INSERT script. Just copy and paste and in the options under step 2 check the box "First row is column names"
Then scroll down and under step 3, enter your table name in the box "Schema.Table or View Name:"
Pay attention to the delete and create table check boxes as well, and make sure you examine the generated script before running it.
This is the quickest and most reliable way I've found.

You can use the below C# Method to generate the insert scripts using Excel sheet just you need import OfficeOpenXml Package from NuGet Package Manager before executing the method.
public string GenerateSQLInsertScripts() {
var outputQuery = new StringBuilder();
var tableName = "Your Table Name";
if (file != null)
{
var filePath = #"D:\FileName.xsls";
using (OfficeOpenXml.ExcelPackage xlPackage = new OfficeOpenXml.ExcelPackage(new FileInfo(filePath)))
{
var myWorksheet = xlPackage.Workbook.Worksheets.First(); //select the first sheet here
var totalRows = myWorksheet.Dimension.End.Row;
var totalColumns = myWorksheet.Dimension.End.Column;
var columns = new StringBuilder(); //this is your columns
var columnRows = myWorksheet.Cells[1, 1, 1, totalColumns].Select(c => c.Value == null ? string.Empty : c.Value.ToString());
columns.Append("INSERT INTO["+ tableName +"] (");
foreach (var colrow in columnRows)
{
columns.Append("[");
columns.Append(colrow);
columns.Append("]");
columns.Append(",");
}
columns.Length--;
columns.Append(") VALUES (");
for (int rowNum = 2; rowNum <= totalRows; rowNum++) //selet starting row here
{
var dataRows = myWorksheet.Cells[rowNum, 1, rowNum, totalColumns].Select(c => c.Value == null ? string.Empty : c.Value.ToString());
var finalQuery = new StringBuilder();
finalQuery.Append(columns);
foreach (var dataRow in dataRows)
{
finalQuery.Append("'");
finalQuery.Append(dataRow);
finalQuery.Append("'");
finalQuery.Append(",");
}
finalQuery.Length--;
finalQuery.Append(");");
outputQuery.Append(finalQuery);
}
}
}
return outputQuery.ToString();}

Here is a link to an Online automator to convert CSV files to SQL Insert Into statements:
CSV-to-SQL

This query i have generated for inserting the Excel file data into database
In this id and price are numeric values and date field as well. This query summarized all the type which I require It may useful to you as well
="insert into product (product_id,name,date,price) values("&A1&",'" &B1& "','" &C1& "'," &D1& ");"
Id Name Date price
7 Product 7 2017-01-05 15:28:37 200
8 Product 8 2017-01-05 15:28:37 40
9 Product 9 2017-01-05 15:32:31 500
10 Product 10 2017-01-05 15:32:31 30
11 Product 11 2017-01-05 15:32:31 99
12 Product 12 2017-01-05 15:32:31 25

I had to make SQL scripts often and add them to source control and send them to DBA.
I used this ExcelIntoSQL App from windows store https://www.microsoft.com/store/apps/9NH0W51XXQRM
It creates complete script with "CREATE TABLE" and INSERTS.

I have a reliable way to generate SQL inserts batly,and you can modify partial parameters in processing.It helps me a lot in my work, for example, copy one hundreds data to database with incompatible structure and fields count.
IntellIJ DataGrip , the powerful tool i use.
DG can batly receive data from WPS office or MS Excel by column or line.
after copying, DG can export data as SQL inserts.

Related

Writing an existing Excel file from db but with different headers

I have a problem that i can't solve even if i looked everywhere for a solution. I got an excel that have to be written from a table in my db, problem is that the headers in my db are named different than the headers in my excel file and I need to keep them like that. Another problem is that the index of the columns are also diffrent...It would be great if i could make a query for each column and write the result to the excel file by indicating the column's index, something like that:
Dim column1 As String = "SELECT [columnName] from [tableName] "
and then
Sheets("SheetName").Columns("ColumnName").value = column1
obviously it doesn't work it's just to let you know my idea...
Build your SQL query to match the order of the excel file and then fill in your result like
for (int i = 0; i < table.Columns.Count; i++)
{
Sheets("SheetName").Columns(i).value = table.Columns[i].ColumnName
}
and add another for-loop for the rows on the table.Rows to fill in the values

Insert Into SQL not working

I have a table named cpns with fields as C_Bk_No (Coupon book number), St_No (starting coupon number) and End_No (number of the last coupon) all integers.
I have initiated the table with first record as 1, 1, 25.
I am trying to get the system insert new rows with cpn number + 1, start_No + 25 and End_No + 25 as the new record on clicking a button (Command13) on the form.
Thus, the expected second record should have cpn_bk_no = 2, St_No = 26, End_No = 50.
I am not sure why the following SQL is not working:
Private Sub Command13_Click()
Dim Sql As String
Dim CbkNo As Long
Dim StNo As Long
Dim EndNo As Long
CbkNo = Me![C_bk_No].Value + 1
StNo = Me![St_No].Value + 25
EndNo = Me![End_No].Value + 25
Sql = "Insert Into cpns ([C_bk_No], [St_No], [End_No]); Values (CBkNo, StNo, EndNo))"
CurrentDb.Execute Sql
End Sub
Every time I click the button, it says "Run time error 3061, Too few parameters: Expected 3." and the line "CurrentDb.Execute Sql" is highlighted in yellow.
Please can anyone help?
It's difficult to know where to begin with this, but the basic problem you're facing is that you've typed the names of your variables into the string. Unlike some other languages, vb doesn't look at your string contents and think "oh, he's typed a variable name into that string, I'll jut swap it for the value currently in the variable"
There are other problems with this code too, but not quite so fundamental as that. I'd genuinely recommend you throw all that code away and follow this tutorial instead, about how to access a database in one of the ways Microsoft recommends:
https://msdn.microsoft.com/en-us/library/ms171884.aspx
Even if you're not making a win forms app the concepts there inside can be applied to all kinds of app
This semicolon is wrong. Semicolons are use to separate sql statements from each other.
Insert Into cpns ([C_bk_No], [St_No], [End_No]) **;** Values (CBkNo, StNo, EndNo)
remove it and put it at the and it will work:
Insert Into cpns ([C_bk_No], [St_No], [End_No]) Values (CBkNo, StNo, EndNo);

How to delete large amounts of data from Foxpro

The problem
We have a legacy Visual FoxPro reservation system with many tables. I have been asked to do some housekeeping on the tables to reduce their size.
The tables are badly designed with no auto incrementing primary key.
The largest table is 3 million rows.
I am attempting to delete 380,000 rows.
Due to the volume of data in the tables, I am trying to develop a solution which batch deletes.
What I've got so far
I have created a C# application which accesses the database files via the vfpoledb.1 driver. This application uses recno() to batch the deletion. This is an example of the query I'm using:
delete from TableA
where TableA.Key in (
select Key from TableB
where Departure < date(2010,01,01) and Key <> ""
) and recno() between 1 and 10000
Executing this via vfpoledb.1 does not delete anything. Executing a select statement with the same where clause does not return anything.
It seems that the combination of the recno() function and an in() function is causing the issue. Testing the query with each clause in turn returns results.
Questions
Is there another way of batch deleting data from Visual FoxPro?
Why are recno() and in() not compatible?
Is there anything else I'm missing?
Additional information
ANSI is set to TRUE
DELETED is set to TRUE
EXCLUSIVE is set to TRUE
Instead of doing in batch of so many record numbers, why not a simpler approach. You are looking to kill off everything prior to some date (2010-01-01).
Why not try based on starting with 2009-12-31 and keep working backwards to the earliest date on file you are trying to purge off. Also note, I don't know if Departure is a date vs datetime, so I changed it to
TTOD( Departure ) (meaning convert time to just the date component)
DateTime purgeDate = new DateTime(2009, 12, 31);
// the "?" is a parameter place-holder in the query
string SQLtxt = "delete from TableA "
+ " where TableA.Key in ( "
+ " select Key from TableB "
+ " where TTOD( Departure ) < ? and Key <> \"\" )";
OleDbCommand oSQL = new OleDbCommand( SQLtxt, YourOleDbConnectionHandle );
// default the "?" parameter place-holder
oSQL.Parameters.AddWithValue( "parmDate", purgeDate );
int RecordsDeleted = 0;
while( purgeDate > new DateTime(2000,1,1) )
{
// always re-apply the updated purge date for deletion
oSQL.Parameters[0].Value = purgeDate;
RecordsDeleted += oSQL.ExecuteNonQuery();
// keep going back one day at a time...
purgeDate = purgeDate.AddDays(-1);
}
This way, it does not matter what RECNO() you are dealing with, it will only do whatever keys are for that particular day. If you have more than 10,000 entries for a single day, then I might approach differently, but since this is more of a one-time cleanup, I would not be too concerned with doing 1000+ iterations ( 365 days per year for however many years) through the data... Or, you could do it with a date range and do maybe weekly, just change the WHERE clause and adjust the parameters... something like... (The date of 1/1/2000 is just a guess for how far back the data goes). Also, since this is doing entire date range, no need to convert possible TTOD() of the departure field.
DateTime purgeDate = new DateTime(2009, 12, 31);
DateTime lessThanDate = new DateTime( 2010, 1, 1 );
// the "?" is a parameter place-holder in the query
string SQLtxt = "delete from TableA "
+ " where TableA.Key in ( "
+ " select Key from TableB "
+ " where Departure >= ? "
+ " and Departure < ? "
+ " and Key <> \"\" )";
OleDbCommand oSQL = new OleDbCommand( SQLtxt, YourOleDbConnectionHandle );
// default the "?" parameter place-holder
oSQL.Parameters.AddWithValue( "parmDate", purgeDate );
oSQL.Parameters.AddWithValue( "parmLessThanDate", LessThanDate );
int RecordsDeleted = 0;
while( purgeDate > new DateTime(2000,1,1) )
{
// always re-apply the updated purge date for deletion
oSQL.Parameters[0].Value = purgeDate;
oSQL.Parameters[1].Value = lessThanDate;
RecordsDeleted += oSQL.ExecuteNonQuery();
// keep going back one WEEK at a time for both the starting and less than end date of each pass
purgeDate = purgeDate.AddDays(-7);
lessThanDate = lessThanDate.AddDays( -7);
}
I'm interested in the best way to accomplish this too. We use a lot of poorly designed dBaseIII files that are sometimes quite large.
We do this a lot but it's a nasty, manual process:
Import dbf files into a temp database using DTS (management studio import/export wizard for version 2005 + )
Run the cleanup scripts using SSMS
Export the dbf files and replace the original ones (backing them up) with the newly modified files.
It works for us.
It looks like your condition with date isn't working. Try to execute SELECT statement with using of CTOD() function instead of DATE() you've used.
When your condition will work, then you'll be able to run DELETE statement. But remember that as a result of the DELETE execution the rows will only be marked as deleted. To remove them completely you should run PACK statement after DELETE.
As an another way, you can also try our DBF editor - DBF Commander Professional. It allows to execute SQL queries, including command-line (batch) mode. E.g.:
dbfcommander. exe -q "DELETE FROM 'D:\table_name.dbf' WHERE RECNO()<10000"
dbfcommander. exe -q "PACK 'D:\table_name.dbf'"
You can use it for free within 20 days full-featured trial period.

Using a VBA array in a SQL statement

I am trying to write some code that uses SQL to delete rows from several tables.
A user would type type numbers into a textbox that are separated by a comma which is used in the WHERE clause of a SQL DELETE statement.
I have managed to split the string into a variant array and now I want to insert it into my SQL statement.
How do I insert the variable into the SQL statement and have it run through every element of the array?
EDIT: A bit more digging has taught me about For Each Next statements. This is probably what im looking for.
I suggest you build your query in VBA, then your list of numbers can be an IN statement:
sSQL = "DELETE FROM table WHERE ID In (" & MyList & ")"
Where MyList = "1,2,3,4" or such like.
As you can see, you do not need an array and a textbox would be more suitable than a combobox. If you wish to allow your users to select by say, a name, then a listbox is useful. You can iterate through the selected items in the listbox and build a string from IDs to be used in the Delete statement. ( MS Access 2007 - Cycling through values in a list box to grab id's for a SQL statement )
You can then execute the sql against an instance of a database. For example:
Dim db As Database
Set db = CurrentDB
db.Execute sSQL, dbFailOnError
MsgBox "You deleted " & db.RecordsAffected & " records."
A generic approach
WHERE
','+Array+',' like '%,'+col+',%'
It will consider all the numbers available in your Array
You could make it simple and elaborate a string, something like
stringBuilder sb = StringBuilder("DELETE FROM YOURTABLE WHERE ");
foreach(string st in stringArray){
sb.append("YOURFIELD='" + st + "'");
//If it is not the last element, add an "OR"
if (st != stringArray[stringArray.length -1]) sb.append(" OR ");
}
//Now, you have a string like
//DELETE FROM YOURTABLE WHERE YOURFIELD='hello' OR YOURFIELD='YES'
//... do something with the command
This method will fail if you want to run SQL query on two (or multiple) columns using array values from two different arrays. .e.g
where col1=array1(i) and col2=array2(i)

Generate insert SQL statements from a CSV file

I need to import a csv file into Firebird and I've spent a couple of hours trying out some tools and none fit my needs.
The main problem is that all the tools I've been trying like EMS Data Import and Firebird Data Wizard expect that my CSV file contains all the information needed by my Table.
I need to write some custom SQL in the insert statement, for example, I have a CSV file with the city name, but as my database already has all the cities in another table (normalized), I need to write a subselect in the insert statement to lookup for the city and write its ID, also I have a stored procedure to cread GUIDS.
My insert statement would be something like this:
INSERT INTO PERSON (ID, NAME, CITY_ID) VALUES((SELECT NEW_GUID FROM CREATE_GUID), :NAME, (SELECT CITY_ID FROM CITY WHERE NAME = :CITY_NAME)
How can I approach this?
It's a bit crude - but for one off jobs, I sometimes use Excel.
If you import the CSV file into Excel, you can create a formula which creates an INSERT statement by using string concatenation in the formula. So - if your CSV file has 3 columns that appear in columns A, B, and C in Excel, you could write a formula like...
="INSERT INTO MyTable (Col1, Col2, Col3) VALUES (" & A1 & ", " & B1 & ", " & C1 & ")"
Then you can replicate the formula down all of your rows, and copy, and paste the answer into a text file to run against your database.
Like I say - it's crude - but it can be quite a 'quick and dirty' way of getting a job done!
Well, if it's a CSV, and it this is a one time process, open up the file in Excel, and then write formulas to populate your data in any way you desire, and then write a simple Concat formula to construct your SQL, and then copy that formula for every row. You will get a large number of SQL statements which you can execute anywhere you want.
Fabio,
I've done what Vaibhav has done many times, and it's a good "quick and dirty" way to get data into a database.
If you need to do this a few times, or on some type of schedule, then a more reliable way is to load the CSV data "as-is" into a work table (i.e customer_dataload) and then use standard SQL statements to populate the missing fields.
(I don't know Firebird syntax - but something like...)
UPDATE person
SET id = (SELECT newguid() FROM createguid)
UPDATE person
SET cityid = (SELECT cityid FROM cities WHERE person.cityname = cities.cityname)
etc.
Usually, it's much faster (and more reliable) to get the data INTO the database and then fix the data than to try to fix the data during the upload. You also get the benefit of transactions to allow you to ROLLBACK if it does not work!!
I'd do this with awk.
For example, if you had this information in a CSV file:
Bob,New York
Jane,San Francisco
Steven,Boston
Marie,Los Angeles
The following command will give you what you want, run in the same directory as your CSV file (named name-city.csv in this example).
$ awk -F, '{ print "INSERT INTO PERSON (ID, NAME, CITY_ID) VALUES ((SELECT NEW_GUID FROM CREATE_GUID), '\''"$1"'\'', (SELECT CITY_ID FROM CITY WHERE NAME = '\''"$2"'\''))" }' name-city.csv
Type awk --help for more information.
Two online tools which helped me in 2020:
https://numidian.io/convert/csv/to/sql
https://www.convertcsv.com/csv-to-sql.htm
The second one is based on JS and does not upload your data (at least not at the time I am writing this)
You could import the CSV file into a database table as is, then run an SQL query that does all the required transformations on the imported table and inserts the result into the target table.
Assuming the CSV file is imported into temp_table with columns n, city_name:
insert into target_table
select t.n, c.city_id as city
from temp_table t, cities c
where t.city_name = c.city_name
Nice tip about using Excel, but I also suggest getting comfortable with a scripting language like Python, because for some tasks it's easier to just write a quick python script to do the job than trying to find the function you need in Excel or a pre-made tool that does the job.
You can use the free csvsql to do this.
Install it using these instructions
Now run a command like so to import your data into your database. More details at the links above, but it'd be something like:
csvsql --db firebase:///d=mydb --insert mydata.csv
The following works with sqlite, and is what I use to convert data into an easy to query format
csvsql --db sqlite:///dump.db --insert mydata.csv
use the csv-file as an external table. Then you can use SQL to copy the data from the external table to your destination table - with all the possibilities of SQL.
See http://www.firebirdsql.org/index.php?op=useful&id=netzka
Just finished this VBA script which might be handy for this purpose. All should need to do is change the Insert statement to include the table in question and the list of columns (obviously in the same sequence they appear on the Excel file).
Function CreateInsertStatement()
'Output file location and start of the insert statement
SQLScript = "C:\Inserts.sql"
cStart = "Insert Into Holidays (HOLIDAY_ID, NAT_HOLDAY_DESC, NAT_HOLDAY_DTE) Values ("
'Open file for output
Open SQLScript For Output As #1
Dim LoopThruRows As Boolean
Dim LoopThruCols As Boolean
nCommit = 1 'Commit Count
nCommitCount = 100 'The number of rows after which a commit is performed
LoopThruRows = True
nRow = 1 'Current row
While LoopThruRows
nRow = nRow + 1 'Start at second row - presuming there are headers
nCol = 1 'Reset the columns
If Cells(nRow, nCol).Value = Empty Then
Print #1, "Commit;"
LoopThruRows = False
Else
If nCommit = nCommitCount Then
Print #1, "Commit;"
nCommit = 1
Else
nCommit = nCommit + 1
End If
cLine = cStart
LoopThruCols = True
While LoopThruCols
If Cells(nRow, nCol).Value = Empty Then
cLine = cLine & ");" 'Close the SQL statement
Print #1, cLine 'Write the line
LoopThruCols = False 'Exit the cols loop
Else
If nCol > 1 Then 'add a preceeding comma for all bar the first column
cLine = cLine & ", "
End If
If Right(Left(Cells(nRow, nCol).Value, 3), 1) = "/" Then 'Format for dates
cLine = cLine & "TO_DATE('" & Cells(nRow, nCol).Value & "', 'dd/mm/yyyy')"
ElseIf IsNumeric(Left(Cells(nRow, nCol).Value, 1)) Then 'Format for numbers
cLine = cLine & Cells(nRow, nCol).Value
Else 'Format for text, including apostrophes
cLine = cLine & "'" & Replace(Cells(nRow, nCol).Value, "'", "''") & "'"
End If
nCol = nCol + 1
End If
Wend
End If
Wend
Close #1
End Function
option 1:
1- have you tried IBExert? IBExpert \ Tools \ Import Data (Trial or Customer Version).
option 2:
2- upload your csv file to a temporary table with F_BLOBLOAD.
3- create a stored procedure, which used 3 functions (f_stringlength, f_strcopy, f_MID)
you cross all your string, pulling your fields to build your INSERT INTO.
links:
2: http://freeadhocudf.org/documentation_english/dok_eng_file.html
3: http://freeadhocudf.org/documentation_english/dok_eng_string.html
A tool I recently tried that worked outstandingly well is FSQL.
You write an IMPORT command, paste it into FSQL and it imports the CSV file into the Firebird table.
you can use shell
sed "s/,/','/g" file.csv > tmp
sed "s/$/'),(/g" tmp > tmp2
sed "s/^./'&/g" tmp2 > insert.sql
and then add
INSERT INTO PERSON (ID, NAME, CITY_ID) VALUES(
...
);