I would like to create a script to insert data into an SQL database.
My project is to take an access database, that isn't well structured and put it into SQL. The access data can be imported, but there is a table that i have created to that the access DB doesn't have, which is what i want the script for. Its a physical Paper box archive database. I need to create the "Locations" data.
To be more specific the data is:
ID (auto num)
Rack - These are the shelving units
Row - This is the same as shelf
Column - This is the amount of boxes horizontally on a shelf
Position - This is the depth (there can be two boxes in the same column on each shelf)
INSERT INTO
In terms of script, there are a few methods of inserting data into MySQL. Firstly if you have an existing table, you can insert values into specific columns like so:
INSERT INTO TableName (Column1, Column2 etc..)
VALUES ("Column1 value", 420 etc...)
This can be added to a while loop in order to fill multiple rows at a fast pace.
IMPORT FILE
Another method you can use to create a table with existing data and columns is to import an excel sheet for example. This can be done by right clicking on the database you wish to add the new table to, heading to tasks then import data.
Database (Right Click) > Tasks > Import Data...
You will then need to select data source, presumably excel, then specify the file path. Next select destination; probably SQL Server Native Client for you. The rest from there should be pretty easy to follow.
BULK IMPORT
I've not had a lot of practice with bulk importing to SQL, however from what I am aware you can use this method to import data from an external file into a SQL table programmatically.
An example I have is as follows:
--Define the data you are importing in a temp table
CREATE TABLE #ClickData
ID INT IDENTITY(1,1)
,Dated VARCHAR(255) COLLATE Latin1_General_CI_AS
,PageViews VARCHAR(255) COLLATE Latin1_General_CI_AS
)
insert into #ClickData
--Select the data from the file
select Dated, PageViews
from openrowset(--Openrowset is the method of doing this
bulk 'FilePath\ImportToSqlTest.csv',--The file you wish to import data from
formatfile = 'FilePath\Main.XML',--The XML formatting for the data you are gathering (I believe this part is for reading the file)
firstrow = 2--Specifiy the starting row(Mine is 2 to ignore headers)
) as data
Apologies if this answer isn't overly helpful, I had to write this in a rush. I'm not entirely sure what you're looking for, as others stated your question is very vague. Hopefully this might help somewhat.
Related
I have several 100 excel files that are not normalized enough for me to efficiently import into the tables of my database. It is difficult to find information but from what I have seen it is possible to index xlsx files with FTS. I am not really looking to implement an alternate database for this as it is a one time thing that will not receive new data.
Would it be possible to do this with FTS and if so could someone point me in the right direction as the info I've found on msdn is quite vague.
Thanks.
I have done something similar using BULK. I would suggest taking a look at it
http://www.sqlteam.com/article/using-bulk-insert-to-load-a-text-file
How it works is excel data can be taken as a text file. Each column is separated by a ";" and each row by "\n" you can then use BULK to crawl through your excel sheet and insert it in a table.
Note that all the values coming from BULK are text values. So if your table contains int values, for example, you will need a temporary table.
CREATE TABLE #TEMPORARYTABLE(
)
The # creates a table that only exists until you disconnect from sql server.
All values in that table should be nvarchars
You can then insert into your real table the #TEMPORARYTABLE and CAST the Nvarchar values to int values or whatever else you need
FTS is a feature in SQL Server, The data you want to create FTS for needs to in a SQL Server database.
Excel being in Excel and not in SQL Server , you will not be able to create FTS for them Excel sheets.
But if you import that data into SQL Server only then you will be able to make use of FTS features, till then unfortunately FTS is not an option for you.
I want to import csv data in sql server. I searched about and found answers about BULK INSERT ... FROM.
The problem I have is :
I want to select just one column of my results
The table already exists with bad datas and I just want to update these fields
The CSV I had contains towns and its parameters (correct datas)
Town,Id,ZipCode,...
T1,1,12000
T2,2,12100
T3,3,12200
And the table in SQL Server 'town' contains for example
T1,1,30456
T2,2,36327
T3,3,85621
I just want to get ZipCode in CSV and update the ZipCode in the table in function the ID.
Does it exist an easy way to do it ?
I normally prefer bcp over bulk insert because with that you can easily import the files over network and there's less issues with access rights. Otherwise I would just load the data into tempdb and update the original table from there.
I'm using Microsoft Sql Server Management Studio.
I currently have an existing database with data in it, which I will call DatabaseProd
And I have a second database with data used for testing, so the data isn't exactly correct nor up to date. I will call this database DatabaseDev.
However DatabaseDev now contains newly added tables and newly added columns,etc etc.
I would like to copy this new schema from DatabaseDev to DatabaseProd while keeping the DatabaseProd's Data.
Ex.
DatabaseProd contains 2 tables
TableA with column ID and Name
TableB with column ID and jobName
and these tables contains data that I would like to keep
DatabaseDev contains 3 tables
TableA with column ID ,Name and phoneNum
TableB with column ID and jobName
TableC with column ID and document
and these tables contains Data that I dont need
Copy DatabaseDev Schema to DatabaseProd but keep the data from DatabaseProd
So DatabaseProd after the copy would look like this
TableA with column ID ,Name and phoneNum
TableB with column ID and jobName
TableC with column ID and document
But the tables would contain it's original Data.
Is that possible?
Thank you
You can use Red-Gate SQL Compare, this will allow you to compare both DB's and generate a script to run on the source DB. You have to pay for a license, but you will get a 14-day trial period.
This tool, along with Data Compare and two tools I always insist on with new roles as they speed up development time, and minimise human error.
Also, a good tip when using SQL compare - if you need to generate a rollback script, then you can edit the project (after creating your rollout script), switch the source and destination around and this will create a script which will return the schema back to it's original state if the rollout script fails. However, be very careful when doing this, and don't select synchronize with sql compare, rather generate a script, see image. I can't upload an image, but I have linked to one here - you can see the two options to select Generate Script / Sync using SQL compare.
Yes, you can just generate a database script which is just for schema only no data will added to that script.
Also you need to just select the third table while creating or generating the database script and run that script to your production server database it will create a new table (table 3 in your case) without any data.
For more information about how to create a database script please follow the below link:
http://blog.sqlauthority.com/2011/05/07/sql-server-2008-2008-r2-create-script-to-copy-database-schema-and-all-the-objects-data-schema-stored-procedure-functions-triggers-tables-views-constraints-and-all-other-database-objects/
You need an ALTER TABLE statement
ALTER TABLE tableA ADD PhoneNum Varchar(10) --Insert variable of choice here
Looked like no changes to TableB
Add TableC
CREATE TABLE TableC (ColumnID int, Document Varvhar(50))
DO you need to copy constraints, Indexes or triggers over?
I have one excel file that I want to import into two different tables, tblUni and tblUser.
I have a third table which contains the id's from the other two tables:
tblUni_Students
Id
UniId
StudentId
What I need is when I import the excel data into the first two tables, for each record, the newly created ids to be inserted into the Uni_Students table also.
Using SSIS, I have managed to import the data into two sql destinations but cannot seem to then take the new ids from these destinations to then insert into the lookup table.
Can anyone advise please. Thanks.
It's a bit difficult to answer without knowing the target database or the structure of the data but speaking generally this would be much better done by adding the data into a "load" table. i.e. one who's sole reason is to temporarily hold data while you process it, you would then update the tblStudent, tblUni and tblUni_Student tables from the load area using SQL statements either via Procedure or via an Execute SQL Task component.
You'd it as an oledbcommand component, where the command is to insert values into the table. Then in the same component you'd output the generated identity. Assign the generated identity to a new column in the output, and now you have all your data plus the generated identity in the dataflow.
This will be processed one row at a time, so it will be slow. Personally I'd put it in a staging table and do it as CiarĂ¡n described.
Nothing technical here. Suppose I have a lot of different categorized data, and I would like to create a database out of it. Would someone literally hand plug in all that info with SQL code itself? Or do some people make a mock website just to input data? What are some of your strategies?
If there would be no way to do it automatically, then a mock website would be the way to go: you can even use it with more people at once, actually multiplying the input speed (as long as you don't mess up assigning each of them a different part of the data).
What format is your data in? And how much of it is there? If its Excel then SQL Server has tools to import it in. I'm not sure if MySQL has anything similar. Even if it doesn't one other technique I have used with Excel data is to use a formula to concatenate as required to generate the INSERT statements. Then just paste those into a query window and run that.
I wouldn't do a website for it unless I was building an admin site for it already and wanted to test that with the initial load.
Most databases have a way to do bulk inserts or have tools for data import.
My strategies normally involve such tools.
Here is an example of importing a CSV file to SQL Server.
Most database servers provide a way to import data from a variety of formats, you could look into that first.
If not, you could write a simple script or console application to parse your input data, and write out a SQL script to insert the data into appropriate tables.
For example, if you data was in a CSV file, you would parse each line in the file, and generate an insert statement to write out to a .sql file.
MyData.csv
1,2,3,'Test',4
2,3,4,'Test2,6
GeneratedInsert.sql
insert into table (col1,col2,col3,col4,col5) values (1,2,3,'Test',4)
insert into table (cal1,col2,col3,col4,col5) values (2,3,4,'Test2',6)
MySQL has a statement LOAD DATA INFILE that is intended for loading bulk data from flat files. It's easy to use and much faster than alternative methods.
But first you do have to use SQL to design tables with fields that match the field of your import data. That is, if you have some file with comma-separated data:
Titanic;1997;4 stars
Batman Begins;2005;5 stars
"Harry Potter and the Sorcerer's Stone";2001;3 stars
...
You would create a table:
CREATE TABLE Movies (
title VARCHAR(100) NOT NULL,
year YEAR NOT NULL
rating VARCHAR(10)
);
Then load data:
LOAD DATA INFILE 'movies.txt' INTO TABLE Movies
FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"';
Most web languages have some sort of auto-scaffolding that you can quickly set up. Useful for admin work as well, if your site is hosted without direct access to DB.
Otherwise, yeah - write the SQL statements. Useful to bring a database up as part of your build process.