Full Text Search For Excel Files - sql

I have several 100 excel files that are not normalized enough for me to efficiently import into the tables of my database. It is difficult to find information but from what I have seen it is possible to index xlsx files with FTS. I am not really looking to implement an alternate database for this as it is a one time thing that will not receive new data.
Would it be possible to do this with FTS and if so could someone point me in the right direction as the info I've found on msdn is quite vague.
Thanks.

I have done something similar using BULK. I would suggest taking a look at it
http://www.sqlteam.com/article/using-bulk-insert-to-load-a-text-file
How it works is excel data can be taken as a text file. Each column is separated by a ";" and each row by "\n" you can then use BULK to crawl through your excel sheet and insert it in a table.
Note that all the values coming from BULK are text values. So if your table contains int values, for example, you will need a temporary table.
CREATE TABLE #TEMPORARYTABLE(
)
The # creates a table that only exists until you disconnect from sql server.
All values in that table should be nvarchars
You can then insert into your real table the #TEMPORARYTABLE and CAST the Nvarchar values to int values or whatever else you need

FTS is a feature in SQL Server, The data you want to create FTS for needs to in a SQL Server database.
Excel being in Excel and not in SQL Server , you will not be able to create FTS for them Excel sheets.
But if you import that data into SQL Server only then you will be able to make use of FTS features, till then unfortunately FTS is not an option for you.

Related

Updating/Inserting Oracle table from csv data file

I am still learning Oracle SQL, and I've been trying to find the best way to update/insert records in an OracleSQL table with the data from a CSV file.
So far, I've figured out how to load the csv into a temporary table using External Tables in Oracle, but I'm having difficulty finding a detailed guide on how to update/insert (UPSERT) the loaded data into an existing table.
What is the best way to do this, when I have 30+ fields in the table? For example, is it best to read the csv line by line with something like pandas and update each record one by one, or is it best to do it with a sql script using something like a merge statement? Not all records in the csv have a value for the primary key, in which case I need to insert rather than update. Thanks for the help!
That looks like a MERGE, indeed.
Data from external table would then be used to
update values in existing rows
create new rows in the target table
Pandas and row-by-row processing? I wouldn't do that. If you already have a powerful database, then use its capabilities. Row-by-row is usually slow-by-slow and there's rarely some benefit in doing it that way.

SQL Create Data for Database

I would like to create a script to insert data into an SQL database.
My project is to take an access database, that isn't well structured and put it into SQL. The access data can be imported, but there is a table that i have created to that the access DB doesn't have, which is what i want the script for. Its a physical Paper box archive database. I need to create the "Locations" data.
To be more specific the data is:
ID (auto num)
Rack - These are the shelving units
Row - This is the same as shelf
Column - This is the amount of boxes horizontally on a shelf
Position - This is the depth (there can be two boxes in the same column on each shelf)
INSERT INTO
In terms of script, there are a few methods of inserting data into MySQL. Firstly if you have an existing table, you can insert values into specific columns like so:
INSERT INTO TableName (Column1, Column2 etc..)
VALUES ("Column1 value", 420 etc...)
This can be added to a while loop in order to fill multiple rows at a fast pace.
IMPORT FILE
Another method you can use to create a table with existing data and columns is to import an excel sheet for example. This can be done by right clicking on the database you wish to add the new table to, heading to tasks then import data.
Database (Right Click) > Tasks > Import Data...
You will then need to select data source, presumably excel, then specify the file path. Next select destination; probably SQL Server Native Client for you. The rest from there should be pretty easy to follow.
BULK IMPORT
I've not had a lot of practice with bulk importing to SQL, however from what I am aware you can use this method to import data from an external file into a SQL table programmatically.
An example I have is as follows:
--Define the data you are importing in a temp table
CREATE TABLE #ClickData
ID INT IDENTITY(1,1)
,Dated VARCHAR(255) COLLATE Latin1_General_CI_AS
,PageViews VARCHAR(255) COLLATE Latin1_General_CI_AS
)
insert into #ClickData
--Select the data from the file
select Dated, PageViews
from openrowset(--Openrowset is the method of doing this
bulk 'FilePath\ImportToSqlTest.csv',--The file you wish to import data from
formatfile = 'FilePath\Main.XML',--The XML formatting for the data you are gathering (I believe this part is for reading the file)
firstrow = 2--Specifiy the starting row(Mine is 2 to ignore headers)
) as data
Apologies if this answer isn't overly helpful, I had to write this in a rush. I'm not entirely sure what you're looking for, as others stated your question is very vague. Hopefully this might help somewhat.

Updating a table based on an Excel file without excel functionality?

Okay, so I basically have a huge excel list with two columns, ID and Value. I'd like to update an Employee table in my SQL and set a column in that table equal to the Value from the excel file where the ID of the excel file is equal to the ID of the Employee table.
I've looked at other similar questions and they use OpenRowSet and other SQL excel functionality, but unfortunately it looks like our DBA hasn't installed any of that. What can I do as an alternative to update that table?
As for what I've tried, I created two XML lists and separated each column of the DB into a temporary table so now I have two temporary tables, one for each excel column, but now I'm reading that combining two tables without a common identifier is really bad practice so I'm back at square one.
You can do that easily using SSIS. You can use excel source, Lookup transformation or Merge Join and OLE DB destination.

How to create a dynamic table structure in SQL from multiple delimited text files?

I have about 100,000+ delimited text files (they dont have same number of columns in each file, e.g. some files have 10 columns, some have 20 and so on). I need to upload all of them to SQL server. Please suggest how can I do it?
I also have an excel spreadsheet enlisting the names/path where files are stored and also the number of columns in each text file. I am clueless how to go about it.
Thanks in advance
I assume you are able to use C# (or other programming language) to create an app which will help you to complete the task. The program should do the following:
Run through all the files and determine all the columns you need.
Create a table on SQL server with columns the program found. Set datatype varchar(max) for each column.
Run through all the files again and insert data to the table. There is 2 ways you can go:
a) Insert data row by row. Pretty slow, but simple.
b) If you use C# you can implement your own DataReader and use SqlBulkCopy class to bulk insert data into the table
Here is some link which may help you:
http://www.sqlteam.com/article/use-sqlbulkcopy-to-quickly-load-data-from-your-client-to-sql-server
http://www.michaelbowersox.com/2011/12/22/using-a-custom-idatareader-to-stream-data-into-a-database/
http://daniel.wertheim.se/2010/11/10/c-custom-datareader-for-sqlbulkcopy/

What is the easiest way to add a bunch of content to a SQL database?

Nothing technical here. Suppose I have a lot of different categorized data, and I would like to create a database out of it. Would someone literally hand plug in all that info with SQL code itself? Or do some people make a mock website just to input data? What are some of your strategies?
If there would be no way to do it automatically, then a mock website would be the way to go: you can even use it with more people at once, actually multiplying the input speed (as long as you don't mess up assigning each of them a different part of the data).
What format is your data in? And how much of it is there? If its Excel then SQL Server has tools to import it in. I'm not sure if MySQL has anything similar. Even if it doesn't one other technique I have used with Excel data is to use a formula to concatenate as required to generate the INSERT statements. Then just paste those into a query window and run that.
I wouldn't do a website for it unless I was building an admin site for it already and wanted to test that with the initial load.
Most databases have a way to do bulk inserts or have tools for data import.
My strategies normally involve such tools.
Here is an example of importing a CSV file to SQL Server.
Most database servers provide a way to import data from a variety of formats, you could look into that first.
If not, you could write a simple script or console application to parse your input data, and write out a SQL script to insert the data into appropriate tables.
For example, if you data was in a CSV file, you would parse each line in the file, and generate an insert statement to write out to a .sql file.
MyData.csv
1,2,3,'Test',4
2,3,4,'Test2,6
GeneratedInsert.sql
insert into table (col1,col2,col3,col4,col5) values (1,2,3,'Test',4)
insert into table (cal1,col2,col3,col4,col5) values (2,3,4,'Test2',6)
MySQL has a statement LOAD DATA INFILE that is intended for loading bulk data from flat files. It's easy to use and much faster than alternative methods.
But first you do have to use SQL to design tables with fields that match the field of your import data. That is, if you have some file with comma-separated data:
Titanic;1997;4 stars
Batman Begins;2005;5 stars
"Harry Potter and the Sorcerer's Stone";2001;3 stars
...
You would create a table:
CREATE TABLE Movies (
title VARCHAR(100) NOT NULL,
year YEAR NOT NULL
rating VARCHAR(10)
);
Then load data:
LOAD DATA INFILE 'movies.txt' INTO TABLE Movies
FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"';
Most web languages have some sort of auto-scaffolding that you can quickly set up. Useful for admin work as well, if your site is hosted without direct access to DB.
Otherwise, yeah - write the SQL statements. Useful to bring a database up as part of your build process.