I know there is a way to import an Excel spreadsheet into Oracle using SQL Developer. However, I am trying to find out if it is possible to import an Excel spreadsheet into an Oracle table using a SQL query statement. I have done this type of SQL query previously going from Excel to MS Access but am not trying to do the same thing for going from Excel to Oracle.
The query I have used for going from Excel to Access is as follows:
SELECT * INTO TABLENAME FROM ('Microsoft.Jet.OLEDB.4.0','Excel 12.0;IMEX=1;HDR=NO;DATABASE=EXCELPATH.xlsx', 'Select * from [EXCELSPREADSHEET$]');
There is the way but it isn't easy way.
You can check this Read excel
New excel documents are saved in open xml standard. Thats mean xlsx file is set of zipedd xml files. You can change exension to zip and look what is inside.
You can load seperated xml document into DB as xmltype and use xmlquery to extract data.
Related
I'm using SQL code (which I do not know, I just tried to copy from a sample tutorial) to merge documents to use in Excel PowerPivot. I have an excel file called 2014, and another file called 2015. This was the syntax shown in the example to merge the two, but it keeps telling me columns not identified. How can I fix this?
SELECT ['2015$'].* FROM ['2015$']
UNION ALL
SELECT * FROM 'C:\Alex\2014.xlsx'.['2014$'];
Using MS Access 2007 I would like to retrieve only part of an ODBC table.
I can import the whole table in Access but I don't need all of it and it would be a waste of space and performance to store the whole table when I only need certain columns.
In Excel I wrote a SQL query that let me retrieve only the part I'm interested in. What I'd like to know is: is it possible to import only the result of a SQL query in Access or do I have to retrieve the whole table and then run the query on it?
Is it possible using built-in Access module or should I turn to VBA?
Edit: Basically I would like to run the ODBC data connection below (currently used in Excel) in Access.
Connection string:
DSN=BLA1;
UID=BLA2;
DBQ=BLA3;
PWD=BLA4;
DBA=W;
APA=T;
EXC=F;
FEN=T;
QTO=T;
FRC=10;
FDL=10;
LOB=T;
RST=T;
GDE=F;
FRL=F;
BAM=IfAllSuccessful;
MTS=F;
MDI=F;
CSR=F;
FWC=F;
PFC=10;
TLO=0;
Command string:
SELECT *
FROM TEST TEST
WHERE (TEST.DATE_STAMP=?)
When I try to link the database I get the error The database engine can't find 'WTD.DATAPOINT_5/1000'. Make sure it is a valid parameter or alias name, that it doesn't include characters or punctuation, and that the name isn't too long. but when I use the Excel database connection I get no error and everything is updated.
You don't need to import the whole table. You could link to the ODBC table and then run a make-table query against that linked table to copy in only the rows and columns that you need.
I often receive requests to query a SQL Server database based on data that is sent to me in an Excel spreadsheet.
I am looking for a more efficient way of completing these types of requests than my current setup:
Currently in order to complete the request I do the following:
Copy the Excel column containing the data that will eventually be placed in a WHERE clause.
Paste the data as text only into Microsoft Word.
Do a find for each paragraph marker and replace it with ', '
Then surround the entire clause with parenthesis to enter into an IN clause.
Does anyone have a suggestion for a more efficient way of accomplishing the same task?
Here are a couple of ways:
Query the excel spreadsheet directly:
SELECT *
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0;Database=C:\excelfile.xls', [Sheet1$])
Use excel to format the data:
In next empty column = A1 & "," then copy-down, or ="'"&A1&"',"
You could save the excell as a CSV comma delimited file and go from there but if this is a regular thing i would probably set up an SSIS process to do it all for you
I have some Excel spreadsheets that I cannot change as they are used by another department and they will not change them in future. They are .xlsm with over 500 columns (A:TH). I'm trying to import them into SQL server 2008 on a 64bit machine but I'm having huge problems. All forms of Excel import appear to truncate the columns I select to the first 255.
Ultimately there will 5 separate tables to store this data with 1 common key. I could write a short VBA script to sort the data in Excel into arranged columns of tables at source but I wanted to ask if the following was possible first...
This works fine and selects the columns A:IV
SELECT * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0;Database=C:\NEW.xlsm',
'SELECT * FROM [Details Sheet$A:IV]')
Is there a clever way to do something similar with a non-contiguous range such as
SELECT * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0;Database=C:\NEW.xlsm',
'SELECT * FROM [Details Sheet$C:C,IW:LZ]')
ie. pick up the key in column C and the additional columns IW:LZ? The problem for me is that using the full range C:LZ and SELECT [ID],[THIS],[THAT] FROM etc won't work for fields beyond 255 columns in the range, very annoying!
Have you tried using SSIS to import the Excel files? It can be very picky about data types, but I've never run into a limitation that I couldn't work around with a bit of a Script Component.
It's designed to be a high-performance ETL tool for jobs like what you're trying to accomplish. If you're new to it, check out this article on importing the entirety of Wikipedia as XML into multiple tables.
A quick note is that you may need to install additional Office drivers to read Excel 2007 format, especially on 64-bit machine.
I have a tab separated .txt (Very Small file with just 10 to 15 datasets) and this file is having some columns as PrdName, PrdSize, PrdWeight, PrdCode and so on.
Now I want to import the two columns which are PrdSize and PrdCode and import it in the columns of my Database table.
I have created the columns but how do I create import clause and transfer data from .txt file to SQL Server? Thanks
Take a look at this post: Import/Export data with SQL Server 2005 Express, there are multiple options that you can use.
Since you have the express edition you'll need to either use BCP or write a program with something else.
If you have a large amount of data, or need to automate the process, definitely look into BCP as mentioned already. However, I often use excel to load one-time data sources (a few hundred to a few thousand) rows of data from odd sources into SQL Server by doing the following:
Get the data into excel (that's usually easy), assuming you get column A with 'Prdsize' and column B with PrdCode, in column C put the formula:
="INSERT INTO MYTABLE(PRDSIZE, PRODCODE) VALUES (" & a1 & "," & B1 & ")"
(in other words create syntactically correct SQL using an Excel formula - you may need to add quotes around string values etc)
and then paste that formula all the way down the column C. Then copy/paste the resultant sql insert statements into SQL Management Studio, or any other tool that can execute SQL and execute it.
Defintely a 'manual' effort, but for one-time data loads it words great.
PS: You'll need to verify the XL formula and the resultant sql syntax - my example is close, but I didn't test it.