Excel query using Transfer Data from System i - sql

I am looking for information on how I can use the Transfer Data from System i Add-In for Excel to only get data that equals the data in one of my columns (both sets of data are strings). Unfortunately, I cannot just get the entire contents of the System i table since it contains more than the maximum allowed in Excel. Thank you!
Additional information for clarification purposes:
I'm trying to get specific data from the iSeries table. If the field in the iSeries = column A, I want that data placed in column c.
COLUMN A COLUMN B COLUMN C
100 xxxxx
on the iSeries table
FIELD 1 = 40 FIELD 2 = ITEMDESC1
FIELD 1 = 100 FIELD 2 = ITEMDESC2
FIELD 1 = 500 FIELD 2 = ITEMDESC3
In this case it would place ITEMDESC2 into column C since FIELD 1 = 100
I just need to know if there is a way within the WHERE clause using the Data Transfer for System i to do this?
I hope this is clearer

Data Transfer is a simple file transfer. It isn't intended to make decisions on how to select DB2 data based on the contents of various cells in an existing spreadsheet. As a workaround, you can upload the existing spreadsheet to DB2 and then use IBM i Navigator's SQL Script function, join the uploaded Excel table to your DB2 table and use CASE to put the proper field into the proper column.
select columna, columnb, case when field1 = columna then field2 else ' ' end
from excelupload join db2table on some_join_criteria
where some_record_selection_criteria
It seems easier to transfer the file to the PC and do the column manipulation in Excel.

Alternate Approach:
You could write whatever query you like in an iNavigator RunSqlStm window. Set the option to save results, before you run the query. Once you run the query, you can save results in several formats, including Excel. If there are too many rows for your version of Excel, then you can save it as a .CSV file.

Create a new request
Fill in system name (Next)
Fill in table name (Next)
Click "Data Options..."
On this panel is a SELECT and a WHERE clause. Put the cursor in the Where box, click "Details..." and create your query. Click "Apply" and then "OK".

Related

Converting null value to 0 for all fields in MS access

I have a monthly sales table for different customers in ACCESS. The field names are in order Sales_201601, Sales_201602 etc. which changes dynamically with every data refresh.
I am looking for a SQL query which can automatically pick all columns with structure Sales_: and change null value to 0 in ACCESS.
I cannot put the field names individually, because table has many columns and field names changes over time. So need to write a code which changes dynamically with the field names.
I am new to MS access. Please help me.
Thanks
You can update the fields individually:
update t
set Sales_201601 = nz(Sales_201601, 0),
Sales_201602 = nz(Sales_201602, 0)
. . . ;
More importantly, you want to prevent this in the future. The idea is to set the column to not null and set a default value. I think the following works in MS Access:
alter table t alter Sales_201601 not null default 0;
You should do this when new columns are added into the table.
By the way, this would be much simpler if each column were on a separate row.

Quick way to update the value of column in 100+ rows?

I have a table with 3 fields
Example_Table
-------------
ID (identity)
SomeKey
SomeValue
There are probably 150 values saved into this table. Apparently, all but 6 of those values has changed -_-' I have an excel document containing the new values, and am not looking forward to trying to do updates on it all. As a programmer, I can't help but feel there's a better method than doing it manually, or worse yet dropping the table and rebuilding it with the new values.
Does anyone know of a quick way to do a mass update like that? The new values in the spreadsheet are logically sorted / ordered by the key (desc) that they are paired with.
If you are using SQL Server 2012 Management Studio
Make sure the columns in Excel are lined up with SomeValue first and SomeKey (or ID) second.
Highlight the entire range, click Ctrl+C, switch to a new query window in Management Studio, and hit Ctrl+V.
Highlight your cursor at the beginning of the first line, hold Shift+Alt and use the down-arrow to scroll to the last line.
Type:
UPDATE dbo.Example_Table SET SomeValue = '
Repeat 3. placing your cursor after the value, and type:
' WHERE SomeKey =
Now you've got a series of UPDATE statements you can run individually or altogether.
If you are using a previous version
Make sure the columns in Excel are lined up with SomeValue first and SomeKey (or ID) second.
Insert a new column before SomeValue.
In the first row of the new column, type:
UPDATE dbo.Example_Table SET SomeValue = '
In the lower-right corner of that cell, drag with a cross to repeat the value across all applicable rows.
Repeat 2. and 3. in between SomeValue and SomeKey, this time typing:
' WHERE SomeKey =
Repeat 4. for the new column.
Highlight the entire range, click Ctrl+C, switch to a new query window in Management Studio, and hit Ctrl+V.
You may need to search and replace for Tab characters. Highlight one, hit Ctrl+C, Ctrl+H, Tab and make sure Replace with: is an empty string, then click Replace All (unless your data might naturally contain tabs).
Copy the spreadsheet data to a new workbook. Delete everything except the column with the keys and the column with the values.
Insert new columns as needed between them, and add UPDATE sql code between them.
You'll wind up with something like
Column A Column B Column C Col D
UPDATE Example_Table SET SomeValue = '| value from column B |' WHERE SomeKey = '| value from column D '
(Sorry about the markup - formatting help would be welcome)
With a little cut & paste you'll have 150 update statements. Copy them into SQL Server and execute. You may want to paste them into notepad or equivalent first, to check for tabs etc.
It should be a 5 minute job.
There are a few ways to import your data in the Excel document into SQL Server. Microsoft lists a few ways here. Personally, I have always used the import wizard that comes with management studio.
Either way you should end up with a table with the keys and values in it. Then you can use a query like this:
UPDATE mt
SET Value = temp.Value
FROM myTable mt
INNER JOIN importedTable temp on mt.Key = temp.Key

.NET Convert the contents of a DataTable to another DataTable with a different schema

I have a program where the user will have the option to map a variety of data source with an unpredictable column schema. So they might have a SQL Server database with 10 fields or an excel file with 20 - the names can be anything and data types of the fields can be a mixture of text and numeric, dates, etc.
The user then has to provide a mapping of what each field means. So column 4 is a "LocName", column 2 is a "LocX", column 1 is a "LocDate", etc. The names and data types that the user is presented as options to map to is well defined by a DataSet DataTable (XSD xchema file).
For example, if the source contains data formatted like this:
User Column 1: "LocationDate" of type string
User Column 2: "XCoord" of type string
User Column 3: "YCoord" of type string
User Column 4: "LocationName" of type int
and the user provides a mapping that would require that translates to this for the Application required DataTable:
Application Column "LocName" of type string = Column **4** of user table
Application Column "LocX" of type double = Column **2** of user table
Application Column "LocY" of type double = Column **3** of user table
Application Column "LocDate" of type datetime = Column **1** of user table
I have routines that connect to the source and pull out the data for a user query in "raw" format as a DataTable - so it takes the schema of the source.
My question is, what is the best way to then "transform" the data from the raw DataTable to the required application DataTable bearing in mine that this projection has to account for type conversions?
A foreach would obviously work but that seems like brute force since it will have to account for the data types with every loop on each row. Is the a "slick" way to do it with LINQ or ADO.NET?
I would normally do in select that "looks like" the destination table, but with data from the source table. You would apply the data conversions also as required.
Select
Cast (LocationNameLocationName As varChar(...) As LocName
, LocX As XCoord
, ...
From SourceTable
Hard to describe in a simple answer. what I've done in the past is issue an "empty" query like "Select * From sourcetable Where 1=0" which returns no rows but makes all the columns and their types available in the result set. You can cycle through the column ADO objects to get the type of each. You can then use that info to dynamically build a real SQL statement with the conversions.
You still have a lot of logic to decide the conversion, but they all happen as you're building the statement, not as the table is being read. You still have to say in code "if the source column is Integer and the destination column is character, then I want to generate into the select ', Cast ( as Varchar) '"
When you finish building the text of the select, you have a select you can run in ADO to get the rows and the actual move becomes as simple as read/write with the field coming in just as you want them. You can also use that select for an "Insert into Select ...".
Hope this makes sense. The concept is easier to do than to describe.

SQL Statement that Updates an Oracle Database Table from an Excel Spreadsheet

I would like to write a SQL Update command that could be run once a year to update a record for every account in an Oracle Database based on external values that are received in an excel spreadsheet.
My research thus far is indicating I may be able to use a OPENROWSET command, but most references are showing this used from Excel to MS SQL Server:
INNER JOIN
OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\foldername\spreadsheetname.xls;',
'SELECT column1name, column2name, column3name, column4name
FROM [worksheetname$]') EXT
Can someone verify I am on the right path or even better provide a basic example?
The basic Psuedo logic is as follows:
For every record in the Oracle USER_DEFINED table where the CODE_FIELD is equal to "CRS" AND where I have a Value on the spreadsheet with a matching account number, Update the VALUE field for that record in the Oracle USER_DEFINED table with the contents of the "Value" column in the Spreadsheet.
Not exactly what you're requesting, but if I were you (and since this is once a year), I would create update statements in Excel using concatenation formulas.
If the first rows/columns of Excel look like this:
ACCT_NBR | NEW_VALUE | CONSTRUCTED_SQL_STMT
123 | Hello | ="Update USER_DEFINED Set VALUE = '"&B2&"' Where CODE_FIELD='CRS' And Account_Num='"&A2&"';"
456 | World | ="Update USER_DEFINED Set VALUE = '"&B3&"' Where CODE_FIELD='CRS' And Account_Num='"&A3&"';"
Then just run copy/paste the resulting series of update statements into SQL*Plus. Any that don't have a match in your DB will not trigger an update, and any that do match will get updated.
Do a commit at the end and you're done!

db2 sql query in excel

I have a number of very simple queries that I run for others on my team and am placing them in Excel so that the end user can just execute the query themselves by opening the spreadsheet.
I'm currently using an ODBC driver to connect to the DB2 server.
All queries work fine but one is giving me a headache in getting it to work correct in Excel.
One of the queries has a where statement that equals a different value depending on the situtation.
ex.
SELECT *
FROM TABLE1 T1
WHERE T1.T1_ID = 859745
What I would like is to set it up so that the query runs like the following. Is it possible to do this through a variable somehow?
SELECT *
FROM TABLE1 T1
WHERE T1.T1_ID = "USER ENTERED VALUE FROM COLUMN A ROW 1 IN THE EXCEL SHEET"
I'm assuming you're using Microsoft Query. First add a parameter to the query. Then you can customize your parameter to choose an excel cell an the input for the paramter