I have an excel 2010. In the VBA code, one procedure fetches data from SQPEXPR 2008.
There are repeated calls, and many times same/filter data is fetched (which is already present in the excel document).
It makes a good use case to apply caching (for performance improvement).
Is it possible? Is yes how?
SQL Native Client is used to connect to database.
When you update the underlying data that you wish to cache, also store the date updated. This could be done manually or via a trigger.
When you perform your query for the main data in Excel, also query for and store in the Excel spreadsheet the date last updated.
Finally, when you perform your data-refresh operation from Excel, first query to see if the current last-updated date is the same as the stored last-updated date. If so, there is no need to refresh the data.
If your data has an inherent "last updated" date and there is an index of any kind with this value as the first column within it, then you already have your "last updated date" stored just fine--it will take only one seek to read the most recent updated date, from which you can derive the same optimization.
SELECT TOP 1 DateChanged FROM dbo.YourTable ORDER BY DateChanged DESC;
Assuming that index I was talking about on DateChanged, you've got your "table last updated" date. Also assuming, of course, that every operation on the table will faithfully update this date when inserting or updating, and that rows are never deleted, just marked inactive (otherwise you would not know to remove a row).
Either way--explicitly saving a separate last-updated date or using a column implicitly, you now have a way to cache your data.
It may help you to think about how browsers and web servers perform this task, which is pretty much exactly how I outlined it: the file being requested has a modified date, and this data is exchanged with the client browser first. Only if the file has a newer date than the cached copy the browser has does the browser request the actual file contents.
Related
using excel 2013 I need to create a search of a SQL tbl on network SQL server for a specific tag and within a given date range. for instance: the tbl has every make of Chevy made since 1990 and the units sold each day. an example, search for corvette with a date range of 1Jul2017 to 31Jul2017. the result would show up how many corvettes were sold for each day in July of 2017 and I will need to have the total sold for entire month of July.
this in important: I only want them to access the data NOT be able to make changes to database
I can do this in SQL no problem, but I have to do this so someone who does not have access to the SQL database can use this query and get the information they need. the eventual goal is to have an Access frontend for this.
edit: I am no expert in either Excel or SQL but know enough to get some things done. one of the many hats I wear . . .
thanks,
In Excel, Data tab, New Query, From Database, pick the type of database, add connection info, click Advanced options, put your query in the SQL statement window, hit OK, hit Load. Now you have SQL results in a spreadsheet. If you want them to update regularly, go to Data tab again, Connections, select the connection you just made from the list, click Properties, and set it to refresh every hour, or when opening the file, or whatever makes sense to you.
This will only work for users that have permissions to the database. If you need it to be more secure (not hold data in the spreadsheet at all, but only access it from the database), in the connections properties click Refresh data when opening the file, then click Remove data from the external data range before saving the workbook.
I am working with office 2016 excel and connecting to an Oracle db
I am creating a file for fetching orders, the part number, their desired delivery dates, the actual delivery dates, the average lead time, the average consumption and finally the current stock level.
I have created an sql to fetch the orders and the dates and the lead time. However, in the database, the current stock level is gotten through a procedure that takes input the part number and the location (can be '%' for overall stock level). Now, I can build a query that takes all orders, all part numbers and run the procedure for every single part number, and then build my powerquery but that seems like a hideous waste of processing power.
Question is: Is there a way to append or merge into the powerquery by calling the procedure after I have filtered the initial source results? Thus only running the procedure for orders made the last month or only on filtered part numbers?
I have tried looking the usual places, support.office, google and here but my problem is I (overwhelmingly) get only results from how to append or merge queries (which is trivial and basically a version of the unwanted situation)
Stian,
If your procedure is not a stored procedure, but a function, then you can create separate query for this function, and then add new column to a filtered table. This column is getting its value by executing this function with a parameter from another column(s).
The other way doing this is creating a view having all the columns you'd like, including that one generated by procedure. Then you query this view in PowerQuery, and apply your filters to the result. This should trigger query folding, which passes your filters to the server, so it optimizes query and doesn't fetch unneeded rows.
As far as I know, Native Queries (when you directly white an SQL query to execute) are not subject to query folding. Also, they are quite unsecure and generally an ad-hoc solution. Keep this in mind.
Google for "Query folding in Power Query" if you'd like more info.
Still, for the most optimal scenario I would consider using function instead of procedure (if that's possible, of course), and combining if with a view to get the data you want.
I want to delete all the rows from a table that haven't been red or haven't been created within the last year.
There is no column that indicates the last access date of a row.
Is there a way to accomplish this anyway the using some internal logs or anything like this in MS Access?
There is no way to do this. Access doesn't keep this sort of log (or any DBMS I know of).
You would need a "date_created" or "date_accessed" column to achieve this.
BTW, in a continuous form or datasheet view, all rows are read (or the filtered ones).
In my current Database I have a table whose data is manually entered or comes in an excel sheet every week. Before we had the "manual entry option", the table would be dropped and replaced by the excel version.
Now because there is data that only exists in the original table this can not be done.
I'm trying to find a way to update the original table with changes and additions from the (excel) table while preserving all rows not in the new sheet.
I've been attempting to simply use an insert query and an update query /but/ I can't find a way to detect changes in a record.
Any suggestions? I can provide the current sql if you'd find that helpful.
Based on what I have read so far, I think I can offer some suggestions:
It appears you have control of the MS Access. I would suggest adding a field to your data table called "source". Modify your form in the access database to store something like "m" for manual entry in the source field. When you import the excel, store an "e" for excel in the field.
You would need to do a one time scrub of the data to mark existing records as manual entries or excel entries. There are a couple of ways you can do it through automation/queries that I can explain in detail if you want.
Once past these steps, your excel process is fairly simple. You can delete all records with source = "e" and then do a full excel import. Manual records would remain unchanged.
This concept will allow you to add new sources and codes and allow you to handle each differently if needed. You just need to spend some time cleaning up your old data. I think you will find it worth it in the end.
Good Luck.
problem
how to best parse/access/extract "excel file" data stored as binary data in an SQL 2005 field?
(so all the data can ultimately be stored in other fields of other tables.)
background
basically, our customer is requiring a large volume of verbose data from their users. unfortunately, our customer cannot require any kind of db export from their user. so our customer must supply some sort of UI for their user to enter the data. the UI our customer decided would be acceptable to all of their users was excel as it has a reasonably robust UI. so given all that, and our customer needs this data parsed and stored in their db automatically.
we've tried to convince our customer that the users will do this exactly once and then insist on db export! but the customer can not require db export of their users.
our customer is requiring us to parse an excel file
the customer's users are using excel as the "best" user interface to enter all the required data
the users are given blank excel templates that they must fill out
these templates have a fixed number of uniquely named tabs
these templates have a number of fixed areas (cells) that must be completed
these templates also have areas where the user will insert up to thousands of identically formatted rows
when complete, the excel file is submitted from the user by standard html file upload
our customer stores this file raw into their SQL database
given
a standard excel (".xls") file (native format, not comma or tab separated)
file is stored raw in a varbinary(max) SQL 2005 field
excel file data may not necessarily be "uniform" between rows -- i.e., we can't just assume one column is all the same data type (e.g., there may be row headers, column headers, empty cells, different "formats", ...)
requirements
code completely within SQL 2005 (stored procedures, SSIS?)
be able to access values on any worksheet (tab)
be able to access values in any cell (no formula data or dereferencing needed)
cell values must not be assumed to be "uniform" between rows -- i.e., we can't just assume one column is all the same data type (e.g., there may be row headers, column headers, empty cells, formulas, different "formats", ...)
preferences
no filesystem access (no writing temporary .xls files)
retrieve values in defined format (e.g., actual date value instead of a raw number like 39876)
My thought is that anything can be done, but there is a price to pay. In this particular case, the price seems to bee too high.
I don't have a tested solution for you, but I can share how I would give my first try on a problem like that.
My first approach would be to install excel on the SqlServer machine and code some assemblies to consume the file on your rows using excel API and then load them on Sql server as assembly procedures.
As I said, This is just a idea, I don't have details, but I'm sure others here can complement or criticize my idea.
But my real advice is to rethink the whole project. It makes no sense to read tabular data on binary files stored on a cell of a row of a table on database.
This looks like an "I wouldn't start from here" kind of a question.
The "install Excel on the server and start coding" answer looks like the only route, but it simply has to be worth exploring alternatives first: it's going to be painful, expensive and time-consuming.
I strongly feel that we're looking at a "requirement" that is the answer to the wrong problem.
What business problem is creating this need? What's driving that? Try the Five Whys as a possible way to explore the history.
It sounds like you're trying to store an entire database table inside a spreadsheet and then inside a single table's field. Wouldn't it be simpler to store the data in a database table to begin with and then export it as an XLS when required?
Without opening up an instance Excel and having Excel resolve worksheet references I'm not sure it's doable at all.
Could you write the varbinary to a Raw File Destination? And then use an Excel Source as your input to whatever step is next in your precedence constraints.
I haven't tried it, but that's what I would try.
Well, the whole setup seems a bit twisted :-) as others have already pointed out.
If you really cannot change the requirements and the whole setup: why don't you explore components such as Aspose.Cells or Syncfusion XlsIO, native .NET components, that allow you to read and interpret native Excel (XLS) files. I'm pretty such with either of the two, you should be able to read your binary Excel into a MemoryStream and then feed that into one of those Excel-reading components, and off you go.
So with a bit of .NET development and SQL CLR, I guess this should be doable - not sure if it's the best way to do it, but it should work.