Access VBA: Copy single record to memory - vba

I am converting an amalgamation of Excel VBA worksheets to an Access database. As seen too often, Excel was used as a database, and converting all data to Access Tables was very easy.
Now however comes getting the VBA forms and (object) modules to work again. I think I need to do this using the Recordset2 objects, however I want to be sure I am doing this the right way.
Always open the database read-only.
Copy, not link, one complete record to VBA application memory.
Use input data from user and data from record to CHANGE some data in this memory (record)... (or not if the user has no data)
Pass the record in memory, changed or unchanged, to the next step in
the process.
Am I on the right track with Recordset2?

Related

Excel data validation list from closed worksheet

I have two Excel 2016 worksheets in the same folder. One is shipment form and the other master database for shipment history. This file also holds the list of "active customers", which obviously changes over time.
When user is creating a new shipment from the shipment form template, I want the customer to be selected from a drop-down list rather than filled in manually. While I'm able to set up standard data validation for a cell, the problem is that the list is maintained in a different Excel file (which, as stated, is in the same folder).
I was also able to set up name reference and have the list from the other spreadsheet available, however I still have one major usability obstacle - it only works when the user has opened the other file, I want to avoid that and just have the user work with shipment form template.
I.e. I'm looking for Excel to fetch the data validation list from another file without forcing the user to keep it open. What are my chances here?
Thanks!
You can use Power Query to get the data from the other file, even if it is closed. Power Query can get the list into the current spreadsheet and you can build the data validation on the Power Query result. You can configure the query to be refreshed when the file is opened, or the user can refresh it manually.

How to temporarily store recordset result in client for use to reduce DB calls

I am working on Excel VBA development recently, I found the recordset retrieved from database will not be available after Connection.Close. Thus, everytime I need some master data, I need to call DB to get it again. That may have very bad impact on the performance. I wonder if there is some solution to store recordset temporarily in client before workbook close, so I can use them without more DB calls. BTW, I was a C# programmer before, and I used ADO.NET to handle the interaction with database. One of the most difference between ADO and ADO.NET is dataset can support off-line query, but recordset needs the connection to be available, I think.
1) Can I get the recordset support off-line query? (Even the connection is closed, the data is still reachable.)
2) Solutions for temporarily storing DB query result before workbook close.
3) There are some collection type in C#, such as Dictionary, List, HashTable, does VBA have the similar data structure?
Recordsets can be disconnected (and later reconnected) and kept in memory or saved to file as xml or a binary format. See the various methods and properties of a recordset.
Saves the Recordset in a file or Stream object.
recordset.Save Destination, PersistFormat
Parameters
Destination
Optional. A Variant that represents the complete path name of the file where the Recordset is to be saved, or a reference to a Stream object.
PersistFormat
Optional. A PersistFormatEnum value that specifies the format in which the Recordset is to be saved (XML or ADTG). The default value is adPersistADTG.
And from Help
One of the most powerful features found in ADO is the capability to open a client-side Recordset from a data source and then disconnect the Recordset from the data source. Once the Recordset has been disconnected, the connection to the data source can be closed, thereby releasing the resources on the server used to maintain it. You can continue to view and edit the data in the Recordset while it is disconnected and later reconnect to the data source and send your updates in batch mode.
To disconnect a Recordset, open it with a cursor location of adUseClient, and then set the ActiveConnection property equal to Nothing. (C++ users should set the ActiveConnection equal to NULL to disconnect.)
We will use a disconnected Recordset later in this section when we discuss Recordset persistence to address a scenario in which we need to have the data in a Recordset available to an application while the client computer is not connected to a network.
You already have it in an object variable. It's not going anywhere while your program runs. It takes one line of code to disconnect. It takes one line of code to save to disk if you want it in future runs of your program, but you don't have to save it. Then one line to reconnect it.

Excel Workbook Connection Makes File Size Large

I have a series of about 30 Excel reports (.xlsm), which each have a unique connection configured to a database. The connection is just a short SQL script which grabs the data to be shown on the report. The data is populated into a table (not a pivot table).
Every week we need to update these reports, so I use a simple PowerShell script to open each of the files and refresh the connection.
Every so often we need to send the base 30 reports to other work groups so they can manually update the files on their own. This can be a nuisance because some of the reports are very large (30mb+). This makes emailing difficult, and uploading them/downloading them several times a day is just a hassle.
To mitigate this, before we distribute the report templates I try to delete all the rows in the tables, and any unused range. This has helped, but there's still several files that are VERY large (30mb+) even though we've deleted everything in the workbook except the connection, and the empty table.
Through tests, I've realized that if I delete the configured connection, the file size becomes sufficiently small (<1mb) which is what I would expect. This leads me to believe that Excel connections have a sort of cache that needs to be cleared, however I can't find any references for this.
Does anyone know a simple way for reducing the size of a connection in such a way that I could do so programmatically using VBA/Powershell?
If deleting the configured connection reduces your file size enough, you could write a macro to delete your connections and another to reestablish them. As Noldor130884 suggested, you can automatically execute the macros on Workbook_Open and Workbook_Close.
Office Online - Create, Edit & Manage connections to external data
The above reference seems to make the relevant statement below:
"Removing a connection only removes the connection and does not remove any object or data from the workbook."
It looks to me as if the problem is with the formatting. I don't know why but in my files excel reformatted all rows and columns while adding form with data from connection. Thus the sheet was very large but if you check the xml file it shows only formatting data.. Once I deleted manually all "empty" rows the size of the file is normal again. Hope that helps, helped in my case..

Can I update data in Essbase using Excel add-In/smart view

Can i insert new data/update the existing data into Essbase using Excel add-In/Smart view like I update the data into Palo Multidimentional database?
regards,
Sri.
Yes. This is what Lock & Send is used for. After you have drilled to an intersection that you would to update/load/change data in, you enter it directly in within Excel. Then perform a Lock operation using the add-in or SmartView. This tells Essbase that you would like to update data that is currently being shown on your spreadsheet. Then perform a Send operation. This will upload all of the data on your sheet back to the database, assuming that you have access to change that data (if you are a read-only user or don't have sufficient filter access, for example, then you can't change the data). Note that all of the data in the spreadsheet will be sent up -- so it is useful to navigate to the smallest possible subset of data that you would like to change.
After sending the data, it will automatically be unlocked. Then just retrieve the sheet to verify that the data you uploaded did in fact upload. If you are trying to upload to members that are dynamic calc, for example, then it won't work. Also note that typically data is loaded such that every intersection point is a Level-0 member, if not then it is possible that a subsequent aggregation/calc in the database might erase the data you just uploaded.

VSTO Cached dataset becoming NULL

We have a VSTO Excel add-in that is capable of loading data by making web service calls. Once we get the data through web service response, we save it to a dataset and load it on the Excel sheet with the help of a list object. Now the dataset is always cached so that when the user opens the workbook next time, we need not make a web service call again to load data. It must load with the cached dataset.
The problem is, for some workbooks (Not all), the dataset is saving and loading properly after first save of the workbook. But, the same dataset is becoming null after second save onwards. The dataset has 4200+ records with 150+ columns. I would like to mention that after second save onwards, all other cached variables are having values. Only the cached dataset is null. Following points to better understand the problem.
Open the Excel template, load data via web service calls. (dataset is loaded now)
Save and close the workbook.
Reopen the same workbook. Dataset is still intact. Loads workbook properly.
Save and close the workbook.
Reopen the same workbook. Dataset is null. So data cannot be loaded to Excel. Existing excel rows are getting deleted as list object behaves as if it was opened for the first time.
Does anyone have any idea what could be wrong? We are using .NET framework 3.5/4.0 and Excel 2007. Excel 2010 does not create any problem.