Is there a way to have a split, multi-user Access database that queries one shared excel file? - sql

I ran into a multi-user related problem in my MS Access application.
I have a split database, with multiple front-ends and multiple back-ends. The file locations of the backend databases are stored in an Excel file. The front-end contains a combo box, of which the source is a query that reads the data from the excel file:
This enables the users to switch between the different be-databases.
SELECT XL.*
FROM (SELECT * FROM [Speicherorte BE_DB$] AS xlData IN 'Z:\LocationXYZ\Speicherort_BE_DB.XLSX'[Excel 12.0 Xml;HDR=yes;IMEX=0;ACCDB=Yes]) AS XL
ORDER BY gemeinde;
However, I noticed that doing so, this doesn't allow for multiple users at the same time.
Error: "External table is not in the expected format"
This error does not occur when only one person is using the application.
Does anyone have an idea on how to fix this? Maybe only query the excel file once upon startup and then have "static values" in the combo box for the rest of the time?
Thx in advance and best regards,
Michael

Related

Datamacro can't be found after split dadatabase

I have an access 2007 database with datamacros in the tables. After splitting the database because I want multiple users to work in it they cant be found.
How do I fix this?
Open the backend database directly with Access. That should reveal the macros.

Import external .txt table to PowerPivot using VBA?

I'd like to import some .txt tables to PowerPivot without clicking "From Other Sources" --> "Text File", but rather by running a VBA macro.
The idea is that there are several .txt tables, say
C:\Table1.txt
C:\Table2.txt
C:\Table3.txt
etc.
and to create a user form or similar so that a user can select which tables he needs, then VBA creates a single appended table in PowerPivot of the ones selected. I'd know how to do that if not for two parts:
1) How to import a table to PowerPivot from an external source (C:\) using a VBA command?
2) How to "append" those tables into one through VBA such that they wouldn't appear as different tables in PowerPivot, but rather as one table with always the same name?
I can find tangent questions and information, but no working examples of how to automate importing tables from external .txt sources (or .csv or .accdb for that matter) into a single PowerPivot table like this...
Thank you very much!
Power Query allows a GUI driven interface to do exactly what you want without VBA. You may want to consider that took instead, as it interfaces natively with Power Pivot and can be embedded in a hosted workbook, whereas the VBA solution could never work on SharePoint or Power BI.

Excel 2010 Data -> "Refresh All" slow

Presently am working with an Excel spreadsheet that makes at least 10 database queries to external Microsoft Access *.mdb files. For each project my company works on, we have a specific excel file related to that project, so we have hundreds of these files. Usually when an analyst opens the Data tab and click on "Refresh All" the refresh completes in a minute or two; however, on a new project for a given excel file it is taking at least an hour to complete the refresh. Here is an example of one of the connection strings:
DSN=MS Access Database;DBQ=W:\Projects\Analysis\project.mdb;DefaultDir=W:\Projects\Analysis\Analysis;DriverId=25;FIL=MS Access;MaxBufferSize=2048;PageTimeout=5;
And here is the associated query:
SELECT Field.FieldNumber, Field.FieldName, Field.GroupMnemonic, Field.ClientFieldID
FROM Field Field
ORDER BY Field.FieldName
I have spent time studying various websites discussing slow excel issues like http://msdn.microsoft.com/en-us/library/ff700515.aspx; however, these websites deal more with calculations and VBA whereas I suspect the performance problem is somewhere in an access file. Does anyone have any suggestions on how to troubleshoot and resolve this issue? TIA.
UPDATE: As suggested in the answer below by JohnFx, I checked the queries and found that they had no definite keys, so and so added keys in the Microsoft Access database generation like this:
CREATE UNIQUE INDEX PIndex ON [myTable] ([KEY])
Run the queries individually directly in Access to rule Excel in or out as part of the problem. If the queries are still slow in Access consider adding indexes on any columns that are in being sorted or filtered on.

Race condition caused by cursors persisted to temp files - is it possible

I'm troubleshooting a problem with a Visual Fox Pro application (built with the Visual Fox Express framework) which I suspect is being caused by a race condition. The application is being hosted on a Citrix XenApp server and under certain conditions, data displayed on a certain form appears to be incorrect, and changes to something other than what the user is entering.
The form in question displays a list of records returned from a query on a SQL Server database based on certain information entered by the user.
If this is what is happening I suspect the sequence of events is something like this:
1) User 1 enters data and causes form to dispay grid of data of
results returned from database.
2) User 2 opens same form on different Citrix session and enters data
causing form to display a grid data of results returned from database.
This cursor gets persisted to disk and overwrites, or somehow
conflicts with User 1's cursor for that form.
3) Some FoxPro cursor mechanism on User 1's instance sees changed data
in the cursor (from User 2) and updates the screen with data from the
cursor.
I don't know much about how FoxPro works but from what I understand in some circumstances a cursor will be persisted to a temp file. On our Citrix application server this temp folder may be shared by between 10 and 50 users. I'm looking for information about if a race condition caused by a cursor written to a file in the temp folder is something that is even possible so that I can continue researching down that path or rule it out definitively.
I know there are ways to make it so that the FoxPro temp files are written to a different folder for each user and I am working on making the change to do that but I would like to find out if anyone else has seen a similar problem or thinks that what I suspect is actually possible.
IT does sound strange, but yes, Foxpro creates temp tables of cursors it uses for display and query results, such as local or remote data access. However, when created, they are created as read-only or read-write, but ONLY for the person per connection. When a cursor attempts to be created, it generates a random file name for the results and uses that as the .dbf cursor for presenting to the user.
COULD IT be a racing issue? I doubt that, but not knowing specifics of the quite old Visual FoxExpress framework, don't know what/where you would configure to have it dynamically use a different location of temp files. It should be going to the temp files path of the Windows environment variables. So, if users of the Citrix connection are using the same user / password for multiple sessions, yes, it would go to the same location, but when trying to generate the temp file, it would fail getting an exclusive handle and try again with the next random file name.
I'd say very unlikely that temp files are implicated here. Each cursor you create uses a different temp file; I don't see how two users, even in a Citrix-type situation, would share a single temp file.

Writing data back to SQL from Excel sheet

I know it is possible to get data from a SQL database into an excel sheet, but i'm looking for a way to make it possible to edit the data in excel, and after editing, writing it back to the SQL database.
It appears this is not a function in excel, and google didn't come up with much usefull.
If you want to have the Excel file do all of the work (retrieve from DB; manipulate; update DB) then you could look at ActiveX Data Objects (ADO). You can get an overview at:
http://msdn.microsoft.com/en-us/library/ms680928(VS.85).aspx
You want the Import/Export wizard in SQL Management Studio. Depending on which version of SQL Server you are using, open SSMS (connect to the SQL instance you desire), right click on the database you want to import into and select Tasks.. "Import Data".
In the wizard, click Next (past the intro screen) and from the Data Source drop list select "Microsoft Excel". You specify the path and file name of the Excel spreadsheet, whether you have column headings or not.. then press Next. Just follow the wizard through, it'll set up the destination (can be SQL Server or another destination) etc.
There is help available for this process in SQL Server Books Online and more (a walkthrough) from MSDN.
If you need something deployable/more robust (or less wizard driven) then you'd need to take a look at SQL Server Integration Services (for a more "Enterprise" and security conscious approach). It's probably overkill for what you want to accomplish though.
There is a new Excel plug-in named "MySQL for Excel" : http://www.mysql.com/why-mysql/windows/
I just had a need to do this, and this thread has been quiet for a long time, so I thought it might be useful to supply a recent data point.
In my application roving salespeople use a copy of an Excel workbook that tracks the progress of a prospect through a loan application. The current stage of the application needs to be automatically saved back to a remote SQL database so that we can run reporting on it.
Rejected methods for updating the database from Excel:
SSIS and OpenRowSet are both methods for allowing SQL Server to pull the data from Excel, and don't work very well when the Excel workbook is sitting in an undefined location on a user's computer, and certainly not when the workbook is currently open in Excel.
ADO is now, if not actually deprecated, nevertheless looking very long in the tooth. Also, I wanted the solution to be robust in the face of the user possibly not being connected to the internet.
I also considered running a web API on the destination server. Macros in the Excel workbook connect to the web API to transfer data. However, it can sometimes be painful to allow a web API to talk to the outside world. Also, the code to make it robust in the face of temporary loss of internet connection is painful.
The adopted solution:
The solution I plan to adopt is low-tech: email. Excel emails the data to an address hosted on an Exchange server. Everyone in the company has Outlook installed, so the emails are sent by programmatically adding them to the Outlook Outbox. Outlook nicely handles the case when the user is offline. At the server end, a custom C# executable, fired up at regular intervals by the Task Scheduler, polls the inbox and processes the emails.
You could use try these add-ins :
www.QueryCell.com (I created this one)
www.SQLDrill.com
www.Excel-DB.net
You can use the OPENROWSET function to manipulate Excel data from a T-SQL script. Example usage would be:
UPDATE OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;DATABASE=c:\MySpreadsheet.xls',
'Select * from MyTable')
SET Field1='Value1' WHERE Field2 = 'Value2'