We have an Access DB which has a set of local tables and input forms etc. in which a user maintains their data.
We also have a SQL DB with the same tables which is used to displays the data in a web search form.
What is the best way to allow the user to udate his changes to the SQL db while keeping the working copy local so he can work offline and then push the files when he is happy with new version of the data?
My first thought was add the SQL tables as linked tables I could then truncate (access does like that) or delete the content in each table and then do an insert for each table.
Can I call a SP from access on the SQL to truncate the tables as I am have problem running deletes
I really do want to get it down to the user running a macro/sql call that is repeatable etc.
Thanks for your help
You should be able to use the ADODB.Command object to execute stored procedures.
EDIT:
This example is copied from Using ADO in VB and Access
Sub ADO_COMMAND_CONNECTION_TEST()
Dim cmd As New ADODB.Command
Dim rs As ADODB.Recordset
Dim strConn As String
cmd.ActiveConnection = " DRIVER={SQL Server};" & _
"Server=UKDUDE;DATABASE=pubs;UID=sa;PWD=;"
cmd.CommandText = "byroyalty"
cmd.CommandType = adCmdStoredProc
cmd.Parameters.Refresh
cmd.Parameters(1).Value = 25
Set rs = cmd.Execute
' Recordset now has authors with 25% royalty.....
End Sub
Don't ever use MS Access linked tables with MS SQL.
Not only are they slow, but Access can leave open client-side write cursors on the tables referenced. That's a really dumb way to create lots of deadlocks, but Access does it anyway.
Microsoft significantly improved this when they added Access Data Projects - in these the entire back end is replaced with SQL and Access just supplies the forms.
If you want user actions to write directly back then ADPs are by far the best method.
If you want to cache changes locally in your Access DB and then send them up to SQL you have a far more complex problem. You will need to be far more specific on exactly how you want synchronisation to happen - for instance if two users make offline changes who wins when they connect?
I don't understand why you just don't link directly to the SQL Server data and use it directly, rather than going to the trouble of maintaining a second copy of it. This is the standard Access way to do things -- why are you resisting the natural capabilities of the tool you're using?
Related
I'm having an issue where I've created an Access 2003 ADP and connected to a database on SQL Server 2008 R2. On my PC, the table and view names are the same as in the SQL db (e.g., Table1, View1, etc), and I linked the report forms I built to these tables using those names. However, when I granted a colleague permission to the SQL backend so she can open the ADP and run the reports I built, it fails because on her PC a "dbo_" prefix appears on each table and object name (e.g., dbo_Table1, dbo_View1, etc), which of course breaks the connection to the data source for her. I'm stuck with Access 2003 at the moment. Is there a way to control this either in Access or in the back end? I did change her schema in SQL to dbo as an attempt to fix this, no dice. Thoughts appreciated!
The team I work on uses a small VBA utility to remove all those "dbo_" prefixes before running code against newly linked tables. It works well for us. You can add it in your Access application in whatever way is most convenient for you. Hope it helps. Good luck with your project.
Public Sub Zap_dboLabelsOnLinks()
Dim dbDef As Database
Dim tblDef As TableDef
Set dbDef = CurrentDb
'loop through each table in the database
For Each tblDef In dbDef.TableDefs
If UCase(Left(tblDef.Name, 4)) = UCase("dbo_") Then
tblDef.Name = Right(tblDef.Name, (Len(tblDef.Name) - 4))
End If
Next tblDef
Exit Sub
One of my managers created a Access database and is working on some data analysis - what if scenarios. Based on different conditions, he produces a report in Access.
He asked me to do some data manipulation, so I imported the database into SQL and wrote a routine with a cursor that'll do what he wants. I then export the results back into Access. Before I get any heat for using a cursor, this was supposed to be a one time only deal, so that was the fastest way for me to get it done.
As you'd expect, now he wants me to run it all the time and asked me to convert my routine to Access so he can just run it. Before you tell me to just use SQL, he's very set on Access and is often traveling and off line.
So, my question is: is there a "easy" way to convert a T-SQL query with a cursor into Access? It's been a long time since I worked with Access, but I suspect it'd have to be re-written in VBA. I'm thinking that maybe another solution would be to call the query from Access and run it in SQL, but I don't know if that can be done or if it'd work on my case because of him being off line (maybe install SQL express in his laptop?)
Any suggestions would be appreciated.
Thanks,
Alex
This is how I got around it:
1.Downloaded and install SQL server express in the user's machine.
2.uploaded Access database structure and data to the local SQL.
3.created the stored procedure that I wanted to run in the local SQL server.
4.back in Access, deleted all the tables and recreated them as linked tables to SQL
5.Create a form in Access with a big button that executes the stored procedure
`Private Sub Command0_Click()
Dim qdf As DAO.QueryDef
Set qdf = CurrentDb.CreateQueryDef("")
qdf.Connect = CurrentDb.TableDefs("ANY TABLE").Connect
qdf.SQL = "EXEC dbo.[stored procedure name]"
qdf.ReturnsRecords = False
qdf.Execute
Set qdf = Nothing`
The stored procedure truncates one re-populates one of the tables. So after executing it I can open up the table in Access and see the changes. My manager can continue to use Access and SQL server is used in the back-end. Happy Ending! :)
I'm been searching for a while and I can't find any help on this issue.
I have a shared Access 2007 database as a source. I have excel pivots that are linked to a table in that database. I will utimately have multiple excel files linked to the database (and multiple users), and for that reason I'd like to avoid residual open connection to the DB for Access performance reasons, even if the connections are read only.
I've built the connection in excel, but I am attempting to so is write VBA code to open the connection to the access table refresh the pivot cache, then drop the connection. I know that I'm not putting the pieces together correctly. Can someone help me out? Thank you in advance.
Sub ConnectToAccessAttempt()
Set cn = New ADODB.Connection
cn.Open "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=DB_Name.accdb;Persist Security Info=False"
ActiveWorkbook.PivotCaches(1).CommandText = "table_Name"
ActiveWorkbook.RefreshAll
End Sub
My company wants to have approximately 100 of their sales people (distributed around the country) to be able to run stored procedures from excel and return the data onto the spreadsheet.
We have sql server 2008. i need to figure out a safe way to do this.
i will create a form in excel where the user can push a command button to refresh the data based on the parameters that they choose.
how can i ensure that the connection from excel to the sql server is secure?
how do i run a stored procedure from excel?
i found this to be very good information: http://office.microsoft.com/en-us/excel-help/connect-to-import-sql-server-data-HA010217956.aspx
Windows Authentication Select this option to use the Windows user
name and password of the current user. This is the most secure method,
but it can affect performance when many users are connected to the
server.
however, i would like your input on this.
yes, the sales reps do have windows logins, but can i use this solution if they will actually be entering specifying the data criteria, then sending the criteria over into the stored procedure and then getting the data from the server?
Allowing users direct connections to your database is tricky. First off, you expose yourself to attack from without, as user accounts are compromised more frequently than well-isolated admin and service accounts. Having said that, the user account does need to be compromised to allow an attacker into the system, and you have good granularity of control built into SQL Server if every user has their own credentials.
Using the Excel-native interfaces isn't that different from doing it via VBA or VSTA, which is how most developers did it for the last decade or so. Those methods are about as secure as your network. I believe the Excel-native functionality works without extraneous references, as well, which is particularly nice for maintenance purposes. The main difference seems to be in the ability to do arbitrary queries. For security and data integrity purposes, this is probably for the best.
Running a stored procedure is probably not a good idea, as you can get into massive support requirements if your users start needing(wanting) tweaks frequently. Can you make do with a view? Excel's inbuilt filtering and sorting abilities are pretty powerful. That would be my first approach.
There are several approaches depending on your needs:
1 - modify your schema to allow the database to tie data to individual users
2 - move the access code into a VBA macro associated to the workbook. This is not recommended, but it will allow you to use ADO directly. Be SURE you have a solid security configuration on the database side if you do this, as an attacker who gains access to a user's account will be able to do anything that user can do.
To go the VBA route, in the VBA environment Tools->References to find the latest Microsoft ADO version. The VBA code looks something like this:
Dim Connection as ADODB.Connection
Set Connection = new ADODB.Connection
Connection.Open"Provider=SQLNCLI;Server=myServerAddress;Database=myDataBase;Trusted_Connection=yes;"
Dim command As ADODB.command
command.CommandText = "exec sp_something"
Dim Parameters(2) As ADODB.Parameter
Set Parameters(1) = New ADODB.Parameter
Parameters(1).Name = "field_name"
Parameters(1).Type = adVarChar
Parameters(1).Size = 50
Set Parameters(2) = New ADODB.Parameter
Parameters(2).Name = "field_name_2"
Parameters(2).Type = adVarChar
Parameters(2).Size = 50
Dim i As Integer
For i = LBound(Parameters) To UBound(Parameters)
command.Parameters.Append Parameters(i)
Next i
Dim Records As ADODB.Recordset
Set Records = command.Execute
Tie that macro to your button, set up your values via the sheet or an input box, and fire away. But I'll repeat my warning: Going this way leads to massive support requirements. If people want to extract custom data, then they get very particular about it.
Instead of the article you linked, I'd rather use VBA script with reference to the ADO library and a normal connection string with a technical SQL user.
Since the password would be in the connection string in this case, this technical user should have no other rights than executing your stored procedures.
Let me know if you need more details.
I am trying to copy a table in a database into another database on another connection in VB.NET, using OleDb. If they were on the same connection I would just use SELECT INTO, but they are not. I have two different OleDbConnection and cannot see an easy way to do this.
Right now I am attempting to just copy the database into a DataTable using an OleDbDataAdapter, and then loop through the DataTable and insert every record into the target database one at a time. This obviously takes a ton of time for the large DB I could potentially be dealing with, and I have to deal with escaping strings, null values, etc.
Is there an easier way to do this?
Thanks,
Logan
edit - just to make this more clear: I have two OleDbConnection objects, one is linked directly to a local .mdb file on my computer (JET). The other is linked to a database on our servers (SQLOLEDB). I am wanting to do this:
"SELECT * FROM fromDB INTO toDB"
But I can't because fromDB and toDB are on different connections, and the OleDbCommand object is only attached to one. The only way I can see how to do this is to connect to fromDB, copy it into a DataTable, connect to toDB, and copy all of the data in the DataTable row by row into toDB. I was wondering if there is an easier way to do this.
If you are constrained to this architecture, one idea is to write a stored procedure on the server that accepts a large chuck of row data in one call. It could then write out the row data to a file for a future bulk-insert, or it could attempt to insert the rows directly.
This also has the benefit of speeding things up over high latency connections to the server.
Also, if you use parameterized statements, you can avoid having to escape strings etc.
If you are just copying from one to the other, why don't you do it in SQL?
You can create a Synonym within one database pointing at a table, view or stored proc on another database (on another server). You can then insert into this synonym just like you could into a table in the same db.
http://www.developer.com/db/article.php/3613301/Using-Synonyms-in-SQL-Server-2005.htm