SQL Replication - sql

I've got two database servers. One is SQL 2000, the other 2008 R2.
The application I'm writing sits mainly on the 2008 server (and thats where all the "writing" takes place)... however it does do quite a bit of looking up on the SQL 2000 instance.
The results are just too slow when joing between the linked servers are just too slow.
The SQL 2000 DB will actually be migrated in a cpl of months... but for now I need to find a better option.
Anyone got any suggestions as to the most efficient way to replicate the tables I need.
I was thinking along the lines of the folowing options:
(a) backup and restore each night
(b) full replication from 2000 - 2008... but will I be able to use the database which is being replicated to?
(c) a sql job which refreshes the tables I need every nn hours
I'm guessing plenty have come across this issue so a little pointer would be greatly appreciated.
Thanks in advance,
Jim

I'd take a look at setting up transactional replication. I've used that technique in the past for exactly the scenario you're describing.

Related

Performance hit on DB2 transactional database after linking to SQL Server 2005

We have an AS400 mainframe running our DB2 transactional database. We also have a SQL Server setup that gets loaded nightly with data from the AS400. The SQL Server setup is for reporting.
I can link the two database servers, BUT, there's concern about how big a performance hit DB2 might suffer from queries coming from SQL Server.
Basically, the fear is that if we start hitting DB2 with queries from SQL Server we'll bog down the transactional system and screw up orders and shipping.
Thanks in advance for any knowledge that can be shared.
Anyone who has a pat answer for a performance question is wrong :-) The appropriate answer is always 'it depends.' Performance tuning is best done via measure, change one variable, repeat.
DB2 for i shouldn't even notice if someone executes a 1,000 row SELECT statement. Take Benny's suggestion and run one while the IBM i side watch. If they want a hint, use WRKACTJOB and sort on the Int column. That represents the interactive response time. I'd guess that the query will be complete before they have time to notice that it was active.
If that seems unacceptable to the management, then perhaps offer to test it before or after hours, where it can't possibly impact interactive performance.
As an aside, the RPG guys can create Excel spreadsheets on the fly too. Scott Klement published some RPG wrappers over the Java POI/HSSF classes. Also, Giovanni Perrotti at Easy400.net has some examples of providing an Excel spreadsheet from a web page.
I'd mostly agree with Buck, a 1000 row result set is no big deal...
Unless of course the system is looking through billions of rows across hundreds of tables to get the 1000 rows you are interested in.
Assuming a useful index exists, 1000 rows shouldn't be a big deal. If you have IBM i Access for Windows installed, there's a component of System i Navigator called "Run SQL Scripts" that includes "Visual Explain" that provides a visual explanation of the query execution plan. View that you can ensure that an index is being used.
On key thing, make sure the work is being done on the i. When using a standard linked table MS SQL Server will attempt to pull back all the rows then do it's own "where".
select * from MYLINK.MYIBMI.MYLIB.MYTABE where MYKEYFLD = '00335';
Whereas this format sends the statement to the remote server for processing and just gets back the results:
select * from openquery(MYLINK, 'select * from mylib.mytable where MYKEYFLD = ''00335''');
Alternately, you could ask the i guys to build you a stored procedure that you can call to get back the results you are looking for. Personally, that's my preferred method.
Charles

Creating Baseline repository for DMV Data

I have started to use DMVs to identify the performance issues we have been encountering in our SQL Server.
Has anyone built a repository to capture this DMV data which can be then baselined and compared against the current data moving forward?
Regards
What version of SQL Server? If 2008 you can take a look at the Management data warehouse. Otherwise you could have a look at the DMVStats project.
Take a look at Creating a baseline for SQL Server to get some ideas
You can not just grab someone's baseline....your server is most likely completely different

SQL Server native client 10.0 versus OleDb differences?

We have an SQL Server 2008 R1 Environment.
We have poorly performing queries across linked servers which are quite complex.
First question:
Are there any differences between (particularly performance) of SQL native client 10.0 and the listing for oledb within linked servers? My understanding of this was that Native client just packages up oledb with some other items. But I would appreciate some guidance.
Additionally does anyone know where to find some good white papers on optimising Linked server queries?
many thanks
D
If you have poorly performing queries across linked servers, you might want to consider using openquery or some other variations. Or possibly putting the logic in a stored procedure on the remote server. This would be much easier if the data do not have to co-mingle with local data. Sometimes it's faster to send to the remote server a small dataset to "work with" or filter on.
ie. in the past, i've called a sp on the remote server with a list of accountids. this is much easier now with xml datatypes.
This difference is explained here... SQL Server linked server performance

How can I replicate a table from SQL 2000 to SQL 2008?

I have a table on a SQL Server 2000 database, which I want copied verbatim to a 2008 server.
I tried to do it manually with INSERT/UPDATE triggers, but this technique runs in a distributed transaction, which does not work because I apparently have to enable MSDTC and the firewall; I don't want to do any of that.
Are there any other ways to replicate that table (in real-time) to the 2008 server?
My first question would be do you really need it to be replicated?
If you want to reference it real-time in Sql 2008, just do a linked server with a Synonym. This will make the table act like it is part of your schema, with minor limitations.
Have you tried to create a scheduled DTS/SSIS package to do the work?
Is it copying or replication? Copying implies a one-time action while replication implies an open-ended period of time during which these tables will be synchronized.

MS Access 2007 and SQL Server 2000

I've recently been upgraded to Office 2007. I have several Access databases (that I've kept in the Access 2000 format for several reasons) that are linked to SQL Server 2000 databases. I have dozens of queries in these databases that I use often. I create new queries daily, sorting, summarizing and generally analyzing the data.
Since the upgrade, some queries take an extremely long time to complete (minutes rather than seconds), and one new one I've tried to run doesn't complete at all, I have to end task on Access. It's a rather simple query, it joins 3 tables, and sorts on one of the fields. I do this ALL THE TIME, and now it appears I can't.
I've searched for discussions of similar problems, but haven't seen specific recommendations.
Any ideas?
I would suggest deleting all your ODBC linked tables and recreating them from scratch as a starting point.
If your queries do not need to make any changes to the data you may find converting them to SQL Pass through queries will speed them up considerable. Note these queries are not parsed through the Jet DB Engine but sent directly to the server and bypass any linked tables.
You will have to use MS SQL syntax and lose the QBE grid though and the result will be read only.
If you need to update data then maybe stored procedures are the way to go.
When I converted to SQL Server backend, I used SQL Server Migration Assistant. I recommend it highly. It's very good at what it does.
Having said that, I assume you're using linked tables in your FE. I convert slow-moving queries by copying the SQL from Access, then pasting it into a "new query" window on SQL Server Management Studio. Then, working through all the syntax changes one at a time, I convert the query to T-SQL and save it as a view with the same name as the query in Access.
I have a little routine that then renames the Access query to "Local_" and then creates a linked table entry to the view on SQL Server. You'll find that a query that used to run for minutes will run for seconds this way. You can, of course, do this manually.
SQL Server Migration Assistant, by the way, will convert many queries (it doesn't try to convert action queries, only select queries...) with little or no intervention.
I would use Access Data Projects with SQL Server 2000. It works great when your SQL backend is that old.