How to programmatically start SQL Server 2005 merge replication - sql-server-2005

We currently have merge replication set up to merge certain tables between two databases. I need to programmatically start one of the publications to make sure data has been synchronized prior to starting a certain job. SQL Server Books Online has not been too helpful.
So far, the only thing I have come up with is to use sp_start_job to start the merge replication sql job. Is it ok to do this?
Are there any other ways to programmatically start synchronizing a publication?

We ended up using the sp_start_job with the name of the merge replication publication. The only downside that we found was that the name of the sql job is dynamically generated when the publication is created so if the publication is drop and recreated, then the name will change. Other than that using sp_start_job has worked beautifully.
Couple other things:
The sql user was added to SQLAgentOperatorRole to allow the call to sp_start_job
sp_help_job was used to indicate if the job completed successfully before proceeding

Related

How to regularly update or create a SQL Server table?

I need to collect data from a SQL Server table, format it, and then put it into a different table.
I have access to SQL Server but cannot setup triggers or scheduled jobs.
I can create tables, stored procedures, views and functions.
What can I setup that will automatically collect the data and insert it into a SQL Server table for me?
I would probably create a stored procedure to do this task.
In the stored procedure you can create a CTE or use temp tables (depending on the task) and do all the data manipulation you require and once done, you can use the SELECT INTO statement to move all the data from the temp table into the SQL Server table you need.
https://www.w3schools.com/sql/sql_select_into.asp
You can then schedule this stored procedure to run at a time desired by you
A database is just a storage container. It doesn't "do" things automatically all by itself. Even if you did have the access to create triggers, something would have to happen to the table to cause the trigger to fire, typically a CRUD operation on the parent table. And something external needs to happen to initiate that CRUD operation.
When you start talking about automating a process, you're talking about the function of a scheduler program. SQL Server has one built in, the SQL Agent, and depending on your needs you may find that it's appropriate to enlist help from whoever in your organization does have access to it. I've worked in a couple of organizations, though, that only used the SQL Agent to schedule maintenance jobs, while data manipulation jobs were scheduled through an outside resource. The most common one I've run across is Control-M, but there are other players in that market. I even ran across one homemade scheduler protocol that was just built in C#.NET that worked great.
Based on the limitations you lay out in your question, and the comments you've made in response to others, it sounds to me like you need to do socialize your challenge within your organization to find out what their routine mechanism is for setting up data transfers. It's unlikely that this is the first time it's come up, unless the company was founded in the last week or two. It will probably require that you set up your code, probably a stored procedure or maybe an SSIS package, and then work with someone else, perhaps a DBA or a Site Operations team or some such, to get that process automated to fire when you need it to, whether through an Agent job or maybe a file listener.
Well you have two major options, SP and SSIS.
Both of them can be scheduled to run at a given time with a simple Job from the SQL Server Agent. Keep in mind that if you are doing this on a separate server you might need to add the source server as a Linked Server so you can access it from the script.
I've done this approach in the past and it has worked great. Note, for security reasons, I am not able to access the remote server's task scheduler, so I go through the SQL Server Agent:
Run a SQL Server Agent on a schedule of your choice
Use the SQL Server Agent to call an SSIS Package
The SSIS Package then calls an executable which can pull the data you want from your original table, evaluate it, and then insert a formatted version of it, one record at a time. Alternatively, you can simply create a C# script within the SSIS package via a Script Task.
I hope this helps. Please let me know if you need more details.

List All Dropped Databases in SQL Server

I need to confirm if a particular table ever existed on our SQL Server. Is there an existing script or method one can use to list all dropped databases in an SQL server?
There is a built-in report you can access through SSMS called Schema Changes History. You can run this to find what you're looking for.

SQL Replication not working correctly.

I have a transactional SQL Server replication set up. Currently everything I have review seems to show that the replication is working. I have reviewed the Replication Monitoring and Sync State and both show that transactions are being pushed across correctly.
The problem I am having is that some of the replicated tables aren't even coming across while other are. Is there a better way to look deeper into Replication to find out why things aren't coming across?
Are you sure the tables are selected as articles in the publication properties? If so you may want to try validate subscriptions. As a last and most manual resort you can use tablediff.exe to compare two tables. The EXE will be located on your SQL server and will need to be run from the command line. Here's the syntax that worked for me. Replace the all caps items with names for your environment.
tablediff -sourceserver INSTANCE1 -sourcedatabase DATABASE1 -sourcetable TABLE1 -destinationserver INSTANCE2 -destinationdatabase DATABASE2 -destinationtable TABLE2

SQL: Automatically copy records from one database to another database

I am trying to find out an ideal way to automatically copy new records from one database to another. the databases have different structure! I achieved it by writing VBS scripts which copy the data from one to another and triggered the scripts from another application which passes arguments to the script. But I faced issues at points where there were more than 100 triggers. i.e. 100wscript processes trying to access the database and they couldn't complete the task.
I want to find out a simpler solution inside SQL, I read about setting triggers, Stored procedure and running them from SQL agent, replication etc. The requirement is that I have to copy records to another database periodically or when there is a new record into another database.
Which method will suit me the best?
You can use CDC to do this activity. Create a SSIS package using CDC and run that package periodically through SQL Server Agent Job. CDC will store all the changes of that table and will do all those changes to the destination table when you run the package. Please follow the below link.
http://sqlmag.com/sql-server-integration-services/combining-cdc-and-ssis-incremental-data-loads
The word periodically in your question suggests that you should go for Jobs. You can schedule jobs in SQL Server using Sql Server agent and assign a period. The job will run your script as per assigned frequency.
PrabirS: Change Data Capture
This is a good option. Because it uses the truncation-log to create something similar to the Command Query Segregation Pattern (CQRS).
Alok Gupta: A SQL Job that runs in the SQL Agent
This too is a good option, given that you have something like a modified date thus you can filter the altered data. You can create a Stored Procedure and let it run regularly in the SQL Agent.
A third option could be triggers (the change will happen in the same transaction).
This option is useful for auditing and logging. But you should definitely avoid writing business logic in triggers, as triggers are more or less hidden and occur without directly calling them (similar to CDC actually). I have actually created a trigger about half a year ago that captured the data and inserted it somewhere else in xml-format as the columns in the original table could change over time (multiple projects using the same database(s)).
-Edit-
By the way, your question more or less suggest a lack of a clear design pattern and that the used technique is not the main problem. You could try to read how an ETL-layer is build, or try to implement a "separations of concerns". Note; it is hard to tell if this is the case, but given how you formulated your question, an unclear design is something that pops up in my mind as possible problem.

SQL Server Replication - New stored procedures

Is there any way that when setting up replication between 2 servers instead of specifying each stored procedure as an article to be replicated, you could actually just replicate ALL stored procedures/user functions?
For example I run replication to a secondary server just so I have an active backup server. As we develop the application and add new stored procedures to the primary server these are not replicated unless we add the new articles into the replication job, this obviously requires additional work to be completed.
Does anybody know of a shortcut?
There is no shortcut - you have to add the new article to replication explicitly. If you are just using this for backup, have you considered log shipping or database mirroring? That may be more "automatic" for you in that you'll get all tables, users, procs, functions, etc. and you don't have to remember to add a table or proc to the replication solution.