How to regularly update or create a SQL Server table? - sql

I need to collect data from a SQL Server table, format it, and then put it into a different table.
I have access to SQL Server but cannot setup triggers or scheduled jobs.
I can create tables, stored procedures, views and functions.
What can I setup that will automatically collect the data and insert it into a SQL Server table for me?

I would probably create a stored procedure to do this task.
In the stored procedure you can create a CTE or use temp tables (depending on the task) and do all the data manipulation you require and once done, you can use the SELECT INTO statement to move all the data from the temp table into the SQL Server table you need.
https://www.w3schools.com/sql/sql_select_into.asp
You can then schedule this stored procedure to run at a time desired by you

A database is just a storage container. It doesn't "do" things automatically all by itself. Even if you did have the access to create triggers, something would have to happen to the table to cause the trigger to fire, typically a CRUD operation on the parent table. And something external needs to happen to initiate that CRUD operation.
When you start talking about automating a process, you're talking about the function of a scheduler program. SQL Server has one built in, the SQL Agent, and depending on your needs you may find that it's appropriate to enlist help from whoever in your organization does have access to it. I've worked in a couple of organizations, though, that only used the SQL Agent to schedule maintenance jobs, while data manipulation jobs were scheduled through an outside resource. The most common one I've run across is Control-M, but there are other players in that market. I even ran across one homemade scheduler protocol that was just built in C#.NET that worked great.
Based on the limitations you lay out in your question, and the comments you've made in response to others, it sounds to me like you need to do socialize your challenge within your organization to find out what their routine mechanism is for setting up data transfers. It's unlikely that this is the first time it's come up, unless the company was founded in the last week or two. It will probably require that you set up your code, probably a stored procedure or maybe an SSIS package, and then work with someone else, perhaps a DBA or a Site Operations team or some such, to get that process automated to fire when you need it to, whether through an Agent job or maybe a file listener.

Well you have two major options, SP and SSIS.
Both of them can be scheduled to run at a given time with a simple Job from the SQL Server Agent. Keep in mind that if you are doing this on a separate server you might need to add the source server as a Linked Server so you can access it from the script.

I've done this approach in the past and it has worked great. Note, for security reasons, I am not able to access the remote server's task scheduler, so I go through the SQL Server Agent:
Run a SQL Server Agent on a schedule of your choice
Use the SQL Server Agent to call an SSIS Package
The SSIS Package then calls an executable which can pull the data you want from your original table, evaluate it, and then insert a formatted version of it, one record at a time. Alternatively, you can simply create a C# script within the SSIS package via a Script Task.
I hope this helps. Please let me know if you need more details.

Related

SQL - Continuous Integration (Data)

This is a general question and probably there are some solutions already. Most of the things I have found are related to database development, deployment, etc..
I am looking for a process that runs daily and performs some checks against some tables of a database. The data loaded in these tables is loaded by a lot of users, but the idea is that defining some rules, the process will detect "wrong" values loaded by the user.
I know this is a very open question, but do you know if this possible with some tools: Jenkins, DBGhost, etc...?
Thank you,
Kat
You have many options. Here's one train of thought.
Create a table called data_audit with fields like so:
audit_datetime
table
field
wrong_value
rule_violated
issue_description
Create stored procedures/functions that can detect wrong values and store the data into this audit table.
Depending on your database, you can run the stored procedure upon schedule. For example, if you have SQL Server, you can run the job using SQL Agent. Once the job is finished, you can run another job that finds count(*) from audit table for today's date. If count was higher than zero, use Database Mail feature to email relevant people to take action.
If you have a database like MySQL or PostgreSQL, write a short program in a language of your choice (PHP/Python/.NET/whatever) to execute the stored procedure, then do count(*) and then email if count was higher than zero. You can run this program using either cron on Linux or Linux-like systems or Task Scheduler in Windows.
You could use tools like Jenkins to schedule such activity. Task Scheduler/cron are built into your operating system and are easy to use. Additional installation like Jenkins is not necessary. If you already have Jenkins installed, you can certainly piggy-back on it.

SQL: Automatically copy records from one database to another database

I am trying to find out an ideal way to automatically copy new records from one database to another. the databases have different structure! I achieved it by writing VBS scripts which copy the data from one to another and triggered the scripts from another application which passes arguments to the script. But I faced issues at points where there were more than 100 triggers. i.e. 100wscript processes trying to access the database and they couldn't complete the task.
I want to find out a simpler solution inside SQL, I read about setting triggers, Stored procedure and running them from SQL agent, replication etc. The requirement is that I have to copy records to another database periodically or when there is a new record into another database.
Which method will suit me the best?
You can use CDC to do this activity. Create a SSIS package using CDC and run that package periodically through SQL Server Agent Job. CDC will store all the changes of that table and will do all those changes to the destination table when you run the package. Please follow the below link.
http://sqlmag.com/sql-server-integration-services/combining-cdc-and-ssis-incremental-data-loads
The word periodically in your question suggests that you should go for Jobs. You can schedule jobs in SQL Server using Sql Server agent and assign a period. The job will run your script as per assigned frequency.
PrabirS: Change Data Capture
This is a good option. Because it uses the truncation-log to create something similar to the Command Query Segregation Pattern (CQRS).
Alok Gupta: A SQL Job that runs in the SQL Agent
This too is a good option, given that you have something like a modified date thus you can filter the altered data. You can create a Stored Procedure and let it run regularly in the SQL Agent.
A third option could be triggers (the change will happen in the same transaction).
This option is useful for auditing and logging. But you should definitely avoid writing business logic in triggers, as triggers are more or less hidden and occur without directly calling them (similar to CDC actually). I have actually created a trigger about half a year ago that captured the data and inserted it somewhere else in xml-format as the columns in the original table could change over time (multiple projects using the same database(s)).
-Edit-
By the way, your question more or less suggest a lack of a clear design pattern and that the used technique is not the main problem. You could try to read how an ETL-layer is build, or try to implement a "separations of concerns". Note; it is hard to tell if this is the case, but given how you formulated your question, an unclear design is something that pops up in my mind as possible problem.

I would like to set up a notification trigger when new data is added to a remote server. What is the best approach?

I'm a little lost and need some guidance on how to approach this feature I'd like to add.
Many operations I use require retrieving data from a remote server. My goal is to be able to receive an email notification if new data has been added to the remote server.
I thought about creating a stored procedure that uses "openquery" and compare data to a local table with a conditional statement that will send out an email if there are differences. Then scheduling a job that will execute this stored procedure frequently. But this does not feel elegant at all...
If I understood your question correctly, all depends on the permissions.
If I was the owner of the system
Find out which job is adding data to the system. Modify the process (ETL/ SQL job etc.) to send you an email. (best way)
If you have create permissions on the remote system
Create an after insert trigger, see the first example here. Refer to this link as well. (2nd best way)
If you have just permissions to create linked server
Whatever you wrote/ you can bring the data from the server (just the primary keys from the table) and keep on checking that by creating a job for new primary keys if any by copying the data to local.
How to choose between these two: depends on the size of data. Second method mentioned in point 3 will work even without a linked server.
But you will have to run this again and again, I can't think of any other way. Set up a SQL job/ ETL to do this for you.

Auditing execution of stored procedures in Sql Server

My boss and I have been trying to see what sort of auditing plan we could try for our stored procedures. Currently there're two external applications taking information from our database through stored procedures and we're interested in auditing when they're being executed, and what values are passed as parameters. So far what I've done is simply create a table for the stored procedures one of the apps is using, and as they use the same input parameters, have one column per parameter. Obviously this isn't the best choice, but we wanted to get quick info to see if they were running batch processes and when they were running them. I've tried SQL Server Audit, but it doesn't catch the parameters unless you're executing a SP in a query.
SQL Server Profiler will do this for you; its included for free. Setup a trace and let it run.
You can also apply quite a bit of filtering to the trace, so you don't need to track everything; you can also direct the output to a file, or sql table for later analysis. This is probably your best bet for a time limited audit.
I think I've used the SQL Server Profiler (http://msdn.microsoft.com/en-us/library/ms181091.aspx) in the past to audit SQL execution. It's not something you would run all the time, but you can get a snapshot of what's running and how it's being executed.
I haven't tried using them, but you might look at event notifications and see if they will work for you.
From BOL
Event notifications can be used to do the following:
Log and review changes or activity occurring on the database.

SQL Server trigger to update Sharepoint List

So, it looks like I'm gonna have to replicate a couple of reference tables from my SS2k5 db on SP2k7 in order to do dropdown boxes on my document library. Small tables, maybe a hundred entries, and not often updated. Ths SP Server is not the SS server.
I know how to build triggers, but how do I reference the SP table to update it from the SS trigger, and what are the authentication issues?
Anybody do this before?
I know there is a thing called Business Catalog Data or something like that, but I don't have full privs on this SP site, so I'm likely not to be able to get to that, and I've never used it before, hence the trigger idea.
Does it really need to be real time via a trigger? Or can it be delayed and processed via an ETL job? If the latter is acceptable, I recommend taking a look at Extracting and Loading SharePoint Data in SQL Server Integration Services. I have used this adapter on past projects to transfer data between SQL Server and SharePoint.
P.S. I would not recommend writing directly to the SharePoint content database. Making changes directly to a content database is not supported and is not considered a best practice.