SQL - Continuous Integration (Data) - sql

This is a general question and probably there are some solutions already. Most of the things I have found are related to database development, deployment, etc..
I am looking for a process that runs daily and performs some checks against some tables of a database. The data loaded in these tables is loaded by a lot of users, but the idea is that defining some rules, the process will detect "wrong" values loaded by the user.
I know this is a very open question, but do you know if this possible with some tools: Jenkins, DBGhost, etc...?
Thank you,
Kat

You have many options. Here's one train of thought.
Create a table called data_audit with fields like so:
audit_datetime
table
field
wrong_value
rule_violated
issue_description
Create stored procedures/functions that can detect wrong values and store the data into this audit table.
Depending on your database, you can run the stored procedure upon schedule. For example, if you have SQL Server, you can run the job using SQL Agent. Once the job is finished, you can run another job that finds count(*) from audit table for today's date. If count was higher than zero, use Database Mail feature to email relevant people to take action.
If you have a database like MySQL or PostgreSQL, write a short program in a language of your choice (PHP/Python/.NET/whatever) to execute the stored procedure, then do count(*) and then email if count was higher than zero. You can run this program using either cron on Linux or Linux-like systems or Task Scheduler in Windows.
You could use tools like Jenkins to schedule such activity. Task Scheduler/cron are built into your operating system and are easy to use. Additional installation like Jenkins is not necessary. If you already have Jenkins installed, you can certainly piggy-back on it.

Related

How to regularly update or create a SQL Server table?

I need to collect data from a SQL Server table, format it, and then put it into a different table.
I have access to SQL Server but cannot setup triggers or scheduled jobs.
I can create tables, stored procedures, views and functions.
What can I setup that will automatically collect the data and insert it into a SQL Server table for me?
I would probably create a stored procedure to do this task.
In the stored procedure you can create a CTE or use temp tables (depending on the task) and do all the data manipulation you require and once done, you can use the SELECT INTO statement to move all the data from the temp table into the SQL Server table you need.
https://www.w3schools.com/sql/sql_select_into.asp
You can then schedule this stored procedure to run at a time desired by you
A database is just a storage container. It doesn't "do" things automatically all by itself. Even if you did have the access to create triggers, something would have to happen to the table to cause the trigger to fire, typically a CRUD operation on the parent table. And something external needs to happen to initiate that CRUD operation.
When you start talking about automating a process, you're talking about the function of a scheduler program. SQL Server has one built in, the SQL Agent, and depending on your needs you may find that it's appropriate to enlist help from whoever in your organization does have access to it. I've worked in a couple of organizations, though, that only used the SQL Agent to schedule maintenance jobs, while data manipulation jobs were scheduled through an outside resource. The most common one I've run across is Control-M, but there are other players in that market. I even ran across one homemade scheduler protocol that was just built in C#.NET that worked great.
Based on the limitations you lay out in your question, and the comments you've made in response to others, it sounds to me like you need to do socialize your challenge within your organization to find out what their routine mechanism is for setting up data transfers. It's unlikely that this is the first time it's come up, unless the company was founded in the last week or two. It will probably require that you set up your code, probably a stored procedure or maybe an SSIS package, and then work with someone else, perhaps a DBA or a Site Operations team or some such, to get that process automated to fire when you need it to, whether through an Agent job or maybe a file listener.
Well you have two major options, SP and SSIS.
Both of them can be scheduled to run at a given time with a simple Job from the SQL Server Agent. Keep in mind that if you are doing this on a separate server you might need to add the source server as a Linked Server so you can access it from the script.
I've done this approach in the past and it has worked great. Note, for security reasons, I am not able to access the remote server's task scheduler, so I go through the SQL Server Agent:
Run a SQL Server Agent on a schedule of your choice
Use the SQL Server Agent to call an SSIS Package
The SSIS Package then calls an executable which can pull the data you want from your original table, evaluate it, and then insert a formatted version of it, one record at a time. Alternatively, you can simply create a C# script within the SSIS package via a Script Task.
I hope this helps. Please let me know if you need more details.

What's the best way to supply up-to-date data periodically to an application?

I would like to make an app in Xcode where users get scored based on football results. For this, I need to update a table of football scores every weekend. What would be the best way to go about this? I assume it would be through SQL queries, but what is the industry standard for grabbing up to date data?
The solution is going to be dependent upon your DBMS but a couple ideas to get you started.
Utilizing SQL server you can created a job and schedule it run on a server a specific schedule, you can write TSQL code to look at CSV files and update your database tables
You can write an integration package with SSIS and schedule that package to run weekly
You can write a solution in Python or another scripting language and schedule that to run on your server with windows task manager

SQL: Automatically copy records from one database to another database

I am trying to find out an ideal way to automatically copy new records from one database to another. the databases have different structure! I achieved it by writing VBS scripts which copy the data from one to another and triggered the scripts from another application which passes arguments to the script. But I faced issues at points where there were more than 100 triggers. i.e. 100wscript processes trying to access the database and they couldn't complete the task.
I want to find out a simpler solution inside SQL, I read about setting triggers, Stored procedure and running them from SQL agent, replication etc. The requirement is that I have to copy records to another database periodically or when there is a new record into another database.
Which method will suit me the best?
You can use CDC to do this activity. Create a SSIS package using CDC and run that package periodically through SQL Server Agent Job. CDC will store all the changes of that table and will do all those changes to the destination table when you run the package. Please follow the below link.
http://sqlmag.com/sql-server-integration-services/combining-cdc-and-ssis-incremental-data-loads
The word periodically in your question suggests that you should go for Jobs. You can schedule jobs in SQL Server using Sql Server agent and assign a period. The job will run your script as per assigned frequency.
PrabirS: Change Data Capture
This is a good option. Because it uses the truncation-log to create something similar to the Command Query Segregation Pattern (CQRS).
Alok Gupta: A SQL Job that runs in the SQL Agent
This too is a good option, given that you have something like a modified date thus you can filter the altered data. You can create a Stored Procedure and let it run regularly in the SQL Agent.
A third option could be triggers (the change will happen in the same transaction).
This option is useful for auditing and logging. But you should definitely avoid writing business logic in triggers, as triggers are more or less hidden and occur without directly calling them (similar to CDC actually). I have actually created a trigger about half a year ago that captured the data and inserted it somewhere else in xml-format as the columns in the original table could change over time (multiple projects using the same database(s)).
-Edit-
By the way, your question more or less suggest a lack of a clear design pattern and that the used technique is not the main problem. You could try to read how an ETL-layer is build, or try to implement a "separations of concerns". Note; it is hard to tell if this is the case, but given how you formulated your question, an unclear design is something that pops up in my mind as possible problem.

Is there a way to automate a series of SQL processing scripts?

Assume I know or am able to learn any adjacent technology/language--what's the best way to go about automating a number of processing/summary SQL scripts?
I have a number of scripts, clean-up (eg, update, delete), processing (eg, joins) and summaries post-processing, that I wrote last month but would like to automate. What's the preferred method(s) of automating the entire process as a series of sequential scripts?
EDIT: All of this is run on MySQL dbs.
Similar to MAW's answer, except I would use a Windows Service instead of a command line app (no GUI), and split out the individual DB scripts into separate tasks within the Service so that they can if necessary be called at different time intervals, log their results separately, and be independently configured.
This depends greatly on your DBMS. In SQL Server there are scheduled jobs which can run any combination of stored procedures/commands on a schedule, similar to Windows scheduled tasks.
Since you've tagged this question with mysql, I'll assume thats what your running.
One way to do this on mysql is:
http://dev.mysql.com/tech-resources/articles/event-feature.html
If all else fails, you can reference all of these scripts in a stored procedure, than create a simple command line program which would connect to that db and call that procedure. Then schedule that program with Windows Task scheduler or similar.

Best practices for writing SQL scripts for deployment

I was wondering what are the best practices in order to write SQL scripts to set up databases for production and/or development, for instance:
Should I include the CREATE DATABASE statement?
Should I create users for the database in the same script?
Is correct to disable FK check before executing the body of the script?
May I include the hole script in a transaction?
Is better to generate 1 script per database than one script for all of them?
Thanks!
The problem with your question is is hard to answer as it depends on the way the scripts are used in what you are trying to achieve. you also don't say which DB server you are using as there are tools provided which can make some tasks easier.
Taking your points in order, here are some suggestions, which will probably be very different to everyone elses :)
Should I include the CREATE DATABASE
statement?
What alternative are you thinking of using? If your question is should you put the CREATE DATABASE statement in the same script as the table creation it depends. When developing DB I use a separate create DB script as I have a script to drop all objects and so I don't need to create the database again.
Should I create users for the database in the same script?
I wouldn't, simply because the users may well change but your schema has not. Might as well manage those changes in a smaller script.
Is correct to disable FK check before executing the body of the script?
If you are importing the data in an attempt to recover the database then you may well have to if you are using auto increment IDs and want to keep the same values. Also you may end up importing the tables "out of order" an not want checks performed.
May I include the whole script in a transaction?
Yes, you can, but again it depends on the type of script you are running. If you are importing data after rebuilding a db then the whole import should work or fail. However, your transaction file is going to be huge during the import.
Is better to generate 1 script per database than one script for all of them?
Again, for maintenance purposes it's probably better to keep them separate.
This probably depends what kind of database and how it is used and deployed. I am developing a n-tier standard application that is deployed at many different customer sites.
I do not add a CREATE DATABASE statement in the script. Creating the the database is a part of the installation script which allows the user to choose server, database name and collation
I have no knowledge about the users at my customers sites so I don't add create users statements also the only user that needs access to the database is the user executing the middle tire application.
I do not disable FK checks. I need them to protect the consistency of the database, even if it is I who wrote the body scripts. I use FK to capture my errors.
I do not include the entire script in one transaction. I require from the users to take a backup of the db before they run any db upgrade scripts. For creating of a new database there is nothing to protect so running in a transaction is unnecessary. For upgrades there are sometimes extensive changes to the db. A couple of years ago we switched from varchar to nvarchar in about 250 tables. Not something you would like to do in one transaction.
I would recommend you to generate one script per database and version control the scripts separately.
Direct answers, please ask if you need to expand on any point
* Should I include the CREATE DATABASE statement?
Normally I would include it since you are creating and owning the database.
* Should I create users for the database in the same script?
This is also a good idea, especially if your application uses specific users.
* Is correct to disable FK check before executing the body of the script?
If the script includes data population, then it helps to disable it so that the order is not too important, otherwise you can get into complex scripts to insert (without fk link), create fk record, update fk column.
* May I include the hole script in a transaction?
This is normally not a good idea. Especially if data population is included as the transaction can become quite unwieldy large. Since you are creating the database, just drop it and start again if something goes awry.
* Is better to generate 1 script per database than one script for all of them?
One per database is my recommendation so that they are isolated and easier to troubleshoot if the need arises.
For development purposes it's a good idea to create one script per database object (one script for each table, stored procedure, etc). If you check them into your source control system that way then developers can check out individual objects and you can easily keep track of versions and know what changed and when.
When you deploy you may want to combine the changes for each release into one single script. Tools like Red Gate SQL compare or Visual Studio Team System will help you do that.
Should I include the CREATE DATABASE statement?
Should I create users for the database in the same script?
That depends on your DBMS and your customer.
In an Oracle environment you will probably never be allowed to do such a thing (mainly because in the Oracle world a "database" is something completely different than e.g. in the PostgreSQL or MySQL world).
Sometimes the customer will have a DBA that won't let you create databases (or schemas or users - depending on the DBMS in use). So you will need to supply that information to the DBA in order for him/her to prepare the environment for your script.
May I include the hole script in a transaction?
That totally depends on the DBMS that you are using.
Some DBMS don't support transactional DDL and will implicitely commit any transaction when you execute a DDL statement, so you need to consider the order of your installation script.
For populating the tables with data I would definitely try to do that in a single transaction, but again this depends on your DBMS.
Some DBMS are faster if you commit only once or very seldomly (Oracle and PostgreSQL fall into this category) but will slow down if you commit more often.
Other DBMS handle smaller but more transactions better and will slow down if the transactions get too big (SQL Server and MySQL tend to fall into that direction)
The best practices will differ considerably on whether it is the first time set-up or a new version being pushed. For the first time set-up yes you need create database and create table scripts. For a new version, you need to script only the changes from the previous version, so no create database and no create table unless it is a new table. Now you need alter table statements becasue you don't want to lose the existing data. I do usually write stored procs, functions and views with a drop and create statment as dropping those pbjects doesn't generally affect the underlying data.
I find it best to create all database changes with scripts that are stored in source control under the version. So if a client is new, you run the create version 1.0 scripts, then apply all the other versions in order. If a client is just upgrading from version 1.2 to version 1.3, then you run just the scripts in version 1.3 source control repository. This would also include scripts to populate or add records to lookup tables.
For transactions you may want to break them up into several chunks not to leave a prod database locked in one transaction.
We also write reversal scripts to return to the old version if need be. This makes life easier if you have a part of a change that causes unanticipated problems on prod (usually performance issues).