How to use Triggers in SQL Server 2012 to archive and delete - sql-server-express

I will be honest, I know nothing about SQL Server other than what I have tried to pack into my brain in the last two days.... I have found a couple of scripts on your website that sounded like they would work What are ways to move data older than 'Y' days to an archive/history table in MySQL?. This one in particular really seems like it would fit my needs.
But I want to insert the data into a table or database on another partition of the same server and can't figure out how to change the location.
I have SQL Server 2012 Express, running on a Windows Server 2008 R2 service pack 1. We started the database on 11/21/2013 and we hit the 10 GB limit on 12/30/13. We design crowns & bridges, implants and dentures so we have multiple CT scans per patient that get manipulated in 3D imaging and CAD programs multiple times so it creates a lot of data very quickly.
Questions:
Should I try to use the triggers built in to my PatientNetDB? [OnAfterDeleteDataSets & OnAfterInsertDatasets]
If so how do I change it to make it work like the question from the user I copied above?
We may need to pull data back out of this archive, how in the heck do I do that?
I really appreciate any help you can give me, remember I am a total newb to this stuff and unfortunately will need extremely simple step by step or copy and paste directions/scripts.
Thank you so much!
Linda Saylor

No, don't use triggers for archiving/deleting. Trigger are fired when specific operations occur - INSERT, UPDATE or DELETE - on certain tables, and you cannot control when and how often triggers are fired. Therefore, triggers should be very small and nimble - you should NOT put large and long-running operations into a trigger. A typical trigger might update a row that's been inserted, or it might put a row into a separate table (an Audit or Command table) - but the trigger itself should never much processing.
What you can and should do is to have scheduled tasks - unfortunately, the SQL Server Agent is not available in the Express edition. With SQL Server Agent you could run certain processing operations (T-SQL scripts) at scheduled intervals, e.g. once every night etc.
Since you're using the Express edition, you'll have to find another way to run a task at given times, possibly by writing a small wrapper in your language of choice (C#, VB.NET, whatever), have that scheduled by the Windows scheduler (Scheduled Tasks in your Windows start menu), which would then kick off / execute a T-SQL script to run the cleanup process and archive your data.

Related

SQL database copied and updated

I have a main system that I use to add records to and run multiple routines on an SQL database using MS Access. Some routines take days to run.
I want to build a second PC where I can hopefully easily update its copy of the database and then run the long routines on while continuing to keep up the day-to-day activities on the main system.
How easy (or feasible) is it to take a copy of a sql database from one computer and update it on another?
Those processing times sound rather large. I would suggest that you consider building some server side routines that run on SQL server – they will run much faster, and more important reduce if not eliminate most network traffic. Keep in mind that working with lots of data using Access to SQL server can and will often run SLOWER than just using Access without SQL server.
As for moving the SQL database to another computer? The idea and concept is very much like moving a file like Access to another computer. You simply from SQL manager create backup file on the first computer. (it will be a single file – choose device, and then add a file location). You also find that such .bak files from SQL zip REALLY well, so if you using some kind of FTP or internet, then zip it file before you transmit it.
Regardless, you can even transfer that “bak” file with a jump drive or whatever. You then restore that database to the other computer – and you off to the races. (on the target computer, or your local computer, simply choose “restore” and restore the database from the “bak” file you transferred to the other computer running sql server. So the whole database with many tables etc. will be turned into a single file - much like access is.
So moving and making copies of a SQL database is a common task, and once you get the hang of this process, it not really much harder than moving an access file between computers.
I would however question why the process are taking so long – they may well be complex, but the use of store procedures and pass-through quires would substantial speed up your application as opposed to processing the data with a local client like Access + VBA. So try adopting more t-sql and store procedures – they will run “many” times faster – often 1 hour can be cut down to say 1 minute or less. So moving is easy, but you might eliminate the whole "day" of time down to a few minutes of time if you can run the processing routines server side as opposed to client side of which will occur when using Access as the client. (the access client can most certainly send or call t-sql routines that run server side - but the main trick here is to get those processing routines running server side).
Last time I used ms access it was just a file like an excel file which you could simply copy to the other machine and do whatever you want with.
If you are fairly comfortable with sql/administration, and only need a one-way copy (main system to second PC) then you could migrate ms access to mysql:
http://www.kitebird.com/articles/access-migrate.html
(Telling Microsoft Access to Export Its Own Tables)
This process should be fairly easy to automate if you need to do this regularly.

SQL Server database : amalgamate 90 database update scripts into a single script

I have an application that has been released for several years. I have updated the SQL Server database around 90 times and each time I have created an update SQL script, so when the user runs the new software version the database is updated.
I would like to amalgamate all the updates into a single SQL script to make deployment faster and simpler, is there an easy way to do this? I presume I could simply grab the latest database after it has run through all 90 updates via SQL Server Management Studio?
EDIT
For clarification; the software I wrote automatically applies new database updates when the user downloads the latest version. This is done via C# / .Net and look for an embedded sql script on startup in the format XX_update.sql calling each script one by one i.e.
1_update.sql - this creates the tables and initial data etc. This was my initial release database.
2_update.sql - updates to the initial database such as adding a new SP or changing column datatype etc
3_update.sql
4_update.sql
...
90_update.sql (4 years and lots of program updates later!).
Ideally, I would install my software and create a brand new database running through all 90 update scripts. Then take this database and convert it into a script which I can replace all 90 scripts above.
This is too long for a comment.
There is no "easy" way to do this. It depends on what the individual updates are doing.
There is a process you can follow in the future, though. Whenever an update occurs, you should maintain scripts both for incremental updates and for complete updates. You might also want to periodically introduce major versions, and be able to upgrade to and from those.
In the meantime, you'll need to build the complete update by walking through the individual ones.
I use a similar system at work and while I prefer to run the scripts separately I have amalgamated several scripts sometimes when they have to be deployed by another person with no problem.
In SQL Server the rule is that as long as you separate the scripts by go and use SQL Server Management Studio or another tool that process the batch separator properly there is no problem in amalgamating it, because it would look like separate scripts to SQL Server (instead of being sent to SQL Server as a one big script the tool send it in batches using the go as the batch separator).
The only problem is that if you have an error in the middle of the script, the tool would continue sending the rest of batches to the server (at least by default, maybe there is an option for changing this). That is why I prefer when possible using a tool to run then separately, just to err on the safer side and stop if there is a problem and locate easily the problematic command.
Also, for new deployments your best option is what you say of using a database that is already updated instead of using the original one and apply all the updates. And to prevent being in this situation again you can keep a empty template database that is not used for any testing and update it when you update the rest of databases.
Are you running your updates manaually on the server. Instead, you can create a .bat file, powershell script or a exe. Update scripts in your folder can be numbered. The script can just pick up updates one by one and execute it over the database connection instead of you doing it.
If you have multiple script files and you want them to combine into one,
rename these as txt, combine, rename the resulting file as sql.
https://www.online-tech-tips.com/free-software-downloads/combine-text-files/
This is easy. Download SQL Server Data Tools and use that to create the deployment script. If there's no existing database, it will create all the objects, and if targeting an older version it will perform a diff against the target database and create scripts that create the missing objects, and alter the existing ones.
You can also generate scripts for the current version using SSMS, or use SSMS to take a backup, and use a restore to in an install.

Update on table really slow in ACCESS, fast in Management Studio

There is a table with trigger in our SQL database. (sql server 2014). When updating record from Microsoft SQL Server Management Studio update takes like 1/10 sec. When the record is changed in MS ACCESS (in the form) it takes like 5-6 seconds to update. There is a trigger on that table, but the same operation directly from MSS Management Studio is fast, so it is not a problem with a trigger itself. THe trigger has to insert 1 record to 15 milion records table. I am looking for reasons why the same operation from ACCESS and SQL management studio can take 40 times longer. Any suggestions? Or links to known issues in that topic?
EDIT:
It's ACCESS 2003. It's subform bound to form by ID field. I am edditing simple integer column of record. I use normal connection with my sql server, it's adp project and we just use typical connection for it. I tried to do update from VBA (the same project) but from simple module -> the same effect. So it does not metter if it is eddited from form or update command is send from other module to that table. It still takes long.
To narrow things down I would still suggest to disable the trigger and see if the issue persists.

New to SQL servers, wanting to schedule table / data moves every X hours

I have looked at a few stackoverflow forum posts but nothing fits (or atleast I dont think so) what I need help with.
I'm looking for general advise, my company has 'tasked' me to look at moving some data from tables stored in our parent companies databases into a database of our own that has all the information we need in one place.
For instance if we want information that related to one thing, we may have to pull data from several different databases. I think I can get my head around the moving of the data and create a sql query to do it, however we're currently using SQL express as our SQL db (the company is more than happy to buy/create a SQL server but as far as we can see SQL express does what we need it too (feel free to correct me)).
I need to look at scheduling the data move for every hour or few hours to keep the data 'up to date' for when reports are generated using the data.
I have looked at a few programs but the as the queries and the database is on a server 2008 r2 system some of the 'programs' don't like it as they were last updated pre 2010 etc. I have also installed SQL management suite 2012 due to SQL server agent but I cant even get that worked (Service is enabled and I have restarted the DB just still nothing within suite).
I'm not looking (however happy to take the help) for a 'Do this and that and that' type reply but more than happy to accept that amount of help but if you guys / gals can point me in the right direction.
Summary:
-Combining data already on databases from our parent company into a table / DB of our own making
-Currently using SQL Express but willing to upgrade to something else that does the job
-Schedule the data moves for every X hours (Windows scheduling?)
-automating the entire thing so don't have to manually do the moves.
Help on any of the points above would be greatly appreciated and I would 'love you long time' for the help.
JB
There are a bunch of limitations for SQL Express. One of them is that SQL Agent is not supported. SSIS like SQL Agent is not supported.
http://msdn.microsoft.com/en-us/library/cc645993.aspx
Do not fret, you can always schedule a job with Windows Scheduler.
http://windows.microsoft.com/en-US/windows/schedule-task#1TC=windows-7
As for moving the data, it is up to you to select a solution.
1 - Write a PowerShell application to perform the Extract, Translate, and Load (ETL).
http://technet.microsoft.com/en-us/library/cc281945(v=sql.105).aspx
2 - Use the SQLCMD to perform logic like calling stored procedures.
http://technet.microsoft.com/en-us/library/ms162773.aspx
3 - Use BCP to dump and load data.
http://craftydba.com/?p=1245
http://craftydba.com/?p=1255
http://craftydba.com/?p=1584
http://craftydba.com/?p=1690
It is funny how youngsters think they need to spend a-lot of $ to create a solution for a business.
However, Microsoft does supply you with a-lot of free tools.
You just have to put them together for a solution.
PS: I remember about 10 years ago I created a custom ETL solution using VBSCRIPT. Unlike power shell, it is installed on old and new programs.
Good luck!
You can create a console application which executes that particular stored procedure which handles your logic. ( http://dotnet.dzone.com/articles/basics-stored-procedures-net )
Of course using SSIS is much easier but it's not available in SQL Server Express Edition.
I think you should have a look at Integartion Services, which is not available for Express Edition. Have a look at this article to get started with SSIS.

Microsoft SQL Server 2005 Merge replication deleting records on subscriber

I created a simple merge replication. I used the default settings to create the publication which was creating a script to create the snapshot that Occurs every 14 day(s) at 12:05:00 AM. First, I am not sure why it needs to run every 14 days. Second, after researching for hours, I could not figure out how to not replicate deletes. Whenever I create the subscription and run the script, it creates a mirror image of everything that the publisher has and deletes any records in the subscriber that were not in the publisher. I need both publisher and subscriber to be merged without any deletes...Just inserts and updates.
I am a beginner when it comes to Microsoft SQL Server Management Studio 2005. If I need to modify any scripts or stored procedures could you please let me know how to do that. Whenever I try to modify a script, it asks me to save it to a file and I am not sure how to get it updated with the changes in the database.
Thanks
-Dimitry