Using windows service read/update SQL server 2008 R2 - sql

I am new to Entity Framework (MVC) methodology, what I need to do is create a windows service that would query the table messageinfo in the database for a column msgtype.
msgtype that I am looking for is 10 (which translates as to be archived)
In such a case I need to move (not copy) the whole corresponding row to another table backupmessageinfo.
This has to be done using a windows service, which can be scheduled to run at specific time lets say 12 am every day.
Please help!
Any pointers much appreciated!!
Philip

IMHO you can achieve the desired result a lot easier utilizing SQL Server Agent for scheduling and writing a simple sql script that will consist of INSERT INTO ... SELECT FROM ... and DELETE statements.

Related

How to query a table to a view and publish to a different database

I have 13 SQL databases some 2005 others 2008, on a VPN. I'd like to take all of the data from the "Employees" table on each database and make it a view at each location. I would then like to publish these views to 1 database on another server, all in one table marking where each came from within the origninal databases. For example the database where all the information goes to would look like this:
User Name Location
bik Bob K 1
JS John S 2
Etc.
Any help is appreciated.
I assume you want the data on the final server to be viewable, but not modifiable, and to reflect changes made to the source databases?
This would probably not perform all that well, but one do-it-yourself-way to do it would be the following (disclaimer: I haven't tried doing this myself):
Set up all the source servers as linked servers on the final server.
Create a view in this form:
SELECT *, 1 as Location
FROM [Linked Server 1].Database1.dbo.Table1
UNION ALL
SELECT *, 2 as Location
FROM [Linked Server 2].Database2.dbo.Table2
... etc ....
You might want to read this documentation on distributed queries, if you haven't already.
I believe it's also possible to use SSIS as the source of a distributed query, but a quick scan through the documentation didn't find anything about it. I mention that because SSIS would make pulling and transforming data from disaparate data sources very easy, and if you could use the final recordset as a data source, you could use an SSIS package as the backend to your view. However, again, performance would probably require considerable tuning.
If the queries don't have to be real time you could look into using SQL Server Integration Services (SSIS) to pull in the data to a local DB. you could schedule the job to run hourly/daily/weekly..

SQL SELECT WHERE value IN ('Huge list of Values')

Note: C# 3.5 application calling a SQL Server 2005 DB on a remote server.
I'm developing a two step process.
1) I search a Windows Indexing Service for a list of files that contain a given word, such as "Bob".
2) I then need to retrieve a list of rows from a DOCUMENT table in a SQL DB by passing in the list of filenames from the Indexing Service.
At the moment I retrieve a list from the indexing service AND all rows from the DOCUMENT table, then filter them in code. This isn't practical as there are 10,000+ documents and the database is through a firewall.
I considered creating a query such as:
SELECT DocName FROM Documents WHERE DocName IN ({list of files from indexing service})
...but given the list of files could be thousands it won't work.
So, what's the best thing I can do? I don't want to query the DB for all 10,000+ rows and pass them back over the firewall (takes 10 minutes). I somehow need to pass in the list of filenames retrieved from the indexing service.
How would linq work in this scenario?
Any advice greatly appreciated.
If you had SQL Server 2008, you could use Table Valued Parameters, but for 2005, there's nothing quite as elegant.
The simplest solution I can think of is:
Create a table in the database
Bulk Insert the results of your Indexing Service into the table
Join your query to this table to filter the results
Retrieve the filered results
It's not a great solution, but I don't know that a great solution exists - that's why TVPs were created.
You can evaluate different solutions for this kind of "massive" operation, may be not necessary to use linq. For example, try to implement a stored procedure on SQL Server, that receives in input the list of file name and returns the list of documents.
I opted for a solution similar to what Bazzz mentioned.
I've set up a nightly operation to copy the required fields from the database and set meta tags on the document files (PDFs). The meta data can then be used in the Indexing Service ;o)
This has proved to be a good solution for this instance, but otherwise what Hallainzil said would've been the best option albeit painful on Sql Server 2005.

Create SQL script that create database and tables

I have a SQL database and tables that I would like to replicate in another SQL Server. I would like to create a SQL script that creates the database and tables in a single script.
I can create "Create" script using the SQL Management Studio for each case (Database and Tables), but I would like to know if combining the both "Create" scripts into single script would be enough.
Thanks.
Although Clayton's answer will get you there (eventually), in SQL2005/2008/R2/2012 you have a far easier option:
Right-click on the Database, select Tasks and then Generate Scripts, which will launch the Script Wizard. This allows you to generate a single script that can recreate the full database including table/indexes & constraints/stored procedures/functions/users/etc. There are a multitude of options that you can configure to customise the output, but most of it is self explanatory.
If you are happy with the default options, you can do the whole job in a matter of seconds.
If you want to recreate the data in the database (as a series of INSERTS) I'd also recommend SSMS Tools Pack (Free for SQL 2008 version, Paid for SQL 2012 version).
In SQL Server Management Studio you can right click on the database you want to replicate, and select "Script Database as" to have the tool create the appropriate SQL file to replicate that database on another server. You can repeat this process for each table you want to create, and then merge the files into a single SQL file. Don't forget to add a using statement after you create your Database but prior to any table creation.
In more recent versions of SQL Server you can get this in one file in SSMS.
Right click a database.
Tasks
Generate Scripts
This will launch a wizard where you can script the entire database or just portions. There does not appear to be a T-SQL way of doing this.
An excellent explanation can be found here: Generate script in SQL Server Management Studio
Courtesy Ali Issa Here's what you have to do:
Right click the database (not the table) and select tasks --> generate scripts
Next --> select the requested table/tables (from select specific database objects)
Next --> click advanced --> types of data to script = schema and data
If you want to create a script that just generates the tables (no data) you can skip the advanced part of the instructions!
Not sure why SSMS doesn’t take into account execution order but it just doesn’t. This is not an issue for small databases but what if your database has 200 objects? In that case order of execution does matter because it’s not really easy to go through all of these.
For unordered scripts generated by SSMS you can go following
a) Execute script (some objects will be inserted some wont, there will be some errors)
b) Remove all objects from the script that have been added to database
c) Go back to a) until everything is eventually executed
Alternative option is to use third party tool such as ApexSQL Script or any other tools already mentioned in this thread (SSMS toolpack, Red Gate and others).
All of these will take care of the dependencies for you and save you even more time.
Yes, you can add as many SQL statements into a single script as you wish. Just one thing to note: the order matters. You can't INSERT into a table until you CREATE it; you can't set a foreign key until the primary key is inserted.

Using 2 differecnt DB's in same SP in SQL Azure

I am using an SP which will insert data in 2 tables in 2 different DB's. To mainitain the transaction, the SP has been designed like that. Its working fine in SQL Server environment.
Like Insert into AdminDB.EmpSiteConfig values(,,,)
Insert into MainDB.EmpDetails values(,,,)
where AdminDB and MainDB are the database names.
But when I migrate it to SQL Azure, I am getting an error as follows.
'Reference to database and/or server name in MainDB.dbo.EmpDetails' is not supported in this version of SQL Server.'
Can somebody tell me how to get rid of this error? Or is there any workaround for this?
Thanks in advance.
SQL Azure does not currently support linking to another server. As to workarounds, you could create a queue message requesting a specific action for data insertion. In your worker role, consume the queue message and call a separate stored procedure on each database.
although i think it is better to use David Makogon's solution you might want to take your changes with SQL Shard: http://enzosqlshard.codeplex.com/

How do I make a script in SQL Management Studio 2005?

I have a table in an MS SQL Server db. I want to create a script that will put the table and all records into another db. So I right-click the table in Management Studio and select Create-To new query editor... but all I get is the table structure.
How exactly do I get the values too?
One of the things I really like about the tools for MySQL that SQL Server is missing out of the box to be certain.
You can use a script to do it however.
You might also want to consider using something like Red-Gate SQL Compare and Red-Gate SQL Data Compare. They aren't cheap tools, priced at $395 each (for the standard editions), but there are 14 day free trials available for download, and they make copying schema and data from one SQL Server to another very easy.
If both are on the same machine (or on different machines but the servers are linked)
you can create the table with the script you can generate automatically and do this to copy the data:
INSERT INTO [destinationdb].[dbo].[destinationtable] SELECT *
FROM [originaldb].[dbo].[originaltable]
(Prepend [servername] to the database name if you'll be using linked servers)
Another option is to enable xp_cmdshell (do with care, it's relaxing security constraints) and use the bcp command line utility from the management studio to create copies you can then import into the other database/server. You can do that directly from the shell as well and do not need to enable xp_cmdshell in that case, of course.
it doesn't really create a "SQL script" but it does the job :
select the database in the object explorer
right click
select import/export data
follow the wizard
at the end of the process you can save the "integration service package" to reuse it
later you can modify the details by opening the .dtsx
(it will take care of security, and won't cost one more penny, it's seems we have to compete with other answers :) )
hope it helps.