How to stored Emails in SQL database and queried in an ASP.NET Application - sql

I'm creating a backup monitoring application that I am going to write in Visual Studio using ASP.NET.
The way I would like this to work is backup emails being sent from the server that has a backup monitoring solution and are stored in a SQL database (SQL Server Express, MySQL) whatever would be best for something like this. I then plan to use this data to query and show statistics such as what servers have backed up successfully for each customer, and those that have failed. Also what servers have backed up successfully most and error trends such as not enough disk space etc.
Would this be possible and if so could someone point me in the right direction wither I should start trying to get information into a database first and foremost and how to achieve that or should I start by creating the application such as login, dashboard etc. I haven't got a strong programming background we covered some Visual Basic and ASP.NET in university and I was hoping to learn a lot from this project.
I was thinking of setting up a test environment with a server running a backup product and purposely making backups fail for testing and have access to the hardware/software resources working for an IT Consultancy / Support company.
Thanks in advance!

After some searching I came across this website which has a great piece of code that reads emails from an inbox such as Gmail and stores them in an SQL database.
protected void SaveEmails(object sender, EventArgs e)
{
for (int i = 0; i < this.Emails.Count; i++)
{
string constr = ConfigurationManager.ConnectionStrings["ConString2"].ConnectionString;
string sqlStatment = "INSERT INTO [Emails] ([From],[Subject],[Body],[Date]) VALUES (#From ,#Subject,#Body,#Date)";
using (SqlConnection con = new SqlConnection(constr))
{
using (SqlCommand cmd = new SqlCommand(sqlStatment, con))
{
con.Open();
cmd.Parameters.AddWithValue("#From", this.Emails[i].From);
cmd.Parameters.AddWithValue("#Subject", this.Emails[i].Subject);
cmd.Parameters.AddWithValue("#Body", this.Emails[i].Body);
cmd.Parameters.AddWithValue("#Date", this.Emails[i].DateSent);
cmd.ExecuteNonQuery();
con.Close();
}
}
}
}

For the architecture of this you can try creating something like this:
External systems: Will write to the sql database
Database: Only needs one table. With possibly these columns (to give you an idea)
Receiver
Sender
IsSent
BodyMessage
SentTime
Sender: A sender application/service that will query the database at regular intervals for unsent emails and send them.
If you wonder how you technically query and insert into a SQL database then that's something there are plenty of resources on.

Related

Cannot restore database as file is being used by another process

I have developed this application for a store owner.I want to allow the owner to backup and restore database by using the application.the backup runs fine but the restore is causing an exception which says that - Operating system error 32( the process cannot access the file because it is being used by another process).Restore database is terminated abnormally.
using(var conn = new SqlConnection(ConnectionString))
{
using(SqlCommand cmd = conn.CreateCommand())
{
string datadirectory = Path.Combine(Environment.CurrentDirectory,#"Data");
string query = #"RESTORE DATABASE""{0}""FROM DISK= '{1}' WITH REPLACE";
string query = String.Format(query,backupfile,datadirectory + "\\Database.mdf");
conn.Open();
SqlCommand command = new SqlCommand(query,conn);
command.ExecuteNonQuery();
}
}
How can I solve this issue ? Thanks in advance.
You have to dispose of every SQLiteConnection, SQLiteCommand and SQLiteDataReader once you are done using it. The second command that you create isn't correctly being disposed off.
That aside, your code sample doesn't really make sense. You create a command that is never used. Then you create a second command that isn't properly disposed off.
Is it the restore-file that is blocking? Or is the database itself still running?
If it is the database that is being used, you can set the database in single-user-mode. Another option is taking the database temporarily offline and bring it online again. That should close all existing connections. Tip; with SSMS you can turn almost every command into an SQL script like the button to brink a database offline. Click on 'Script' and you get something like 'USE MASTER GO ALTER DATABASE [AdventureWorks] SET OFFLINE GO'.
wait, this is good news. it looks like the application is running, the db is online and live, so why do want to restore? backups are something you do daily/hourly... but restores you ONLY do if something goes wrong. of course you got an error. the db is live and sql service is using the files and it's good it didn't let you restore or else you would have lost a lot of data.
if all you want is to test the restore, then you need to shut down the sql service first.BUT, make sure you take a backup just before that so you restore the latest.

Automatically Sync SQL Databases across two computers

I am using a cloud backup/sync service (SpiderOak) which automatically Syncs folders across several computers / devices.
I am trying to figure out a way to automatically sync all my databases across my work computer and personal laptop, without actually needing to backup/restore from one instance to the other.
So what I am thinking of is to create a new sql instance on my laptop which is identical to my work desktop instance, then to pick both SQL Server directories in Program Files to sync with each other using SpiderOak (the whole root SQL Server folders).
Will this be enough for my two instances to Sync with each other? Meaning if I create a new database on my computer at work, will I see this database on my laptop when I open SQL Server Database Management Studio?
I am almost sure if databases already exist they will sync with each other (since the root folders contain the mdf & ldf files - but correct me if I am wrong). however, I am not sure if a new database will be created if it doesn't already exist on one of the machines.
Is there any other folders that I need to sync other than the ones I specified already?
You could use Sql Sync Framework, you can download it here
some more readfood
It works for Sql Server 2005
Download and import references and include with the default ones:
using System.Data.Sql;
using System.Data.SqlClient;
using Microsoft.Synchronization;
using Microsoft.Synchronization.Data;
using Microsoft.Synchronization.Data.SqlServer;
using System.Diagnostics;
using System.Reflection;
using System.Net;
Than the actual code:
private void SyncTables()
{
SqlConnection ConStringOnline = new SqlConnection("connstring");
SqlConnection ConStringOffline = new SqlConnection("connString");
SyncOrchestrator sync = new SyncOrchestrator();
sync.Direction = SyncDirectionOrder.Download; //or DownloadAndUpload
//the 'scope1' is important, read more about it in the articles
var provider1 = new SqlSyncProvider("scope1", ConStringOnline);
var provider2 = new SqlSyncProvider("scope1", ConStringOffline);
PrepareServerForProvisioning(provider1);
PrepareClientForProvisioning(provider2, ConStringOnline);
sync.LocalProvider = provider2;
sync.RemoteProvider = provider1;
sync.Synchronize();
}
private static void PrepareServerForProvisioning(SqlSyncProvider provider)
{
SqlConnection connection = (SqlConnection)provider.Connection;
SqlSyncScopeProvisioning config = new SqlSyncScopeProvisioning(connection);
if (!config.ScopeExists(provider.ScopeName))
{
DbSyncScopeDescription scopeDesc = new DbSyncScopeDescription(provider.ScopeName);
scopeDesc.Tables.Add(SqlSyncDescriptionBuilder.GetDescriptionForTable("TABLENAME", connection));
config.PopulateFromScopeDescription(scopeDesc);
config.SetCreateTableDefault(DbSyncCreationOption.CreateOrUseExisting);
config.Apply();
}
}
private static void PrepareClientForProvisioning(SqlSyncProvider provider, SqlConnection sourceConnection)
{
SqlSyncScopeProvisioning config = new SqlSyncScopeProvisioning((SqlConnection)provider.Connection);
if (!config.ScopeExists(provider.ScopeName))
{
DbSyncScopeDescription scopeDesc = SqlSyncDescriptionBuilder.GetDescriptionForScope(provider.ScopeName, sourceConnection);
config.PopulateFromScopeDescription(scopeDesc);
config.Apply();
}
}
The downside of using Sync Framework: It is a pain in the a** to add these prerequisites to your application before publishing, no problem if you just use an application for yourself or for your company, but when you would like to publish it online it is a bit harder. I already had a topic about that
However, when using tools like InnoScript, you can install the prerequisites easily while installing the application. Here is how.
Now for the ScopeName: It is important that you don't use twice the same name, I believe. I had multiple tables so I just named them scope1,scope2,scope3,scope4. Apparently Sync Framework does the rest of the work for you. It also automatically adds _tracking tables to your database, this is just metadata to store information to synchronize properly.

Updating Data Source Login Credentials for SSRS Report Server Tables

I have added a lot of reports with an invalid data source login to an SSRS report sever and I wanted to update the User Name and Password with a script to update it so I don't have to update each report individually.
However, from what I can tell the fields are store as Images and are encrypted. I can't find anything out about how they are encrypted or how to update them. It appears that the User Name and password are stored in the dbo.DataSource tables. Any ideas? I want the script to run in SQL.
Example Login Info:
I would be very, very, VERY leery of hacking the Reporting Services tables. It may be that someone out there can offer a reliable way to do what you suggest, but it strikes me as a good way to clobber your entire installation.
My suggestion would be that you make use of the Reporting Services APIs and write a tiny app to do this for you. The APIs are very full-featured -- pretty much anything you can do from the Report Manager website, you can do with the APIs -- and fairly simple to use.
The following code does NOT do exactly what you want -- it points the reports to a shared data source -- but it should show you the basics of what you'd need to do.
public void ReassignDataSources()
{
using (ReportingService2005 client = new ReportingService2005)
{
var reports = client.ListChildren(FolderName, true).Where(ci => ci.Type == ItemTypeEnum.Report);
foreach (var report in reports)
{
SetServerDataSource(client, report.Path);
}
}
}
private void SetServerDataSource(ReportingService2005 client, string reportPath)
{
var itemSources = client.GetItemDataSources(reportPath);
if (itemSources.Any())
client.SetItemDataSources(
reportPath,
new DataSource[] {
new DataSource() {
Item = CreateServerDataSourceReference(),
Name = itemSources.First().Name
}
});
}
private DataSourceDefinitionOrReference CreateServerDataSourceReference()
{
return new DataSourceReference() { Reference = _DataSourcePath };
}
I doubt this answers your question directly, but I hope it can offer some assistance.
MSDN Specifying Credentials
MSDN also suggests using shared data sources for this very reason: See MSDN on shared data sources

create auto sql script that runs in every hour - in c# or any other easy way

I have simple sql script:
Select * from student where score > 60
What i am trying to do is run this above script every 1 hour and getting notified on my computer in any way possibe that above condition was met. So basically i dont want to go in there and hit F5 every hour on the above statement and see if i get any result. I am hoping someone out here has something exactly for this, if you do please share the code.
You can use Sql Agent to create a job, Sql server 2008 also has mail functionality
Open SQL Management Studio and connect to your SQL Server
Expand the SQL Server Agent node (if you don't see it, use SQL configuration manager or check services and ensure that SQL Server Agent (SQLINSTANCENAME) is started)
Right click on Jobs and choose 'New Job'
You can run a SQL statement in a job. I'll let you figure out the rest of that part (it's pretty intuitive)
You may want to send your mail using xp_sendmail
Check out the SQL documentation for xp_sendmail
http://msdn.microsoft.com/en-us/library/ms189505(v=sql.105).aspx
You might need to turn the feature on (afaik it's off by default) and you need some server/machine to deliver the mail (so you might need IIS and SMTP installed if on a local machine)
Edit:
Assuming you can't access the server and want to do this on the client side, you can create a .NET framework app or windows service to do the work for you using a schedule or a timer approach:
Schedule approach:
Create a simple command line application which does the query and mails the results, and use the windows scheduler to invoke it every hour (or whatever your interval may be)
Timer approach:
Create a simple application or windows service that will run a timer thread which does the work every x number of minutes
I'd probably just go for the former. The code would be quite simple - new console app:
static void Main(string args[])
{
// No arguments needed so just do the work
using(SqlConnection conn = new SqlConnection("ConnectionString"))
{
using(SqlCommand cmd = new SqlCommand("sql query text", conn))
{
var dr = cmd.ExecuteReader();
List<myClass> results = new List<myClass>();
// Read the rows
while(dr.Read())
{
var someValue = dr.GetString(dr.GetOrdinal("ColumnName"));
// etc
// stuff these values into myClass and add to the list
results.Add(new myClass(someValue));
}
}
}
if(results.Count > 0) // Send mail
{
//Send the message.
SmtpClient client = new SmtpClient(server);
// Add credentials if the SMTP server requires them.
client.Credentials = CredentialCache.DefaultNetworkCredentials;
MailMessage message = new MailMessage(
"recipient#test.com",
"sender#test.com",
"Subject",
"Body");
// Obviously you'd have to read the rows from your list, maybe override ToString() on
// myClass and call that using a StringBuilder to build the email body and append the rows
// This may throw exceptions - maybe some error handling (in any of this code) is advisable
client.Send(message);
}
}
Disclaimer: probably none of this will compile :D
Edit 2: I'd go this way as it's much easier to debug than a windows service as you can just run it from the command line. You can also pass command line arguments so you don't need an application configuration file

RavenDB, RavenHQ and Appharbor - document size error with very first document

I have a completely empty RavenHQ database that's linked to my Appharbor application. The amount of space the database is currently using is 1.1mb out of an available 25mb for my bronze account. The database previously had records in it, but I have deleted them using "delete collection" in the management studio.
The very first time I call session.Store(myobject), and BEFORE I call .SaveChanges(), I get the following error.
System.InvalidOperationException: Url: "/docs/Raven/Hilo/AccItems"
Raven.Database.Exceptions.OperationVetoedException: PUT vetoed by Raven.Bundles.Quotas.Triggers.DatabaseSizeQoutaForDocumetsPutTrigger because: Database size is 45,347 KB, which is over the allowed quota of 25,600 KB. No more documents are allowed in.
Now, the document is definitely not that big, so I don't know what this error can mean, especially as I don't think I've even hit the database at that point since I haven't closed the session by calling SaveChanges(). Any ideas? Here's the code itself.
XDocument doc = XDocument.Parse(rawXml);
var accItems = ExtractItemsFromFeed(doc);
using (IDocumentSession session = _store.OpenSession())
{
var dbItems = session.Query<AccItem>().ToList();
foreach (var item in accItems)
{
var existingRecord = dbItems.SingleOrDefault(x => x.Source == x.SourceId == cottage.SourceId);
if (existingRecord == null)
{
session.Store(item);
_logger.Info("Saved new item {0}.", item.ShortName);
}
else
{
existingRecord.ShortName = item.ShortName;
_logger.Info("Updated item {0}.", item.ShortName);
}
session.SaveChanges();
}
}
Any other comments about the style of this code would be most welcome, as I was unsure of the best way to approach the "update existing item or create if it isn't there" scenario.
The answer here was as follows.
RavenHQ support found that the database was indeed oversized, but it seemed that the size reported in the Appharbor-branded RavenHQ control panel was incorrect. I had filled up the database way over the limit with a previous faulty version of the code posted above, so the error message I received was actually correct.
Fixing this problem without paying to upgrade the database wasn't straightforward, as it's not possible to shrink the database. As I also wasn't able to delete my single Appharbor/RavenHQ database or create another one that left me with the choice of creating an entirely new Appharbor application, or registering directly with RavenHQ for a new account. I chose the latter. The RavenHQ-branded control panel is slightly different to the Appharbor one, in that it has the ability to create and delete databases.
So to summarize: there doesn't seem to be any benefit to using RavenHQ as an add-on to Appharbor - you might as well go and get a proper free RavenHQ account.