I have read many articles here, but haven't quite found the solution. I have a SQL Server Express database that is used by my VB.NET application. I have packaged and deployed the application via an MSI file and everything works great except I cannot figure out how to include my database file with the package. I understand there are three general ways to do this (copy the files over manually, custom actions, and SQL scripts). I didn't need anything fancy here, just a quick way to put the DB on the client machine so my app can access it.
I decided copying over the DB manually was the quickest option. I tried putting it in the working directory and in the \DATA directory of the client's SQL Server Express install, but my app wouldn't connect. I also tried changing my connection in the project to .\SQLEXPRESS instead of [my_computer_name]\SQLEXPRESS followed by a rebuild of the deployment project and reinstall on the client machine, but no soup for me. Same issue. I tried changing the "UserInstance" property in the project to "True" but my project would not let me save that action.
Am I correct that a manual copy is the quickest and easiest way to get this done?
You should to attach your file to the Sql Server instance.
CREATE DATABASE YourDatabaseName
ON (FILENAME = 'C:\your\data\directory\your_file.mdf'),
(FILENAME = 'C:\your\data\directory\your_file_Log.ldf')
FOR ATTACH;
You need to attach your database file to the running SQL Server on the client machine.
This could be easily done using this variation on the connection string stored in your configuration file (app.config or web.config)
Server=.\SQLExpress;AttachDbFilename=where_you_have_stored_the_mdf_file;
Database=dbname; Trusted_Connection=Yes;
in alternative, you could use the |DataDirectory| substitution string.
This shortcut eliminates the need to hard-code the full path.
Using DataDirectory, you can have the following connection string:
Server=.\SQLExpress;AttachDbFilename=|DataDirectory|\yourfile.mdf”
Database=dbname; Trusted_Connection=Yes;
Related
I have a windows form application that requires users to log in to access the information. I have created a local compact database file for the credentials to be stored. I added the database file to my the folder but when I open my application and try to log in it tells me that it cannot find the database file.
Should the file be stored on a different folder, or should I need to install an instance of sql on the user computer.
This is my first deployment so I am not sure how to go about it. I have done some research on the subject, but it does not seem related to my issue. The help section of Intallshield was not clear either.
I am looking for some resources on how to accomplish this.
I figure out the issue, in order to work all files, including the database files need to be dumped under the userprofile folder.
I've successfully created site using Umbraco now its time to upload it on hosting server..
i've searched and got one paid product for the same..and i dont want to use it.
Has any body tried developing Umbraco site on local and then uploading it on server?
If yes then please help me doing that.
First I run the umbraco install from a local IIS website. Then I setup my visual studio solution for that website (and my souce control). Then I work, until I reach a beta version, then I go through this process for deploying:
Ftp over to the remove website and copy the whole website (I actually use Beyond compare).
Connect to my local database with management studio and create a .bak file.
Upload the .bak file to the database server.
Restore that database
Review connection strings in web.config
Then I'm pretty much done.
Once I'm "live" and have content I don't want to lose, when I want to work on the website, I bring back the live database through a .bak file, then I make my changes. They often include DB changes since the schema is basically in the database. I note all the operations I do. Once my changes are ready I manually replicate the changes on the live site as I update the files.
This is very painfull but I tried solutions like courrier and other things like that and they are not reliable enough for production I find. Manually is the only risk free way I see for the moment.
Hope this helps.
Yes, that happens all the time. Use FTP to copy your local installation to your webserver, modify the web.config to point to the correct database and your website should be up-and-running.
I'm sure there are more elegant solutions with less clicks but here's how I do it on azure websites with sql, not sure what hosting/db you're using:
1) Create an empty db on azure with the same login and user as my local db.
2) Create an empty site on azure connected to my db.
3) Download the publishing profile.
4) Upload the db the first time with Sql Azure Migration Wizard.
5) Import the publishing profile into and upload the site with WebMatrix.
6) Thereafter I deploy the site and db with WebMatrix.
WebMatrix uses WebDeploy or FTP, you can use WebDeploy through IIS if you like, and FTP.
I'm trying to go through multiple .trc files to find out who has been logging into SQL Server over the last few months. I didn't setup the trace, but what I've got are a bunch of .trc files,
ex:
C:\SQLAuditFile2012322132923.trc,
C:\SQLAuditFile201232131931.trc
etc.
I can load these files into SQL Profiler and look at them individually, but I was hoping for a way to load them all up, so that I can quickly scan them for logins. Either using a filter, or better yet, load them into a SQL Server table and query them.
I tried loading the files into a table using:
use <databasename>
GO
SELECT * INTO trc_table
FROM ::fn_trace_gettable('C:\SQLAuditFile2012322132923.trc', 10);
GO
But when I do this, i get the error message:
File 'C:\SQLAuditFile2012322132923.trc' either does not exist or is not a recognizable trace file. Or there was an error opening the file.
However, I know the file exists, and I have the correct name. Also they appear to be recognizable because I can load them up into SQL Profiler and view them fine.
Anybody have an idea why I'm getting this error message, and if this won't work, perhaps another way of analyzing these multiple .trc files more easily?
Thanks!
You may be having permissions issues on the root of C:. Try placing the file into a subfolder, e.g. c:\tracefiles\, and ensuring that the SQL Server account has at least explicit read permissions on that folder.
Also try starting simpler, e.g.
SELECT * FROM ::fn_trace_gettable('C:\SQLAuditFile2012322132923.trc', default);
Anyway unless you were explicitly capturing successful login events, I don't know that these trace files are going to contain the information you're looking for... this isn't something SQL Server tracks by default.
I had pretty much the same issue and thought I'd copy my solution from
Database Administrators.
I ran an SQL trace on a remote server and transferred the trace files to a
local directory on my workstation so that I load the data into a table on my
local SQL Server instance for running queries against.
At first I thought the error might be related permission but I ruled this
out since I had no problem loading the .trc files directly into SQL Profiler
or as a file into SSMS.
After trying a few other ideas, I thought about it a bit more and realised
that it was due to permissions after all: the query was being run by the SQL
Server process (sqlsrvr.exe) as the user NT AUTHORITY\NETWORK SERVICE –
not my own Windows account.
The solution was to grant Read and Execute permissions to NETWORK
SERVICE on the directory that the trace files were stored in and the trace
files themselves.
You can do this by right-clicking on the directory, go to the Security
tab, add NETWORK SERVICE as a user and then select Read & Execute for
its Permissions (this should automatically also select Read and
List folder contents). These file permissions (ACLs) should automatically
propagate to the directory contents.
If you prefer to use the command line, you can grant the necessary permissions to
the directory – and its contents – by running the following:
icacls C:\Users\anthony\Documents\SQL_traces /t /grant "Network Service:(RX)"
I would like to make a complete backup of my whole joomla 1.5 based site from time to time. How would this ideally be done? Are there any common pitfalls? Not that I only have ftp access to the hosting server. Is there a step by step tutorial somewhere? I am using latest Joomgallery and Kunena 1.0.9 (Legacy mode).
Maybe there is a good way to automate this?
There's two parts of the backup you have to worry about, the database and the files.
The first part is the database. It can be backed up using something like phpMyAdmin. If you don't have this available on your server already, it's not too hard to upload and get it going yourself. From there, you can just Export the entire database to a gzip file.
The second part is the code and uploaded files. The code base shouldn't change too often, so you could probably just make one backup of this. There's a number of ways. The simplest is to just download the entire folder via FTP, though if you're Linux, I'm sure someone will know a single command line to get all the changed files (rsync?).
The database is the main thing you have to worry about though: everything else should be able to be rebuilt just by reinstalling.
I think this: http://www.joomlapack.net/ is what you need. I use it myself and it works like a charm. Both for backups and for moving my Joomla installations from developer sites and to the real site.
get an FTP synchronisation tool and keep an up-to-date copy of your site locally. Then you could run the batch script
mysqldump -hhost -uuser -p%1 schema > C:\backup.sql
to create a backup of your mysql tables at various points in time.
edit
you would have to have MySQL Server installed on your local machine and path to its bin directory in you PATH, in order to run the mysqldump command without much hassle. -p%1 would take the command-line provided password, as you wouldn't want to store passwords in your batch script.
If you only have FTP access you are in a bit of a problem, as beside all files you'll also have to backup the database. Without accessing the database, a full-backup won't do you any good.
Whatever backup strategy you choose - be sure it can handle UTF-8 correctly. Joomla 1.5 stores all content with UTF-8, even when the database charset is set on 'iso-5589-1' - so when the backup solution is detecting the database charset, some characters like € or é will result in "strange" ¬ / é - not really what you'll want.
I absolutely endorse using Joomlapack - it works great. The optional remote tools allow you to initiate the backup from a Windows desktop machine - it performs the backup and downloads it. The remote has a scheduler, and you can also set it off to backup and download a list of sites.
Joomlapack also provides a file "kickstart.php" which you copy to your empty server account along with the backup, which automates the restore procedure. You do have to create an empty database with PHPMyAdmin or similar, and you are given the opportunity to supply the database parameters (host, database, username, password) during the process.
One pitfall I did run into with this though is that some common components can have absolute URLs in their configuration - e.g. SOBI2, Virtuemart. It's then just a matter of finding the appropriate configuration file, editing it and re-uploading it.
Another problem was one archive file (either ZIP or their JPA format) got a filename with a "?" character in it (from a Linux server) and this caused a bit of a problem trying to install it locally on a Windows WAMP stack - the extract process on the ZIP file failed, and it stopped the process completing cleanly.
I suggest using automatic backup service by http://www.everlive.net
Update:
Ok, here is some more information. EverLive.net is a website where you can create a free account. Enter your website details and you are ready to take your backups withe just one click. Restore is also possible in the same way.
Further you can use automatic backup option to take automatic backups at defined intervals. Other than that, you can use the website health check service to inform you if your website is not available.
I work on quite a few DotNetNuke sites, and occasionally (I haven't figured out the common factor yet), when I use the Database Publishing Wizard from Microsoft to create scripts for the site I've created on my Dev server, after running the scripts at the host (usually GoDaddy.com), and uploading the site files, I get an error... I'm 99.9% sure that it's not file related, so not sure where to begin in the DB. Unfortunately with DotNetNuke you don't get the YSOD, but a generic error, with no real way to find the actual exception that has occured.
I'm just curious if anyone has had similar deployment issues using the Database Publishing Wizard, and if so, how they overcame them? I own the RedGate toolset, but some hosts like GoDaddy don't allow you to direct connect to their servers...
The Database Publishing Wizard's generated scripts usually need to be tweaked since it sometimes gets the order wrong of table/procedure creation when dealing with constraints. What I do is first backup the database, then run the script, and if I get an error, I move that query to the end of the script. Continue restoring the database and running the script until it works.
There are two areas that I would look at -
Are you running in the dbo schema and was your scripted database
using dbo?
Are you using an objectqualifier in either your dev or your
production environment? (look at your sqldataprovider configuration
settings)
You should be able to expose the underlying error message by setting the following in the web.config:
customErrors mode="Off"
Could you elaborate on "and uploading the site files"? New instance of DNN? updating an existing site? upgrading DNN version? If upgrade or update -- what files are you adding/overwriting?
Also, when using GoDaddy, can you check to verify that the web site's identity (network service or asp.net machine account depending on your IIS version) has sufficient permissions to the website's file system? It should have modify permissions and these may need to be reapplied if you are overwriting files.
IIS6 (XP, Server 2000, 2003) = ASP.Net Machine Account
IIS7 (Vista, Server 2008) = Network Service
Test your generated scripts on a new local database (using the free SQL Express product or the full meal deal). If it runs fine locally, then you can be confident that it will run elsewhere, all things being equal.
If it bombs when you run it locally, use the process of elimination and work your way through the script execution to find the offending code.
My hunch is that the order of scripts could be off. I think I've had that happen before with the database publishing wizard.
Just read your follow up. In every case that I've had your problem, it was always something to do with the connection string in web.config. Even after hours of staring at it, it was always a connection string issue in web.config. Get up, take a walk and then come back.
If you are getting one of DNN's error pages, there is a chance it may have logged the error to the eventlog table.
Depending on exactly what is happening and what DNN is showing you you might be able to manually look inside the EventLog table, pull out the XML data stored there, and parse it to find the stack trace and detailed information regarding the specific error at hand.
I have found however though that I get MUCH better overall experiences with deployments using backups and restores of my database, that way I am 100% sure that all objects moved correctly, and honestly it works better in my experience.
With GoDaddy I know another MAJOR common issue is incorrect file permissions, preventing DNN from modifying the web.config and other files that it needs to do.