I'm studying a course about database management based on the Transact-SQL language. The first problem I founded was in my OS because I'm using a MAC. I had to create a SQL container in Docker and create a connection using Azure Data Studio, where I have my server, but my problem (where I'm blocked) is this:
when I have to create the FILENAME for this database I don't know where is it the directory which I have to save the .mdf and .ldf files because I'm running my server from a 'virtual' location which is not in my computer system.
How can I solve this?
Because I know nothing about containers, is the first time using Docker or similar...
I already tried to do it in a virtual machine running windows but is too much slow so I have discarded that option at the moment and I have to use SQL Server because is the language used in the course, I don't have any other option.
On a default Docker container (which you can check with EXEC master..sp_helpfile;), database files get created in:
/var/opt/mssql/data/
So you can say:
CREATE DATABASE floob
ON
(
NAME = N'floob_data',
FILENAME = N'/var/opt/mssql/data/floob.mdf'
)
LOG ON
(
NAME = N'floob_log',
FILENAME = N'/var/opt/mssql/data/floob.ldf'
);
This is the bare minimum set of properties you need in order to dictate the location of the files; it will take other properties (like size/growth) from model.
But unless the course materials dictate that your CREATE DATABASE command absolutely must include FILENAME parameters, you can generally leave them out when building some skunkworks stuff on your own machine, as they will be placed in the default location that you don't really need to know (until that drive fills up).
In production that's generally not the best idea, though, because we often want to customize size, growth, separate data from log, etc.
Related
I am trying to find the best procedure to get data from our SQL server at headquarters to update apps running on local machines in various locations not connected to our network. Our current data and application is in Foxpro where you simply copied the data file, so I am not very familiar with using SQL databases.
The field app uses localdb and users don't save anything to the database. When the app opens it checks a web site to for updates. I tried detaching our HQ .mdf and .ldf, downloading it and overwriting it on the local machine, but localdb would not attach to the new file (same name). I thought localdb closes and detaches when the application closes , but maybe I am wrong. I also wonder if I need the log file since no changes are made and I dont need to rollback anything. I have searched for a good article on this topic but haven't found anything. This must be a fairly common scenario in many companies.
You want to look into using replication, probably snapshot replication. This allows you to distribute on whatever schedule is applicable to send one or more tables, or other objects, to off site sql server instances. You can use Http to send data.
I have PostGreSQL database on the other computer(B), also, have images saved in 'My Documents' on that computer(B). The information system software I created using Visual Studio 2012 deployed on my computer(A) gets information from the database on computer(B).
How can I access the images from computer(B) because they are linked, by ID, on the records from the database?
UPDATE I have already solved the issue, shared the folder of images to (A), and used UNC on my source codes. Is there any other way to make this possible? Will NetBIOS help me on this one?
Share the folder from (B) to (A).
It looks like you're using Windows, so that's straightforward enough.
If the two machines aren't on the same local network you might well need to change your firewall settings to allow access. Allow just the IP address of machine (A).
Unless the images themselves are stored in the database, you're not going to be able to read them from it.
If you have some form of web server running on (B) then you could feasibly write code to pull the image you want based on the database information and send it to (A) through that.
I've not done much SQL and am still pretty new to this, so please excuse what's probably a basic question.
I've been asked to look into creating an SQL job to backup our databases, store the .baks on another machine and then to restore them to a second server. I've been doing a bit of research and playing with SSMS and have back-ed up the database to my personal machine by setting up a share and running a backup job to the share location. I'm now trying to create a new database (on the same server I back-ed up from) by restoring the .bak file (but giving the database I'm trying to create a new name and what-not) but am unable to specify restoring it from the share, like I did when backing it up/I can't find how to specify other network locations and am just browsing the server's C drive when I try to locate the file.
For now, I'm just using the built-in wizards to try and achieve this (open SSMS -> Connect to server -> right click DataBases -> Restore DataBases and then select From Device and browse to find the file).
This isn't the final process, just me trying to get to grips with how this works. As I said, the idea is to ultimately have a scheduled job to backup the DB from server1 to a .bak on, say, my personal machine and then to restore that to a DB on server2 (different network, different city) and, probably, with a series of SQL commands rather than using the wizard every time (there are a few DBs that'll, ultimately, need backing up).
My apologies for the, possibly, quite drawn out and convoluted question - essentially, all I need to know is can I/how can I restore a DB in SSMS from a .bak on a different machine?
Many thanks
You could use something like the following script. It restores a database from the filesystem, and it overwrites the existing database with the name of "MyDB", moving the files to new locations of your choice in the process.
RESTORE DATABASE
MyDB
FROM DISK = '\\MyShare\MyBackup.bak'
WITH
MOVE 'DataFile' TO 'D:\myNewDBLocation\DataFile.mdf',
MOVE 'LogFile' TO 'E:\\myNewDBLocation\LogFile.ldf'
, REPLACE
You can find out the name of the llogical files (in the above, those are called DataFile and LogFile by running the following:
RESTORE FILELISTONLY
FROM DISK = '\\MyShare\MyBackup.bak'
Additional information about various options and parameters:
RESTORE (Transact-SQL)
I have a live database on a shared hosting server. I am making some major changes to my site's code and I would like to fix some stupid mistakes I made in initially designing the database. These changes involve altering the size of a large number of fields, and enforcing referential integrity between tables properly. I would like to make the changes on both my local test server and the remote server if possible.
I should note that while I'm fairly comfortable with writing complex queries to handle data, I have very little experience modifying database structure without a graphical interface.
I can access the remote database in the visual studio database explorer but I can not use that for anything other than data manipulation. I installed Sql Management Studio express last night and after 40+ crashes I gave up - I couldn't even patch the damn thing.
The remote server is SQL 2005 / The MyLittleAdmin web interface is available.
So my question is what is the best way to accomplish these changes. Is there a graphical interface I can use on the remote server? If not is there an easy way to copy the database to my local machine, fix it, and re upload? Finally if none of the above are viable does anyone have links to a decent info on fixing referential integrity via query?
Sorry for the somewhat general question - I feel like I am making this far harder than it should be but after searching / trying all night i haven't gotten anywhere. Thanks in advance for the help. I really appreciate it.
...Also does anyone have a time machine I can borrow- I need to go kick my past self's ass for this.
Usually hosting providers allow you to backup and restore your database, so the easiest way to accomplish the move is to backup your live DB, download the backup file, restore it locally, do all the changes, do a backup of the local db, upload it, then restore it in the live service. Your site should be placed on an administrative shutdown during this time so it does not continue to update the data while you're doing this operations. You have to make sure your local SQL instance is exactly at the same build version (##version) like the hosting provider, otherwise your local SQL may upgrade the database structure and you'll be unable to restore it back on the hosting provider (or you'll be unable to restore in on your local server if your version is earlier than the host's). The MSDN BOL has a detailed guid on how to Copy Databases using Backup/Restore.
An alternative to backup/restore is to detach/attach the database, but I do not recommend this because you need to move both the MDF and the LDF in sync, and they're also larger in size than a backup.
This assumes you can do all the schema changes on your local copy in a wizardly manner, ie. fast and correct. Of course, that is not easy. The recommended way is to prepare in time a script that applies all the transformations needed to reach the new schema. There are tools like SQL Diff, SQL Compare, SQL Delta and other that can generate such a script. Also Visual Studio Database Edition can do this.
How I would do this would be like this:
Ensure I have exactly the same schema on my dev machine as on the live host. If not sure, I can take a backup of the live server and restore it localy. This would be my reference, v1. schema.
Keep the backup of v1. for reference
Start developing a script that changes the schema to my target. Sometimes I need to refresh my memory on script syntax myself, and what I do is I go to the SQL Server Management Studio wizards for the operation I want to do, select all the options in the UI and then select the 'show script options', that will show me exactly the script SSMS is running to accomplish my desired change.
For each change I add to the script, I can test it by restoring the v1. reference backup I have from step 1 and running the script.
Keep iterating on the script, adding one change at a time, until all the needed schema changes are done. After each change, I can test it again like in step 4.
Yourscript should do not only DDL changes to the schema, but also any DML changes needed (modifying reference data, changing values, moving columns between tabels etc).
When the script is ready, I can download a newer backup, apply the script, and upload the updated backup and restore it on the live host. Alternatively you can simply run the script on the live host (after of course you backed it up in case something goes horribly wrong).
In my projects I always rely on scripts to deploy and upgrade the database. In fact I use the database extended properties to store a 'version' of my application deployed schema and in my code I simply roll forward all the scripts that bring the schema to my last version. I have an article on my blog describing this technique: Version Control and your Database.
I have a VB project that runs on SQL SERVER 2005, while making the setup file for it, how do I include the DB?
You don't
Typically you have a DB generation script that is run either as part of setup or as part of first run of application
You also need to consider migrations (changes to DB when new releases of your application are published)
Consider using MigratorDotNet or RikMigrations to solve these problems in a seperate installer/upgrade program if you are still using VB6
I disagree, you could include the database. Simply distribute the .MDF file with your application.
Of course, the setup application would have know how to attach the database to an existing SQL Server RDBMS.
Both methods given in the above answers will work. I have tried them both. However using a
db generation script reduces the size of the final deployment files considerably. I would launch the script on the first run of the application and not in the setup itself.
I will second jack on this one.
From my experience of using installs that require an actual database file tend to have more issues then when updating or on first install when running scripts. As jack mentioned another bonus is reduced file size.
You can create who database scripts by right clicking on the required database, and selecting the script database option. Note however this will only create the tables and fields and not replicate any data.