Where do I put .mdf and .ldf files to share an SQL script through git - sql

I am attempting to share a file that builds and populates an SQL database through git, but it won't create the DB on my team members' machines because the .mdf and .ldf files are located on my machine. How can I rectify this?

If you want to share a SQL script, you don't have to share the database with it!
What is generally done (best practice) is that you have the script needed to create the database (and eventually populate it with static/test data) in git, and then the user will launch that script to build the database.
git is here to keep track of your source code and the changes made to it, you shouldn't put in it any generated file, and .mdf / .ldf files are typically part of what should not be in your git. For generated files within your folder, there are ways to configure git to ignore them.
The value of git is to record differences between files, if you want to share your software, git is definitely not the good tool. Put those file on a shared folder (NAS), on dropbox, give them through an USB key or whatever.
However, if you really want to do this (bad idea), I guess you can add your files in your repository and either configure SQL Server to find them here or create a symbolic link.

Related

IBM Worklight - How to retrieve an already-deployed .wlapp file?

I have an already running Worklight application. We are planning to move to another production cluster tomorrow but I want to get the old .wlapp that is already deployed on the first cluster.
How can I get it? I could find a directory on the temp file of the WebSphere application server, but it is in an exploded form. When I zipped one of them and converted it to .wlapp, while the deployment was successfull, I had a Direct Update and the application failed to start...
Is it the right path?
When you build your application in Worklight Studio, the build produces .wlapp files and stores them in the yourProject\bin folder, these files do not get deleted from the bin folder unless you've manually deleted them.
appname-envname-version.wlapp
appname-common-version.wlapp
appname-all.wlapp
Where the version value changes depending on the value you've set in yourProject\yourApp\application-descriptor.xml.
A good practice would be to backup your artifacts of different versions as well as project source code (using a source control system...), so you could always restore any version... esp. if you're talking about already being in Production....
After deployment, the .wlapp file is not stored in the filesystem, but in the database.
So as for retrieving previously deployed .wlapp files if you do not have backup (this is very bad, BTW), these are stored in your database in the APP_SYNC_DATA table in the deployable column. You can try to extract and save the APP_SYNC_DATA.deployable data relevant for you as a .wlapp file.
But this really does not guarantee that it will "work" any better than your other attempt. That .wlapp is configured to work with certain server URLs, and if this new cluster is not a replica of the previous cluster, I don't see this working either... But also, without knowing more about the errors you get, who knows to what they are related (but this is NOT for this question).

Backing up source files managed by source control software: TortoiseSVN

I am new to source control and I am confused with something I read on a webpage yesterday (I don't have the link). I have followed these instructions: "create folder structure", then "Start Reprobrowser", then copy source files into trunk folder. Please see the screen shot below:
However, when I navigate to the folder using Windows Explorer I do not see this folder structure. I see this:
Therefore I am wandering: where are the files physically stored? The reason I ask is because I want to ensure that NetBackup (corporate backup tool) backs up the correct directories.
To make sense of the repository structure you need to read all the documentation on SVN, but the preferred way to backup a SVN repository is through the command
svnadmin dump your_svn_repository_path > destination_filename_backup.svn
You could put this command in a scheduled task running sometime before your corporate tool execute the full backup of your data and include the destination_filename_backup.svn in your backup job
If you ever need to restore the backup (after recreating the repository) you could use the command
svnadmin load your_svn_repository_path < destination_filename_backup.svn

FTP Concurrency issues using Ipswitch WS-FTP Pro

I think we have a problem in our FTP scripts that pull files from a remote server to a local machine. I couldn't find an answer in their knowledge base, nor scripting documentation.
We are doing an MGET *.* and then a MDELETE *.* immediately after it. I think what is happening is that, while we are copying files from the server, additional files are copied into the same directory and then the delete command deletes everything from the server. So we end up deleting file we never copied down.
Is there a straight-forward way to delete only the files that were copied, or is it going to be some sort of hack job where we generate a dynamic delete script based on what we actually copied down?
Answers that are product specific would be much appreciated!
Here were the options that I came up with and what I ended up doing.
Rename the extension on the server, copy the renamed files, and then delete the renamed files. This could not work because there is no FTP rename command that works with wildcards (Windows rename command will by the way).
Move the files to a subdirectory on the server, copy the files from that location, and then delete from the remote location. This could not work because there is no FTP command to move the files on the remote server.
Copy the files down in one script and SHELL a batch file on the local side that dynamically builds a script to connect to the server and delete the files that were copied down. This is the solution I ended up using to solve this problem.

Which files are used by a program?

I have written a program on Visual Basic. In the debug folder, there are many files:
Database1.mdf
Database1_log.ldf
MyData.Designer.vb
MyData.xsc
MyData.xsd
MyData.xss
WindowsApplication1.exe
WindowsApplication1.config
WindowsApplication1.pdb
WindowsApplication1.vshost
WindowsApplication1.vshost.exe
WindowsApplication1.vshost.exe.manifest
WindowsApplication1.xml
I want to publish my program. Are all of those files necessary for the program? Which of them are used for my database?
Because I want to put a button in my program that backs up the database. Which files must be backed up?
First of all, you should publish the Release version of your software, not the debug version so the files will be a bit different. As for which files to publish, if you use the Setup project you will be able to select the files based upon what your application needs. For example, it looks like you are including database files with your application (Database1.mdf and Database1_log.ldf). You could add these files to the setup project.
The setup project will know to include your exe and your config file (unless you tell it not to) so you will be covered there. Here is a video and a written walkthrough of how to create a Setup project:
http://msdn.microsoft.com/en-us/library/ms241903.aspx
http://www.youtube.com/watch?v=Lcue0jo41AM
As for your PDB files, these are the Program Database Files that are used for debugging (and should never be give to the customer/end user).
http://msdn.microsoft.com/en-us/library/ms241903.aspx
As for backing up your database, back up the MDF and LDF files.
No, all of the files above are from your debug compile output. You can change what is output by changing your build configuration. Go to Build, Configuration Manager and switch to Release. It's also on the toolbar.
In general your ProjectName.exe (but not the .vshost.exe), .config (but not the .vshost.exe.config) and MDF/LDF files are needed for publishing. You also have an XSD File which will also be needed.
The MDF/LDF files are your database.

FTP only changed files in MSBuild

An MSBuild project copies its output to a directory on a server. Each day, only a few files change and most have an older creation date.
I can FTP this to a remote server with MSBuild tasks. But how can I do this FTP and only copy the few files that have changed?
To do this you'll need something that will manage the sync for you - that is that will keep track of what file is where and update accordingly.
We have used FTPSync to do the file sync bit very tidly for a number of sites.
From MSBuild you can call an external program - so putting the two together will probably work providing your are consistently synching from the same location (otherwise its going to be more interesting!)