Move SQL Server database from new version to old version - sql

We have a database created on SQL Server 2017 (Express edition) on one of our servers and we are trying to move this database to our another server which has SQL Server 2014 (licensed version) installed on it. We have tried to restore the backup, detach/attach database files and also tried generating the script and running them on new server.
But unfortunately we are not able to restore the database. For backup/restore, we are getting following error message.
For generating scripts and running them on new server; the problem is that the file size of the scripts is around 3.88 GB. We are not able to edit this file before executing it because file size is too large. We have also tried to first generate scripts without data and then with data but the file size with only Data comes up to 3.88 GB (there is only very little difference when only Data is selected).
What are the options we have?

Since you probably already have the schema and data scripts.
You could use VSCode with SQL Server extension to execute the scripts against a previous SQL server version. Sublime is also able to handle large files. See this question for all editors with large file support.
But only if you generated the scripts using the correct compatibility level.
Be aware that with the script not everything is scripted by default like triggers. So you might want to change other options.

Related

Using Liquibase to version control remote SQLite Databases

The system I have is a local machine for development with the dev DB and a number of remote servers with the production database. While looking for a system to manage the versions of my SQLite database I found Liquibase but I can't understand if it will work for what I need. Which is updating the schema of the production databases when i release a new version, adding the changes configured in Liquibase's changelog file for that version. Ofcourse all the rest code is under GIT so, if Liquibase only needs the changelog files I can put them in the repository, but if it needs something else it could become a problem.
Yes it should work. If you are using liquibase for first time it will run all the migrations and will store information in your database by creating seperate table for itself. Though you should verify the structure at both local and production is same and migrations won't cause error.

Issue with SSIS executing task to convert Excel to CSV

We have a task where we need to automatically convert an excel file to a csv to prep it for loading into a SQL database. The developers built this process into a SSIS package. For the conversion, they initially tried to have a task in the SSIS package execute a VBscript to convert the file. When they were running this on there local machines, this worked correctly. When they ran the package manually through VS on the server, it ran correctly. When they ran the package manually via the Integration Catalog it ran correctly. We did this both as our accounts and as the service account and got the same results. However, when we scheduled it as a job it would hang on the part of the process that executed the VBScript. No errors, it would just hang until you killed the job.
The job was executing as the service account which has full admin access on the server, explicit full access to the share where the files are stored and converted (which is on the same server) and full admin access to SQL. The job owner is set to sa which uses the service account. And all the job does is execute the package from the integration catalog which works if you run it independently of a job. When we compared the ssisdb execution report for the manual run in the integration catalog using the service account to the job run they looked the same except the job hung on the conversion task and the other did not.
After spending some time trying to figure this out, the developers tried a different solution. They changed the conversion script from using VBScript to using C#. They ran the package from there local machine and once again the package worked. This time when they ran it manually on the server it failed. When we ran it from the integration catalog it failed and when we ran it from a job it failed.
The error we keep getting is "Create CSV file: Error: Exception has been thrown by the target of an invocation" After spending several hours looking into this error nothing suggested seems to be working.
We also tried these same solutions on a newly built server to make sure we weren't dealing with an odd configuration setting that could have been changed (It is a Dev server) and it still failed there.
At this point, we are pretty lost at what is happening. The first solution worked, but for some reason would never work as a job. The 2nd solution works everywhere except when ran on the server.
We are looking at some other solutions to try to get around this. The next thing may be trying to using powershell to convert the file, but not sure if that will bring us back to the same issue. Any suggestion you guys have will be greatly appreciated
Also, we are using SQL Server 2012 dev edition, VS 2012 pro, Windows Server 2012 R2
This might be because of a bug that Excel has when trying to run jobs (that use Excel) and no user is logged on a specific machine. This might affect also the excel library. The solution is to create the following 2 folders:
C:\Windows\SysWOW64\config\systemprofile\Desktop
C:\Windows\System32\config\systemprofile\Desktop
and then restart the machine. :)
(Edited to show that a restart is needed. thanks for Dustin for checking this)

Create an oracle pl/sql package using a package on a different remote database

Is it possible to create a package or replace an existing package in a local database using a package from a different database without having to export it from the remote database?
Basically i have two environments/servers (DEV and QA).
The developers that work on the packages use the development environment and i would like to update the same packages in the QA environment using the package in DEV (ignore any possible issues for now e.g compilation failures etc).
Is it possible to frequently update the package in QA using the package in Dev as the source (instead of compiling from an .sql file)? Maybe a database link?
Yes, it's possible, you could created a process on your target system which uses the DBMS_METADATA package on the remote system to fetch the DDL for the desired package spec and body, and then use dynamic SQL on local system to compile the fetched code.
Alternatively, you could use tools such as Oracle's SQL Developer for migrating code. Using either the database diff functionality to detect differences and prepare the appropriate DDL scripts, or the Cart functionality to pick and choose what get's migrated. However, I'm not sure how well the SQL Developer method can be automated.

Prestashop 1.3.6 to 1.4.1

I have an eshop running PS 1.3.6 version. On my local I've updated to 1.4 first and then to 1.4.1...
Now I would like to update on server... is it possible to just upload files from my local 1.4.1, adjust the settings file and run the update script from 1.4.1 directly (without the middle step to 1.4)?
I can see there are database update scripts for each version, so it should be safe to do it like that, but I want to be sure before I run it on server.... thanks
I am usually doing the major upgrades this way:
Take a snapshot of the current site (tar.gz) & backup database using mysqldump tool (for compatibility);
Download all the files and setup the site on your local server machine using database dump (via mysql command) and downloaded snapshot. Adjust settings if necessary.
Perform an upgrade on your local site, thoroughly test it and test it again with your code & theme.
Repackage your updated files and database (tar.gz & mysqldump) and upload them to the server.
Erase the old site and untar upgraded site to its folder to take its place.
Replace old database with an upgraded one (using mysql command on the server).
Adjust settings if necessary. Test and run it! :)
That should be all. If you're more advanced you could optimize most of the steps. Give me a shout if you need all the useful commands to back up and restore files & DB.

SSIS Packages are not listed in integration services

We have multiple SSIS packages deployed on a production box. All the scheduled jobs that call these packages run fine.
Whenever I create a new package and try to deploy, I can see the folder structure and deploy fine.
Here is the problem I am facing:
When I log into integration services from SSMS, none of the deployed packages are visible. The tree structure under "Stored Packages" does not expand.
The setup is a 64 bit cluster running SQL 2005 9.0.4226
Any help would be greatly appreciated.
Thanks,
Raj
Just a couple of troubleshooting questions:
Is this problem specific to your machine?
Can you duplicate the problem from another machine?
If you log into the server itself and use its version of SSMS, does it also have the problem?
Have you tried reinstalling SQL Server 2008 on the server?
Are you able to the see File System and MSDB folders under Stored Packages or does it just never expand Stored Packages at all as you implied in your post?
If you run the following queries, do they return any values? The first query should display the list of folders including a Maintenance Plans folder and a blank folder. The second query should show all of your deployed packages if you are relying on server storage.
select *
from msdb.dbo.sysssispackagefolders
select *
from msdb.dbo.sysssispackages