Recover failed sonar upgrade (missing table) - migration

Due a backup issue the migration from sonar 3.5.1 to 3.7 got messed up a bit. Now some tables are missing but the migration is done.
Is there some way I can rerun the db migration to create the missing tables ?
Note I have so far only seen one problem and it shows in the log as:
MySQLSyntaxErrorException: Table 'sonar.issue_filters' doesn't exist
when I show the issues page or the issues drilldown. And I see that table is created in war/sonar-server/WEB-INF/db/migrate/411_create_issue_filters.rb
So based on the info there it seems I could create that one manually directly in sql but is there a better safe way to recover this migration ? (as I suspect the issue_filters is not the only problem)
Using MySQL for the db.

Related

Data Migration Assistant

I am using Data Migration Assistant to assess compatibility issues migrating a SQL database to Azure SQL. After running for a couple of minutes, it throws an error saying "The file contains the XML node type {0}. This type is unsupported or in an unsupported location." I have successfully assessed other databases using DMA but this particular database always aborts after throwing this error.
I decided to go ahead and migrate the database using the wizard (Deploy Database to Microsoft Azure SQL Database) from SSMS, and ran into several compatibility issues that showed as errors. The database had several triggers that were created by a third-party database tool that referred to table objects with the 3 & 4-part naming conventions which is not supported on Azure SQL. There were several other errors in addition to these but I decided to delete these triggers first and run Data Migration Assistant again. This time it ran to completion and I got compatibility report. I am not sure if it was the sheer number of issues found or something in those triggers that I deleted had caused the error.

SSIS Error: VS_NEEDSNEWMETADATA

I'm currently updating all of our ETLs using Visual Studio 2015 (made in BIDS 2008) and redeploying them to a new reporting server running on SQL Server 2016 (originally 2008R2).
While updating one of the ETLs and trying to run on the new server I got this error:
The package execution failed. The step failed.
Sometimes it also produces this error:
Source: Load Fact Table SSIS.Pipeline Description: "Copy To Fact
Table" failed validation and returned validation status
"VS_NEEDSNEWMETADATA".
I've tried deleting and re-adding the OLEDB Destination, connection strings and opened up the column mappings to refresh the meta data. I also recreated the whole data flow task but I'm still getting the same error.
The package runs fine on my local machine.
UPDATE:
I started taking the package apart and running only pieces of it to try and narrow down which part was failing. It seemed to be failing on loading into the staging table but I couldn't find out why.
I eventually decided to just try and re-create the whole thing. After re-creating the entire package, still no luck. The picture below is from the event viewer on the server itself but it didn't give me any new information.
Package error from event viewer
I have tried all the solutions provided above and the other sites. Nothing worked.
I got a suggestion from my friend Which worked for me.
Here are the steps:
Right click on the Source/Target Data flow component.
Go to Advanced Editor -> Component Properties
Find ValidateExternalMetadata and set it to False.
Try your luck. This is a pathetic issue and left me clueless for 2 days.
I finally found the issue and here's how I did it.
Because the error messages I was getting from SSMS weren't very insightful I first opened up my remote desktop and logged into the server. Then I went to Administrative Tools>Event Viewer and then Windows Logs>Application to see if the failed event would provide greater detail.
It didn't give me much still.
The next step I took was to run the package from the command line because the messages should be more verbose. Opened up cmd, changed directory to the one my package was in and then...
DTEXEC /FILE YourPackageName.dtsx
Finally, the error message here showed a missing column in the tables the package was trying to write to. I added those columns and voila!
As stated in comments,
if it runs ok in your development environment, then the problem isn't with the package, it's with the scheduled job on the server. Try recreating that.
If that doesn't work,
It seems like the server has a cached instance of the package it's using instead of the updated one. Try renaming your package and creating a new job with the new package name and see if that works.
If that doesn't work,
all I can recommend at that point is to cut the package down until it succeeds, then add the next step that fails.
Sounds like from your solution the development environment is more forgiving of schema updates than the deployed solution. Glad you were able to resolve, eliminating clutter helps.
I had the same problem and my issue was a difference between two environments, the same field in the same table once was written with a capital and once not. So the name was the same, but with this small difference (e.g. isActive vs IsActive).
This came from a refactoring effort, where we used VS database publish that did not updated the field name.
Have you tried deleting and re-creating the source? When I get this I can generally modify OK any object that has the error but have to delete and rebuild the paths between them, however sometimes I have to delete everything in the data flow and re-create it.
A Proxy for SSIS Package Execution should be created under the SQL Server Agent. You should then change your job step (or steps) to Run As the Proxy you've created.
I had your same problem some time ago and the proxy fixed it.
Forgive me if you've already tried this.
It is very common to get that message when 2 columns in the source file are being inserted into the same field of the table.
i.e.
My text file has twice "neighborhood" (same label for different columns) and my table has "neighborhood" and "neighborhoodb" (notice the "b" at the end). The import will try to import both text columns into the field "neighborhood" and ignore the "neighborhoodb" field, it will fail with the "VS_NEEDSNEWMETADATA" error.
Re-creating the job worked for me. Some cached version of the job may have been causing the VS_NEEDSNEWMETADATA error. The package was executing correctly but it was failing, when it was executed by an agent job.
This ended up being a permissions issue for me. The OLE DB Source was using a stored procedure that selected from a SQL view. This view joined to a table in another database and unfortunately the proxy account the SQL Agent job step was running the package under did not have SELECT permission to the table in that database. This is why the package ran fine in Visual Studio but not from a job when deployed to the server. I found the root cause of the error by taking the SELECT statement out of the stored procedure and putting it directly in the Source Query box of the OLE DB Source control which caused it to finally return the 'SELECT permission denied' error message. This error was apparently hidden from SSIS since the proxy account DID have execute permission on the stored procedure.
It works for me after changing the ValidateExternalMetadata to false. I was transferring the data from MSSQL database to MySQL database. Changed "ADO NET Destination".
You may need to strongly type your Source Query.
Example:
If your DestinationDB has a FullName field Nvarchar(255)
and in your source query you have
select firstname + lastname as FullName from...
Try this:
Select CONVERT(NVARCHAR(255),firstname + lastname) as Fullname from...
So if you are going from db to db and both are nvarchar(255) I don't have this issue, but if you are concatenating fields in your query specify the data type and length.
This error can also occur when an entire SSIS project needs to be redeployed rather than just one of the packages (for VS versions that allow deployment of a single package in a multi-package project), particularly when a project connection has been changed or added. For example, if you've added or removed columns from a flat-file project connection. In that case, you need to deploy the entire project to push out the updated project connection properties. This can be true even if the project only has one package in it. In VS Solution Explorer, rather than click on the package name to deploy, select the bolded project name at the top, and then click deploy.

I am getting a DB collation error when loading data

Hi I am loading the data from MySQL staging to MySQL destination.
I get this error and it says Illegal mix of collations (latin1_swedish_ci, COERCIBLE) and (latin1_german1_ci, COERCIBLE) for operation '='
Does this has to do anything with Pentaho. Because the same runs fine in Production server but gives error in Dev server.
Probably not Pentaho since it is working in one area, but not another. Try:
Moving the code from your prod box to your dev box to make sure you didn't introduce any changes unintentionally.
Are your MySQL instances the same versions? Are they supported by Pentaho? What about your drivers? And are the drivers all stored in the correct places? Make sure that you don't have two of the MySQL drivers in the different folders to avoid conflicts.
Run the job in row level mode to see the most messages about what is occurring. It could give you important clues.

Unable to deploy database to Azure

I created ms sql database in SSMS 2012. Connected successfully to Azure and trying to deploy db to the cloud.
Encountering following errors:
Please see screen shot
Numerous Usupported property errors — not supported when used as part of a data package
You're likely using a feature not supported in Azure SQL Database. Please refer to this non supported features list to help you pinpoint the problem:
http://msdn.microsoft.com/en-us/library/azure/ff394115.aspx
This happened with me too. In my case ,i changed the schema of a table after creating once for the first time. After deleting that table database deployed correctly. Usually this error occurs when validating schema fails.
Regards
MAnoj Bojja

alter table drop column fail in SSDT because of dependancy of non cluster index

I have created an SSDT project for SQL Server 2012 database. since i have database already present in the SQL Server Database engine so i use the import feature to import all the objects into SSDT. everything works fine but i am now facing 2 problems
1) since one of the table is using the HIERARCHYID column (col1) as a datatype and there is one computed column based on the HIERARCHYID column. The definition of computed column is something like case Col1= hierarchy.GETRoot() THE NULL ELSE someexpression END. after importing the table script in SSDT, Error of unresolve reference start coming up.
If i change the defination to something like case hierarchy.GETRoot() = Col1 THE NULL ELSE someexpression END (note now col1 is now at the end) it works fine.
2) if i keep the above solution (i.e keeping col1 after =) then at the time of publishing the project,SSDT has to drop the column at the production server and then recreate it. since there is a index depend on this column the deployment get failed everytime saying the error like ALTER TABLE DROP COLUMN fail because other object access it. i have no control how SSDT design / publish the script. and if i have to keep any eye to drop every dependent object before publishing the database project then i think there is no use of it
Please suggest how i can resolve this
Thanks
Atul
I was able to reproduce the reference resolution problem you described. I would suggest submitting that issue to Microsoft via Connect here: https://connect.microsoft.com/SQLServer/feedback/CreateFeedback.aspx
I was not able to reproduce the publish failure. Which version of SSDT does the Visual Studio Help > About dialog show is installed? The most recent version ends with 40403.0. If you're not using the most recent version, I would suggest installing it to see if that fixes the publish failure. You can use Tools > Extensions and Updates to download SSDT updates.
If you do have the most recent version, could you provide an example schema that demonstrates the problem?
Compare your project to a production dacpac and have it generate scripts to make the changes. Then if need be, you can edit the scripts before they get applied to production. This is how my dev teams do it.
I was running into the same issue for a number of days now. After finding your post to confirm the issue was in SSDT, I realized that it may be fixed in a later version than the one we are currently using: 12.0.50730.0 (VS 2013, the version this project uses).
I also have version 14.0.3917.1 installed from VS 2017. I just attempted with that, no issues. So the solution is to upgrade your SSDT version.
Please ignore that solution, it appears my success last night was anomalous. While attempting to repeat it today after restoring a database with the issue, the deployment failed to account for at least one index again.
EDIT:
I have posted about this on User Voice: https://feedback.azure.com/forums/908035-sql-server/suggestions/33850309-computed-column-indexes-are-ignored-with-dacpac-de
Also, to maintain that this is at least a workable answer of sorts, the workaround I am implementing involves dropping and recreating the missed indexes myself using pre and post deployment scripts.
Not an ideal solution if the dacpac was meant to update various versions of the database that could have different levels of drift from the model, however it works for us as we have a tight control over all instances and can expect about the same delta generated each release for each db instance.