Umbraco contour on azure fails when form is submitted - sql

I have installed my Umbraco site on Azure and this is working beautifully. I installed Contour on my Umbrcaco system which I did manually in order to successfully install the database. The whole form admin seems fine and the form I have created displays fine until I submit it, I get the following error:
Tables without a clustered index are not supported in this version of
SQL Server. Please create a clustered index and try again.
I have searched high and low and can't find a solution to this problem. Can anyone help?

Unfortunately... this is, indeed, a limitation of SQL-Azure. You'll need to do what it says and create a clustered index on each table.
Details are here: http://msdn.microsoft.com/en-us/library/windowsazure/ee336245.aspx#cir

Related

Sonarqube DB Queries - How to find new issues?

I need to find all the issues discovered in a snapshot/scan in Sonarqube. I can't use the web API since the volume can be excessive for new projects on first scan. I have a query that can find the latest snapshot with the project information. I can query issues by project. I can't figure out how to relate issues to a snapshot. There has to be a way since Sonarqube does it - New issues on the Project page.
Has anyone done this or have enough experience with the crazy schema to be able to figure it out? Can't wait for the schema rationalization...
Sonarqube 5.6.3 on Windows 2012 R2 with SQL Server 2012.
There is currently no association between snapshot and issue. Nor has there ever been one. The closest you can come is to use date parameters to narrow the set of issues created right around the time of your analysis. Note that this could be difficult if you run analyses close together.
The "new issues" metrics shown on the project homepage are just that - metrics. However, if you click through on one, you'll find yourself in a date-based Issues search.
You can do the same sort of thing using the web service, again, via date-based criteria. Or you could use the sinceLeakPeriod parameter.

Azure SQL Server Database creation not working correctly

I have Azure and I just upgraded to the Pay-as-you-go option as I though being on the trial might be causing my issue, but it persists.
I try to make a database in SSMS and I get this error saying I don't have the right subscription:
The reason I want to do it from SSMS is because when I try to add the database through the azure portal it doesn't show up in the sys.database table:
One of my databases is dependent on another and can't find it when trying to add a stored procedure because it doesn't seem to be registering correctly with master.
What is going on and how do I fix it?
I figured it out. When making a new database you need to go to options and change service level to basic.

Unable to deploy database to Azure

I created ms sql database in SSMS 2012. Connected successfully to Azure and trying to deploy db to the cloud.
Encountering following errors:
Please see screen shot
Numerous Usupported property errors — not supported when used as part of a data package
You're likely using a feature not supported in Azure SQL Database. Please refer to this non supported features list to help you pinpoint the problem:
http://msdn.microsoft.com/en-us/library/azure/ff394115.aspx
This happened with me too. In my case ,i changed the schema of a table after creating once for the first time. After deleting that table database deployed correctly. Usually this error occurs when validating schema fails.
Regards
MAnoj Bojja

alter table drop column fail in SSDT because of dependancy of non cluster index

I have created an SSDT project for SQL Server 2012 database. since i have database already present in the SQL Server Database engine so i use the import feature to import all the objects into SSDT. everything works fine but i am now facing 2 problems
1) since one of the table is using the HIERARCHYID column (col1) as a datatype and there is one computed column based on the HIERARCHYID column. The definition of computed column is something like case Col1= hierarchy.GETRoot() THE NULL ELSE someexpression END. after importing the table script in SSDT, Error of unresolve reference start coming up.
If i change the defination to something like case hierarchy.GETRoot() = Col1 THE NULL ELSE someexpression END (note now col1 is now at the end) it works fine.
2) if i keep the above solution (i.e keeping col1 after =) then at the time of publishing the project,SSDT has to drop the column at the production server and then recreate it. since there is a index depend on this column the deployment get failed everytime saying the error like ALTER TABLE DROP COLUMN fail because other object access it. i have no control how SSDT design / publish the script. and if i have to keep any eye to drop every dependent object before publishing the database project then i think there is no use of it
Please suggest how i can resolve this
Thanks
Atul
I was able to reproduce the reference resolution problem you described. I would suggest submitting that issue to Microsoft via Connect here: https://connect.microsoft.com/SQLServer/feedback/CreateFeedback.aspx
I was not able to reproduce the publish failure. Which version of SSDT does the Visual Studio Help > About dialog show is installed? The most recent version ends with 40403.0. If you're not using the most recent version, I would suggest installing it to see if that fixes the publish failure. You can use Tools > Extensions and Updates to download SSDT updates.
If you do have the most recent version, could you provide an example schema that demonstrates the problem?
Compare your project to a production dacpac and have it generate scripts to make the changes. Then if need be, you can edit the scripts before they get applied to production. This is how my dev teams do it.
I was running into the same issue for a number of days now. After finding your post to confirm the issue was in SSDT, I realized that it may be fixed in a later version than the one we are currently using: 12.0.50730.0 (VS 2013, the version this project uses).
I also have version 14.0.3917.1 installed from VS 2017. I just attempted with that, no issues. So the solution is to upgrade your SSDT version.
Please ignore that solution, it appears my success last night was anomalous. While attempting to repeat it today after restoring a database with the issue, the deployment failed to account for at least one index again.
EDIT:
I have posted about this on User Voice: https://feedback.azure.com/forums/908035-sql-server/suggestions/33850309-computed-column-indexes-are-ignored-with-dacpac-de
Also, to maintain that this is at least a workable answer of sorts, the workaround I am implementing involves dropping and recreating the missed indexes myself using pre and post deployment scripts.
Not an ideal solution if the dacpac was meant to update various versions of the database that could have different levels of drift from the model, however it works for us as we have a tight control over all instances and can expect about the same delta generated each release for each db instance.

MS Search Server Express Throws an Exception

I installed MS Search Server Express and crawl some Ektron documents and those documents are successfully indexed.
Next when I tried to search I got the following error.
I also installed some hot-fixes as suggested by several websites. However, this error still throws, when I try to search my documents.
Do you have any idea?
I had this happen and ended up going to the search manager site - something like yourhost:1234 or some other port. Once there, you navigate to the search service configuration screen and you'll notice the database connections, etc at the bottom. An Ektron support person had me create a new configuration and delete the old one. This actually drops the tables in SQLServer and creates new tables with long numbers at the end of the table names. After doing this, it all started working again. You can always drop your Sharepoint* tables, register your site again and do a full crawl to start over.