How to update a domain service when the .edmx file is updated? - silverlight-4.0

Is it possible to update a Domain Service?
Right now I delete the service and add again its stupid.

There's no update support in the UI, but if you're just returning whole entities there should be no reason to regenerate, everything should just flow through when you update the edmx and rebuild the project (to re-run codegen for the client).

Just for information if anyone else stumbles upon this thread in desperation (like I just did) ...
I updated my edmx model in a Silverlight application, having only renamed one database column and added a couple of foreign keys, and the Client project immediately started generating 30+ fatal errors, all due to the fact that, despite the Web project building "successfully", it did not generate the app.g.vb "generated code file" for the client, resulting in "WebContext not defined", missing ApplicationResources etc etc.
The first time this happened I just gave up, deleted the edmx file and the domain service and started again.
The second time it happened I got angry, but this time I just deleted and re-created the domain service, and everything went back to normal.
So it looks like there are definitely some modifications you can't make to the edmx model without requiring re-creation of the Domain Service.
Ade.

Related

entity framework enabling migrations

i was working on an application and drop/creating the database on each launch. Now, before going into production, i enabled migrations. I created an initial migration, then changed one of my models, added a new migration, updated the database, and everything was dandy
now, I tried to change the database name in the context, to see how it would behave on a new database, and every time I launch the application I get a differed exception.
right now its "invalid object name".. on one of my tables i think. I'm looking at answers here and they say to refresh the intellisense cache but I can't find where. I went to edit -> intellisense -> theres no refresh cache, theres refresh remote resources, which didn't fix anything.
by the way, do I have to turn automatic migrations on? (i.e. 'true')

Remove a duplicate Context.vb files from Entity Framework

I'm having a bit of an issue with EF6 in VS2013. I had to modify the connection string for the database-first model in a Web Application project so I followed the advice in the best answer for How should I edit an Entity Framework connection string? and deleted it from my Web.Config file.
At first it seemed to work fine, I deleted the connection string then from the Entity Designer I ran "Update Model from Database", re-created the connection string, but then my build failed with multiple errors similar to:
Public Sub New() has multiple definitions with identical signatures
After some digging I figured out that when I re-created the connection string EF created a second Model.Context.vb file named Model.Context1.vb and both are still referenced somewhere. Since then I've opened every file in the folder containing my EF model with notepad searching for a reference to Context.vb or Context1.vb and have come up empty, if I remove either file my build fails stating the file can not be found, so as a workaround I opened the Context.vb file and removed all the code so there are no duplicates, I'd like to fix it properly by removing the reference to the file deleting it if anyone knows how I can go about doing that.
I got it sorted out, after attempting to restore an older version of the EF files from source control and still running into the same issue, I realized the reference was probably in a project file.
In [projectName].vbproj I found these two entries:
<Compile Include="Data\schedulerModel.Context1.vb">
<AutoGen>True</AutoGen>
<DesignTime>True</DesignTime>
<DependentUpon>schedulerModel.Context.tt</DependentUpon>
</Compile>
<Content Include="Data\schedulerModel.Context.tt">
<Generator>TextTemplatingFileGenerator</Generator>
<DependentUpon>schedulerModel.edmx</DependentUpon>
<LastGenOutput>schedulerModel.Context1.vb</LastGenOutput>
</Content>
I removed the first, and dropped the 1 from the context.vb file in the second, opened the project and ran a rebuild without issue.
I had the same problem but a slightly different resolution. For whatever reason, updating the edmx file one time seemed to remove a seemingly important line from the project file, the line reading <LastGenOutput>MyEntityModel.Context.cs</LastGenOutput>.
I re-added the line to my project file and updating the model didn't result in any more duplicate context files. The whole block looked like the following when fixed:
<Content Include="MyEntityModel.Context.tt">
<Generator>TextTemplatingFileGenerator</Generator>
<DependentUpon>MyEntityModel.edmx</DependentUpon
<LastGenOutput>MyEntityModel.Context.cs</LastGenOutput>
</Content>
Just thought I'd add my findings to this as it has been driving me to distraction for a few weeks - every time I updated my Model from Database, I got "duplicate" context, designer files etc, and then hundreds of errors. However, the new sp or table or whatever I had added was only present in the new "context1" files, not the originals, so when I wound it back I had to go through the same process again etc etc.
Then finally a light went on when I thought of ... Source Control! I use TFS, and I found that unless ALL model-related files are checked out before doing the Update, so that's Context, Designer and Service files, then EF generates new versions of almost everything, presumably because it can't modify one of the files which are read-only due to source-control.
The key then is to fix the project file as stated in answers above before getting everything checked out and THEN doing the update. If you don't get that tag right in the proj file, it goes and does it all wrong again even though everything is checked out.
Hope this helps - my sanity is slowly returning anyway.
Ade

Migrations don't run on hosting

I'm using MigratorDotNet to manage Rails-style migrations for my web app. I have a workflow where, if I delete all the tables in the database, I can access an installation view that will run MigratorDotNet and create all the necessary tables.
This works locally. For some reason, when I upload my code to my Arvixe hosting, the migrations just never run. I get this odd error:
There is already an object named 'SchemaInfo' in the database.
This is odd because, prior to running migrations, I manually deleted all the tables in the database (to make sure it wasn't left over from a previous install).
My code essentially boils down to:
new Migrator.Migrator("SqlServer", connectionString.ToString(), migrationsAssembly).MigrateToLastVersion();
I've already verified by logging that the connection string is correct (production/hosting settings), and the assembly is correctly loaded (name and version).
Works locally, but not on Arvixe. How do I troubleshoot this?
This is a dark day.
It turns out (oddly) that the root cause was my hosting company used a schema other than dbo for my database. Because of this, the error message I saw (SchemaInfo already exists) was talking about their table.
My solution, unfortunately, was to rip out MigratorDotNet and go with FluentMigator instead. not only did this solve the problem, but it also gave me a more intelligible error message (one referring to the schema names).
While it doesn't seem possible to auto-set the schema, and while I need to switch the schema on my dev vs. production machine, it's still a solvable problem (and a better API, IMO). I googled, but did not find any way to change the default schema in migratordotnet.
I'm sorry for the issues that you were having. On shared hosting, unfortunately the only way that we may be able to change the schema is manually. If you are still looking for a solution that requires our assistance, please forward your ticket ID to qa .at. arvixe.com as well as arvand .at. arvixe.com and we can look into the best way to resolve this.

Data changes in RavenDB by itself

I have set up a RavenDB for evaluation. I wrote some code which pushed some documents into it. I then have a web site which renders those documents.
Throughout the day, I used the Raven Studio to modify some text in those documents, so that I could see the changes come through in my web site.
Problem: It seems that after going home for the night, when I come in the next day my database has changed - my documents have reverted to the 'pre-changed' versions... what's going on??
I've looked through the Raven console output, and there were no update commands issued on my developer machine overnight (nor would I expect there to be!!)
Note: this is just running on my development machine.
As far as I know, RavenDB has no code in it that would automatically undo commited write operations and honestly, this would really scare me. Altogether this sounds really weird and I can't think of a scenario where that could actually happen. I suggest you send the logfiles to ravendb support if it happens again, because this would be a really serious issue.
My colleague had this very problem with updates being reverted. The update we made was to add a property, and then also a document specific value for this property, to all the documents. We called SaveConfiguration() and saw the change being done in the Raven Studio. A while later some of the documents had lost it's new property.
I decided to turn on the logging and therefore added an NLog.config file, to get the logging started I touched the web.config. This of course restarted the application, and "voila", the updates appeared in the Raven Studio again.
After a while they disappeared from the Raven Studio, so I assumed that this was a studio problem. I therefore tried to retrieve the objects from the database in a test controller, unfortunately the objects were lacking the property value here too, so it wasn't just a studio problem.
With the logging turned on we updated the documents of the specific type again, and according to the logs and also the studio we actually updated the documents. Not long thereafter the documents reverted by losing it's added property yet again (my colleague started crying at this point - true story)..
Later I came to realize that this was all because of our live web application still had the old version of the object. When it was read in the web application the data was returned without the extra property. Because of this it seems like our DocumentSession thought that the object had changed (in all fairness), so when we called SaveChanges even these objects was written to the database - without it's extra property.
Is my conclusion correct? What is the solution to this problem? I'm thinking CQRS, because then we will never call "SaveChanges()" on the DocumentSession for reads.
Adam,
Just making sure, did you call SaveChanges() after you made your modifications?
There is absolutely nothing in RavenDB that would cause this behavior.

SharePoint Content Type Event Receivers Impossible to Remove

I have a very odd situation in my SharePoint staging environment. We recently stood up a new SharePoint 2010 server (single WFE + a DB server), and attached a backed-up content database from our existing environment. We created a new web application, and pointed it at the attached content database. All of our site collections, sites, lists, etc. appeared, and things appeared good.
We had deployed some custom content types to our existing environment prior to moving the database, and we wanted to upgrade those content types. Specifically, we attach event receivers to the content types (using code, not XML) and we needed to update the assembly version that those event receivers point to. So we ran our usual code (part of a feature receiver) to remove the event receivers, but to our surprise, the receivers remained.
In an attempt to remedy the situation, we wrote a console application that iterates over all content types (SPWeb.ContentTypes) in the root site of each site collection and deletes them, and then calls SPContentType.Update(true) on each content type. There are no errors returned from the call to Update, but again to our even greater surprise, SharePoint still reports the event receivers are attached.
In a desparate last ditch effort, we even went into the content database (after taking a snapshot -- and remember, this is staging, not production!) and manually DELETED the offending receivers from the EventReceivers table. We figured that should have at least some kind of effect. Alas, SharePoint still reports the receivers as being present.
We perform these types of upgrades on content type event receivers all the time, but have never run into this issue on any other SharePoint farm. Does it sound like an environmental problem? Is it something that could have been caused by moving the content database? Any help would be appreciated, because we are completely stumped at this point.
1st of all, I will never recommend changing anything in DB. It will surely give you trouble in long run.
You did mention that you tried to remove the event reciever from Web level but not sure if you have tried removing it from List/Library level
Use ContentTypeUsage class and try deleting from List/Library level
http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spcontenttypeusage.aspx