Silverlight is not fetching data from my WCF RIA service - wcf

I just started learning Silverlight by walking through the labs posted on Channel9. When I tried to explore a little bit I found that my queries were not working as I thought they would.
To recreate what I have done you would need to create a new Silverlight Business application, create a data entity that is pointed to the Adventureworks LT db, and generate the web services for those entities (including edit).
I then simply drug a RichTextbox to Home.xaml and in Home.xaml.cs I added this code first to OnNavigatedTo and when that didn't work to the constructor.
AdventureWorksDomainContext ctx = new AdventureWorksDomainContext();
EntityQuery<Product> query =
from p in ctx.GetProductsQuery()
select p;
LoadOperation<Product> loadOp = ctx.Load(query);
var paragraph = new Paragraph();
foreach (var product in loadOp.Entities)
{
paragraph.Inlines.Add(new Run { Text = product.Name });
}
richTextBox1.Blocks.Add(paragraph);
When I run the page I never see loadOp.Entities contain a value and I only see the query I expect, go across the wire after all my code has been executed.
I feel like I'm missing something fundamental and this will make more sense if I can find someone to explain it to me.
Thanks,
Eric

The problem is related to the how you are loading the data. The actual Load operation is asynchronous, as is all Silverlight network calls. You are callingt ctx.Load(query) and then immediately setting the paragraph to the entities. You need to use a callback when Load is completed. Something like this,
AdventureWorksDomainContext ctx = new AdventureWorksDomainContext();
EntityQuery<Product> query =
from p in ctx.GetProductsQuery()
select p;
LoadOperation<Product> loadOp = ctx.Load(query,() =>
{
var paragraph = new Paragraph();
foreach (var product in loadOp.Entities)
{
paragraph.Inlines.Add(new Run { Text = product.Name });
}
richTextBox1.Blocks.Add(paragraph);
});
Since you aren't using the entities directly in a binding and are just iterating them, you need to make sure you wait until they are loaded. I can't remember the actual signature of the Load method, so you may need to modify my lambda to make it work.

Related

How to configure multiple sitemaps using MVCSiteMapProvider v4 with StructureMap DI

The problem, essentially, is that I can't get my sitemap config to support multiple sitemaps. It's always looking for "default" even when I name my instances and request another. Now for the background.
I've been pouring over the docs for the new implementation of MVCSiteMapProvider. They are now using Dependency Injection to configure the SiteMapProvider. We have an existing StructureMap DI implementation, so I followed the instructions and added, in our case
ObjectFactory.Configure(x =>
{
...
x.AddRegistry<MvcSiteMapProviderRegistry>();
...
});
Then I started tweaking the MvcSiteMapProviderRegistry.cs file to implement my multiple sitemap scenario. I have multiple site map files, either will work as long as it's called "default". If I remove the "default" item then it breaks and complains that "default" is missing. Which I assume is because it can't find my instance. Here's how I have them defined. I suspect the problem is somewhere in here... the loader which it says I have to configure in the Global.asax is looking for ISiteMapLoader but I'm adding my multiple configuration to SiteMapBuilderSet... anyway here's the code.
// Register the sitemap builder
string absoluteFileName = HostingEnvironment.MapPath("~/Main.sitemap");
string absoluteFileName2 = HostingEnvironment.MapPath("~/Test.sitemap");
var xmlSource = this.For<IXmlSource>().Use<FileXmlSource>()
.Ctor<string>("fileName").Is(absoluteFileName);
var reservedAttributeNameProvider = this.For<ISiteMapXmlReservedAttributeNameProvider>()
.Use<SiteMapXmlReservedAttributeNameProvider>()
.Ctor<IEnumerable<string>>("attributesToIgnore").Is(new string[0]);
var builder = this.For<ISiteMapBuilder>().Use<CompositeSiteMapBuilder>()
.EnumerableOf<ISiteMapBuilder>().Contains(y =>
{
y.Type<XmlSiteMapBuilder>()
.Ctor<ISiteMapXmlReservedAttributeNameProvider>().Is(reservedAttributeNameProvider)
.Ctor<IXmlSource>().Is(xmlSource);
y.Type<ReflectionSiteMapBuilder>()
.Ctor<IEnumerable<string>>("includeAssemblies").Is(includeAssembliesForScan)
.Ctor<IEnumerable<string>>("excludeAssemblies").Is(new string[0]);
y.Type<VisitingSiteMapBuilder>();
});
var xmlSource2 = this.For<IXmlSource>().Use<FileXmlSource>()
.Ctor<string>("fileName").Is(absoluteFileName2);
var builder2 = this.For<ISiteMapBuilder>().Use<CompositeSiteMapBuilder>()
.EnumerableOf<ISiteMapBuilder>().Contains(y =>
{
y.Type<XmlSiteMapBuilder>()
.Ctor<ISiteMapXmlReservedAttributeNameProvider>().Is(reservedAttributeNameProvider)
.Ctor<IXmlSource>().Is(xmlSource2);
y.Type<ReflectionSiteMapBuilder>()
.Ctor<IEnumerable<string>>("includeAssemblies").Is(includeAssembliesForScan)
.Ctor<IEnumerable<string>>("excludeAssemblies").Is(new string[0]);
y.Type<VisitingSiteMapBuilder>();
});
// Configure the builder sets
this.For<ISiteMapBuilderSetStrategy>().Use<SiteMapBuilderSetStrategy>()
.EnumerableOf<ISiteMapBuilderSet>().Contains(x =>
{
/* x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("default")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder)
.Ctor<ICacheDetails>().Is(cacheDetails);*/
/*
x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("MainSiteMapProvider")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder)
.Ctor<ICacheDetails>().Is(cacheDetails);*/
x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("TestSiteMapProvider")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder2)
.Ctor<ICacheDetails>().Is(cacheDetails);
});
In my global.asax.cs I added
MvcSiteMapProvider.SiteMaps.Loader = Resolver.Get<ISiteMapLoader>();
and to reference in my view I have
#Html.MvcSiteMap("TestSiteMapProvider").Menu(false, true, true)
but it must not be able to find "TestSiteMapProvider" because it always displays "default" or complains if it doesn't exist.
I also thought it might have something to do with the Cache, as I see the filename referenced there, but I don't know how to add multiple instances to the cache, so I just disabled it. I'm really not doing anything fancy with my sitemaps anyway, and this whole thing is really feeling like massive overkill just to get some flippin automatic breadcrumbs!
Apparently there was another help doc that I wasn't aware of. I had completed all of the steps thus far properly, but I also needed to implement ISiteMapCacheKeyGenerator.
See this doc (which wasn't named this when I started.)
https://github.com/maartenba/MvcSiteMapProvider/wiki/Multiple-Sitemaps-in-One-Application

Error when trying add data to RavenDb

I'm using autofac and the interfaces are correctly resolved but this code fails with "No connection could be made because the target machine actively refused it 127.0.0.1:8081"
using (var store = GetService<IDocumentStore>())
{
using (var session = store.OpenSession())
{
session.Store(new Entry { Author = "bob", Comment = "My son says this", EntryId = Guid.NewGuid(), EntryTime = DateTime.Now, Quote = "I hate you dad." });
session.SaveChanges();
}
}
Here is the registration
builder.Register<IDocumentStore>(c =>
{
var store = new DocumentStore { Url = "http://localhost:8081" };
store.Initialize();
return store;
}).SingleInstance();
When I navigate to http://localhost:8081 I do get the silverlight management UI. Although I'm running a Windows VM and vmware and Silverlight5 don't play together. That's another issue entirely. Anyways does anyone see what I'm doing wrong here or what I should be doing differently? Thanks for any code, tips, or tricks.
On a side note, can I enter some dummy records from a command line interface? Any docs or examples of how I can do that?
Thanks All.
Just curious, are you switching RavenDB to listen on 8081? The default is 8080. If you're getting the management studio to come up, I suspect you are.
I'm not too familiar with autofac but, it looks like you're wrapping your singleton DocumentStore in a using statement.
Try:
using (var session = GetService<IDocumentStore>().OpenSession())
{
}
As far as dummy records go, the management studio will ask you if you want to generate some dummy data if your DB is empty. If you can't get silverlight to work in the VM, I'm not sure if there's another automated way to do it.
Perhaps using smuggler:
http://ravendb.net/docs/server/administration/export-import
But you'd have to find something to import.

given a list of objects using C# push them to ravendb without knowing which ones already exist

Given 1000 documents with a complex data structure. for e.g. a Car class that has three properties, Make and Model and one Id property.
What is the most efficient way in C# to push these documents to raven db (preferably in a batch) without having to query the raven collection individually to find which to update and which to insert. At the moment I have to going like so. Which is totally inefficient.
note : _session is a wrapper on the IDocumentSession where Commit calls SaveChanges and Add calls Store.
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var page = 0;
const int total = 30;
do
{
var paged = sales.Skip(page*total).Take(total);
if (!paged.Any()) return;
foreach (var sale in paged)
{
var current = sale;
var existing = _session.Query<Sale>().FirstOrDefault(s => s.Id == current.Id);
if (existing != null)
existing = current;
else
_session.Add(current);
}
_session.Commit();
page++;
} while (true);
}
Your session code doesn't seem to track with the RavenDB api (we don't have Add or Commit).
Here is how you do this in RavenDB
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
sales.ForEach(session.Store);
session.SaveChanges();
}
Your code sample doesn't work at all. The main problem is that you cannot just switch out the references and expect RavenDB to recognize that:
if (existing != null)
existing = current;
Instead you have to update each property one-by-one:
existing.Model = current.Model;
existing.Make = current.Model;
This is the way you can facilitate change-tracking in RavenDB and many other frameworks (e.g. NHibernate). If you want to avoid writing this uinteresting piece of code I recommend to use AutoMapper:
existing = Mapper.Map<Sale>(current, existing);
Another problem with your code is that you use Session.Query where you should use Session.Load. Remember: If you query for a document by its id, you will always want to use Load!
The main difference is that one uses the local cache and the other not (the same applies to the equivalent NHibernate methods).
Ok, so now I can answer your question:
If I understand you correctly you want to save a bunch of Sale-instances to your database while they should either be added if they didn't exist or updated if they existed. Right?
One way is to correct your sample code with the hints above and let it work. However that will issue one unnecessary request (Session.Load(existingId)) for each iteration. You can easily avoid that if you setup an index that selects all the Ids of all documents inside your Sales-collection. Before you then loop through your items you can load all the existing Ids.
However, I would like to know what you actually want to do. What is your domain/use-case?
This is what works for me right now. Note: The InjectFrom method comes from Omu.ValueInjecter (nuget package)
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var ids = sales.Select(i => i.Id);
var existingSales = _ravenSession.Load<Sale>(ids);
existingSales.ForEach(s => s.InjectFrom(sales.Single(i => i.Id == s.Id)));
var existingIds = existingSales.Select(i => i.Id);
var nonExistingSales = sales.Where(i => !existingIds.Any(x => x == i.Id));
nonExistingSales.ForEach(i => _ravenSession.Store(i));
_ravenSession.SaveChanges();
}

Silverlight RIA query with client-side filtering

I have the following query that returns empty (enumeration yielded no results):
CurrentStudent = _ctx.Students.SingleOrDefault();
var qry = _ctx.GetStudentsWithAdultContactsQuery();
_ctx.Load(qry.Where(s => s.StudentID == studentId));
dataForm1.CurrentItem = CurrentStudent;
BeginEdit();
Instead, I have to do the following, as if I were working with a regular SL WCF Service:
var qry = _ctx.GetStudentByIDQuery(studentId);
var load = _ctx.Load(qry);
load.Completed += (s, e) =>
{
CurrentStudent = _ctx.Students.FirstOrDefault();
dataForm1.CurrentItem = CurrentStudent;
BeginEdit();
};
Why does the first method not work? The server-side query does the same filtering by ID as the first, so it's not the query itself. None of the examples of using RIA that I've seen use the Completed event handler; RIA is supposed to handle the asynchronous loading behind the scenes. What gives?
EDIT
I wanted to refocus the question a bit. Here's another query that I have that works just as you would expect:
var query = ctx.GetStudentsWithAdultContactsQuery();
studentDataGrid.ItemsSource = ctx.Students;
ctx.Load(query);
I am not explicitly handling the Completed callback here, and that's how I see RIA used in examples on the web, including here. So, the obvious differences between this query and the one that doesn't work is (a) the filtering and (b) the databinding target. But why should either make a difference?
I think I've worked out an answer:
The two situations -- binding a grid and binding a dataform -- are different because in the former scenario I am binding to an EntitySet (Students) while in the case of the dataform I am binding to an individual entity (Student). Well, duh! But so what?
So, an EntitySet implements INotifyCollectionChanged, which raises a notification event when entities are added to the set (as when the query returns with the results), causing the grid to update. The dataform CurrentItem, on the other hand, is not binding to an EntitySet, so it's not listening to changes in the Students collection and doesn't know when CurrentStudent is no longer null. Hence, I have to wait for the load to complete, or I can do something that the grid presumably does, and listen to the Students collection, which works to the same effect:
var qry = _ctx.GetStudentsWithAdultContactsQuery().Where(s => s.StudentID == studentId);
_ctx.Load(qry);
_ctx.Students.EntityAdded += new EventHandler<EntityCollectionChangedEventArgs<Student>>(Students_EntityAdded);
//...
void Students_EntityAdded(object sender, EntityCollectionChangedEventArgs<Student> e)
{
dataForm1.CurrentItem = e.Entity;
}
Or, more succinctly:
var qry = _ctx.GetStudentsWithAdultContactsQuery();
_ctx.Load(qry.Where(s => s.StudentID == studentId));
_ctx.Students.EntityAdded += (s, e) => {
dataForm1.CurrentItem = e.Entity;
};
I'm marking the question as answered, but please feel welcome to post your explanation if it differs from mine; I will happily vote it up if it offers new insight.

Encapsulating common logic (domain driven design, best practices)

Updated: 09/02/2009 - Revised question, provided better examples, added bounty.
Hi,
I'm building a PHP application using the data mapper pattern between the database and the entities (domain objects). My question is:
What is the best way to encapsulate a commonly performed task?
For example, one common task is retrieving one or more site entities from the site mapper, and their associated (home) page entities from the page mapper. At present, I would do that like this:
$siteMapper = new Site_Mapper();
$site = $siteMapper->findByid(1);
$pageMapper = new Page_Mapper();
$site->addPage($pageMapper->findHome($site->getId()));
Now that's a fairly trivial example, but it gets more complicated in reality, as each site also has an associated locale, and the page actually has multiple revisions (although for the purposes of this task I'd only be interested in the most recent one).
I'm going to need to do this (get the site and associated home page, locale etc.) in multiple places within my application, and I cant think of the best way/place to encapsulate this task, so that I don't have to repeat it all over the place. Ideally I'd like to end up with something like this:
$someObject = new SomeClass();
$site = $someObject->someMethod(1); // or
$sites = $someObject->someOtherMethod();
Where the resulting site entities already have their associated entities created and ready for use.
The same problem occurs when saving these objects back. Say I have a site entity and associated home page entity, and they've both been modified, I have to do something like this:
$siteMapper->save($site);
$pageMapper->save($site->getHomePage());
Again, trivial, but this example is simplified. Duplication of code still applies.
In my mind it makes sense to have some sort of central object that could take care of:
Retrieving a site (or sites) and all nessessary associated entities
Creating new site entities with new associated entities
Taking a site (or sites) and saving it and all associated entities (if they've changed)
So back to my question, what should this object be?
The existing mapper object?
Something based on the repository pattern?*
Something based on the unit of work patten?*
Something else?
* I don't fully understand either of these, as you can probably guess.
Is there a standard way to approach this problem, and could someone provide a short description of how they'd implement it? I'm not looking for anyone to provide a fully working implementation, just the theory.
Thanks,
Jack
Using the repository/service pattern, your Repository classes would provide a simple CRUD interface for each of your entities, then the Service classes would be an additional layer that performs additional logic like attaching entity dependencies. The rest of your app then only utilizes the Services. Your example might look like this:
$site = $siteService->getSiteById(1); // or
$sites = $siteService->getAllSites();
Then inside the SiteService class you would have something like this:
function getSiteById($id) {
$site = $siteRepository->getSiteById($id);
foreach ($pageRepository->getPagesBySiteId($site->id) as $page)
{
$site->pages[] = $page;
}
return $site;
}
I don't know PHP that well so please excuse if there is something wrong syntactically.
[Edit: this entry attempts to address the fact that it is oftentimes easier to write custom code to directly deal with a situation than it is to try to fit the problem into a pattern.]
Patterns are nice in concept, but they don't always "map". After years of high end PHP development, we have settled on a very direct way of handling such matters. Consider this:
File: Site.php
class Site
{
public static function Select($ID)
{
//Ensure current user has access to ID
//Lookup and return data
}
public static function Insert($aData)
{
//Validate $aData
//In the event of errors, raise a ValidationError($ErrorList)
//Do whatever it is you are doing
//Return new ID
}
public static function Update($ID, $aData)
{
//Validate $aData
//In the event of errors, raise a ValidationError($ErrorList)
//Update necessary fields
}
Then, in order to call it (from anywhere), just run:
$aData = Site::Select(123);
Site::Update(123, array('FirstName' => 'New First Name'));
$ID = Site::Insert(array(...))
One thing to keep in mind about OO programming and PHP... PHP does not keep "state" between requests, so creating an object instance just to have it immediately destroyed does not often make sense.
I'd probably start by extracting the common task to a helper method somewhere, then waiting to see what the design calls for. It feels like it's too early to tell.
What would you name this method ? The name usually hints at where the method belongs.
class Page {
public $id, $title, $url;
public function __construct($id=false) {
$this->id = $id;
}
public function save() {
// ...
}
}
class Site {
public $id = '';
public $pages = array();
function __construct($id) {
$this->id = $id;
foreach ($this->getPages() as $page_id) {
$this->pages[] = new Page($page_id);
}
}
private function getPages() {
// ...
}
public function addPage($url) {
$page = ($this->pages[] = new Page());
$page->url = $url;
return $page;
}
public function save() {
foreach ($this->pages as $page) {
$page->save();
}
// ..
}
}
$site = new Site($id);
$page = $site->addPage('/');
$page->title = 'Home';
$site->save();
Make your Site object an Aggregate Root to encapsulate the complex association and ensure consistency.
Then create a SiteRepository that has the responsibility of retrieving the Site aggregate and populating its children (including all Pages).
You will not need a separate PageRepository (assuming that you don't make Page a separate Aggregate Root), and your SiteRepository should have the responsibility of retrieving the Page objects as well (in your case by using your existing Mappers).
So:
$siteRepository = new SiteRepository($myDbConfig);
$site = $siteRepository->findById(1); // will have Page children attached
And then the findById method would be responsible for also finding all Page children of the Site. This will have a similar structure to the answer CodeMonkey1 gave, however I believe you will benefit more by using the Aggregate and Repository patterns, rather than creating a specific Service for this task. Any other retrieval/querying/updating of the Site aggregate, including any of its child objects, would be done through the same SiteRepository.
Edit: Here's a short DDD Guide to help you with the terminology, although I'd really recommend reading Evans if you want the whole picture.