How to configure multiple sitemaps using MVCSiteMapProvider v4 with StructureMap DI - asp.net-mvc-4

The problem, essentially, is that I can't get my sitemap config to support multiple sitemaps. It's always looking for "default" even when I name my instances and request another. Now for the background.
I've been pouring over the docs for the new implementation of MVCSiteMapProvider. They are now using Dependency Injection to configure the SiteMapProvider. We have an existing StructureMap DI implementation, so I followed the instructions and added, in our case
ObjectFactory.Configure(x =>
{
...
x.AddRegistry<MvcSiteMapProviderRegistry>();
...
});
Then I started tweaking the MvcSiteMapProviderRegistry.cs file to implement my multiple sitemap scenario. I have multiple site map files, either will work as long as it's called "default". If I remove the "default" item then it breaks and complains that "default" is missing. Which I assume is because it can't find my instance. Here's how I have them defined. I suspect the problem is somewhere in here... the loader which it says I have to configure in the Global.asax is looking for ISiteMapLoader but I'm adding my multiple configuration to SiteMapBuilderSet... anyway here's the code.
// Register the sitemap builder
string absoluteFileName = HostingEnvironment.MapPath("~/Main.sitemap");
string absoluteFileName2 = HostingEnvironment.MapPath("~/Test.sitemap");
var xmlSource = this.For<IXmlSource>().Use<FileXmlSource>()
.Ctor<string>("fileName").Is(absoluteFileName);
var reservedAttributeNameProvider = this.For<ISiteMapXmlReservedAttributeNameProvider>()
.Use<SiteMapXmlReservedAttributeNameProvider>()
.Ctor<IEnumerable<string>>("attributesToIgnore").Is(new string[0]);
var builder = this.For<ISiteMapBuilder>().Use<CompositeSiteMapBuilder>()
.EnumerableOf<ISiteMapBuilder>().Contains(y =>
{
y.Type<XmlSiteMapBuilder>()
.Ctor<ISiteMapXmlReservedAttributeNameProvider>().Is(reservedAttributeNameProvider)
.Ctor<IXmlSource>().Is(xmlSource);
y.Type<ReflectionSiteMapBuilder>()
.Ctor<IEnumerable<string>>("includeAssemblies").Is(includeAssembliesForScan)
.Ctor<IEnumerable<string>>("excludeAssemblies").Is(new string[0]);
y.Type<VisitingSiteMapBuilder>();
});
var xmlSource2 = this.For<IXmlSource>().Use<FileXmlSource>()
.Ctor<string>("fileName").Is(absoluteFileName2);
var builder2 = this.For<ISiteMapBuilder>().Use<CompositeSiteMapBuilder>()
.EnumerableOf<ISiteMapBuilder>().Contains(y =>
{
y.Type<XmlSiteMapBuilder>()
.Ctor<ISiteMapXmlReservedAttributeNameProvider>().Is(reservedAttributeNameProvider)
.Ctor<IXmlSource>().Is(xmlSource2);
y.Type<ReflectionSiteMapBuilder>()
.Ctor<IEnumerable<string>>("includeAssemblies").Is(includeAssembliesForScan)
.Ctor<IEnumerable<string>>("excludeAssemblies").Is(new string[0]);
y.Type<VisitingSiteMapBuilder>();
});
// Configure the builder sets
this.For<ISiteMapBuilderSetStrategy>().Use<SiteMapBuilderSetStrategy>()
.EnumerableOf<ISiteMapBuilderSet>().Contains(x =>
{
/* x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("default")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder)
.Ctor<ICacheDetails>().Is(cacheDetails);*/
/*
x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("MainSiteMapProvider")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder)
.Ctor<ICacheDetails>().Is(cacheDetails);*/
x.Type<SiteMapBuilderSet>()
.Ctor<string>("instanceName").Is("TestSiteMapProvider")
.Ctor<bool>("securityTrimmingEnabled").Is(securityTrimmingEnabled)
.Ctor<bool>("enableLocalization").Is(enableLocalization)
.Ctor<ISiteMapBuilder>().Is(builder2)
.Ctor<ICacheDetails>().Is(cacheDetails);
});
In my global.asax.cs I added
MvcSiteMapProvider.SiteMaps.Loader = Resolver.Get<ISiteMapLoader>();
and to reference in my view I have
#Html.MvcSiteMap("TestSiteMapProvider").Menu(false, true, true)
but it must not be able to find "TestSiteMapProvider" because it always displays "default" or complains if it doesn't exist.
I also thought it might have something to do with the Cache, as I see the filename referenced there, but I don't know how to add multiple instances to the cache, so I just disabled it. I'm really not doing anything fancy with my sitemaps anyway, and this whole thing is really feeling like massive overkill just to get some flippin automatic breadcrumbs!

Apparently there was another help doc that I wasn't aware of. I had completed all of the steps thus far properly, but I also needed to implement ISiteMapCacheKeyGenerator.
See this doc (which wasn't named this when I started.)
https://github.com/maartenba/MvcSiteMapProvider/wiki/Multiple-Sitemaps-in-One-Application

Related

Dependency Injection Access While Configuring Service Registrations in asp.net Core (3+)

I have cases, where I want to configure services based on objects which are registered in the dependency injection container.
For example I have the following registration for WS Federation:
authenticationBuilder.AddWsFederation((options) =>{
options.MetadataAddress = "...";
options.Wtrealm = "...";
options.[...]=...
});
My goal in the above case is to use a configuration object, which is available via the DI container to configure the WsFederation-middleware.
It looks to me that IPostConfigureOptions<> is the way to go, but until now, I have not found a way to accomplish this.
How can this be done, or is it not possible?
See https://andrewlock.net/simplifying-dependency-injection-for-iconfigureoptions-with-the-configureoptions-helper/ for the I(Post)ConfigureOptions<T> way, but I find that way too cumbersome.
I generally use this pattern:
// Get my custom config section
var fooSettingsSection = configuration.GetSection("Foo");
// Parse it to my custom section's settings class
var fooSettings = fooSettingsSection.Get<FooSettings>()
?? throw new ArgumentException("Foo not configured");
// Register it for services who ask for an IOptions<FooSettings>
services.Configure<FooSettings>(fooSettings);
// Use the settings instance
services.AddSomeOtherService(options => {
ServiceFoo = fooSettings.ServiceFoo;
})
A little more explicit, but you have all your configuration and DI code in one place.
Of course this bypasses the I(Post)ConfigureOptions<T> entirely, so if there's other code that uses those interfaces to modify the FooSettings afterwards, my code won't notice it as it's reading directly from the configuration file. Given I control FooSettings and its users, that's no problem for me.
This should be the approach if you do want to use that interface:
First, register your custom config section that you want to pull the settings from:
var fooSettingsSection = configuration.GetSection("Foo");
services.Configure<FooSettings>(fooSettingsSection);
Then, create an options configurer:
public class ConfigureWSFedFromFooSettingsOptions
: IPostConfigureOptions<Microsoft.AspNetCore.Authentication.WsFederation.WsFederationOptions>
{
private readonly FooSettings _fooSettings;
public ConfigureWSFedFromFooSettingsOptions(IOptions<FooSettings> fooSettings)
{
_fooSettings = fooSettings.Value;
}
public void Configure(WsFederationOptions options)
{
options.MetadataAddress = _fooSettings.WsFedMetadataAddress;
options.Wtrealm = _fooSettings.WsFedWtRealm;
}
}
And finally link the stuff together:
services.AddTransient<IPostConfigureOptions<WsFederationOptions>, ConfigureWSFedFromFooSettingsOptions>();
The configurer will get your IOptions<FooSettings> injected, instantiated from the appsettings, and then be used to further configure the WsFederationOptions.

Mojolicious template cache is stale

I'm currently developing a small single-page Web app using Mojolicious. The app has a Javascript frontend (using Backbone) that talks to a REST-ish API; the layout of the source is roughly:
use Mojolicious::Lite;
# ... setup code ...
get '/' => sub {
my $c = shift;
# fetch+stash data for bootstrapped collections...
$c->render('app_template');
};
get '/api_endpoint' => sub {
my $c = shift;
# fetch appropriate API data...
$c->render(json => $response);
};
# ... more API endpoints ...
app->start;
The app template uses EP, but very minimally; the only server-side template directives just insert JSON for bootstrapped collections. It's deployed via Apache as a plain CGI script. (This isn't optimal, but it's for low-traffic internal use, and more intricate server configuration is problematic in context.) Perl CGI is configured via mod_perl.
This works most of the time, but occasionally the renderer somehow gets the idea that it should cache the template and ignore changes to it. The debug records in error_log show "Rendering cached template" rather than the normal "Rendering template", and my new changes to the template stop appearing in the browser. I can't find a reliable way to stop this, though it will eventually stop on its own according to conditions I can't discern.
How can I make the app notice template changes reliably? Alternatively, how can I disable template caching completely?
How can I make the app notice template changes reliably?
This is what the morbo development server is for. Morbo wouldn't be used for your live code deployment, but for a development environment where you are continually changing your code and templates. Generally changes to live code and templates are meant to be handled by restarting the application server, or Apache in your case. (Hypnotoad has a hot-restart capability for this purpose)
Alternatively, how can I disable template caching completely?
To do this, add the following setup code (outside of routes, after use Mojolicious::Lite):
app->renderer->cache->max_keys(0);
For old answer see below.
I turned the findings of this answer into a plugin and released it on CPAN as Mojolicious::Plugin::Renderer::WithoutCache after discussing wit Grinnz on IRC, where they encouraged a release.
You can use it like this:
use Mojolicious::Lite;
plugin 'Renderer::WithoutCache';
It will create a new Cache object that does nothing, and install that globally into the renderer. That way, it doesn't need to be created every time like my initial answer below did.
In theory, this should be faster than Grinnz' approach (which is more sensible), and since you explicitly don't want to cache, you obviously want things to be as fast as possible, right? It's supposedly faster because the real Mojo::Cache would still need to go and try to set the cache, but then abort because there are no more free keys, and it also would try to look up the values from the cache every time.
I benchmarked this with both Dumbbench and Benchmark. Both of them showed negligible results. I ran them each a couple of times, but they fluctuated a lot, and it's not clear which one is faster. I included output of a run where my implementation was faster, but it still shows how minuscule the difference is.
Benchmark with Dumbbench:
use Dumbbench;
use Mojolicious::Renderer;
use Mojolicious::Controller;
use Mojolicious::Plugin::Renderer::WithoutCache::Cache;
my $controller = Mojolicious::Controller->new;
my $renderer_zero_keys = Mojolicious::Renderer->new;
$renderer_zero_keys->cache->max_keys(0);
my $renderer_nocache = Mojolicious::Renderer->new;
$renderer_nocache->cache( Mojolicious::Plugin::Renderer::WithoutCache::Cache->new );
my $bench = Dumbbench->new(
target_rel_precision => 0.005,
initial_runs => 5000,
);
$bench->add_instances(
Dumbbench::Instance::PerlSub->new(
name => 'max_keys',
code => sub {
$renderer_zero_keys->render( $controller, { text => 'foobar' } );
}
),
Dumbbench::Instance::PerlSub->new(
name => 'WithoutCache',
code => sub {
$renderer_nocache->render( $controller, { text => 'foobar' } );
}
),
);
$bench->run;
$bench->report;
__END__
max_keys: Ran 8544 iterations (3335 outliers).
max_keys: Rounded run time per iteration: 5.19018e-06 +/- 4.1e-10 (0.0%)
WithoutCache: Ran 5512 iterations (341 outliers).
WithoutCache: Rounded run time per iteration: 5.0802e-06 +/- 5.6e-09 (0.1%)
Benchmark with Benchmark:
use Benchmark 'cmpthese';
use Mojolicious::Renderer;
use Mojolicious::Controller;
use Mojolicious::Plugin::Renderer::WithoutCache::Cache;
my $controller = Mojolicious::Controller->new;
my $renderer_zero_keys = Mojolicious::Renderer->new;
$renderer_zero_keys->cache->max_keys(0);
my $renderer_nocache = Mojolicious::Renderer->new;
$renderer_nocache->cache( Mojolicious::Plugin::Renderer::WithoutCache::Cache->new );
cmpthese(
-5,
{
'max_keys' => sub {
$renderer_zero_keys->render( $controller, { text => 'foobar' } );
},
'WithoutCache' => sub {
$renderer_nocache->render( $controller, { text => 'foobar' } );
},
}
);
__END__
Rate max_keys WithoutCache
max_keys 190934/s -- -2%
WithoutCache 193846/s 2% --
I recon in a heavy load environment with lots of calls it would eventually make a difference, but that is very hard to prove. So if you don't like to think about the internals of the cache, this plugin might be useful.
Old answer:
Looking at Mojolicious::Plugin::EPRenderer I found out that there is a cache. It's a Mojo::Cache instance, which has the methods get, set and max_keys, and inherits from Mojo::Base (like probably everything in Mojolicious).
The ::EPRenderer gets a $renderer, which is a Mojolicious::Renderer. It holds the Mojo::Cache instance. I looked at $c with Data::Printer, and found out that there is a $c->app that holds all of those.
Knowing this, you can easily make your own cache class that does nothing.
package Renderer::NoCache;
use Mojo::Base -base;
sub get {}
sub set {}
sub max_keys {}
Now you stick it into $c.
package Foo;
use Mojolicious::Lite;
get '/' => sub {
my $c = shift;
$c->app->renderer->cache( Renderer::NoCache->new );
$c->render(template => 'foo', name => 'World');
};
app->start;
__DATA__
## foo.html.ep
Hello <%= $name =%>.
Now every attempt to get or set the cache simply does nothing. It will try caching, but it will never find anything.
Of course it's not great to make a new object every time. It would be better to make that object once at startup and get it into the internal permanent version of app. You have CGI, so it might not make a difference.
You could also just monkey-patch the get out of Mojo::Cache. This more hacky approach will do the same thing:
package Foo;
use Mojolicious::Lite;
*Mojo::Cache::get = sub { };
get '/' => sub {
my $c = shift;
$c->render(template => 'foo', name => 'World');
};
app->start;
But beware: we just disabled fetching from every cache in your application that uses Mojo::Cache. This might not be what you want.

How to link to a child site-map file from a parent site map in ASP.NET MVC4 using MVCSitemapProvider?

I am using MVCSitemapProvider by Maarten Balliauw with Ninject DI in MVC4. Being a large-scale web app, enumerating over the records to generate the sitemap xml accounts for 70% of the page load time. For that purpose, I went for using new sitemap files for each level-n dynamic node provider.
<mvcSiteMap xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://mvcsitemap.codeplex.com/schemas/MvcSiteMap-File-4.0" xsi:schemaLocation="http://mvcsitemap.codeplex.com/schemas/MvcSiteMap-File-4.0 MvcSiteMapSchema.xsd">
<mvcSiteMapNode title="$resources:SiteMapLocalizations,HomeTitle" description="$resources:SiteMapLocalizations,HomeDescription" controller="Controller1" action="Home" changeFrequency="Always" updatePriority="Normal" metaRobotsValues="index follow noodp noydir"><mvcSiteMapNode title="$resources:SiteMapLocalizations,AboutTitle" controller="ConsumerWeb" action="Aboutus"/>
<mvcSiteMapNode title="Sitemap" controller="Consumer1" action="SiteMap"/><mvcSiteMapNode title=" " action="Action3" controller="Consumer2" dynamicNodeProvider="Comp.Controller.Utility.NinjectModules.PeopleBySpecDynamicNodeProvider, Comp.Controller.Utility" />
<mvcSiteMapNode title="" siteMapFile="~/Mvc2.sitemap"/>
</mvcSiteMapNode>
</mvcSiteMap>
But, it doesn't seem to work. For localhost:XXXX/sitemap.xml, the child nodes from Mvc2.sitemap don't seem to load.
siteMapFile is not a valid XML attribute in MvcSiteMapProvider (although you could use it as a custom attribute), so I am not sure what guide you are following to do this. But, the bottom line is there is no feature that loads "child sitemap files", and even if there was, it wouldn't help with your issue because all of the nodes are loaded into memory at once. Realistically on an average server there is an upper limit of around 10,000 - 15,000 nodes.
The problem that you describe is a known issue. There are some tips available in issue #258 that may or may not help.
We are working a new XML sitemap implementation that will allow you to connect the XML sitemap directly to your data source, which can be used to circumvent this problem (at least as far as the XML sitemap is concerned). This implementation is stream-based and has paging that can be tied directly to the data source, and will seamlessly page over multiple tables, so it is very efficient. However, although there is a working prototype, it is still some time off from being made into a release.
If you need it sooner rather than later, you are welcome to grab the prototype from this branch.
You will need some code to wire it into your application (this is subject to change for the official release). I have created a demo project here.
Application_Start
var registrar = new MvcSiteMapProvider.Web.Routing.XmlSitemapFeedRouteRegistrar();
registrar.RegisterRoutes(RouteTable.Routes, "XmlSitemap2");
XmlSitemap2Controller
using MvcSiteMapProvider.IO;
using MvcSiteMapProvider.Web.Mvc;
using MvcSiteMapProvider.Xml.Sitemap.Configuration;
using System.Web.Mvc;
public class XmlSitemap2Controller : Controller
{
private readonly IXmlSitemapFeedResultFactory xmlSitemapFeedResultFactory;
public XmlSitemap2Controller()
{
var builder = new XmlSitemapFeedStrategyBuilder();
var xmlSitemapFeedStrategy = builder
.SetupXmlSitemapProviderScan(scan => scan.IncludeAssembly(this.GetType().Assembly))
.AddNamedFeed("default", feed => feed.WithMaximumPageSize(5000).WithContent(content => content.Image().Video()))
.Create();
var outputCompressor = new HttpResponseStreamCompressor();
this.xmlSitemapFeedResultFactory = new XmlSitemapFeedResultFactory(xmlSitemapFeedStrategy, outputCompressor);
}
public ActionResult Index(int page = 0, string feedName = "")
{
var name = string.IsNullOrEmpty(feedName) ? "default" : feedName;
return this.xmlSitemapFeedResultFactory.Create(page, name);
}
}
IXmlSiteMapProvider
And you will need 1 or more IXmlSitemapProvider implementations. For convenience, there is a base class XmlSiteMapProviderBase. These are similar to creating controllers in MVC.
using MvcSiteMapProvider.Xml.Sitemap;
using MvcSiteMapProvider.Xml.Sitemap.Specialized;
using System;
using System.Linq;
public class CategoriesXmlSitemapProvider : XmlSitemapProviderBase, IDisposable
{
private EntityFramework.MyEntityContext db = new EntityFramework.MyEntityContext();
// This is optional. Don't override it if you don't want to use last modified date.
public override DateTime GetLastModifiedDate(string feedName, int skip, int take)
{
// Get the latest date in the specified page
return db.Category.OrderBy(x => x.Id).Skip(skip).Take(take).Max(c => c.LastUpdated);
}
public override int GetTotalRecordCount(string feedName)
{
// Get the total record count for all pages
return db.Category.Count();
}
public override void GetUrlEntries(IUrlEntryHelper helper)
{
// Do not call ToList() on the query. The idea is that we want to force
// EntityFramework to use a DataReader rather than loading all of the data
// at once into RAM.
var categories = db.Category
.OrderBy(x => x.Id)
.Skip(helper.Skip)
.Take(helper.Take);
foreach (var category in categories)
{
var entry = helper.BuildUrlEntry(string.Format("~/Category/{0}", category.Id))
.WithLastModifiedDate(category.LastUpdated)
.WithChangeFrequency(MvcSiteMapProvider.ChangeFrequency.Daily)
.AddContent(content => content.Image(string.Format("~/images/category-image-{0}.jpg", category.Id)).WithCaption(category.Name));
helper.SendUrlEntry(entry);
}
}
public void Dispose()
{
db.Dispose();
}
}
Note that there is currently not an IXmlSiteMapProvider implementation that reads the nodes from the default (or any) SiteMap, but creating one is similar to what is shown above, except you would query the SiteMap for nodes instead of a database for records.
Alternatively, you could use a 3rd party XML sitemap generator. Although, nearly all of them are set up in a non-scalable way for large sites, and most leave it up to you to handle the paging. If they aren't streaming the nodes, it will not realistically scale to more than a few thousand URLs.
The other detail you might need to take care of is to use the forcing a match technique to reduce the total number of nodes in the SiteMap. If you are using the Menu and/or SiteMap HTML helpers, you will need to leave all of your high-level nodes alone. But any node that does not appear in either is a good candidate for this. Realistically, nearly any data-driven site can be reduced to a few dozen nodes using this technique, but keep in mind every node that is forced to match multiple routes in the SiteMap means that individual URL entries will need to be added in the XML sitemap.

What options do I have for automating bindings with NInject

Rather than manually having to bind every class, what methods and patterns, if any, are recommended for automatically setting up bindings?
For example, the vast majority of bindings simply look like this:
Bind<ICustomerRepository>.To<CustomerRepository>();
Once modules get large, you can end up with 100s of bindings that all look exactly the same. Can this be automated?
check out the conventions extension:
https://github.com/ninject/ninject.extensions.conventions
using (IKernel kernel = new StandardKernel())
{
var scanner = new AssemblyScanner();
scanner.From(Assembly.GetExecutingAssembly());
scanner.BindWith<DefaultBindingGenerator>();
kernel.Scan(scanner);
var instance = kernel.Get<IDefaultConvention>();
instance.ShouldNotBeNull();
instance.ShouldBeInstanceOf<DefaultConvention>();
}

Encapsulating common logic (domain driven design, best practices)

Updated: 09/02/2009 - Revised question, provided better examples, added bounty.
Hi,
I'm building a PHP application using the data mapper pattern between the database and the entities (domain objects). My question is:
What is the best way to encapsulate a commonly performed task?
For example, one common task is retrieving one or more site entities from the site mapper, and their associated (home) page entities from the page mapper. At present, I would do that like this:
$siteMapper = new Site_Mapper();
$site = $siteMapper->findByid(1);
$pageMapper = new Page_Mapper();
$site->addPage($pageMapper->findHome($site->getId()));
Now that's a fairly trivial example, but it gets more complicated in reality, as each site also has an associated locale, and the page actually has multiple revisions (although for the purposes of this task I'd only be interested in the most recent one).
I'm going to need to do this (get the site and associated home page, locale etc.) in multiple places within my application, and I cant think of the best way/place to encapsulate this task, so that I don't have to repeat it all over the place. Ideally I'd like to end up with something like this:
$someObject = new SomeClass();
$site = $someObject->someMethod(1); // or
$sites = $someObject->someOtherMethod();
Where the resulting site entities already have their associated entities created and ready for use.
The same problem occurs when saving these objects back. Say I have a site entity and associated home page entity, and they've both been modified, I have to do something like this:
$siteMapper->save($site);
$pageMapper->save($site->getHomePage());
Again, trivial, but this example is simplified. Duplication of code still applies.
In my mind it makes sense to have some sort of central object that could take care of:
Retrieving a site (or sites) and all nessessary associated entities
Creating new site entities with new associated entities
Taking a site (or sites) and saving it and all associated entities (if they've changed)
So back to my question, what should this object be?
The existing mapper object?
Something based on the repository pattern?*
Something based on the unit of work patten?*
Something else?
* I don't fully understand either of these, as you can probably guess.
Is there a standard way to approach this problem, and could someone provide a short description of how they'd implement it? I'm not looking for anyone to provide a fully working implementation, just the theory.
Thanks,
Jack
Using the repository/service pattern, your Repository classes would provide a simple CRUD interface for each of your entities, then the Service classes would be an additional layer that performs additional logic like attaching entity dependencies. The rest of your app then only utilizes the Services. Your example might look like this:
$site = $siteService->getSiteById(1); // or
$sites = $siteService->getAllSites();
Then inside the SiteService class you would have something like this:
function getSiteById($id) {
$site = $siteRepository->getSiteById($id);
foreach ($pageRepository->getPagesBySiteId($site->id) as $page)
{
$site->pages[] = $page;
}
return $site;
}
I don't know PHP that well so please excuse if there is something wrong syntactically.
[Edit: this entry attempts to address the fact that it is oftentimes easier to write custom code to directly deal with a situation than it is to try to fit the problem into a pattern.]
Patterns are nice in concept, but they don't always "map". After years of high end PHP development, we have settled on a very direct way of handling such matters. Consider this:
File: Site.php
class Site
{
public static function Select($ID)
{
//Ensure current user has access to ID
//Lookup and return data
}
public static function Insert($aData)
{
//Validate $aData
//In the event of errors, raise a ValidationError($ErrorList)
//Do whatever it is you are doing
//Return new ID
}
public static function Update($ID, $aData)
{
//Validate $aData
//In the event of errors, raise a ValidationError($ErrorList)
//Update necessary fields
}
Then, in order to call it (from anywhere), just run:
$aData = Site::Select(123);
Site::Update(123, array('FirstName' => 'New First Name'));
$ID = Site::Insert(array(...))
One thing to keep in mind about OO programming and PHP... PHP does not keep "state" between requests, so creating an object instance just to have it immediately destroyed does not often make sense.
I'd probably start by extracting the common task to a helper method somewhere, then waiting to see what the design calls for. It feels like it's too early to tell.
What would you name this method ? The name usually hints at where the method belongs.
class Page {
public $id, $title, $url;
public function __construct($id=false) {
$this->id = $id;
}
public function save() {
// ...
}
}
class Site {
public $id = '';
public $pages = array();
function __construct($id) {
$this->id = $id;
foreach ($this->getPages() as $page_id) {
$this->pages[] = new Page($page_id);
}
}
private function getPages() {
// ...
}
public function addPage($url) {
$page = ($this->pages[] = new Page());
$page->url = $url;
return $page;
}
public function save() {
foreach ($this->pages as $page) {
$page->save();
}
// ..
}
}
$site = new Site($id);
$page = $site->addPage('/');
$page->title = 'Home';
$site->save();
Make your Site object an Aggregate Root to encapsulate the complex association and ensure consistency.
Then create a SiteRepository that has the responsibility of retrieving the Site aggregate and populating its children (including all Pages).
You will not need a separate PageRepository (assuming that you don't make Page a separate Aggregate Root), and your SiteRepository should have the responsibility of retrieving the Page objects as well (in your case by using your existing Mappers).
So:
$siteRepository = new SiteRepository($myDbConfig);
$site = $siteRepository->findById(1); // will have Page children attached
And then the findById method would be responsible for also finding all Page children of the Site. This will have a similar structure to the answer CodeMonkey1 gave, however I believe you will benefit more by using the Aggregate and Repository patterns, rather than creating a specific Service for this task. Any other retrieval/querying/updating of the Site aggregate, including any of its child objects, would be done through the same SiteRepository.
Edit: Here's a short DDD Guide to help you with the terminology, although I'd really recommend reading Evans if you want the whole picture.