Multiple sites in one Piranha-CMS - piranha-cms

I'm planning to run multiple sites in one instance of Piranha-CMS. The sites are small (< 20 pages) and all have their own layout.
According to this line in the documentation i have to create multiple IIS websites:
"A note on caching: As most meta-data entities are memory cached on the server side to ensure scalability it is neccessary to create a site in IIS for each site in the manager interface."
Is this still true for the current release, 2.2.1? If so, is their a workaround so i can still use multiple Piranha sites under one IIS-site, for example by using unique page types for each site?

This issue has been resolved and the documentation should be updated. In previous version there was a bug that caused the cache-errors on the start pages of the different navigation trees.
Regards
HÃ¥kan

Related

Dotnetnuke URL tracking

I have a DNN (5) internal website that I need track URL's. I am unable to use any external analytics (Google or otherwise), so I have to rely on the internal installation as-is. I have also tried a tracking tool, but it does not give me what I need, which is really just raw page and user info. I have also tried to look at the [UrlLog], [Urls] and [UrlTracking] tables, but even though I am navigating pages, nothing is listed. The [eventlog] is logging events, so I know I am on the right site. I don't necessarily want to create new tables and log event separately if DNN does log it anyway. I would prefer to just be able to query it directly via SQL (or via the reports module).
Any guidance will be appreciated.
You can enable the SITE LOG in host settings, that will start to track a lot of information in DNN, by default it is set to 0 days, which won't store any history.
For existing portals, you will need to go into the individual portals settings and update the SITE log to also reflect the correct number of days. HOST is only used for new sites created.

prevent duplicate site / page / layouts / templates / webparts, possible

We have a sharepoint environment with many sites (and sometimes many site collections). Each site (or site collection) has the same default page with some custom webparts that use sitecolumn values (for example a projectcode or clientcode) to show information from external systems. (for each project we have to create a separate site (or site collection) because of other reasons)
What is the best approach to minimize duplication? The dynamic parts of the page are stored in site columns. When we add a new webpart, ideally the default page every site/page should show the new webpart without spreading the update to the individual pages
Thanks
One approach you may want to take is to use the web part as a wrapper for a user control. The user control does the heavy lifting on the site. Once the web part is included on your pages, the user control should be able to tell which site it is being executed on and pull the necessary dynamic data from your site columns.
When you need to make updates, you update the user control and then redeploy the solution package to the farm. Each site will pick up the change as soon as the solution is deployed.
Here is a little information about this approach:
http://msdn.microsoft.com/en-us/library/ff649867.aspx.
The above article relates to WSS 3.0, but that should give you a starting point.
An approach you may want to look at for SharePoint 2010 is a visual web part. More info can be found here: http://msdn.microsoft.com/en-us/library/ff597539.aspx.

What's the best practice , using subdomains, archive SEO , keep the system scalable , and isolate the applications?

We are developing a website quite similar with ebay.com and in order to upgrade/maintain it without much effort we decided to split/isolate different parts of the website like ebay does too (e.g the item page/application will be served from cgi.domain.com , signin application from signin.domain.com, shopping cart application from offer.domain.com, search features from search.domain.com etc ). Each major application/function of the site will be deployed on a different server. Another reason for isolation the applications is the security.
I also need to mention that one application is deployed on google app engine .
However we received some "warnings" that this will affect the SEO dramatically so I have 2 questions :)
Is it true ? Do the subdomains decrease the pagerank of the website ?
If it's true how can we sort this out ? Should we use a different server which should act as a routing/proxy and make a kind of rewrite (e.g search.domain.com => domain.com/search etc) ?
What's the best practice to archive the simplicity/isolation of the applications + SEO + security + scalability in a website ?
Thank you in advance !
Search engines no longer see sub-domains as separate sites. This was new as around Sept 2011. Whether your link-juice carries over is another thing, and it's not really explained (as of yet). Here is a reference: http://searchengineland.com/new-google-classifies-subdomains-as-internal-links-within-webmaster-tools-91401
No, multiple subdomains will not decrease page rank of the main website. However, they don't contribute to page rank either (because the search engines see them as separate sites).
However, for the sort of site that you're working on, that looks like it would be OK. For example, the only thing you really want indexed is product listings anyway - you don't need it to index login, search results and stuff like that. Also, since external websites aren't going to link to your login pages or search results either (I assume they'll only link to product pages as well) then you don't really care about those other sites contributing to your page rank.
Personally, I think people put too much focus on making sites "SEO" friendly. As long as the site is user-friendly then SEO-friendly will follow as well.

Difference in website display depending on domain

I'm getting slightly different display of a website depending on which URL I use to access it (two different servers, both serving the same files). One looks "thinner" than the other in Firefox 3.0 (no discernible difference in IE)
For example, compare:
http://www.ece.ualberta.ca/support/
and
http://www1.ece.ualberta.ca/support/
This is not a major issue, but I just noticed this and am extremely curious as to what could cause it. Is it some kind of Firefox bug? I haven't yet tried the newest version.
EDIT: My bad for assuming those URL's were actually serving the same content (it's not my server, but I do attend that school). Comparing:
http://www.ece.ualberta.ca/~ecegsa/links.html (it seems this server is down atm) and http://www1.ece.ualberta.ca/~ecegsa/links.html
shows the same issue, but the HTML is identical according to diff run on saved files. I don't see the problem on anything other than FF 3.0 at my work, so I'm guessing it's some idiosyncrasy with that browser. Still curious though.
Looking briefly at those two URLs, they're running different HTML!
For example, http://www.ece.ualberta.ca/support/ has this text:
Windows Vista/7 (volume license)
Activation
While http://www1.ece.ualberta.ca/support/ has this text:
Windows Vista (volume license)
Activation
I suspect that different HTML accounts for the difference you're seeing.
If these are actually the same servers hosting the same content, this kind of disparity could be caused by intermediate caches (e.g. proxies, CDN's, etc.) refreshing at different rates. For example, if www points to a load-balancing, caching proxy and www1 points directly to the host, this may cause the difference. You might also be seeing a bug or lag in how content is updated to different servers in a load-balanced cluster.

Web site migration and differences in firebug time profiles

I have a php web site under apache (at enginehosting.com). I rewrote it in asp.net MVC and installed it at discountasp.net. I am comparing response times with firebug.
Here is the old time profile:
Here is the new one:
Basically, I get longer response times with the new site (not obvious on the pictures I posted here but in average yes with sometimes a big difference like 2s for the old site and 9s for the new one) and images appear more progressively (as opposed to almost instantly with the old site). Moreover, the time profile is completely different. As you can see on the second picture, there is a long time passed in DNS search and this happens for images only (the raw html is even faster on the new site). I thought that once a url has been resolved, then it would be applied for all subsequent requests...
Also note that since I still want to keep my domain pointed on the old location while I'm testing, my new site is under a weird URL like myname.web436.discountasp.net. Could it be the reason? Otherwise, what else?
If this is more a serverfault question, feel free to move it.
Thanks
Unfortunately you're comparing apples and oranges here. The test results shown are of little use because you're trying to compare the performance of an application written using a different technology AND on a different hosting company's shared platform.
We could speculate any number of reasons why there may be a difference:
ASP.NET MVC first hit and lag due to warmup and compilation
The server that you're hosting on at DiscountASP may be under heavy load
The server at EngineHosting may be under utilised
The bandwidth available at DiscountASP may be under contention
You perhaps need to profile and optimise your code
...and so on.
But until you benchmark both applications on the same machine you're not making a proper scientific comparison and are going to be pulling a straws.
Finally, ignore the myname.web436.discountasp.net url, that's just a host name/header DiscountASP and many other hosters add so you can test your site if you're waiting for a domain to be transferred/registered, or for a DNS propagation of the real domain name to complete. You usually can't use the IP addresse of your site because most shared hosters share a single IP address across multiple sites on the same server and use HTTP Host Headers.