SharePoint 2010: what's the recommended way to store news? - sharepoint-2010

To store news for a news site, what's a good recommendation?
So far, I'm opting for creating a News Site, mainly because: I get some web parts for free (RSS, "week in pictures"), workflows in place and authoring experience in SharePoint seeems reasonable.
On the other hand, I see for example that, by just creating a Document Library, I can store Word documents based on "Newsletter" template and saved as web page and they look great, and the authoring experience in Word is better than that on SharePoint.
And what about just creating a blog site!
Anyway, what would people do? Am I missing a crucial factor here for one or the other? What's a good trade-off here?
Thanks!

From my experience, the best option would be to
Create a new News Site
Create a custom content type having properties like Region (Choice), Category (Choice), Show on homepage (Boolean) , Summary (Note) etc.
Create a custom page layout attached to above content type. Give it a look and feel you want your news article to look like.
Attach the page layout as default content type to Pages Library of News site.
The advantages of this approach is that you can use CQWP web part on the home page to show latest 5 articles. You can also show a one liner or a picture if you also make it a property in custom content type.
By Storing News in a word document, you are not really using SharePoint as Publishing Environment but only as repository. Choice is yours.

D. All of the above
SharePoint gives you a lot of options because there is no one sized solution that works for everyone. The flexibility of options is not to overwhelm you with choices, but rather to allow you to focus on your process, either how it exists now or how you want it to be, and then select the option that best fits your process.
My company's intranet is a team site and news is placed into an Announcements list. We do not need any flashy. The plain text just needs to be communicated to the employees. On the other hand, our public internet site is a publishing site, which gives our news pages a more finished touch in terms of styling and images. It also allows us to take advantage of scheduling, content roll-up, friendly URLs along with the security of locking down the view forms. Authoring and publishing such a page is more involved than the Announcements list, but each option perfectly fits what we want to accomplish in each environment.
Without knowing more about your needs or process, based only on your highlighting Word as the preferred authoring tool, I would recommend a Blog. It is not as fully featured as a publishing site, but there is some overlap. And posts can be authored in Word.
In the end, if you can list what you want to accomplish, how you want to accomplish it, and pick the closest option (News Site, Team Site, Publishing Site, Blog, Wiki, etc), then you will have made the correct choice.

I tend to use news publishing sites, for what you said and page editing features.
It also allows you to set scheduled go-live and un-publish dates which is kind of critical for news items.

Related

Google duplicate content issue for social network applications

I am making a social network application where user will come and share the posts like facebook. But now I have some doubts like lets say a user is just shared a content by coping it from another site and same with the case of images. So does google crawler consider it as a duplicate content or not?
If yes then how I can tell to the google crawler that "don't consider it as a spam, its a social networking site and the content is shared by the user not by the me". Is there any way or any kind of technique that help me.
Google might consider it to be duplicate content, in which case the search algorithm will choose 1 version, which it believes to be the original or more important one and drop the other.
This isn't a bad thing per se - unless you see that most of your site's content is becoming duplicated.
You can use canonical URL declarations to do what you are saying, but i wouldn't advise it.
If your website belongs to one of these types - forum or e-commerce, it will not be punished for duplicate content issue. I think "social platform" is one type of forum.
If your pages are too similar, the result is that the two or more similar pages will scatter the click rate, flow etc, so the rank in SERPs may not look well.
I suggest do not use "canonical" because this instruction tell the crawlers do not crawl/count this page. If you use it, in the webmaster tool, you will see the indexed pages decrease a lot.
Do not too worry about the duplicate content issue. You can see this article: Google’s Matt Cutts: Duplicate Content Won’t Hurt You, Unless It Is Spammy

Google displaying website title differently in search results

Google displays my website’s page title differently to how it is meant to be.
The page title should be:
Graphic Designer Brighton and Lewes | Lewis Wallis Graphic Design
It displays fine in Bing, Yahoo and on my actual website.
However, Google displays it differently:
Lewis Wallis Graphic Design: Graphic Designer Brighton and Lewes
This is annoying as I want my keywords "graphic designer brighton" to go before my name.
I am using the Yoast SEO plugin and my only suspicion is that there might be a conflict between that and my theme, Workality.
Has anyone got any suggestions as to why this might be happening?
Google Search may change webpage titles they show in the result page (since 2012-01):
We use many signals to decide which title to show to users, primarily the <title> tag if the webmaster specified one. But for some pages, a single title might not be the best one to show for all queries, and so we have algorithms that generate alternative titles to make it easier for our users to recognize relevant pages.
See also the documentation at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35624:
Google's generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. The goal of the snippet and title is to best represent and describe each result and explain how it relates to the user's query.
[…]
While we can't manually change titles or snippets for individual sites, we're always working to make them as relevant as possible.
In my answer on Webmasters SE I linked to questions from people having the same issue.
Is is possible that you changed the title, or installed the plugin, and Google hasn't picked up the changes yet?
It can take a few weeks for Google to pick up changes to your site, depending on how often it spiders it. The HTML looks fine so I can only think that Google hasn't got round to picking up the changes yet.

SEO: Allowing crawler to index all pages when only few are visible at a time

I'm working on improving the site for the SEO purposes and hit an interesting issue. The site, among other things, includes a large directory of individual items (it doesn't really matter what these are). Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
The directory is large - having about 100,000 items in it. Naturally, on any of the pages only a few items are listed. For example, on the main site homepage, there are links to about 5 or 6 items, from some other page there links to about a dozen different items, etc.
When real users visits the site, they can use search form to find item by keyword or location - so there would be a list produced matching their search criteria. However when, for example, a google crawler visits the site, it won't even attempt to put a text into the keyword search field and submit the form. Thus as far as the bot is concern, after indexing the entire site, it has covered only a few dozen items at best. Naturally, I want it to index each individual item separately. What are my options here?
One thing I considered is to check the user agent and IP ranges and if the requestor is a bot (as best I can say), then add a div to the end of the most relevant page with links to each individual item. Yes, this would be a huge page to load - and I'm not sure how google bot would react to this.
Any other things I can do? What are best practices here?
Thanks in advance.
One thing I considered is to check the user agent and IP ranges and if
the requestor is a bot (as best I can say), then add a div to the end
of the most relevant page with links to each individual item. Yes,
this would be a huge page to load - and I'm not sure how google bot
would react to this.
That would be a very bad thing to do. Serving up different content to the search engines specifically for their benefit is called cloaking and is a great way to get your site banned. Don't even consider it.
Whenever a webmaster is concerned about getting their pages indexed having an XML sitemap is an easy way to ensure the search engines are aware of your site's content. They're very easy to create and update, too, if your site is database driven. The XML file does not have to be static so you can dynamically produce it whenever the search engines request it (Google, Yahoo, and Bing all support XML sitemaps). You can find out mroe about XML sitemaps at sitemaps.org.
If you want to make your content available to search engines and want to benefit from semantic markup (i.e. HTML) you should also make sure your all of content can be reached through hyperlinks (in other words not through form submissions or JavaScript). The reason for this is twofold:
The anchor text in the links to your items will contain the keywords you want to rank well for. This is one of the more heavily weighted ranking factors.
Links count as "votes", especially to Google. Links from external websites, especially related websites, are what you'll hear people recommend the most and for good reason. They're valuable to have. But internal links carry weight, too, and can be a great way to prop up your internal item pages.
(Bonus) Google has PageRank which used to be a huge part of their ranking algorithm but plays only a small part now. But it still has value and links "pass" PageRank to each page they link to increasing the PageRank of that page. When you have as many pages as you do that's a lot of potential PageRank to pass around. If you built your site well you could probably get your home page to a PageRank of 6 just from internal linking alone.
Having an HTML sitemap that somehow links to all of your products is a great way to ensure that search engines, and users, can easily find all of your products. It is also recommended that you structure your site so more important pages are closer to the root of your website (home page) and then as you branch out gets to sub pages (categories) and then to specific items. This gives search engines an idea of what pages are important and helps them organize them (which helps them rank them). It also helps them follow those links from top to bottom and find all of your content.
Each item has its own details page, which is accessed via
http://www.mysite.com/item.php?id=item_id
or
http://www.mysite.com/item.php/id/title
This is also bad for SEO. When you can pull up the same page using two different URLs you have duplicate content on your website. Google is on a crusade to increase the quality of their index and they consider duplicate content to be low quality. Their infamous Panda Algorithm is partially out to find and penalize sites with low quality content. Considering how many products you have it is only a matter of time before you are penalized for this. Fortunately the solution is easy. You just need to specify a canonical URL for your product pages. I recommend the second format as it is more search engine friendly.
Read my answer to an SEO question at the Pro Webmaster's site for even more information on SEO.
I would suggest for starters having an xml sitemap. Generate a list of all your pages, and submit this to Google via webmaster tools. It wouldn't hurt having a "friendly" sitemap either - linked to from the front page, which lists all these pages, preferably by category, too.
If you're concerned with SEO, then having links to your pages is hugely important. Google could see your page and think "wow, awesome!" and give you lots of authority -- this authority (some like to call it link juice" is then passed down to pages that are linked from it. You ought to make a hierarchy of files, more important ones closer to the top and/or making it wide instead of deep.
Also, showing different stuff to the Google crawler than the "normal" visitor can be harmful in some cases, if Google thinks you're trying to con it.
Sorry -- A little bias on Google here - but the other engines are similar.

prevent duplicate site / page / layouts / templates / webparts, possible

We have a sharepoint environment with many sites (and sometimes many site collections). Each site (or site collection) has the same default page with some custom webparts that use sitecolumn values (for example a projectcode or clientcode) to show information from external systems. (for each project we have to create a separate site (or site collection) because of other reasons)
What is the best approach to minimize duplication? The dynamic parts of the page are stored in site columns. When we add a new webpart, ideally the default page every site/page should show the new webpart without spreading the update to the individual pages
Thanks
One approach you may want to take is to use the web part as a wrapper for a user control. The user control does the heavy lifting on the site. Once the web part is included on your pages, the user control should be able to tell which site it is being executed on and pull the necessary dynamic data from your site columns.
When you need to make updates, you update the user control and then redeploy the solution package to the farm. Each site will pick up the change as soon as the solution is deployed.
Here is a little information about this approach:
http://msdn.microsoft.com/en-us/library/ff649867.aspx.
The above article relates to WSS 3.0, but that should give you a starting point.
An approach you may want to look at for SharePoint 2010 is a visual web part. More info can be found here: http://msdn.microsoft.com/en-us/library/ff597539.aspx.

How to setup a simple site search for a simple website?

I'm maintaining an existing website that wants a site search. I implemented the search using the YAHOO API. The problem is that the API is returning irrelevant results. For example, there is a sidebar with a list of places and if a user searches for "New York" the top results will be for pages that do not have "New York" in the main content section. I have tried adding Yahoo's class="robots-nocontent" to the sidebar however that was two weeks ago and there has been no update.
I also tried out Google's Search API but am having the same problem.
This site has mostly static content and about 50 pages total so it is very small.
How can I implement a simple search that only searches the main content portions of the page?
At the risk of sounding completely self-promoting as well as pushing yet another API on you, I wrote a blog post about implementing Bing for your site using jQuery.
The advantage in using the jQuery approach is that you can tune the results quite specifically based on filters passed to the API and playing around with the JSON (or XML / SOAP if you prefer) result Bing returns, as well as having the ability to be more selective about what data you actually have jQuery display.
The other thing you should probably be aware of is how to effectively use #rel attributes on your content (esp. links) so that search engines are aware of what the relationship is between the actual content they're crawling and the destination content it links to.
First, post a link to your website... we can probably help you more if we can see the problem.
It sound like you're doing it wrong. Google Search should work on your website, unless your content is hidden behind javascript or forms or something, or your site isn't properly interlinked. Google solved crawling static pages, so if that's what you have, it will work.
So, tell me... does your site say New York anywhere? If it does, have a look at the page and see how the word is used... maybe your site isn't as static as you think. Also, are people really going to search your site for New York? Why don't you input some search terms that are likely on your site.
Another thing to consider is if your site is really just 50 pages, is it really realistic that people will want to search it? Maybe you don't need search... maybe you just need like a commonly used link section.
The BOSS Site Search Widget is pretty slick.
I use the bookmarklet thing but set as my "home" page in my browser. So whatever site I'm on I can hit my "home" button (which I never used anyway) and it pops up that handy site search thing.