SEO webmaster sitelinks issue - seo

I get sitelinks when I type the whole domain – eg: domain.com
But the sitelinks does not show up when only domain is typed. What could be the issue?

I think there is no issues, such situations happens when your website is new, or if is not so popular on search engines. I have come across such situations multiple times with my client website. Do some online promotion for your website. Once your website starts getting decent traffic, the sitelinks will start showing up, on typing domian name only.

Related

When will Google stop showing a site's page after a robots.txt has been placed in it?

Google is showing www.example.com/myPage as a search result.
I do not want this /myPage to be indexed by google, so a robots.txt was placed in the page.
How long will it take to stop being showed in google?
I know that people can still visit it if they have the URL, but my aim is just to remove it from google's search results.
My knowledge in SEO is little, and I feel the answer may vary depending on the site traffic and other SEO-related factors, but speaking in general terms, how long would this take?
Crawls are based on many factors such as PageRank, links to a page, and crawling constraints such as the number of parameters in a URL. Any number of factors can affect the crawl frequency of individual sites.
The crawl process is algorithmic; computer programs determine which sites to crawl, how often, and how many pages to fetch from each site. They don't accept payment to crawl a site more frequently. For tips on maintaining a crawler-friendly website, please visit the Webmaster Guidelines.
I would suggest you to use the google webmaster tool for your SEO this will help you to see that when Google last crawled your website also there are many SEO options that will help you to index your site better.
There is also an option in Webmaster to ask Google to crawl your site again telling Google bots to re crawl your site as the content on your site has changed.
This link might help you understand better.Also to get an overview of Webmaster setup and features visit this link

What is going on with my Organic Google Traffic?

You can see my Google organic traffic graph attached, as you can see back in the time I have 10K searches and 1K Daily hit but nowadays (for about 2 months or so) I've got only 200-300 unique visitors from Google.
My site is totally unique and the content is not a copy-paste, it is fully written my me and I am updating it Daily.
So, do you have a guess about what's going on with my organic Google traffic? Any help?
Have you noticed a drop in organic positions for keywords you wish to appear high in Google for? Normally the reason for a drop in organic traffic is directly related to weaker organic rankings.
Have you recently changed anything on the domain? Http to https for example? Webmaster tools is a bit 'dum' and would need both versions submitted.
Ideally you should be tracking organic traffic through Google Analytics not webmaster tools.
You'll find the reason while performing these actions:
-Login to your Google Analytics account and check your top organic traffic pages (use in time compare feature to detect pages that caused the drop).
-Check on Google if your main traffic pages are indexed (by typing "site:www.yourwebsite.com" (without quotes) in the search box). Sometimes important pages get removed by admins without appropriate redirection.
-Log in to Google webmaster tools and check if there aren't any manual actions detected: Search Traffic/Manual Actions.
-Also in Google webmaster tools, take a look at Search Traffic/Links to your site. Are there weird websites on the top of the list having hundred or thousand of links pointing to your website? If yes, contact these website to unlink from you, and if you can not co-operate with them, submitt a disavow request to Google.
-Check the robots.txt file if there aren't some new rules added to restrict search engine bots' access to your pages.

Google Policy on interlinking my websites together

I was wondering what's Google's official policy on linking my own websites together, do they forbid it, allow it, allow it as long as it's no-follow, etc.
For clarification i will give both a white-hat and black-hat examples:
white-hat:
I'm a web designer who also has several affiliate websites. I designed those websites so i would like to give myself credit by linking from the affiliate website to my professional bio website where people can hire me as a designer.
black-hat:
I buy 100 different domains and link each one to the other 99 sharing all the link juice between them. The content of each website abide by Google's policy and isn't spammy , the only thing that's wrong is the fact that i got 99 links to each of them and i'm the only one doing the linking.
First solution - nofollow:
Well, if they are nofollow, I don't see why Google would care.
So, you'd probably be safe with that, if what you want to achieve is indeed giving yourself credit.
But, as for SEO optimization, as you already know, the sites wouldn't benefit much.
However with nofollow, even if you didn't increase pagerank, number of visits to each site should increase (the traffic from your other sites). This also could be beneficial.
Second solution - portfolio site:
There is one scenario which could suit your purpose:
Create your "portfolio". A site with links to all the sites you created, as an example of your skills and stuff..
Place a link on each of your sites to this portfolio.
Now, you have a page with 100 outbound links, each perfectly legitimate. And each of your sites contains just one outbound link connecting it to your other sites.
This should be fine both for your presentation and for SEO, and you avoided having a link farm.
EDIT: You can find actual info from Google here: http://www.google.com/webmasters/docs/search-engine-optimization-starter-guide.pdf

Sitemap.xml - Google not indexing

I have created a sitemap for my site and it complies with the protocol set by http://www.sitemaps.org/
Google has been told about this sitemap via webmaster tools. It has tracked all the urls within the sitemap (500+ urls) but has only indexed 1 of them. The last time google downloaded the sitemap was on the 21st of Oct 2009.
When I do a google search for site:url it picks up 2500+ results.
Google says it can crawl the site.
Does anyone have any ideas as to why only 1 url is actually indexed?
Cheers,
James
First off, make sure Google hasn't been forbidden from those pages using robots.txt, etc. Also make sure those URLs are correct. :)
Second, Google doesn't just take your sitemap at face value. It uses other factors, such as inbound links, etc, to determine whether it wants to crawl all of the pages in your sitemap. The sitemap then serves mostly as a hint more than anything else (it helps Google know when pages are updated more quickly, for example). Get high-quality, relevant, useful links (inbound and outbound) and your site should start getting indexed.
Your two statements seem to contradict one another.
but has only indexed 1 of them.
and
When I do a google search for site:url it picks up 2500+ results
bdonlan is correct in their logic (robot.txt and Google's lack of trust for sitemaps) but I think the issue is what you "think" is true about your site.
That is, Google Webmaster Tools says you only have 1 page indexed but site:yoursite.com shows 2.5k.
Google Webmaster Tools aren't very accurate. They are nice but they are buggy and MIGHT help you learn about issues about your site. Trust the site: command. Your in Google's index if you search site:yoursite.com and you see more than 1 result.
I'd trust site:yoursite.com. You have 2.5k pages in Google, indexed and search-able.
So, now optimize those pages and see the traffic flow. :D
Sidenote: Google can crawl any site, flash, javascript, etc.

Why doesn't Googlebot index pages it crawls?

Three months ago I published my small personal website (~10 pages), submitted the URL to Google, and a few days later Googlebot showed up. Over the course of the last couple of weeks, Googlebot visits my website approximately twice a week and crawls maybe every other page.
Ever since Googlebot first crawled my website, whenever I run a search for site:example.com Google returns only my homepage. (Interestingly, so does Bing, so maybe the problem isn't specific to Google.)
I built the website with CodeIgniter mainly to familiarize myself with it. It's really simple, only a couple of pages about me and my projects. I am not using any black-hat SEO techniques, JavaScript, or anything like that.
What could be possible reasons why Googlebot would crawl my pages but not index them?
EDIT:
I do have a Webmaster Tools account. There are no crawl errors, internal links are listed, but listed keywords come only from my homepage.
Create XML site map and "tell" to Google about it in Google Webmaster tool
Ensure your 10 pages have distinguished content - since search engine can eliminate if content looks identical
Ensure links from home page lead to all another 9 pages. If so place link in