Google Policy on interlinking my websites together - seo

I was wondering what's Google's official policy on linking my own websites together, do they forbid it, allow it, allow it as long as it's no-follow, etc.
For clarification i will give both a white-hat and black-hat examples:
white-hat:
I'm a web designer who also has several affiliate websites. I designed those websites so i would like to give myself credit by linking from the affiliate website to my professional bio website where people can hire me as a designer.
black-hat:
I buy 100 different domains and link each one to the other 99 sharing all the link juice between them. The content of each website abide by Google's policy and isn't spammy , the only thing that's wrong is the fact that i got 99 links to each of them and i'm the only one doing the linking.

First solution - nofollow:
Well, if they are nofollow, I don't see why Google would care.
So, you'd probably be safe with that, if what you want to achieve is indeed giving yourself credit.
But, as for SEO optimization, as you already know, the sites wouldn't benefit much.
However with nofollow, even if you didn't increase pagerank, number of visits to each site should increase (the traffic from your other sites). This also could be beneficial.
Second solution - portfolio site:
There is one scenario which could suit your purpose:
Create your "portfolio". A site with links to all the sites you created, as an example of your skills and stuff..
Place a link on each of your sites to this portfolio.
Now, you have a page with 100 outbound links, each perfectly legitimate. And each of your sites contains just one outbound link connecting it to your other sites.
This should be fine both for your presentation and for SEO, and you avoided having a link farm.
EDIT: You can find actual info from Google here: http://www.google.com/webmasters/docs/search-engine-optimization-starter-guide.pdf

Related

Track how often link was clicked

I am currently running a website where I promote different coffees from pubs in my city. On my website I have links to the different coffees.
I have recently seen some of this links being shared on Facebook and other social networks.
So I was wondering if it is somehow possible to track how often one of this links are being clicked?
I have tried using redirects to my site but Facebook uses my pictures in the previews, whereas I don't want this because it is misleading.
I have seen that this works with Bitly so it must somehow be possible?
And there are of course different services providing this, but it would be nice if it would run without any foreign services.
So basically I am looking for a solution which will let me know how often a link, origination from my site was clicked in Facebook, Google+ or any other forum.
There definitely is. Try looking into Google Analytics, it will show you show much data from your personal websites and links that it can blow your mind! Here is the link
Google Analytics helps you analyze visitor traffic and paint a
complete picture of your audience and their needs. Track the routes
people take to reach you and the devices they use to get there with
reporting tools like Traffic Sources. Learn what people are looking
for and what they like with In-Page Analytics. Then tailor your
marketing and site content for maximum impact.
You can even get a free package to use!
Hope this helps!
Yes you have plenty of analytical options.
Something as straight forward as Google Analytics for example.
If you are using cpanel on your hosts server, you even have options such as AWSTATS, which will also provide information.
If all else fails you can even use post data stored in your apache / nginx logs.
Since you have amended your question you might want to check out this tool. It is not google. :)
It is called Click Meter and performs Link Tracking and provides click reports, etc

What is going on with my Organic Google Traffic?

You can see my Google organic traffic graph attached, as you can see back in the time I have 10K searches and 1K Daily hit but nowadays (for about 2 months or so) I've got only 200-300 unique visitors from Google.
My site is totally unique and the content is not a copy-paste, it is fully written my me and I am updating it Daily.
So, do you have a guess about what's going on with my organic Google traffic? Any help?
Have you noticed a drop in organic positions for keywords you wish to appear high in Google for? Normally the reason for a drop in organic traffic is directly related to weaker organic rankings.
Have you recently changed anything on the domain? Http to https for example? Webmaster tools is a bit 'dum' and would need both versions submitted.
Ideally you should be tracking organic traffic through Google Analytics not webmaster tools.
You'll find the reason while performing these actions:
-Login to your Google Analytics account and check your top organic traffic pages (use in time compare feature to detect pages that caused the drop).
-Check on Google if your main traffic pages are indexed (by typing "site:www.yourwebsite.com" (without quotes) in the search box). Sometimes important pages get removed by admins without appropriate redirection.
-Log in to Google webmaster tools and check if there aren't any manual actions detected: Search Traffic/Manual Actions.
-Also in Google webmaster tools, take a look at Search Traffic/Links to your site. Are there weird websites on the top of the list having hundred or thousand of links pointing to your website? If yes, contact these website to unlink from you, and if you can not co-operate with them, submitt a disavow request to Google.
-Check the robots.txt file if there aren't some new rules added to restrict search engine bots' access to your pages.

Running Multiple websites with same profiles

Is it a bad practice in terms of search traffic to maintain multiple websites in the same niche. For example using the same set of social profiles from twitter, facebook and g+ and using them on two websites related to laptop shopping.
I am interested to know the search traffic impact with and without using social sharing at all.
Not is a bad practice for SEO at all. You could be penalize for duplicate content but the socials profiles would not do that.
Every day is more important the impact in social networks just to take more reputation and more traffic at all. Your profile of Google+ just will do, that your profile will improve as author.
Soon will be important the reputation from authors and without a good SEO position of your website if you are good position as author the sites where you will collaborate will have better reputation.
I expect it will help to you.

Is this a blackhat SEO technique?

I have a site which has been developed completely in flash. Now the site owners do not want to shift to a more text/html based site. So am planning to create an alternative html/text based site which the googlebot will get redirected to. (By checking the useragent). My question is that is this allowed officially by google?
If not then how come there are many subscription based sites which display a different set of data to google compared to the users? Is that allowed?
Thank you very much.
I've dealt with this exact scenario for a large ecommerce site and Google essentially ignored the site. Google considers it cloaking and addresses it directly here and says:
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Instead, create an ADA compliant version of the website so that users with screen readers and vision aids can use your web site. As long as there as link from your home page to your ADA compliant pages, Google will index them.
The official advice seems to be: offer a visible link to a non-flash version of the site. Fooling the googlebot is a surefire way to get in trouble. And remember, Google results will link to the matching page! Do not make useless results.
Google already indexes flash content so my suggestion would be to check how your site is being indexed. Maybe you don't have to do anything.
I don't think showing an alternate version of the site is good from a Google perspective.
If you serve up your page with the exact same address, then you're probably fine. For example, if you show 'http://www.somesite.com/' but direct googlebot to 'http://www.somesite.com/alt.htm', then Google might direct search users to alt.htm. You don't want that, right?
This is called cloaking. I'm not sure what the effects of it are but it is certainly not whitehat. I am pretty sure Google is working on a way to crawl flash now so it might not even be a concern.
I'm assuming you're not really doing a redirect but instead a PHP import or something similar so it shows up as the same page. If you're actually redirecting then it's just going to index the other page like normal.
Some sites offer a different level of content -- they LIMIT the content, they don't offer alternative and additional content. This is done so it doesn't index unrelated things generally.

How do sites like Hubspot track inbound links?

Are all these types of sites just illegally scraping Google or another search engine?
As far as I can tell ther is no 'legal' way to get this data for a commercial site.. The Yahoo! api ( http://developer.yahoo.com/search/siteexplorer/V1/inlinkData.html ) is only for noncommercial use, Yahoo! Boss does not allow automated queries etc.
Any ideas?
For example, if you wanted to find all the links to Google's homepage, search for
link:http://www.google.com
So if you want to find all the inbound links, you can simply traverse your website's tree, and for each item it finds, build a URL. Then query Google for:
link:URL
And you'll get a collection of all the links that Google has from other websites into your website.
As for the legality of such harvesting, I'm sure it's not-exactly-legal to make a profit from it, but that's never stopped anyone before, has it?
(So I wouldn't bother wondering whether they did it or not. Just assume they do.)
I don't know what hubspot do, but, if you wanted to find out what sites link to your site, and you don't have the hardware to crawl the web, one thing you can do is monitor the HTTP_REFERER of visitors to your site. This is, for example, how Google Analytics (as far as I know) can tell you where your visitors are arriving from. This is not 100% reliable as not all browsers set it, particularly in "Privacy Mode", but you only need one visitor per link to know that it exists!
This is ofter accomplished by embedding a script into each of your webpages (often in a common header or footer). For example, if you examine the source for the page you are currently reading you will find (right down at the bottom) a script that reports back to Google information about your visit.
Now this won't tell you if there are links out there that no one has ever used to get to your site, but let's face it, they are a lot less interesting than the ones people actually use.