Running Multiple websites with same profiles - seo

Is it a bad practice in terms of search traffic to maintain multiple websites in the same niche. For example using the same set of social profiles from twitter, facebook and g+ and using them on two websites related to laptop shopping.
I am interested to know the search traffic impact with and without using social sharing at all.

Not is a bad practice for SEO at all. You could be penalize for duplicate content but the socials profiles would not do that.
Every day is more important the impact in social networks just to take more reputation and more traffic at all. Your profile of Google+ just will do, that your profile will improve as author.
Soon will be important the reputation from authors and without a good SEO position of your website if you are good position as author the sites where you will collaborate will have better reputation.
I expect it will help to you.

Related

Get social share count for Asian social networks (QZone, Renren & Mixi)

I'm trying to get the share counts for any (or all) of the social networking sites QZone, Renren & Mixi. The counts are being fetched server-side so I was hoping for some kind of API but I don't speak Chinese or Japanese so I'm having trouble finding the right information.
Any pointer in the right direction would be very much appreciated as my searches so far haven't yielded anything useful. Thanks!
Requests for offsite resources are off-topic for stackoverflow - however a quick look found http://dev.renren.com which is usable via online translation eg google translate and has a full API. The best place to ask would be a developers group for mostly Asia based developers.
I'm not really sure what you are trying to do and if you already have functionality to share to those networks or not, if not then many of the third party sharing services provide that and will do the share counting for you eg add this, add to any etc. I understand that some sites based in Asia are restricted by country of origin so signing up for a test account must is restricted to mobiles registried in specific countries, anonymous and open to all it is not.

Track how often link was clicked

I am currently running a website where I promote different coffees from pubs in my city. On my website I have links to the different coffees.
I have recently seen some of this links being shared on Facebook and other social networks.
So I was wondering if it is somehow possible to track how often one of this links are being clicked?
I have tried using redirects to my site but Facebook uses my pictures in the previews, whereas I don't want this because it is misleading.
I have seen that this works with Bitly so it must somehow be possible?
And there are of course different services providing this, but it would be nice if it would run without any foreign services.
So basically I am looking for a solution which will let me know how often a link, origination from my site was clicked in Facebook, Google+ or any other forum.
There definitely is. Try looking into Google Analytics, it will show you show much data from your personal websites and links that it can blow your mind! Here is the link
Google Analytics helps you analyze visitor traffic and paint a
complete picture of your audience and their needs. Track the routes
people take to reach you and the devices they use to get there with
reporting tools like Traffic Sources. Learn what people are looking
for and what they like with In-Page Analytics. Then tailor your
marketing and site content for maximum impact.
You can even get a free package to use!
Hope this helps!
Yes you have plenty of analytical options.
Something as straight forward as Google Analytics for example.
If you are using cpanel on your hosts server, you even have options such as AWSTATS, which will also provide information.
If all else fails you can even use post data stored in your apache / nginx logs.
Since you have amended your question you might want to check out this tool. It is not google. :)
It is called Click Meter and performs Link Tracking and provides click reports, etc

How do the social media monitoring sites fetch the huge number of user posts?

There are many social media monitoring sites in the market. I am very curious about how do the sites fetch the posts of such a huge number of users. How do they know which user's posts should be fetched?
For example, if one site needs me to log in with my Facebook account, and it just fetches/analyzes my or my friend's posts. That would be reasonable. But I tried several social media monitoring services several days ago, I found that there are massive amount of data fetched, users of all kinds are included.
How do the services know which user's data should they fetch? If they fetch all the posts of a certain social site, how do they achieve that? Doesn't the social site's API always prohibit apps from fetching data with large amount?
The application Social Radar is primarily crawler driven. This is similar to how the Google.com search engine works.
Google doesn't really worry about which users' content they're crawling, they just index what they can find. Content is typically structured in ecosystems so if you can find part of a conversation, you can often discover the rest of it as well. This is also true and helpful in the process of spam filtering.
APIs are leveraged as well, terms differ by service.

Google Policy on interlinking my websites together

I was wondering what's Google's official policy on linking my own websites together, do they forbid it, allow it, allow it as long as it's no-follow, etc.
For clarification i will give both a white-hat and black-hat examples:
white-hat:
I'm a web designer who also has several affiliate websites. I designed those websites so i would like to give myself credit by linking from the affiliate website to my professional bio website where people can hire me as a designer.
black-hat:
I buy 100 different domains and link each one to the other 99 sharing all the link juice between them. The content of each website abide by Google's policy and isn't spammy , the only thing that's wrong is the fact that i got 99 links to each of them and i'm the only one doing the linking.
First solution - nofollow:
Well, if they are nofollow, I don't see why Google would care.
So, you'd probably be safe with that, if what you want to achieve is indeed giving yourself credit.
But, as for SEO optimization, as you already know, the sites wouldn't benefit much.
However with nofollow, even if you didn't increase pagerank, number of visits to each site should increase (the traffic from your other sites). This also could be beneficial.
Second solution - portfolio site:
There is one scenario which could suit your purpose:
Create your "portfolio". A site with links to all the sites you created, as an example of your skills and stuff..
Place a link on each of your sites to this portfolio.
Now, you have a page with 100 outbound links, each perfectly legitimate. And each of your sites contains just one outbound link connecting it to your other sites.
This should be fine both for your presentation and for SEO, and you avoided having a link farm.
EDIT: You can find actual info from Google here: http://www.google.com/webmasters/docs/search-engine-optimization-starter-guide.pdf

Is this a blackhat SEO technique?

I have a site which has been developed completely in flash. Now the site owners do not want to shift to a more text/html based site. So am planning to create an alternative html/text based site which the googlebot will get redirected to. (By checking the useragent). My question is that is this allowed officially by google?
If not then how come there are many subscription based sites which display a different set of data to google compared to the users? Is that allowed?
Thank you very much.
I've dealt with this exact scenario for a large ecommerce site and Google essentially ignored the site. Google considers it cloaking and addresses it directly here and says:
Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.
Instead, create an ADA compliant version of the website so that users with screen readers and vision aids can use your web site. As long as there as link from your home page to your ADA compliant pages, Google will index them.
The official advice seems to be: offer a visible link to a non-flash version of the site. Fooling the googlebot is a surefire way to get in trouble. And remember, Google results will link to the matching page! Do not make useless results.
Google already indexes flash content so my suggestion would be to check how your site is being indexed. Maybe you don't have to do anything.
I don't think showing an alternate version of the site is good from a Google perspective.
If you serve up your page with the exact same address, then you're probably fine. For example, if you show 'http://www.somesite.com/' but direct googlebot to 'http://www.somesite.com/alt.htm', then Google might direct search users to alt.htm. You don't want that, right?
This is called cloaking. I'm not sure what the effects of it are but it is certainly not whitehat. I am pretty sure Google is working on a way to crawl flash now so it might not even be a concern.
I'm assuming you're not really doing a redirect but instead a PHP import or something similar so it shows up as the same page. If you're actually redirecting then it's just going to index the other page like normal.
Some sites offer a different level of content -- they LIMIT the content, they don't offer alternative and additional content. This is done so it doesn't index unrelated things generally.