Alexa Rank: No Data - seo

I own a domain and host for more than 2 years. I get No Data in alexa rank.
I claimed my website in alexa for more than a month. But I still get No Data.
Should I use a special technique to have my website indexed in alexa?

Running Google Analytics on your site shouldn't have any impact on whether you're listed on Alexa or not. Listing on Alexa is entirely a measure of traffic quantity over a 90-day period. If your website doesn't generate significant traffic, it will not be listed.

Just thought to share an answer that I was able to discover to start ranking in
Alexa's radius. Some of the blogging wizards say that follow three steps to get
your site/blog recognized by Alexa-
A) Install Alexa toolbar into your Firefox and Chrome browsers.
B) Install a Alexa Rank Plugin or Add Alexa Ranking code into your
Blog or site.
C) Do something to drive traffic to your website or blog.
If someone finds a better idea, kindly do share it. I just followed the above
three, and waiting for Alexa to rank my blog. I did add a Text Widget to my blog
and embed the Alexa Rank HTML code. If you like, can check it from
TechBeamers.com.

No data by Alexa usually means they have not recorded (or estimated) little or no traffic or visits for the last 3 months. This is what Alexa says about the same -
Alexa’s traffic rankings are based on the past three months of global
traffic measured from our global traffic panel and are updated daily.
An overall traffic ranking of “No data”, or “No rank” in the Toolbar,
indicates that none of our sources show visits to the site in question
during the past three months (as of our last update).
Do you have any analytics deployed? If yes, do you record any traffic (excluding your own)?

Related

Track how often link was clicked

I am currently running a website where I promote different coffees from pubs in my city. On my website I have links to the different coffees.
I have recently seen some of this links being shared on Facebook and other social networks.
So I was wondering if it is somehow possible to track how often one of this links are being clicked?
I have tried using redirects to my site but Facebook uses my pictures in the previews, whereas I don't want this because it is misleading.
I have seen that this works with Bitly so it must somehow be possible?
And there are of course different services providing this, but it would be nice if it would run without any foreign services.
So basically I am looking for a solution which will let me know how often a link, origination from my site was clicked in Facebook, Google+ or any other forum.
There definitely is. Try looking into Google Analytics, it will show you show much data from your personal websites and links that it can blow your mind! Here is the link
Google Analytics helps you analyze visitor traffic and paint a
complete picture of your audience and their needs. Track the routes
people take to reach you and the devices they use to get there with
reporting tools like Traffic Sources. Learn what people are looking
for and what they like with In-Page Analytics. Then tailor your
marketing and site content for maximum impact.
You can even get a free package to use!
Hope this helps!
Yes you have plenty of analytical options.
Something as straight forward as Google Analytics for example.
If you are using cpanel on your hosts server, you even have options such as AWSTATS, which will also provide information.
If all else fails you can even use post data stored in your apache / nginx logs.
Since you have amended your question you might want to check out this tool. It is not google. :)
It is called Click Meter and performs Link Tracking and provides click reports, etc

What is going on with my Organic Google Traffic?

You can see my Google organic traffic graph attached, as you can see back in the time I have 10K searches and 1K Daily hit but nowadays (for about 2 months or so) I've got only 200-300 unique visitors from Google.
My site is totally unique and the content is not a copy-paste, it is fully written my me and I am updating it Daily.
So, do you have a guess about what's going on with my organic Google traffic? Any help?
Have you noticed a drop in organic positions for keywords you wish to appear high in Google for? Normally the reason for a drop in organic traffic is directly related to weaker organic rankings.
Have you recently changed anything on the domain? Http to https for example? Webmaster tools is a bit 'dum' and would need both versions submitted.
Ideally you should be tracking organic traffic through Google Analytics not webmaster tools.
You'll find the reason while performing these actions:
-Login to your Google Analytics account and check your top organic traffic pages (use in time compare feature to detect pages that caused the drop).
-Check on Google if your main traffic pages are indexed (by typing "site:www.yourwebsite.com" (without quotes) in the search box). Sometimes important pages get removed by admins without appropriate redirection.
-Log in to Google webmaster tools and check if there aren't any manual actions detected: Search Traffic/Manual Actions.
-Also in Google webmaster tools, take a look at Search Traffic/Links to your site. Are there weird websites on the top of the list having hundred or thousand of links pointing to your website? If yes, contact these website to unlink from you, and if you can not co-operate with them, submitt a disavow request to Google.
-Check the robots.txt file if there aren't some new rules added to restrict search engine bots' access to your pages.

Selenium based malware (malvertising) checking - A few questions

We recently had an issue where an advertiser who purchased advertisments via a 3rd party was distributing malware via the ads they purchased.
This led to google black listing our web property for a short period of time.
This issue is now resolved.
After this happened we decided that we will self-aduit our advertisers.
After searching the web for services that provide this service, we found a few... Armorize (www.armorize.com), amongst others, provides this type of service. But after speaking with their sales on the telephone we found that they charge aprox 10K-15K USD / year for this service. Way out of our price range.
We dont have that kind of cake.
What we do have is a smart head on our (err, my) shoulders.
So, here is what I have developed.
A) Selenium running firefox.
B) Firefox proxying all requests via a locally hosted squid proxy.
The result?
Pipe in advertisers URL -> Selenium Firefox -> Squid access log -> A nice clean list of all URLS hit by the advertisment(s).
The next step was to test these against some sort of malware list. We are now testing them againts googles safebrowsing API ( https://developers.google.com/safe-browsing/ ).
The result is exactly what we wanted. A way to test via "real browser" each of the URLS hit by our advertisers.
So, the questions are as follow:
a) Is using their (googles) API like this acceptable as far as google is concerned? We will be keeping this 100% in house, and will not be reselling this service. Its 100% for us.
b) Does the google safe browser API allow checking of FULL URLs, or does it work only on a per-domain basis?
c) Does anyone know any other APIs where we can test these URLs? Free / low cost would be great :)
Thanks!
a. Reviewing the Safe Browsing API Terms of Service together with the Google APIs Terms of Service I cannot find anything that you are doing that falls outside of these.
b. The docs consistently refer to URL rather than domain - having performed some tests (e.g. liderlab.ru / absa/ vs. liderlab.ru /absa / page/ 1) the first is a phising site and gives the appropriate warning whereas the second doesn't).
c. PhishTank is good and free and seems to be a little more current than Google (from a brief investigation). BrightCloud is a reasonably priced pay for service. URL Blacklist is a pay for service that works on a honour system so you can see their data.

Google Policy on interlinking my websites together

I was wondering what's Google's official policy on linking my own websites together, do they forbid it, allow it, allow it as long as it's no-follow, etc.
For clarification i will give both a white-hat and black-hat examples:
white-hat:
I'm a web designer who also has several affiliate websites. I designed those websites so i would like to give myself credit by linking from the affiliate website to my professional bio website where people can hire me as a designer.
black-hat:
I buy 100 different domains and link each one to the other 99 sharing all the link juice between them. The content of each website abide by Google's policy and isn't spammy , the only thing that's wrong is the fact that i got 99 links to each of them and i'm the only one doing the linking.
First solution - nofollow:
Well, if they are nofollow, I don't see why Google would care.
So, you'd probably be safe with that, if what you want to achieve is indeed giving yourself credit.
But, as for SEO optimization, as you already know, the sites wouldn't benefit much.
However with nofollow, even if you didn't increase pagerank, number of visits to each site should increase (the traffic from your other sites). This also could be beneficial.
Second solution - portfolio site:
There is one scenario which could suit your purpose:
Create your "portfolio". A site with links to all the sites you created, as an example of your skills and stuff..
Place a link on each of your sites to this portfolio.
Now, you have a page with 100 outbound links, each perfectly legitimate. And each of your sites contains just one outbound link connecting it to your other sites.
This should be fine both for your presentation and for SEO, and you avoided having a link farm.
EDIT: You can find actual info from Google here: http://www.google.com/webmasters/docs/search-engine-optimization-starter-guide.pdf

How do sites like Hubspot track inbound links?

Are all these types of sites just illegally scraping Google or another search engine?
As far as I can tell ther is no 'legal' way to get this data for a commercial site.. The Yahoo! api ( http://developer.yahoo.com/search/siteexplorer/V1/inlinkData.html ) is only for noncommercial use, Yahoo! Boss does not allow automated queries etc.
Any ideas?
For example, if you wanted to find all the links to Google's homepage, search for
link:http://www.google.com
So if you want to find all the inbound links, you can simply traverse your website's tree, and for each item it finds, build a URL. Then query Google for:
link:URL
And you'll get a collection of all the links that Google has from other websites into your website.
As for the legality of such harvesting, I'm sure it's not-exactly-legal to make a profit from it, but that's never stopped anyone before, has it?
(So I wouldn't bother wondering whether they did it or not. Just assume they do.)
I don't know what hubspot do, but, if you wanted to find out what sites link to your site, and you don't have the hardware to crawl the web, one thing you can do is monitor the HTTP_REFERER of visitors to your site. This is, for example, how Google Analytics (as far as I know) can tell you where your visitors are arriving from. This is not 100% reliable as not all browsers set it, particularly in "Privacy Mode", but you only need one visitor per link to know that it exists!
This is ofter accomplished by embedding a script into each of your webpages (often in a common header or footer). For example, if you examine the source for the page you are currently reading you will find (right down at the bottom) a script that reports back to Google information about your visit.
Now this won't tell you if there are links out there that no one has ever used to get to your site, but let's face it, they are a lot less interesting than the ones people actually use.