How to do google search without getting captcha - vba

I have this vba script which performs about 60 to 80 searches on google.com but I am getting captcha after some queries I understand its a violation of google terms of use. Is there any way I can resolve this issue?
Can custom google search https://cse.google.com/cse/ help me in any way?

There are a few of ways..
Both Google and Bing offer CSE - But they have limitations on how many query per day/per month. But that can be increased with paid options.
The more IP's you have the more you can "Search" before getting a captcha block
You can purchase a BackConnect Rotating Proxy - The IP's on these proxies change every 10 mins - This enables you to search more.
You can purchase/code Captcha solving Software, that will solve the Captcha for you.
Those are you options, obviously the correct way is #1 and then you start to go into the shades of grey.

Related

Fastest way to determine whether or not Instagram account follows certain user?

Currently trying to figure out how to check whether or not very large Instagram accounts (10M+ followers) are following a list of 30-40k users. Scraping all followers doesn't seem to be a very viable solution, as Instagram rate limits at every ~10k requests (so naturally, scraping 50-60M usernames would take very long).
The Instagram app has a search-bar feature that obviously lets one check whether or not an account follows someone; however, I'm not entirely sure how to replicate this functionality on web.
Does anyone know of any ways to achieve this functionality through a Selenium-like bot and / or utilize the Instagram API?
Figured it out! For anyone who has a similar goal, I accomplished the above with two approaches:
1.) Appium / Selenium automation w/ Android emulator (takes advantage of the Instagram native-only search fucntionality)
2.) Overkill: Fiddler and patched Instagram APK (SSL cert-pinning disabled) to find the actual query to see whether a user is following another).

Track how often link was clicked

I am currently running a website where I promote different coffees from pubs in my city. On my website I have links to the different coffees.
I have recently seen some of this links being shared on Facebook and other social networks.
So I was wondering if it is somehow possible to track how often one of this links are being clicked?
I have tried using redirects to my site but Facebook uses my pictures in the previews, whereas I don't want this because it is misleading.
I have seen that this works with Bitly so it must somehow be possible?
And there are of course different services providing this, but it would be nice if it would run without any foreign services.
So basically I am looking for a solution which will let me know how often a link, origination from my site was clicked in Facebook, Google+ or any other forum.
There definitely is. Try looking into Google Analytics, it will show you show much data from your personal websites and links that it can blow your mind! Here is the link
Google Analytics helps you analyze visitor traffic and paint a
complete picture of your audience and their needs. Track the routes
people take to reach you and the devices they use to get there with
reporting tools like Traffic Sources. Learn what people are looking
for and what they like with In-Page Analytics. Then tailor your
marketing and site content for maximum impact.
You can even get a free package to use!
Hope this helps!
Yes you have plenty of analytical options.
Something as straight forward as Google Analytics for example.
If you are using cpanel on your hosts server, you even have options such as AWSTATS, which will also provide information.
If all else fails you can even use post data stored in your apache / nginx logs.
Since you have amended your question you might want to check out this tool. It is not google. :)
It is called Click Meter and performs Link Tracking and provides click reports, etc

preventing google to detect your traffic by vb browser

I have this problem for a long time and I test a lot of ways to fix it but I can't. my main problem is that when my vb webbrowser search in google , after a while google ditect unusual traffic from my computer network (like google say) so I forced to type the characters of image google send me every 5 minutes and try again the last search.
Does anyone know a way to prevent this?
Thanks
You probably need to use the Google APIs. You can download them from their site.

How is it possible for new content to appear in Google results mere minutes after it is created?

For example, when I post to Stackoverflow, the post appears in the Google index a minute later. How is this accomplished? What do I have to do to my web-site to get the same frequency of indexing?
You could start by:
getting 65,000-odd regular users on your site.
making your site linked to from all over the place.
make your site very active.
providing very useful content.
This is all standard SEO stuff which will up your "importance" in the eyes of Google (and other search engines, presumably, but who cares :-).
The faster a page changes, the more google will re-index it.
Obviously, if your site is "important" enough for google.
You should check out Google Webmaster Tools here http://www.google.com/webmasters/tools
To help with indexing from Google, but also Yahoo and MS, you'll want to use the sitemap protocol, see http://en.wikipedia.org/wiki/Sitemaps .
Simply put, if you want do that you, first, need to lure Google robot to you site.
To do this you should do those things:
Building as much hyperlinks to high-ranked, active, relevant sites as possible.
make your own site active. In this way, google believes your site is worthwhile to visit frequently!
In addition to this, you can provide premier content and structure(site map).
To sum all of them up, you need build a great site in the eyes of search engines!
Good luck!

How do sites like Hubspot track inbound links?

Are all these types of sites just illegally scraping Google or another search engine?
As far as I can tell ther is no 'legal' way to get this data for a commercial site.. The Yahoo! api ( http://developer.yahoo.com/search/siteexplorer/V1/inlinkData.html ) is only for noncommercial use, Yahoo! Boss does not allow automated queries etc.
Any ideas?
For example, if you wanted to find all the links to Google's homepage, search for
link:http://www.google.com
So if you want to find all the inbound links, you can simply traverse your website's tree, and for each item it finds, build a URL. Then query Google for:
link:URL
And you'll get a collection of all the links that Google has from other websites into your website.
As for the legality of such harvesting, I'm sure it's not-exactly-legal to make a profit from it, but that's never stopped anyone before, has it?
(So I wouldn't bother wondering whether they did it or not. Just assume they do.)
I don't know what hubspot do, but, if you wanted to find out what sites link to your site, and you don't have the hardware to crawl the web, one thing you can do is monitor the HTTP_REFERER of visitors to your site. This is, for example, how Google Analytics (as far as I know) can tell you where your visitors are arriving from. This is not 100% reliable as not all browsers set it, particularly in "Privacy Mode", but you only need one visitor per link to know that it exists!
This is ofter accomplished by embedding a script into each of your webpages (often in a common header or footer). For example, if you examine the source for the page you are currently reading you will find (right down at the bottom) a script that reports back to Google information about your visit.
Now this won't tell you if there are links out there that no one has ever used to get to your site, but let's face it, they are a lot less interesting than the ones people actually use.