Complete list of Google Gears enabled sites [closed] - google-gears

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Is there a complete list of Google Gears enabled sites? I'm aware of rememberthemilk.com, google docs and google calendar

MySpace's message system is one of the biggest outside of Google. At least so I thought according to this article.
Other sites are:
Wordpress
MindMeister
PassPack
ZohoWriter
BuxFer
SomeThings

So far it seems that:
wordpress.com
Zoho writer
Buxfer
PassPack
Google Reader
MindMeinster
Google Picasa Web Albums
MySpace (Mail Search)
YouTube
Support Google Gears in some fashion. (list compiled from: wikipedia, and this site here)

Listed in the order of perceived decreasing complexity of offline implementation:
Autodesk Labs Project Draw
Buxfer
Gmail
Google Calendar
Google Docs
Google Reader
MindMeinster
Myspace
Paymo
Passpack
Picasa Web Albums
Remember The Milk
Some Things
WordPress
Zoho Mail
Zoho Writer
Taken from:
http://tarunupadhyay.com/2009/06/15/full-list-of-websites-that-you-can-take-offline-with-google-gears/

Gmail now uses Gears too.

Remember The Milk

Related

Is it ok to have facebook and twitter as authentication for an e-commerce site? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have been looking around and noticed that ecommerce sites are not using social auth and i am building an ecommerce site and wanted to know the reasons for not using social auth.
It depends, will you be storing credit card details? You might hit some problems with PCI compliance if purchases can be made through social network logins.
If you're just using it as a useful way of signing up, you should be OK.
In a recent survey my employer conducted, we established that in younger (<30 years) users there was a perception that they did not want to link financial data with there social networking data due to, primarily, a lower level of trust in social networking brands, w.r.t visa et. al
I would never trust any site that allows me to login with my Facebook account. Too much risk for getting my user and password sniffed out and abused.

What to use now Google News API is deprecated? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
As part of a Project I'm working on I've been instructed to implement Google News API into a Web Application.
However, I've checked the Google News API site, and I see the following message:
Important: The Google News Search API has been officially deprecated
as of May 26, 2011. It will continue to work as per our deprecation
policy, but the number of requests you may make per day may be
limited.
I've checked SO Questions but I've not been able to find a question related to the News API.
What should I use now that Google News API is redundant?
Is it the Custom Search API?
And if so, how can I make this relevant for just
News Results for a particular query for my Web Application?
I've checked the Google News RSS, but this uses HTML in the description which won't work for my requirements as I just need the text.
Depending on your needs, you want to use their section feeds, their search feeds
http://news.google.com/news?q=apple&output=rss
or Bing News Search.
http://www.bing.com/toolbox/bingdeveloper/
I'm running into the same issue with one of my own apps. So far I've found the only non-deprecated way to access Google News data is through their RSS feeds. They have a feed for each section and also a useful search function. However, these are only for noncommercial use.
As for viable alternatives I'll be trying out these two services: Feedzilla, Daylife
Looks like you might have until the end of 2013 before they officially close it down.
http://groups.google.com/group/google-ajax-search-api/browse_thread/thread/6aaa1b3529620610/d70f8eec3684e431?lnk=gst&q=news+api#d70f8eec3684e431
Also, it sounds like they are building a replacement... but it's going to cost you.
I'd say, go to a different service. I think bing has a news API.
You might enjoy (or not) reading: http://news.ycombinator.com/item?id=1864625

TV guide listing API [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Does anybody know a provider offering TV listings (through API or download) for all channel and cable providers?
Or is there any independent company collecting/providing such data?
An API/REST/SOAP interface would be great.
The MythTV folks have gathered resources for various countries here. If you're in the US or Canada, they recommend the Schedule Direct service.
These services are generally based on the XMLTV data format/toolset.
Rovi offers both SOAP and REST APIs for TV listings. It supplies listings for all channels of the cable, satellite, and broadcast services in multiple countries. Rovi is the source the cable companies use. See this website.
Schedule Direct doesn't grant commercial licenses. I'm checking out tvrage as mentioned above and Rovi right now.
Edit: If this is a commercial project, I recommend contacting Tribune Media Services (click "License our Content" in the footer at Zap2It.com). That was the solution I chose for my company, simply because they were much more prompt in their response than Rovi. The paid listings are XML files.
If all you need is a non-commercial license, SchedulesDirect works very well. If you have questions about their licensing, they encourage you to ask. We started using their listings and then had to change tack because of licensing issues.
You could contact staff of WebTelevideo at info#webtelevideo.com that has solved your problem. They may provide you an account to their own API.
WebTelevideo API has TV Scheduling for many countries and metadata (actors, directors, plot, trailers, posters etc..)

Alternative to Google Custom Search [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm using Google Custom Search on a client website. They are not very happy about rival companies showing up on sponsored links on their own site. I know we can use Google Site Search but it has an annual fee. I've been looking all around for a Free/OpenSource alternative for Google CSE, but found little I can use. Anyone have any suggestions?
Check this question. What is a good search engine for embedding in a web site. IMO if the client dont want to pay for a search engine then they will have to live with the advertisements if they want a good search engine.
Google has a paid version of search. You can read about it here. We use it in our intranet.
Check out the Google JSON/AJAX Search API. It's a lightweight way of doing a query and returning pure search results that you can then display.
http://code.google.com/apis/ajaxsearch/documentation/
Search is very big business right now because it is relatively immature as an industry - similar to the OS industry many years ago. Anyone with something good is going to charge for it. The open source community will only catch up when the core concepts around search stabilize and become more widely understood (and therefore reproducible). Right now much of the basics are still trade secrets.
Short answer - if you want something even remotely as good as Google, expect to pay for it.
You can block your competitors just as with AdSense, "While AdSense allows you to filter ads by URLs, you can also filter URLs from your search results within your CSE account." - https://www.google.com/support/adsense/bin/answer.py?answer=91652
The Austrian company Mindbreeze ( http://www.mindbreeze.com/index_en.html ) has a good alternative site search for websites.
You can test it here for free: http://www.mindbreeze.com/RegisterInSite.html
opensource alternative:
http://lucene.apache.org/

Automatically verify my website's links are pointing to urls that exist? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Is there a tool to automatically search through my site and test all the links? I hate running across bad urls.
Xenu link sleuth is excellent (and free)
w3.org checklink
If I were you, I'd check out the W3C Link Checker.
Something like this should work: http://www.dead-links.com/
Do google searches for "404 checker" or "broken link checker"
I used Xenu's Link Sleuth in the past. It will crawl your site and tell you which links point to nowhere. It is not super fancy but it works.
http://en.wikipedia.org/wiki/Xenu%27s_Link_Sleuth
The Wikipedia page lists a whole bunch of other products.
WebHTTrack
Can take a long time to go through a large web site (I archived a 250MB website and it took approximately 2 hours - it wasn't local though) It has a log so you should be able to track 404s easily.
Also check out Google's webmaster tools.
http://www.google.com/webmasters/tools/
They give you the ability to see the 404's that GoogleBot discovers when crawling your website (along with lots and lots of other stuff).