Meaning Esri product life cycle support - arcgis

Dose any one knows the meaning of these terms in Esri ArcGIS Product life cycle?
General Availability?
Extended Support?
Mature Support?
Thanks

A quick Google search finds Esri Product Lifecycle Support Policy (https://downloads2.esri.com/support/TechArticles/Product-Life-Cycle.pdf ). Since the definition of those terms depends on the product line, I will not post any excerpts since you didn't state the products you are using.

Related

Can we use Google Translate API to index files for searching?

I am using Lucene 4.2.1 to index files. I need to index multilingual content for which we use Analyzer based on the language, to tokenize and index keywords. However Lucene 4.2.1 does not have analyzers for some languages like Japanese, Korean. The one solution to this is updating the lucene version but since that involves a lot of changes for deprecated functions (in case), I'm trying to find a work around. Does anyone have any suggestions? Thank you!
Personally i would strongly suggest to invest this amount of time and upgrade to the most recent version. This "problem" is already solved in the never versions and building an own written solution may be much more time consuming than upgrading.
IMAO working with such an old version is a technical dept which should be solved. Technical dept always fires back and usually costs much more money as longer as they exist.
Not Sure - But Got This on Google. The Google Cloud Translation API can dynamically translate text between thousands of language pairs. The Cloud Translation API lets websites and programs integrate with the translation service programmatically. The Google Translation API is part of the larger Cloud Machine Learning API family. Please Refer Here Too https://cloud.google.com/translate/docs/ Cotton bags supplier in Dubai

How to optimise Google Translate API calls to translate multiple words in a single request

Everyone. Recently Google Translate Is Integrated Into My Project, Which Plays The Role Of Translating Some Product Names, Product Descriptions, Product Related Category Names. But Cause There Are Plenty Of Products In My Database(And Increased Quickly), Google Translate Api Would Cost Considerable Money.
I Want To Translate By Google As Less As Possible. In The Translation, Many Words Are Same Among Many Products, For Example : 阿迪达斯 - Adidas, 苹果 - iphone, 篮球 - Basketball, Bla Bla..... I Wanna Do Some Tricks, But Find No Idea.
Did Anyone Encounter Such Questions?
Any Help Would Be Appreciated.
It sounds like what you need is actually the ability to reuse translation at the string or substring level (in other words, per database entry). You can't really do that with Google, that I know of. You've got a few options, as I see it:
You could switch over to Microsoft Translator and use their methods
that allow you to place translations yourself, such as their
Collaborative Translation feature that lets you override the MT with
a preferred translation and even to vote translations up/down. Quality here will be broadly comparable to Google (I often find it better), and you have methods at your disposal that allow this override. Also, unlike Google, the Microsoft API is free up to a certain volume. Take a look:
http://www.microsoft.com/en-us/translator/developers.aspx
Microsoft also has a unique feature called the Microsoft Translator Hub, which can use your terminology, for example, for translations. However,depending on how you implemented any solution with Microsoft, you might still have the problem that you are making more calls out to Microsoft than you'd like, and, moreover, that "matching" only takes place at the level of a whole record or string, so it would not hit the case of shared linguistic elements being concatenated into one string.
There's a commercial offering called GeoFluent (full disclosure--I am the product manager for this product, so I'm clearly biased :)) that works with Microsoft Translator but provides pre and post translation processing that can deal with sub-segment and may reduce the volume you are therefore putting through translation each time. It could make sense if, as you mention, you are rapidly adding to your database. Of course, this is a commercial offering too, so you'd have to balance the costs.
Let me know if this helps, and happy to answer any other questions you have.
Marcus
There is a PHP sample here : http://weblite.ca/svn/dataface/modules/tm/trunk/lib/googleTranslatePlugin.php
That allows you to send and array and return an array.
array(source=>target) getTranslations()
translates all of the user provided strings into the target language using the Google Translate API and returns an array of source=>target
strings.

Deprecated youtube API's

I read on this page https://developers.google.com/youtube/youtube-api-list YouTube API Subject to the Deprecation Policy
On this page is the YouTube IFrame Player API.
Does this mean that this api is deprecated ?
Hope someone can shine a light on this page
Thanks
I think that section 7 of the Youtube API terms of service helps clarify things a bit:
Google will announce if it intends to discontinue or make backwards
incompatible changes to this API or Service. Google will use
commercially reasonable efforts to continue to operate those YouTube
API versions and features identified at
http://developers.google.com/youtube/youtube-api-list without these
changes until the later of: (i) one year after the announcement or
(ii) April 20, 2015, unless (as Google determines in its reasonable
good faith judgment):
required by law or third party relationship (including if there is a
change in applicable law or relationship), or doing so could create a
security risk or substantial economic or material technical burden.
So in other words, there's not an announced deprecation of those APIs on the list you pointed to, but they reserve the right to announce deprecations that fall under the policy above. Here are the API technologies that have been officially deprecated:
https://developers.google.com/youtube/2.0/developers_guide_protocol_deprecated

Entity Extraction/Recognition with free tools while feeding Lucene Index

I'm currently investigating the options to extract person names, locations, tech words and categories from text (a lot articles from the web) which will then feeded into a Lucene/ElasticSearch index. The additional information is then added as metadata and should increase precision of the search.
E.g. when someone queries 'wicket' he should be able to decide whether he means the cricket sport or the Apache project. I tried to implement this on my own with minor success so far. Now I found a lot tools, but I'm not sure if they are suited for this task and which of them integrates good with Lucene or if precision of entity extraction is high enough.
Dbpedia Spotlight, the demo looks very promising
OpenNLP requires training. Which training data to use?
OpenNLP tools
Stanbol
NLTK
balie
UIMA
GATE -> example code
Apache Mahout
Stanford CRF-NER
maui-indexer
Mallet
Illinois Named Entity Tagger Not open source but free
wikipedianer data
My questions:
Does anyone have experience with some of the listed tools above and its precision/recall? Or if there is training data required + available.
Are there articles or tutorials where I can get started with entity extraction(NER) for each and every tool?
How can they be integrated with Lucene?
Here are some questions related to that subject:
Does an algorithm exist to help detect the "primary topic" of an English sentence?
Named Entity Recognition Libraries for Java
Named entity recognition with Java
The problem you are facing in the 'wicket' example is called entity disambiguation, not entity extraction/recognition (NER). NER can be useful but only when the categories are specific enough. Most NER systems doesn't have enough granularity to distinguish between a sport and a software project (both types would fall outside the typically recognized types: person, org, location).
For disambiguation, you need a knowledge base against which entities are being disambiguated. DBpedia is a typical choice due to its broad coverage. See my answer for How to use DBPedia to extract Tags/Keywords from content? where I provide more explanation, and mentions several tools for disambiguation including:
Zemanta
Maui-indexer
Dbpedia Spotlight
Extractiv (my company)
These tools often use a language-independent API like REST, and I do not know that they directly provide Lucene support, but I hope my answer has been beneficial for the problem you are trying to solve.
You can use OpenNLP to extract names of people, places, organisations without training. You just use pre-exisiting models which can be downloaded from here: http://opennlp.sourceforge.net/models-1.5/
For an example on how to use one of these model see: http://opennlp.apache.org/documentation/1.5.3/manual/opennlp.html#tools.namefind
Rosoka is a commercial product that provides a computation of "Salience" which measures the importance of the term or entity to the document. Salience is based on the linguistic usage and not the frequency. Using the salience values you can determine the primary topic of the document as a whole.
The output is in your choice of XML or JSON which makes it very easy to use with Lucene.
It is written in java.
There is an Amazon Cloud version available at https://aws.amazon.com/marketplace/pp/B00E6FGJZ0. The cost to try it out is $0.99/hour. The Rosoka Cloud version does not have all of the Java API features available to it that the full Rosoka does.
Yes both versions perform entity and term disambiguation based on the linguistic usage.
The disambiguation, whether human or software requires that there is enough contextual information to be able to determine the difference. The context may be contained within the document, within a corpus constraint, or within the context of the users. The former being more specific, and the later having the greater potential ambiguity. I.e. typing in the key word "wicket" into a Google search, could refer to either cricket, Apache software or the Star Wars Ewok character (i.e. an Entity). The general The sentence "The wicket is guarded by the batsman" has contextual clues within the sentence to interpret it as an object. "Wicket Wystri Warrick was a male Ewok scout" should enterpret "Wicket" as the given name of the person entity "Wicket Wystri Warrick". "Welcome to Apache Wicket" has the contextual clues that "Wicket" is part of a place name, etc.
Lately I have been fiddling with stanford crf ner. They have released quite a few versions http://nlp.stanford.edu/software/CRF-NER.shtml
The good thing is you can train your own classifier. You should follow the link which has the guidelines on how to train your own NER. http://nlp.stanford.edu/software/crf-faq.shtml#a
Unfortunately, in my case, the named entities are not efficiently extracted from the document. Most of the entities go undetected.
Just in case you find it useful.

Cheap places for MSDN Licenses

A similar question has been asked: MSDN subscriptions on the cheap?, but I am not interested in the solutions provided:
I am not developing a product for sale, I am starting up a consulting company, so Empower is not an option.
I have visited the links to MS regarding MSDN subscriptions and they do not point to a way to get an inexpensive copy.
I am not interested in suggestions that I become a MVP. Frankly, I'm desiring to focus on developing my company, not jumping through MS's hoops.
There are really only a few options available
Buy it at standard price
Become a Microsoft Certified Partner, and get a good discount (Actually much simpler than you would think, I did it in under 2 weeks for my business)
Find a MVP buddy that is willing to share a free giveaway
But in all reality, these are the ONLY legal options. You might also try calling Microsoft, you never know what might happen, they have many special programs that are not necessarily publicly advertised.
What you want is the Action Pack: https://partner.microsoft.com/US/40016455.
(Note, as an employee of Microsoft, I apologize that you have to LOG OUT of your LiveID to see this page if that LiveID is not already attached to a Registered Partner.)
You don't have to be certified to get access to this, just registered (there are three levels of partnership: 1. Registered, 2. Certified, 3. Gold Certified). You do have to pass a fairly simple assessment test, though.
See the pdf referenced at https://partner.microsoft.com/US/40082823 for an overview of the process.
One last thing - if you are a student (I suspect the OP is not), you can get many Microsoft tools free from http://www.dreamspark.com.
Surely your consultancy will need a website in ASP.NET and perhaps your clients would like a widget that talks directly to a web service on your site? There's your product.
Also, look into "Value added Services" amongst the Empower documentation.
I'm on the Empower program - there really aren't any barriers to entry, as such.
I used to go directly through MS, but nowadays, I always order mine through Xtras.net - they have good multi-year discounts and you manage the subscription online through Microsoft's site as normal.
Does Empower require that the 'main' use is developing a product?
You can always develop a product as well - doesn't have to be very sucessfull, perhaps something to display the time in a window?