I just deployed an app in production and using the inspiration flight search for european airport return a "DATA DOMAIN NOT FOUND FOR REQUEST" error. all the countries aren't available in this API ?
Here is a list of the airport that return error
LIL
CDG
BRU
EIN
CRL
OST
RTM
ANR
SEN
The Flight Inspiration Search and Flight Cheapest Date Search APIs are "inspirational APIs" built on a pre-computed cache. This cache has a limited number of origin-destination and do not cover all possibles cities, for this you will need to use the Flight Offers Search API.
We are working to increase the data coverage of those 2 APIs to offers more and better data.
Note: The API works with city IATA code and not airport (NYC for New-York and not JFK / PAR for Paris and not CDG)
Related
I'm trying to use Amadeus' Airport Nearest Relevant API and I'm running in to a weird case. Given the following parameters
latitude: 45.366431
longitude: -75.7955389
radius: 100
I would expect to get the nearest airport, Ottawa International, but am instead getting OGS--an airport in New York state, ~86km away as compared to Ottawa's 10km. Ottawa doesn't even show up in those search results, in fact--it exclusively lists NY state airports, none in Canada let alone in Ottawa. This seems weird.
One thing to note is that I'm using the sandbox API, not the production API. Would that affect anything?
You are right, the test environment offers only a subset of our data (you can find here the list of supported countries for Airport Nearest Relevant). If you want to access to all the data you will need to move to production.
This list of supported countries/cities/airports per API will be soon published on the portal.
I have seen how easy pvlib-python can obtain weather forecasts, as it is presented in this link: https://pvlib-python.readthedocs.io/en/latest/forecasts.html
In this link, the example is just for illustration, the retrieved weather data seem to be limited in length (not more than a month from the past). So, I wonder whether the archived weather forecasts retrieved by pvlib for a practical implementation can be longer.
Can pvlib-python retrieve archived GFS weather forecasts for a year?
For example, I am looking for the temperature and solar irradiance (GHI) for the entire 2018. Can pvlib-python do that, and if so how?
This is not possible with pvlib-python. I think it's out-of-scope and I don't anticipate adding this feature in the future.
However, I wrote a python script to download some archived point forecast data from the NOAA NOMADS server: https://github.com/wholmgren/get_nomads/ It's efficient in that in only downloads the data that you need, but it's still fairly slow and error prone.
I wrote a small client for the CAMS radiation service: https://github.com/GiorgioBalestrieri/cams_radiation_python.
It contains a notebook showing how to combine this with pvlib.
From the website:
Copernicus Atmosphere Monitoring Service (CAMS) radiation service provides time series of Global, Direct, and Diffuse Irradiations on horizontal surface, and Direct Irradiation on normal plane (DNI) for the actual weather conditions as well as for clear-sky conditions. The geographical coverage is the field-of-view of the Meteosat satellite, roughly speaking Europe, Africa, Atlantic Ocean, Middle East (-66° to 66° in both latitudes and longitudes). Time coverage is 2004-02-01 up to 2 days ago. Data are available with a time step ranging from 1 min to 1 month. The number of automatic or manual requests is limited to 40 per day.
See the repo readme file for more information.
Am trying to query data from http://www.census.gov, using their API
I want to get the population of a particular city in the US, by using the city name and the US state code.
Given that I already have the key, what other parameters do I add in the URL below, so that I can get the population.
http://api.census.gov/data/2010/sf1?key=<my key>
any assistance will be greatly appreciated
Judging from your query URI, you wish to access population data from the 2010 Census Summary File. You would add GET paramaters of get and for to your query. Example:
http://api.census.gov/data/2010/sf1?key=b48301d897146e8f8efd9bef3c6eb1fcb864cf&get=P0010001&for=state:06
The population table as given in the get parameter are identified with a "P" and you can use the for parameter to further narrow down your scope. Examples of valid criteria formatted as URIs can be found here...
EDIT: It seems that for a finer grained search such as cities, you're going to need to use the governments cumbersome FIPS (Federal Information Processing Standard) codes (after converting lat/lon regions to their coding system)... I've found this resource that should be helpful, specifically points 5 thru 7, but it seems mega complex...
Another alternative I found is the USA Today census API, it seems that they mirror the data from the census and they do have available endpoints with data granularity at the city level... Check it out here...
no need to use API the data is available in CSV here http://www.census.gov/popest/data/cities/totals/2012/SUB-EST2012.html
I wonder how the hierarchical relationship in WordNet between the words are retrieved.
Is that manually done or via computer techniques.
If based on computer techniques, what are they?
From the FAQ:
q.1.2 Where do you get the definitions for WordNet? (short answer) Our
lexicographers write them.
Where do you get the definitions for WordNet? (long answer) From the
foreword to WordNet: An Electronic Lexical Database, pp. xviii-xix:
People sometimes ask, "Where did you get your words?" We began in 1985
with the words in Kučera and Francis's Standard Corpus of Present-Day
Edited English (familiarly known as the Brown Corpus), principally
because they provided frequencies for the different parts of speech.
We were well launched into that list when Henry Kučera warned us that,
although he and Francis owned the Brown Corpus, the syntactic tagging
data had been sold to Houghton Mifflin. We therefore dropped our plan
to use their frequency counts (in 1988 Richard Beckwith developed a
polysemy index that we use instead). We also incorporated all the
adjectives pairs that Charles Osgood had used to develop the semantic
differential. And since synonyms were critically important to us, we
looked words up in various thesauruses: for example, Laurence Urdang's
little "Basic Book of Synonyms and Antonyms" (1978), Urdang's revision
of Rodale's "The Synonym Finder" (1978), and Robert Chapman's 4th
edition of "Roget's International Thesaurus" (1977) -- in such works,
one word quickly leads on to others. Late in 1986 we received a list
of words compiled by Fred Chang at the Naval Personnel Research and
Development Center, which we compared with our own list; we were
dismayed to find only 15% overlap.
So Chang's list became input. And in 1993 we obtained the list of
39,143 words that Ralph Grishman and his colleagues at New York
University included in their common lexicon, COMLEX; this time we were
dismayed that WordNet contained only 74% of the COMLEX words. But that
list, too, became input. In short, a variety of sources have
contributed; we were not well disciplined in building our vocabulary.
The fact is that the English lexicon is very large, and we were lucky
that our sponsors were patient with us as we slowly crawled up the
mountain.
I've gotten a request to show a person's local time based on their phone number. I know our local GMT offset, so I can handle USA phones via a database table we have that links US zip_code to GMT offset (e.g. -5). But I've no clue how to convert non-US phone numbers or country names (these people obviously don't have a zip code).
If you care, my employer college wants to solicit our alumni for donations and do it during reasonable hours.
Sorry to all that I didn't clearly state that I was considering HOME phone numbers. So roaming isn't an issue. I'm looking for some reference table or Oracle application I can source this info from.
Florida has two time zones, but many countries only have one. You need this table: http://en.wikipedia.org/wiki/List_of_country_calling_codes . Parse the country code out of the phone number by looking for the 1 and the area code for NANPA countries (those countries using the same 1+area code allocation as the USA), 7 for Russia or Kazakhstan. If that doesn't match check to see whether the number starts with one of the 2 digit calling prefixes, and then the 3-digit ones.
Remember that the first few digits of the number may be the international dialing prefix, and are not properly part of the telephone number.
For countries that span more than one time zone, see if you can get allocation information from their national telecom regulator. For the USA and other NANPA countries, check out http://www.nanpa.com/ .
Of course your results will be far from perfect, but hopefully you will wake fewer customers from their night's sleep.
Local time is one thing but, if you have worldwide customers, there are also local habits to take into account.
People typically go to bed earlier in Norway than in Spain (and they are in the same time zone).
You might be able to get the phone company to feed you location data (this info should exist for land lines and must exist for cells) but expect to pay.
Some nations are easy, since they are in a single time-zone. Look at Europe and add millions of people by just using the internation dialing code. +47 for Norway etc.
Phone-number allocations are usually done by a national telecom authority, so you could probably get the information for free.
As you allready know this would only take into account default-timezone, since they might be anywhere on the planet at the time. Also number-allocation might not distingish at all between timezones, so the approach is buggy but potentially usefull to provide default settings.
Regards
Look in the phone book. Ours has quite a few pages mapping area codes onto countries/provinces/states. Then you have to map geographical locations onto time zones, but that is pretty straightforward.
Impossible. If I drive about 400 miles east (west coast of the US) then I'll break your algorithm by having a XXX number in a YYY timezone.
Now if this is a cell phone app, it does seem possible with something called NITZ.
I think Danie, Bortzmeyer, and others are over thinking the problem. It's not to maximize the calling window, it's to find an acceptable time.
Let's take the US and consider only the 4 major timezones. Say we define acceptable as from 10AM - 7PM. I doubt even the Norwegian Bachelor farmers go to be before 7PM.
So if you know that the phone is in the US, don't make any call before 1PM. That way if they are in NYC or LA, it's still after 10AM. And no calls after 7PM. Who cares if it's Florida main or its hour later panhandle? Dallas or El Paso, also same state but different time zones. For US, filter for AK and HI. The only seriously difficult country is Russia -- lots-o-timezones.