How To Use Here Maps Telematics Toll Cost With GPS Coordinates - gps

I am trying to calculate toll costs using HERE API https://tce.api.here.com/2/tollcost.json but I don't want to use link id's. I only want to use GPS coordinates to map the path and find all of the toll costs that would be associated. So far it looks like I need to use link id's? If this is not the case can someone provide an example of the route value that is needed?

Using our Fleet Telematics API, you can get the total toll cost between/along a set of GPS coordinates (waypoints).
https://developer.here.com/documentation/fleet-telematics/dev_guide/topics/calculation-considerations.html
This example (insert your and ) will return the total toll cost along the whole route:
https://fleet.api.here.com/2/calculateroute.json?mode=fastest;truck;traffic:disabled&currency=EUR&rollups=total&waypoint0=52.51,13.42&waypoint1=45.747353,11.733903&tollVehicleType=3&app_id=<app_id>&app_code=<app_code>&routeMatch=1
You can also get details of the toll cost per country or per link ID if necessary using the rollups parameter.

Related

How can pvlib-python retreive a year-long archived weather forecasts from the global model (GFS)?

I have seen how easy pvlib-python can obtain weather forecasts, as it is presented in this link: https://pvlib-python.readthedocs.io/en/latest/forecasts.html
In this link, the example is just for illustration, the retrieved weather data seem to be limited in length (not more than a month from the past). So, I wonder whether the archived weather forecasts retrieved by pvlib for a practical implementation can be longer.
Can pvlib-python retrieve archived GFS weather forecasts for a year?
For example, I am looking for the temperature and solar irradiance (GHI) for the entire 2018. Can pvlib-python do that, and if so how?
This is not possible with pvlib-python. I think it's out-of-scope and I don't anticipate adding this feature in the future.
However, I wrote a python script to download some archived point forecast data from the NOAA NOMADS server: https://github.com/wholmgren/get_nomads/ It's efficient in that in only downloads the data that you need, but it's still fairly slow and error prone.
I wrote a small client for the CAMS radiation service: https://github.com/GiorgioBalestrieri/cams_radiation_python.
It contains a notebook showing how to combine this with pvlib.
From the website:
Copernicus Atmosphere Monitoring Service (CAMS) radiation service provides time series of Global, Direct, and Diffuse Irradiations on horizontal surface, and Direct Irradiation on normal plane (DNI) for the actual weather conditions as well as for clear-sky conditions. The geographical coverage is the field-of-view of the Meteosat satellite, roughly speaking Europe, Africa, Atlantic Ocean, Middle East (-66° to 66° in both latitudes and longitudes). Time coverage is 2004-02-01 up to 2 days ago. Data are available with a time step ranging from 1 min to 1 month. The number of automatic or manual requests is limited to 40 per day.
See the repo readme file for more information.

Octaplanner example for Capicated Vehicle Routing with Time Window?

I am new to OctaPlanner.
I want to build a solution where I will nave number of locations to deliver items from one single location and also I want to use openmap distance data for calculating the distance.
Initially I used jsprit, but for more than 300 deliveries, it takes more than 8 minutes with 20 threads. Thats why I am trying to use Octa planner.
I want to map 1000 deliveries within 1 minute.
Does any one know any reference code or reference material which I can start using?
Thanks in advance :)
CVRPTW is a standard example, just open the examples app, vehicle routing and then import one of the belgium datasets with timewindows. The code is in the zip too.
To scale to 1k deliveries and especially beyond, you'll want to use "Nearby selection" (see reference manual), which isn't on by default but which makes a huge difference.

A better way to handle Long Lat distances

OK so I don't have an issue here but I'm just wondering if there's a more standardized way to handle what I'm doing.
Essentially I have a DB table full of locations including longitude and Latitude, there could potentially be thousands of locations. I also have some functionality to search your postcode and you can then see from the stored the locations the closest x amount to you.
Ive read about going off and using the Google Maps api to do this but I don't really want to pull back and send thousands of requests to the google maps api.
So here's what I'm doing. I have a stored procedure where I am passing the users Long and Lat. I am then using this to form a column called distance with which I am then ordering the data. The distance column I am working out using the below logic:
SQRT(SQUARE((CAST(USERSLAT AS decimal(9,6))) - Latitude) + SQUARE((CAST(USERSLONG AS decimal(9,6)))-(Longitude))) AS Distance
Essentially what this is doing is the classic a^2=b^2+c^2 to find the distance between to coords, and then using these results I can theoretically see the closest locations to the user. Once I have this data i can use the google maps api to find the exact distances. Is this an ok way to do things? I have this nagging feeling in the back of my head that im missing something.

How to use http://www.census.gov API to pull data

Am trying to query data from http://www.census.gov, using their API
I want to get the population of a particular city in the US, by using the city name and the US state code.
Given that I already have the key, what other parameters do I add in the URL below, so that I can get the population.
http://api.census.gov/data/2010/sf1?key=<my key>
any assistance will be greatly appreciated
Judging from your query URI, you wish to access population data from the 2010 Census Summary File. You would add GET paramaters of get and for to your query. Example:
http://api.census.gov/data/2010/sf1?key=b48301d897146e8f8efd9bef3c6eb1fcb864cf&get=P0010001&for=state:06
The population table as given in the get parameter are identified with a "P" and you can use the for parameter to further narrow down your scope. Examples of valid criteria formatted as URIs can be found here...
EDIT: It seems that for a finer grained search such as cities, you're going to need to use the governments cumbersome FIPS (Federal Information Processing Standard) codes (after converting lat/lon regions to their coding system)... I've found this resource that should be helpful, specifically points 5 thru 7, but it seems mega complex...
Another alternative I found is the USA Today census API, it seems that they mirror the data from the census and they do have available endpoints with data granularity at the city level... Check it out here...
no need to use API the data is available in CSV here http://www.census.gov/popest/data/cities/totals/2012/SUB-EST2012.html

Determine "reach" by geographic distribution

I have a large collection of checkins for products manufactured at a distinct geographic location. I'd like to create a summary metric used to rank these products by how far, globally, they have traveled from their point of origin. For example, a product produced in Maine that is found in California, Florida, and Dublin, Ireland should rank higher than a product made in California that hasn't been seen outside of California.
What kind of algorithms should I be looking at? How would you approach this?
MS SQL Server (which I've just spotted may not be relevant to you) includes spatial data types that allow you to calculate (among other things) the distance between two points defined by their latitude and longitude. So this code:-
DECLARE #p1 geography = geography::Point(#lat1, #long1, 4326);
SELECT #distance=#p1.STDistance(geography::Point(#lat2, #long2, 4326))
would load #distance with the distance in metres between the two points. I lifted the code from a scalar valued in line function that I wrote - but it could also be targeting table columns directly. The 4326 magic number is a reference to the Spatial Reference System Identifier (SRID) that provides answers in metres. This calculation doesn't take into account altitude and the distortion of the globe (other functions/SRIDs are available for this) but it's probably accurate enough for most purposes.
Unfortunately, if you are restricted to postgresql, this answer is of no use (though it may point you in a direction for further investigation).
A reference for Sql Server can be found here : http://technet.microsoft.com/en-us/library/bb933790.aspx