One-To-Many travel time calculation and export - one-to-many

I have an excel document with every Australian postcode including LONG and LAT coordinates for each.
What I need to be able to do is get the travel time from one location in Aus to each of the locations in this excel document and end up with these times exported into my original document. I'm happy to do this manually with a separate export and using some excel formulas to match the data.
Does anyone know of any tools/websites that can do this? I know there are some websites offering API to do a one-to-many matrix but unless you're only doing a few locations it would be incredibly tedious and time consuming to modify the API call to do about 1000 locations.
I am also not really good with API calls so would need some good documentation to assist if this was the path I had to take.
Appreciate any help or resources people can direct me to.

Related

Automating the process of creating doc word

I have a .doc template I use for building CVs for many friends.
I'm trying to automate this process using simple library/program, for exmaple, that can accept data like name, email, phone number, job title, and can create the .doc automatically.
What framework can be used for that to make it fastest i can?
Thanks,
Tal
Where exactly are keeping this template and are your friends plugging in the data or are you doing it all yourself?
No matter what, you're basically looking to do a data merge. An example of a data merge is a mail merge:
https://support.microsoft.com/en-us/help/294683/how-to-use-mail-merge-to-create-form-letters-in-word
The same thing really applies to what you're accomplishing to do.
You can take a template, specify the fields that require variable data (aka the different information that's changing), and then just use a spreadsheet to pull the data from and plug it in.
Now the question you'll probably be wondering next is how data merges use spreadsheets. The way data merges work is that each column you set with data in it, that should correspond to the changing lines in your template. I strongly recommend you read up on this further - it's not that difficult to do once you get the hang of it.
The last question is probably how you'll compile the data into this spreadsheet. Are your friends going to fill out an online form perhaps? If so, you'll need an online form of some sort perhaps, so you'll need to use some PHP, have a database to store the information from the form, and then just go to the table and export the information as a .csv file after you see you have enough data populated in your database table to do a data merge.
If you don't have access to MS Office, I'm sure you can accomplish this in OpenOffice.org instead (which is free/open-source).
Hope this helps.
At my job we do data merges all the time - for mail merges, for letters that need to be personally address to individual recipients, and we do this for people who need to print dozens of different business cards for different employees. We take their business card template and just do a data merge from a spreadsheet to save time on needing to set up individual files. P.S. you can also use Adobe inDesign for this, if you know how to use it.

Use VBA to pull store locations

I am trying to pull all the store locations from several different websites, for example Sephora http://www.sephora.com/store-locations-events and Freshpet http://freshpet.com/storelocator/.
Every site constrains the radius for the search, so to find all locations, the only thing I have come up with is cycling through every zip code and then filtering out duplicates. Is there a way to use VBA to manipulate the search radius using VBA to just search the entire United States, instead of only up to 100 miles? That way, I could just do one search for each of these sites.
Thanks!
I don't think you can do it on those javascript sites. VBA works great with HTML, but not so great with javascript and flash. Also, the code isn't even exposed, so the search results must be generated dynamically from the server. I just did a search on that freshpet site and got the results then did a right-click and viewed the source code and nothing that I searched for was in the code.

Get a list of files on S3 that have changed since X

I have about 40,000 images up on S3 and I've downloaded into my application/database then sent them out to another site (like ebay or magento)
This is to support a client that sells his products on a few sites. Sites which really you'd rather keep a copy of the product image on their site. (so they can resize it and such)
My issue right now is that I want to poke S3 every once and a while looking for new files, or modified files.
I don't much like the idea of targeting each file one at a time. Nor do I like the idea of bringing down all the file names and dates and then comparing them with dates I've stored. Both seem to be quite wasteful especially if I want to run this every day (or every hour).
What I had hoped for, and what I'm looking for is a way to say "give me the names of all the files that have changed since 2013-10-14 13:10:30. This would let me store just one value, and if nothings changed, then I'd get back nothing (or something that indicated nothing).
Is there a way to get a list of changed files since X date?
I'm language agnostic.. though Ruby/Rails would be cool.
Note: I've tried to figure it out with the WSDL, but it doesn't quite seem to help as much as I'd hope.
Unfortunately S3 does not offer any support for this.
Currently your only options are to either list all the objects in the S3 bucket and check for changes or to keep track of the changed objects separate from S3 (record the last changed timestamp in some data store when you change the objects).

How to import complex relational data into SQL Server from Excel

We have business users who are entering product information into excel spreadsheets. I have been tasked with coming up with a way of entering this information into our SQL Server DB. The problem is that the excel spreadsheets aren't just a flat table, they're hierarchical. They're something like this
-[Product 1] [Other fields]...
-[Maintenance item 1] [Other fields]...
-[Maintenance task 1] [other fields]...
-[Maintenance item 2] [Other fields]...
-[Maintenance task 2] [other fields]...
-[Maintenance task 3] [other fields]...
-[Product 2] [Product Description] [Other fields]...
ETC.......
So there can be 0-many maintenance items for a product and 0-many maintenance tasks for a maintenance items. This is how the database is structured. I need to come up with a standard excel template I can send out to our business users so they can input this information and then figure out how to export this into sql server. The volume is going to be high so I need to have the import somewhat automated. How should I do this?
Welcome to the worst possible way to store data and try to import it into a database. If at all possible do not let them create garbage Excel spreadsheets like that. That method is bound to create very many bugs in the data imports and you will hate your life forever if you have to support this mess.
I can't believe I'm even suggesting this, but can you get them to use a simple Access database instead? It could even link directly to the SQL server database and store the data correctly. By using Access forms, the users will find it relatively easy to add and maintain information and you will have far fewer problems than trying to import Excel data in the form you described. It would be a far less expensive and far less error prone solution to your problem.
You are stuck with the format, the best way I have found to do something like ths is to import it as is into a staging table add the ids to every subordinate row (you may end up looping to do this) and then drag the information out to relational staging tables and then import into the production database.
You can create all this using SSIS but it won't be easy, it won't quick and it will be very prone to bugs if users aren't disciplined abnout exactly how they enter data (and they never are without a set of forms to fill out). Make sure you reject the Excel spreadsheet completely and send it back to the user if it strays at all from the prescribed struture. Trust me on this.
I's estimate the Access solution to take about a month and the Excel solution to take at least six months of development. Really that's how bad this is going to be.
I don't believe you'll find an import tool that will do this for you. Instead, you're going to have to write a script to ETL the spreadsheet files. I do a lot of this in Python (I'm doing it today, in fact).
Make sure that you handle exceptions on per-cell level, reporting to the user exactly which cell had unexpected information. With spreadsheets created by hand it is guaranteed that you will have to handle this on a regular basis.
That said, if this is coming to you as XLSX it might be possible to develop an XML translation to convert it to some more tractable XML document.
It probably makes more sense to break it up into several Excel sheets...one for product, but then another for maintenance items, and another for maintenance tasks. For each one, they'll have to enter some kind of ID to link them back together (ex: maintenance_task_id=1 links to maintenance_item_id=4). That can be a pain for business users to remember, but the only alternative is to enter lots of redundant data for each line.
Next, create a normalized database model (to avoid storing redundant data) and fill it by writing an app or script to parse-through your Excel sheets. Vague and high-level, but that's how I'd do it.
I agree with previous posts in general...
my suggestion - avoid the spreadsheet entirely. Spend your time making a simple front end form - preferably a web based one. catch the data as cleanly as possible (ANYTHING here will be better than the spreadsheet cleanliness. - including just having named fields)
you will spend less time in the end.
I would add VBA code to the template to add as much structure and intelligence as possible to the user data entry and validation.
In the extreme case of this you make the user enter all data via Forms which put all the validated data on the sheet, and then have an overall validation routine built into the Save or Close event.
less extreme would be to add 3 command buttons driving code for
- add product
- add maintenance item
- add maintenance task
and some overall validation code at save/closeThis way you add as much smarts as possible to the data entry tasks.
Use Named Cells or other hidden metadata created by the VBA code as markers so that your DB update routine can make better sense of the data.The last one I did like this took 3-4 manweeks including the DB update routines, but I think it was probably more complicated than your example. But if you are not experienced with VBA and the Excel object model and events it would obviously take much longer.

Rails 3: build places database from google maps

I have a ruby on rails 3.0 application which will need a database containing the name and address of all the listed places in a certain area indexed by google maps. I don't need to display the map provided by google maps itself, I just need to be able to get the names and addresses of all places in a certain area from google, store this on my server, and then match the address/name a user enters with the place in my database if it exists, or add it if it doesn't. I have some questions about this:
Are there any gems out there that would help with this? A quick google search brought up gems which show places graphically on the google map, but this isn't what I need.
Approximately how much space will I need to store the names and addresses of, say every place in a city indexed by google maps?
I'd appreciate any feedback on how to go about building the places database using the google maps database as a source and making sure it's quickly accessible.
I just implemented an application for Google Places . There were about 4 million records in the Google places Db so the approach i used it you just make a curl call for a google place and store it in Database ,so next time i serve the place from DB. Don't store all the records in Db as this will make your Db too heavy. also google places allow only 1000 queries per day from a single IP. you also need ranking of the places with Db eg Paris in Canada and Paris in NYC will come together when you will search for paris. Google doesnot provide data according to the ranking . there is a different mechanism for finding ranking of the places