To build an App for an Internet site without its API and Schema - api

I was asked to build a control-system for a Ebay-like Finnish auction-site huuto.net.
The system would reopen closed auctions by a specific rules. It would be completely external from the main site, running at an external website.
The site is however unwilling to release its API and Schema. I know no way to build such a system without knowing its API.
How do you build an internet site without its API and Schema?

You could try some form of automatic browsing: mechanize
Edit:
Examples here.

I think you're asking about building a site that interacts with another site without using a well-defined API. Is that right?
You can interact with an external site without using an official API - in order to do so, you need to imitate a normal site visitor and send your requests to the site frontend (in much the same way as a web crawler does). Tools like hpricot, mechanize and curl can help you parse the content of pages and send requests, but in doing so your system may be quite brittle. Any change to the target site might mean you have to rewrite portions of your system.

It might be possible to get the data you need by screen scraping the site. You could perform the operations you want to do by POSTing data into their forms or using a WebClient type API to make your program act like a web browser but that's likely to be an extremely brittle solution.
Honestly though, without an API, there really is no good solution.

you either need access to the database or an API, otherwise no point in even trying.

Related

How to automatically test application web page looking for error?

Is there an easy way to test some web page looking for some errors. I mean usually while doing web testing we want to check the display which could be very difficult to maintain.
So I wanted to know if as an alternative strategy there are some practice to massively test the URLs of a website and just look for any kind of error including JS error in the console.
I think it is possible using a framework like Selenium but it might be a bit overkill no ?
Also the idea will be to do that on a production server, in addition to test server.
The website have some authentication so just hitting the URLs will not work.

Chrome Extension: how to safely restrict the content and customise the user experience?

I'm enjoying developing cross-browser web extensions, the main target being Chrome, so much that I started to think to develop one for my company. I find a chrome extension quite a cheap and efficient way to deploy internal apps. The main purpose is to host a couple of dynamic dashboards that fetch data from various APIs by using cross-domain ajax in background scripts. I finalized the app and I was also able to implement the authentication via chrome.identity and Azure AD.
However, I am struggling to find a safe way to customise the content.
I mean, when the extension is installed it requires to login to azure via the chrome.identity flow. Then I get a token that I use to query ms graph and get the user ID, name, email and basic info.
Until I get this information I want the browser action (popup) to be unavailable to the user as well as any other extension pages. After a successful login I would like to show the content on the pop up and to let the user access the pages, but here I want to customize the experience.
I know how to use the user id retrieved from the api call to customize the extension, but I think it is not safe because all the code is in the client.
If I code something like
if (user === logged) show something
it will be damn easy for a malicious user to look at the code and bypass it, or even to impersonate another user. And chrome extension cannot be obfuscated.
Any help?
Thanks

Dropbox - any API to cli_link?

I'm using the dropboxd service under Linux, which requires you to log into their website e.g. https://www.dropbox.com/cli_link?host_id=2173bf325f94beee3b1879d2c7b49e69 to link the machine to your account.
Is there any programatic way to do this (ideally using Java)? To access the website above it seems you need to login using forms (which seems tricky to do programatically), and their basic REST API (https://www.dropbox.com/developers/core/docs) doesnt seem to cover the cli_link command.
I could write an app to do the sync using their full API, but it seems like overkill since aside from the cli_link requirement the basic dropboxd does all that I need.
The official Dropbox desktop client is unrelated to the API, though both the API and the Linux CLI require user interaction on the Dropbox web site (once per link) to authorize the linking. Also, note that automating/scraping the site itself is not allowed by the terms:
https://www.dropbox.com/terms#acceptable_use
Not really a solution for DropBox users, but in the end we just moved over to use MediaFire instead. That has a full REST API and doesnt require any manual intervention.

Best Practise for building mobile site

I am about to start building a mobile site which is dynamic, working from a lot of dynamic content which must come from the database.
I have already written a REST API for the site which the IOS and Android applications are using to interact with the information.
My question is what would be the absolute best practise for building this site, would it be:
1- Make the mobile classes an extension of the existing site functions
(The downside I see here is that the mobile site would be dependant on the main site library meaning that any bad heat on the main site would also affect the mobile site)
2- Make the mobile site a completely stand alone site running from itself
(The downside I see here is that any change to the main site library will need to be reflected here so in essence we would almost be writing code twice)
3- Make the mobile site run from the REST API and standalone
(The downside i see here is just increased number of HTTP requests for the information rather than communicating with the server directly)
Each one would function normally and there wouldn't really be any problem there, coding is really not too difficult, though if I make it standalone I would need to recreate a lot of the functions from the main site and adapt them for the mobile site which isn't ideal.
Look forward to your comments! Thanks
I would go with 3rd point, but that needs to be architect well.
We will prioritize standalone application after that API, also we can have 2 way communication, any content changed on server it will coordinate with clients to get that updated.
Also I would also suggest go with Bootstrap framework, its an awesome framework and have responsive and adaptive design

How to optimize Google Maps API integration

I've tried to evaluate my website with PageSpeed and I saw most warnings (Leverage browser caching, combine external javascript...) come from Google Maps API integration.
So my question is, is there a way to use it as PageSpeed would like?
Thanks.
Probably not, the external js files can change. If you combine them or host them locally with expires headers you may encounter bugs when they change.
I suggest to read this article in Google Blog: https://maps-apis.googleblog.com/2015/09/map-tips-speeding-up-page-load-times.html
Asynchronously loading JavaScript on your pages can give you huge performance wins. We have just updated all of our JavaScript Maps API samples and you can make these changes to your site, too. Read on to find out what we did and how it works, or skip straight to Unblocking scripts to see what you can start to update on your site today.