In one of our apps, we have to implement an online / offline feature. The caching is already done. What we have to do, however, is to implement a save way to prevent the app from opening a network connection. So my idea was, use CFNetwork to route every network call thru an internal proxy which checks the status of the app. If the app is allowed to go online it simply forwards the message. If not, it returns an http error.
My question is: Are there any open source proxies out there, that can handle this feature or do I have to implement the proxy all by myself?
Best regards,
Michael
How are you doing the caching / making the requests? Because NSURLConnection supports caching by default, and has a cache mode NSURLRequestReturnCacheDataDontLoad that will only load from the cache (and return nil if the resource you're requesting isn't available in the cache). If you're not using NSURLConnection, what are you using instead (since this will probably affect your solution)?
Related
Trying out cloudflare for ddos protection of an SPA webapp, using free tier for testing.
Static contents loading is fine, but API calls became very slow.
From original <50ms for each api call to around 450~500ms each.
My apis are called via a subdomain eg apiXXX.mydomain.xyz
Any idea the problems or alternative fast ddos protection solution?
Cloudflare has created a page to explain the configuration you have to do when proxyfing APIs :
You have to create a new Page Rule in which you have to bypass cache, turn off Always Online and Browser Integrity Check options.
If you do not configure this, you may have slow response time because all those options are enabled by default on API calls.
Here's the link to create the configuration : Cloudflare Page Rule for API
Information you get from the internet is increasingly tailored specifically for "you".
Google search results or youtube comments ranking algorithms have way to much power not to be fully disclosed.
You can make a snapshot of a web site using a web archiving service and that service could testify its authenticity.
How could one have a similar "provable" snapshot for you specific user session?
Of course, if you give your access tokens to the archiving service they could testify it but how could that be done in a trustless way?
I thought about using a browser extension but nothing prevents someone from using a tempered browser build?
Could a proxy record of the encrypted communications be used somehow?
Would a provable link between it and the decrypted version of the interaction be possible without exposing the user private key?
Are there any methods already available?
If not...any ideas?
Thanks
edit:
TC "kind of" solves the "secured against its owner" that I'm assuming browser extension also solve but I don't think TC would help since you can just build your own version of the browser.
If that would be possible and a "man-in-the-middle attack" could be prevented then a simple "webRTC tab sharing" record would do it (or some other king of record and replay method)
I was thinking in more in some kind of method where you would log into the site and "only" after that all communications would go through a proxy.
You would record you interaction with the site and close the connection.
Then you would give the TLS key and all the saved interaction to the proxy and it would be able to replay that interaction with the cached network traffic.
Is something like that possible?
What currently are React Native's default behaviors for caching in fetch calls? The official FB guides simply say "look at Mozilla!" but we're not on a web browser. I would assume cache behavior is custom here as result of the middleware.
Let's say I do: fetch("https://exampleserver.com/myfile.json")
Are requests automatically cached once called?
Is the request contents of myfile.json cached the entire "session" (ie: App is running active/bg, but not forced closed by user).
Where is the request cached? Ie: is it using AsyncStorage
Would fetch the URL again result in the app reading cache.
How "fast" is caching, if for some reason I have to instantly request myfile.json multiple times, is it going to basically ignore cache at that point and make all those separate calls? (I am seeing this behavior in debugger)
When I force close the app, and reopen, does this cache still exist?
If so, can I request the cache to persist?
Any of this behavior different in iOS than Android?
Does Expo affect this at all?
Knowing at least some of this would help decide whether I need to write custom caching situation with AsyncStorage like so https://gist.github.com/dslounge/18e555250a8df1f8218d702b21910eeb
React Native’s fetch API bridges to NSURLSession on iOS and okhttp3 on Android. Both of these libraries strictly follow the HTTP caching spec. The caching behavior will depend primarily on the Cache-Control and Expires headers in the HTTP response. Each of these libraries have their own configuration you can adjust, for example to control the cache size or to disable caching.
The cached files are not guaranteed to persist until they expire. The system can purge them whenever it wants.
If you make three requests really fast then you will, in general, succeed, because the caching is neither immediate nor guaranteed.
In general: set your HTTP response headers appropriately, but don’t rely on HTTP caching to behave a certain way for the proper functioning of your app. If you want a guarantee that a second request won’t actually make a network connection, you need to write that yourself.
I don’t think Expo affects this.
Are there any specific spec'd processes that a browser client can use to dynamically encourage a server to push additional requested items into the browser cache using HTTP/2 server push before the client needs to actually use them (not talking about server-side events or WebSockets, here, btw, but rather HTTP/2 server push)?
There is nothing (yet) specified formally for browsers to ask a server to push resources.
A browser could figure out what secondary resources needs to render a primary resource, and may send this information to the server opportunistically on a subsequent request with a HTTP header, but as I said, this is not specified yet.
[Disclaimer, I am the Jetty HTTP/2 maintainer]
Servers, on the other hand, may learn about resources that browsers ask, and may build a cache of correlated resources that they can push to clients.
Jetty provides a configurable PushCacheFilter that implements the strategy above, and implemented a HTTP/2 Push Demo.
The objective of server push is that the server send additional files (e.g. javascripts, css) along with the requested URL (e.g. an HTML page) to the browser before the browser knows what related files are required, thus saving a round-trip and improve webpage load speed. If the browser already know what resources are needed it can request with normal HTTP calls.
We have recently fixed a nagging error on our website similar to the one described in How to stop javascript injection from vodafone proxy? - basically, the Vodafone mobile network was vandalizing our pages in transit, making edits to the JavaScript which broke viewmodels.
Adding a "Cache-Control: no-transform" header to the page that was experiencing the problem fixed it, which is great.
However, we are concerned that as we do more client-side development using JavaScript MVP techniques, we may see it again.
Is there any reason not to add this header to every page served up by our site?
Are there any useful transformations that this will prevent? Or is it basically just similar examples of carriers making ham-fisted attempts to minify things and potentially breaking them in the process?
The reasons not to add this header is speed performance and data transfer.
Some proxy / CDN services encode the media, so if your client is behind proxy or are you using a CDN service, the client may get higher speed and spend littler data transfer. This header actually orders proxy / CDN - not to encode the media , and leave the data as is.
So, if you don't care about this, or your app not use many files like images or music, or you don't want any encoding on your traffic, there is no reason not to do this (and the opposite, recommended to).
See the RFC here: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.5
Google has recently incorporated the service googleweblight so if your pages has the "Cache-Control: no-transform" header directive you'll be opting-out from transcoding your page in case the connection comes from a mobile device with slow internet connection.
More info here:
https://support.google.com/webmasters/answer/6211428?hl=en