Is there a Plone package to serve images via CDN (Amazon cloudfront) - amazon-s3

Is there a way to manage images uploaded into a plone system, but have them be synced and automatically served with Amazon S3/Cloudfront?
I've seen reference to a project that doesn't look like it's been touched since early 2011: http://plone.org/products/collective.cdn.core/ and that only has experimental image support, and not necessarily for Amazon s3/cloudfront

There is not yet an add-on product that makes this "point-and-click" easy. https://github.com/collective/collective.cdn.core suggest that collective.cdn.core has continued to be developed, although the authors haven't pushed their releases to plone.org (shame shame!). It still does not appear to include "native" Amazon/CloudFront support, but I suspect the add-on authors would welcome either code contributions or sponsorship to add that end.

Related

Efficient Nuxt generated static site hosting: Better on Amazon AWS or a Cloud Droplet

not sure if it belongs here or is well titled, but I finish soon my first Nuxt project and I am not sure, where to host it.
Usually I would use a Ionos or digital ocean droplet, but I was told that aws amplify or S3 (I have no Idea about any solution) might be cheaper or maybe cost nothing, since it is a small project, cause it depends on how intense process are ...
If true, would that apply as well, when I would need to run git pull and then the build/generate process, once a day, to get new content (via nuxt/content)?
Sorry if expressed poorly and thanks in advance for any helpful suggestion.
This question do not really belong to stackoverflow because it's essentially opinion based.
By order of preference, I do personally recommend those:
Netlify
Vercel
Digitalocean
Github pages
Surge
More on the official documentation of Nuxt: https://nuxtjs.org/docs/2.x/deployment/netlify-deployment

Worklight 6.2. Direct Update upload the full web part?

I'm making tests with Direct Update because a requirement of making daily updates to an Android App with look & feel changes.
What I have seen is that all the web files are uploaded and not only the new ones or the updated ones.
Is it possible to make a direct update of specific files?
For example I have an application with images and the size of all those images is 20mg, I make a change to a .css file. The direct update will contain my updated .css but also the 20mg of images that already are in the app and are exactly the same. Is it possible to upload only the .css?
This is not possible in the current releases of Worklight.
However, starting Worklight 6.3, which will be publically available in December 2014, the Direct Update feature is extended to support "Differential Direct Update".
With this feature it will no longer be necessary for the client application to download the entire web resources on every update. Instead, only the resources that were changed will be downloaded and updated.
More on that as 6.3 goes public and documentation becomes publicly available.

Is it a bad idea to call phaser.min.js directly from GitHub?

As a jQuery user, I link the remote library from Google using
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
which is very clever because it's already loaded in the brower cache if the reader visited another site that did the same.
As I discover Phaser, I was hoping to see people on the web doing the same with
<script src="https://raw.githubusercontent.com/photonstorm/phaser/master/build/phaser.min.js"></script>
or any library hoster, but apparently no one does.
Is there a reason why Phaser users don't do that?
I would strongly advise against linking to the master release as it will absolutely break your games over time. Most of the 2.0.x updates have been non-API changing, but 2.1 and above will be altering some core aspects of Phaser. You should only ever link to specific versions.
For a similar service to the Google hosted APIs (of which they only host very specific libraries that they've selected) we use CDN.js, which offers the same thing. You can find details in the Phaser README or just go to http://cdnjs.com/ and search for Phaser.
There is as such no issue if you link to a specific version. But you don't want the latest build, as this can potentially break your site by making changes.
I don't know Github's policies on referencing their site like this.

Storing files locally in Node Webkit App

Folks:
I'm creating an app using Node Webkit. The purpose of this app is to display images and pdfs. The app needs to download those files from a central repository, and cache them locally. When the app runs offline, the files should still be available, and displayed.
On the face of it, this sounds like appcache is the answer - and that indeed is where I was heading when this was a pure webapp in a browser. However, now I've discovered node-webkit, and here we are.
node-webkit's GitHub wiki states:
"However, application cache is designed for browser use, for apps using node-webkit, it's less useful than the other two method, read HTML5 Application Cache if you want to use it."
But doesn't say why.
I've also researched node.js filesystem - but that seems like a whole magnitude of complexity above what I need.
Can anyone point me in a sensible direction?
Thanks.
It has to do with the nature of App Cache itself.
You specify a manifest file that lists all the static assets required for your app to run offline. You don't have any programmatic access to the cache to add and remove files via JS.
So for a node-webkit app, it'd make more sense to fetch these files and store them in the Application Support folder (Or AppData, depending on the platform). That's where the node.js part is really useful, the file IO stuff.

Eclipse RCP Target Platform: Updates & Backing Up

I've just created an eclipse target definition/platform for my application, opting to use software sites (rather than local files/installations) as recommended in the tutorial I followed and a later best practices post by the same author.
The software sites are all external sites (eclipse, sourceforge etc.)
Everything seems to be working well, though I have two concerns:
If a component is updated (by the software provider), will it also be updated automatically in the target definition file?
Is it possible to take a backup of the target platform, so that it can be configured (for example) on a computer without an internet connection, or used in the event a remote site becomes unavailable.
You can create a mirror of an Eclipse p2 repository. It's quite common to do this inside an organisation so that there's a copy of the repository that's quick to access, and isn't dependant on some third party continuing to host it. There's a guide on the Eclipse Wiki.
As far as I'm aware, your Target Definition can only reflect what's in the p2 repository it's pointing at. If the developer replaces a package with a newer version, it'll pick that up. If you need greater control over that, then selectively mirroring the content is probably the way to go.
From that wiki page, it looks like by default it won't delete content in your mirror (even if it's deleted in the remote) unless you specify -writeMode clean.