Is it a bad idea to call phaser.min.js directly from GitHub? - browser-cache

As a jQuery user, I link the remote library from Google using
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
which is very clever because it's already loaded in the brower cache if the reader visited another site that did the same.
As I discover Phaser, I was hoping to see people on the web doing the same with
<script src="https://raw.githubusercontent.com/photonstorm/phaser/master/build/phaser.min.js"></script>
or any library hoster, but apparently no one does.
Is there a reason why Phaser users don't do that?

I would strongly advise against linking to the master release as it will absolutely break your games over time. Most of the 2.0.x updates have been non-API changing, but 2.1 and above will be altering some core aspects of Phaser. You should only ever link to specific versions.
For a similar service to the Google hosted APIs (of which they only host very specific libraries that they've selected) we use CDN.js, which offers the same thing. You can find details in the Phaser README or just go to http://cdnjs.com/ and search for Phaser.

There is as such no issue if you link to a specific version. But you don't want the latest build, as this can potentially break your site by making changes.
I don't know Github's policies on referencing their site like this.

Related

Can react-admin be deployed as a static site?

I wish to deploy a react-admin application as a static site - running without a server component. This data browsing app would use the ra-data-fakerest on a read-only slug of JSON.
Examples of deploying this would be to a GitHub Pages site, or a dumb memory stick.
Since react-admin is written in React/Typescript it must get trans-piled before running, presumably into a build folder. So, am I right in thinking I could just deploy the build folder?
(also grateful for any lessons-learned on this subject).
Yes, you absolutely can. The react-admin demo (https://marmelab.com/react-admin-demo/) is indeed a static site. What is deployed is just the built version of https://github.com/marmelab/react-admin/tree/master/examples/demo.
The react-admin codesandbox is another example of that: https://codesandbox.io/s/github/marmelab/react-admin/tree/master/examples/simple

Chrome manifest v3 - is there a viable workaround to use Google's File Picker in a Chrome extension?

My searches have turned up nothing concrete. My extension uses Google's file picker to allow the user to browse their sheets and choose a desired file to write some data to, which manifest v3 breaks because of some GAPI limitations. Unless I've missed something obvious, there does not seem to be a simple workaround or method for this to migrate to v3 -- it just seems to be disallowed.
I'm not asking if there's a way to do something that they intend to not be possible (even though I doubt such a thing would exist with Google) but I'm optimistically hoping that maybe there is some hacky/annoying workaround that still fits within their rules. If I absolutely have to just allow them to set a sheet URL manually I will...I'm just trying to avoid it.
Any tips or suggestions would be appreciated.
You may have to test it yourself to make sure there are no weird behaviors, but Google has some recommendations regarding this in their migration guide:
In Manifest V3, all of your extension's logic must be included in the extension. You can no longer load and execute a remotely hosted file. A number of alternative approaches are available, depending on your use case and the reason for remote hosting. Here are approaches to consider:
Configuration-driven features and logic
In this approach, your extension loads a remote configuration (for example a JSON file) at runtime and caches the configuration locally. The extension then uses this cached configuration to decide which features to enable.
Externalize logic with a remote service
Consider migrating application logic from the extension to a remote web service that your extension can call. (Essentially a form of message passing.) This provides you the ability to keep code private and change the code on demand while avoiding the extra overhead of resubmitting to the Chrome Web Store.
Bundle third-party libraries
If you are using a popular framework like React or Bootstrap, you can download the minified files, add them to your project and import them locally.
For your case, option #3 seems like the easiest. Looking at the Google Picker API documentation it only uses two relatively small script files, https://apis.google.com/js/api.js and https://accounts.google.com/gsi/client. You could try to bundle these in your Chrome extension and call the methods locally.

Syncing Route 53 static website to github website repository (allowing offline/destop editing)

I have had a github.io website for quite some time, and want to move it over to it's own domain name. Currently, I have transferred all the files to the right S3 bucket and I can access the website all the same as I can my Github. However, I want to be able to edit individual files in a desktop app (such as SubEthaEdit or Brackets), like you can in Github, which to my knowledge you cant.
I would like to be able to migrate the Github website over to the domain name, so that when I edit, save, and push a commit from the Github desktop using SubEthaEdit, it would be automatically updated on the domain website (ideally). Either this, or they share a library that is still accessible from the Github desktop app. I really just hate editing files using S3's editor, or having to download and re-upload any document I am working on.
Im quite new to any sort of coding or programming languages other than a little bit of javascript, apologies if the language I used is, shall I say, sub-par.
Got It!
#ceejayoz helped to point me in the right direction.
after some fiddling, I got a CNAME record that points towards my github website, while maintaining https.

Is there a Plone package to serve images via CDN (Amazon cloudfront)

Is there a way to manage images uploaded into a plone system, but have them be synced and automatically served with Amazon S3/Cloudfront?
I've seen reference to a project that doesn't look like it's been touched since early 2011: http://plone.org/products/collective.cdn.core/ and that only has experimental image support, and not necessarily for Amazon s3/cloudfront
There is not yet an add-on product that makes this "point-and-click" easy. https://github.com/collective/collective.cdn.core suggest that collective.cdn.core has continued to be developed, although the authors haven't pushed their releases to plone.org (shame shame!). It still does not appear to include "native" Amazon/CloudFront support, but I suspect the add-on authors would welcome either code contributions or sponsorship to add that end.

Is there a decent, standalone, cross-platform webserver that will work in concert with Autorun on USB Jump Drives?

I'm trying to find a decent standalone webserver that I can load up on a jump drive.
My wife is a photographer, and I'd like to present the clients with their images on usb. When they plug it in, I'd like a web page to load up, and run some jQuery magic to show them a nice carousel of all there images.
So far, this is all fine since it can all be done client side and doesn't need a server at all.
The problem I'm facing is that I'd like some server-side code to be able to read the images out of the directory so that once the interface is built, I don't need to manually create all of the <img /> tags.
If it was primarily going to be used in a Windows environment, I'd have no problem going with IIS Express, since I'm mainly a .NET MVC developer and this would be perfect for me... However, the fact of the matter is that a large amount of our client base is also OS X users.
I did find this Java one jlHttp, and I also found this thread here on SO, but I don't think I understand enough about either one of them to accomplish what I'm looking for.
Thanks in advance for your suggestions.
I'm looking for the same thing, and the two best options I've found were Flying Ant cd web server and Stunnix. Of the two, Flying Ant is cheaper, and I've tested it with success on my project.
I found Mongoose very convenient for this exact purpose. It's crossplatform, lightweight and requires minimum configuration. You may be interested in this project that uses Mongoose to display pictures in a folder tree or FTP directory.
How about Node.js
It says it runs on Linux, OS X, and Windows.