Remove requests to any GAFAM (google api here) from Nuxt - vue.js

Is there any easy way to remove any query to google font api "fonts.gstatic.com" in Nuxt.Js ? I would rather provide font files myself.
So far I tried to remove any mention of fonts.gstatic.com from .nuxt/components/index.js, but it seems that the command build reset my modifications, so nothing changed.
My configuration is quite simple, I initialized an app with #nuxt/content-theme-docs.

Since the concern is more aimed towards GAFAM (avoid the usage of Google fonts), the solution would be to fork the package for the Nuxt team and strip the related module.
Here is where to find it: https://github.com/nuxt/content/search?q=fonts
This module of Nuxt is aimed towards so fast, pain-free and easy to setup documentation. Hence probably why, Nuxt's team was using such package (since it's still the goto as of today to use Google fonts).
You can follow this answer if you want to use a module on build time: https://stackoverflow.com/a/68166329/8816585
Otherwise, you can use this website to have your fonts locally (link those to your CSS file and you should be fine): https://google-webfonts-helper.herokuapp.com/fonts

Related

Is it possible to deploy Vue and Vite without a server to run?

I have a very particular question, cause I wish to create a webpage that works without a server, and I'm trying to do it using vite, my whole project is using vue + vite.
if I try to use "vite build" command, the page deploy as blank, and the only way I can see the page is if I use "vite preview".
would it be possible, somehow, to load the content of the html page using vite, without needing the "vite preview"? just double clicking on index.html
Using vue-cli, this is possible by setting the publicPath in the vue.config.js file to an empty string, see: https://cli.vuejs.org/config/#publicpath
I've personally only used it with Vue 2, but from what I read online it should also be possible with Vue 3, if you're okay with switching to vue-cli.
Using Vite, I found this question and answer which shows a way by bundling all the scripts, css and images into a single file:
How to open a static website in localhost but generated with Vite and without running a server?
I did try that and it mostly works, but not currently for svg files which I use a lot of in my application. It might work fine for your use-case.
I did also need to add "type": "module", in my package.json to get rid of an error saying
"Error [ERR_REQUIRE_ESM]: require() of ES Module
/path/to/dist/index.js from /path/to/vite.config.inlined.ts not
supported."
If you open your page simply as an index.html, you will be limited regarding some API calls and alike. Nowadays, you will need a light server to be hosting it via a simple vite preview as you saw. This is mainly because the files are being worked with bundlers like Webpack and Vite.
I'm not sure that there is a way of loading the whole thing with just an index.html because files like .vue are not natively supported, you need a specific loader.
One simple solution would be to use Vue as a CDN, but it will limit your DX experience regarding SFC files, but you will be able to use Vue into a regular index.html file.
PS: your performance will also be worse overall (because of the required network requests).
If you want something really lightweight, you could of course also use petite-vue, maybe more suited towards super simple projects with a tiny need of reactivity.
I still recommend using something like Netlify or Vercel, to host your static site for free + having the whole Vue experience thanks to a server running vite preview for you.

Dropwizard serve external images directory

I have a dropwizard API app and I want one endpoint where I can run the call and also upload and image, these images have to be saved in a directory and then served through the same application context.
Is it possible with dropwizard? I can only find static assets bundles.
There is similar question already: Can DropWizard serve assets from outside the jar file?
The above module is mentioned in the third party modules list of dropwizard. There is also official modules list. These two lists are hard to find maybe because the main documentation doesn't reference them.
There is also dropwizard-file-assets which seems new. I don't know which module will work best for your case. Both are based on dropwizard's AssetServlet
If you don't like them you could use it as example how to implement your own. I suspect that the resource caching part may not be appropriate for your use case if someone replace the same resource name with new content: https://github.com/dirkraft/dropwizard-file-assets/blob/master/src/main/java/com/github/dirkraft/dropwizard/fileassets/FileAssetServlet.java#L129-L141
Edit: This is simple project that I've made using dropwizard-configurable-assets-bundle. Follow the instructions in the README.md. I think it is doing exactly what you want: put some files in a directory somewhere on the file system (outside the project source code) and serve them if they exist.

How do I share a flohub graph?

How can I share (or publish) a flohub graph like is done in this answer?
I need to be able to post a publically accessible project, and am willing to set up a server if needed.
Examples indeed only support a single graph for now. If your example uses subgraphs or custom components and is targeting NoFlo on the browser, another nice option is to make a public HTML build of it and host it somewhere (for example GitHub pages).
The noflo-browser-app repository has build automation setup for this, including pushing to GitHub. To use it, you need to do the following steps:
Fork noflo-browser-app
Set your project to use your forked repository path in project settings on Flowhub
Push your graphs and components to GitHub
Share the live mode URL
To make the automatic publishing of app builds to GitHub Pages work, you need to enable the project in Travis CI and provide a GitHub access token via the GH_TOKEN secure env var.
Also remember to tweak the component.json file to include whatever custom component libraries you need.
noflo-browser-app bundles the WebRTC runtime, so it should be quite easy to access.
Sharing is somehow magically implemented through github gists. This works with graphs using ONLY the built-in components. Here's how you do it:
create a github gist
copy the json for the graph you want to share and paste them into the gist. my main.json, for example.
name the gist file noflo.json (not sure if this is required)
copy the gist's id from the url, in my case it is ecf36f449034209b8c2e
form your share link like this https://app.flowhub.io/#example/<yourGistId> here is mine
This only works for projects which use standard components. This issue is tracked here

Testing Dojo with DOH without local Dojo installation

I'm trying to work out how to use DOH to test Dojo modules if I don't install Dojo locally to my project. I'm working in Eclipse and ideally, I'd like something that I can run as part of a Maven build eventually. The Dojo package is 5-20Mb and I don't want to have it stashed in my source control system with each project if possible.
I've tried a few options with the runner.html test runner, but DOH is going to need to find a Dojo somewhere, and then it seems that modules will be found relative to that installation.
Having Dojo installed on my system but not in the project gives me a problem in trying to find the project relative to the location of the dojo.js file. The cross-domain protection prevents me serving up any kind of absolute path as it strips : characters. It also stops me using a Dojo installation served up on a different domain over http.
Is it necessary to have Dojo installed somewhere that I can then define a relative path from dojo.js to the roots of my modules? If not, how do I configure to get around it?
I've not tried this completely cross domain, but yes, you can define paths which may be enough to get you going.
We run our tests using a somewhat complicated deployment (to ensure we don't introduce accidental dojo/doh path dependencies), and our URL looks like this:
http://server/XXX/dev/dohpath/util/doh/runner.html?boot=../../../dojo/dojo.js&dojoUrl=../../../dojo/dojo.js&paths=doh,../dohpath/util/doh;mymodule,../../mymodule&testModule=full.test.module
That is, you fire up the runner, give it both 'boot' and 'dojoUrl' to tell it where Dojo itself lives, use 'paths' to tell DOH where it lives and how to find your own modules.
blech
Whether those relative paths can be made absolute successfuly, and whether it'll work cross-domain is an entirely different matter, I'm afraid. We'll be hitting that problem ourselves in a couple of months.
I've been able to do this with the runner located at http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html and a gist.
The trick seems to be to use a path alias in the URL, and use a network-path reference URL (i.e. omit the URL scheme, URL starts with //).
I found this out while trying to answer this question without a local copy of DOH.
Here it is:
http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html?testModule=aa&paths=aa,//gist.github.com/gitgrimbo/5406688/raw/e6bc4469ce72dfd6d50e61e885889cb915a3f66b/gistfile1

How to keep synchronized, per-version documentation?

I am working on a small toy project who is getting more and more releases. Until now, the documentation was just a set of pages in the wordpress blog I setup for the project. However, as time passes, new releases are out and I should update the online documentation to match the most recent release.
Unfortunately, if I do so, the docs for the previous releases will "disappear" as my doc pages are updated to the most recent version, therefore I decided to include the documentation in the release package and to keep the most recent documentation available online as a web page as well.
A trivial idea would be to wget the current docs from the wordpress pages, save them into the svn and therefore into the release package, repeating the procedure at every new release. Unfortunately, the HTML I get must be hacked by hand to fix the links (or I should hack wordpress to use BASE so that the HTML code is easily relocatable, something I don't want to do).
How should I handle the requirements of having at the same time:
user-browsable documentation for the proper version included in the downloadable package
most recent documentation available online (and properly styled with my web theme)
keep synchronized between the svn and the actual online contents (in wordpress, or something else that fits nicely with my wordpress setup)
easy to use
Thanks
Edit: started a bounty to see if I can lure more answers. I think this is a quite important issue, and it would be nice to have multiple hints and opinions for future readers.
I would check your pages into SVN, and then have your webserver update from its local SVN working copy when you're ready to release. Put everything into SVN--wordpress, CSS, HTML, etc.
WGet can convert all the links in the document for you. See the convert-links option:
http://www.gnu.org/software/wget/manual/html_node/Advanced-Usage.html
Using this in conjuction with the other methods could yield a solution.
I think there are two problems to be solved here
how and where to keep the documentation aligned with the code
where to publish the documentation
For 1 i think it's best to:
keep the documentation in a repository (SVN or git or whatever you already use for the code) as a set of files, instead of in a db as it is easier to keep a history of changes (an possibly to stay in par with the code releases
use an approach where the documentation is generated from a set of source files (you'd keep the sources in the repository) from which the html files for the distribution package or for publishing on the web are generated. The two could possibly differ, as on the web you'd need to keep some version information (in the URL) that you don't need when packaging a single release.
To do "2" there are several tools that may generate a static site. One of them is Jekyll it's in ruby and looks quite complete and customizable.
Assuming that you use a tool like jekyll and keep the files and source in SVN you might setup your repo in this way:
repo/
tags/
rel1.0/
source/
documentation/
rel2.0/
source/
documentation/
rel3.0/
source/
documentation/
trunk/
source/
documentation/
That is:
You keep the current documentation beside the source in the trunk
When you do a release you create a tag for the release
you configure your documentation generator to generate documentation for each of the repo/tags//documentation directory such that the documentation for each release is put in documentation_site/ directory
So to publish the documentation (point 2 above):
you copy on the server the contents of the documentation_site directory, putting it in the same base dir of your wordpress install or linking from that, such that each release doc can be accessed as: http://yoursite/project/docs/relXX/
you create a link to the current release documentation such that it can always be reached as http://yoursite/project/docs/current
The trick here is to publish the documentation always under a proper release identifier (in the URL, on the filesystem) and use a link (or a redirect) to make sure that the "current documentation" on the web server points to the current release.
I have seen some programs use help & manual. But I am a Mac user and I have no experience with it to know if it's any good. I'm looking for a solution myself for Mac.
For my own projects, if that were a need, I would create a sub-dir for the documentation, and have all the files refer from the known-base of there relatively. For example,
index.html -- refers to images/example.jpg
README
-- subdirs....
images/example.jpg
section/index.html -- links back to '../index.html',
-- refers to ../images/example.jpg
If the docs are included in the SVN/tarball download, then they are readable as-is. If they are generated from some original files, they would be pre-generated for a downloadable version.
Archive versions of the documentation can be unpacked/generated and placed into named directorys (eg docs/v1.05/)
Its a simple PHP script that can be written to get a list the subdirs of the /docs/ directory from the local disk and display a list, and highlighting the most recent, for example.