Dropwizard serve external images directory - assets

I have a dropwizard API app and I want one endpoint where I can run the call and also upload and image, these images have to be saved in a directory and then served through the same application context.
Is it possible with dropwizard? I can only find static assets bundles.

There is similar question already: Can DropWizard serve assets from outside the jar file?
The above module is mentioned in the third party modules list of dropwizard. There is also official modules list. These two lists are hard to find maybe because the main documentation doesn't reference them.
There is also dropwizard-file-assets which seems new. I don't know which module will work best for your case. Both are based on dropwizard's AssetServlet
If you don't like them you could use it as example how to implement your own. I suspect that the resource caching part may not be appropriate for your use case if someone replace the same resource name with new content: https://github.com/dirkraft/dropwizard-file-assets/blob/master/src/main/java/com/github/dirkraft/dropwizard/fileassets/FileAssetServlet.java#L129-L141
Edit: This is simple project that I've made using dropwizard-configurable-assets-bundle. Follow the instructions in the README.md. I think it is doing exactly what you want: put some files in a directory somewhere on the file system (outside the project source code) and serve them if they exist.

Related

how to use vue.js offline?

Hi I received a web project with all already implemented CSS js HTML code, directories, project structure etc.
I have to make changes in view.js but I don’t always have internet access on the move so is there a way to continue this project locally without changing my project structure?
I already have an existing web project whose file contains
-an HTML page
-a CSS file
-a js file
Place in their folders respectively
I want to use view.js on this project
The problem I don’t always have internet access when I’m on the move.
So how do I use seen?
Knowing that:
CDN is a script placed in HTML requiring a connection to run view
-Vue CLI is a package that allows to generate a new project view "certainly out of competition"
But I should start over
Because the directory structure and already predefined what doesn’t suit me.
How does it work?
How to just add view and continue the project without zero spread?
I already installed node.js (npm) on my pc if its can help .
"-- IN BRIEF:
If you still don’t understand
Imagine being entrusted with a web project all made HTML CSS JS already configure etc...
And you must use VUE to make changes
knowing that on the move you don’t always have the connection
How do you do that?
Assuming (I can't tell 100% from your description) that it is an un-compiled implementation that uses the CDN, you can easily handle this by copying the vue library locally and update the html to use the local version instead of the CDN.
if you need to keep the html, you could use a browser plugin like requestly but there are many others. There you can select the url that goes to the cdn and replace it with the local one.
Another option for chromium-based browsers is to use local overrides. Picture upload is not working currently, so can't include a picture, but the option is available through the sources tab in the developer tools. You need to enable overrides, select a folder, then you can select the resource that you want to serve from local override.

How do I share a flohub graph?

How can I share (or publish) a flohub graph like is done in this answer?
I need to be able to post a publically accessible project, and am willing to set up a server if needed.
Examples indeed only support a single graph for now. If your example uses subgraphs or custom components and is targeting NoFlo on the browser, another nice option is to make a public HTML build of it and host it somewhere (for example GitHub pages).
The noflo-browser-app repository has build automation setup for this, including pushing to GitHub. To use it, you need to do the following steps:
Fork noflo-browser-app
Set your project to use your forked repository path in project settings on Flowhub
Push your graphs and components to GitHub
Share the live mode URL
To make the automatic publishing of app builds to GitHub Pages work, you need to enable the project in Travis CI and provide a GitHub access token via the GH_TOKEN secure env var.
Also remember to tweak the component.json file to include whatever custom component libraries you need.
noflo-browser-app bundles the WebRTC runtime, so it should be quite easy to access.
Sharing is somehow magically implemented through github gists. This works with graphs using ONLY the built-in components. Here's how you do it:
create a github gist
copy the json for the graph you want to share and paste them into the gist. my main.json, for example.
name the gist file noflo.json (not sure if this is required)
copy the gist's id from the url, in my case it is ecf36f449034209b8c2e
form your share link like this https://app.flowhub.io/#example/<yourGistId> here is mine
This only works for projects which use standard components. This issue is tracked here

Custom Clickstart with Templates and Github

We are using Cloudbees dev#cloud service, and are looking to create a number of application based off of an archetype stored in Github. I would like to create a custom ClickStart in order to streamline the process.
We are currently forking the archetype, then using a Folder Template that I have created to provision a build pipeline for the application.
While I have been able to create a simple ClickStart, I would like to create one that:
Forks or copies the clickstart source into a Github repository, and not cloudbees forge. The GitHub API supports this.
Point to my folder template using the Jenkins XML API. Currently, not all attributes of a folder template are represented in the rendered XML.
Target a specific folder to create my new folder job under.
The ClickStart API and JSON doesn't seem that well documented, and I have gotten about as far as I can go with trial-and-error.
Is what I am looking to accomplish possible with the current state of the Clickstart API?
Forks or copies the clickstart source into a Github repository
I do not believe this is possible today. Certainly it has been proposed.
not all attributes of a folder template are represented in the rendered XML
Such as what? The config.xml of a folder, just like that of a job, should be definitive. (It does not include definitions of child items.)
Target a specific folder
Also not possible today that I know of. (Though the user of the ClickStart could always move the result into a subfolder after the fact.)

Testing Dojo with DOH without local Dojo installation

I'm trying to work out how to use DOH to test Dojo modules if I don't install Dojo locally to my project. I'm working in Eclipse and ideally, I'd like something that I can run as part of a Maven build eventually. The Dojo package is 5-20Mb and I don't want to have it stashed in my source control system with each project if possible.
I've tried a few options with the runner.html test runner, but DOH is going to need to find a Dojo somewhere, and then it seems that modules will be found relative to that installation.
Having Dojo installed on my system but not in the project gives me a problem in trying to find the project relative to the location of the dojo.js file. The cross-domain protection prevents me serving up any kind of absolute path as it strips : characters. It also stops me using a Dojo installation served up on a different domain over http.
Is it necessary to have Dojo installed somewhere that I can then define a relative path from dojo.js to the roots of my modules? If not, how do I configure to get around it?
I've not tried this completely cross domain, but yes, you can define paths which may be enough to get you going.
We run our tests using a somewhat complicated deployment (to ensure we don't introduce accidental dojo/doh path dependencies), and our URL looks like this:
http://server/XXX/dev/dohpath/util/doh/runner.html?boot=../../../dojo/dojo.js&dojoUrl=../../../dojo/dojo.js&paths=doh,../dohpath/util/doh;mymodule,../../mymodule&testModule=full.test.module
That is, you fire up the runner, give it both 'boot' and 'dojoUrl' to tell it where Dojo itself lives, use 'paths' to tell DOH where it lives and how to find your own modules.
blech
Whether those relative paths can be made absolute successfuly, and whether it'll work cross-domain is an entirely different matter, I'm afraid. We'll be hitting that problem ourselves in a couple of months.
I've been able to do this with the runner located at http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html and a gist.
The trick seems to be to use a path alias in the URL, and use a network-path reference URL (i.e. omit the URL scheme, URL starts with //).
I found this out while trying to answer this question without a local copy of DOH.
Here it is:
http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html?testModule=aa&paths=aa,//gist.github.com/gitgrimbo/5406688/raw/e6bc4469ce72dfd6d50e61e885889cb915a3f66b/gistfile1

Project organization in perforce

I created several web applications that use the same static files (css, js, images).
When I use svn for version control, I use an external repository (svn: externals) to add files to the current project.
For example:
- Project_1
---- Webapp
-------- Static (external to static's repo)
- Project_2
---- Webapp
-------- Static (external to static's repo)
I could easily use it in their web pages by adding a link like /static/ ...
But now our company has moved to perforce.
How can I support the current structure?
We also use maven, I think to pack these files as a jar and use as a dependency, but then my editor (idea) does not see that this dependence are js-scripts and styles.
And i need to repackage and deploy jar file when create minor changes.
How to use maven correctly?
Perforce has support for defining multiple mappings from the depot to your hard drive as part of the client spec. You could, for example, set the following:
Client Name: Sample_Maven
Client Root: c:\inetpub\wwwroot
//depot/Project_1/Webapp/... //Sample_Maven/Project_1/...
//depot/Project_2/Webapp/... //Sample_Maven/Project_2/...
//depot/Shared/static/... //Sample_Maven/static/...
... any other folder mappings you need to bring in and sync ...
Perforce won't handle multiple mapping of the shared static folder situation by itself, you will have to use junctions/symlinks in your file system to get the behavior you want. A word of caution though, make sure only one of the shared static folders is actually managed through Perforce. It can get slightly grumpy if resources get changed out from under it without it knowing about the changes.
Really though, you are probably better off (if you can) - having a single workspace/client spec per project - one for proj1 and one for proj2, each with their own mappings to the shared static folder. If you can structure things appropriately and just use maven to build each "project" things will go more smoothly.
For a Maven based solution, you could use WAR Overlays, sharing common resources across multiple web applications is exactly what overlays are for.
It seems you have a couple of choices, both called overlays:
a) Maven overlays as #Pascal suggests. Then you a struction like #Goyuix suggests to checkout the static content from Perforce.
b) Perforce overlays, which would allow you to have two different workspaces/client specs, one for each project, and in each import the static content into the expected place in the filesystem. This is the closest match to the subversion structure you were using before.