How can I format a gitlab-ci.yml file to HTML - gitlab-ci

The gitlab system has a CI file linter that allows to check gitlab-ci.yml files. It renders these files as a table with informative entries.
However, this rendering is not very robust. Does anyone know if there is a component that provides this service, I.e. a sensible HTML rendering of the CI file for documentation purposes?

Related

Is it possible to load a file from remote URL (non JS) to be used in WebPack build

I am attempting to load a file from a remote URL during build to be WebPacked. This file is an MDX file and I am using the MDX vue-loader to load this file for use within the Vue application.
The system I am deploying is tenanted with a headless CMS powering some pages across the system. I would like to explore the possibilities of loading the MDX files at build time from a remote URL.
I have placed the MDX files on GitHub Pages with the remote URL passed in as an environment variable at build time.
The result is something like this (the idea here is that I can swap the domain during build to satisfy the tenanted site requirement):
import('https://somedomain.com/content/home.mdx');
This fails with your typical error during build of:
dependencies not found please install them using npm --save https://somedomain.com/content/home.mdx
I can WebPack ignore this import which allows it to build but then it fails to load in the browser as browsers will only load external modules with a MIME type of JS. Not to mention the fact that this hasn't been through the MDX loader so I suspect even if I could get the browser to load it the file would not have been parsed into something usable.
I realise I could copy these files in during the build stage from the remote but I was hopeful that there might be a way to either allow the browser to pull this remote file or WebPack to download this remote file and pack it into the output.
Does anyone have any ideas if this might be possible? Many thanks in advance.
As MDX needs pre-processing during build I think integration with Webpack is the only way.
You can try the SaveRemoteFilePlugin webpack plugin which allows you to download the file from remote to local file system. But maybe it's not what you want as it seems pushing downloaded files directly into dist folder without passing it through rest of the Webpack pipeline...
So probably better option is val-loader which allows executing your own Node scripts during build - here you can find the example which does almost what you need - Fetching Remote data during build

export and maintain vue application

I have developed a vue application and did run npm run build
After that I uploaded the content in the dist file to my webpage but it returned a blank page.
Since I did this for testing I uploaded it to a folder in my public_html/mypage.com/vueapplication To get all the paths right I added a vue.config.js with this content:
// vue.config.js
module.exports = {
publicPath: '/vueapplication/'
}
The application now works but I wounder however:
how do I best publish/upload the application to my site? Just by simply dragging the content inte the right folder?
how can I best maintain my site? Do I need to build again and upload, overwriting my files when everytime I make an update on my site?
And what is the difference between build and deploy your application?
Drag and dropping your code should work. But as your app grows you may want to look into automating this. For instance if you use an S3 bucket you can use the aws cli to automate the upload.
Yes, you should overwrite your deploy folder(s). You need to also take care of deploying different binary files, that have the same name. An example is if you have a global css file (main.css for instance). The file will probably change content between deployments, but keep the same name. Browsers may cache the file so users that downloaded older versions of the file will not use the new one. There are different techniques to handle this, but if you use webpack, it uses cache busting techniques and you should be fine.
Build is the process of transforming source code into an artifact(s). Exactly what this means differs from language to language, platform to platform. In the vuejs world this usually means a couple of js files, a couple of css files and some assets.
Deploying means taking the output of a build and making it available to your users. Again this differs from project to project. In the vuejs world this usually means taking the artifacts from the build and uploading them to an http enabled web server.

How to work with css and js files in moodle plugin

I need to develop a plugin for Moodle, and i need to have some js and css files in plugin. But i have the next problem - how to work with them from installed plugin? Of course, i can hardcode their path via to moodle structure, but it's a very dirty and bad way. Also, i know that i can place all js and css code inline, but i think that it's a bad decision too. Is there a built-in way to serve assets from plugin? I tried to find it in documentations, but found nothing.
Thanks
I assume you want to know how to include CSS and JS files into your plugin.
You can include a JS file via the command:
$PAGE->requires->js( /relative/path/your_script.js');
You can then call a JS function once the page has been downloaded with the command:
$PAGE->requires->js_init_call ( your_JS_function_name, array_of_parameters_here, bool: on DOM ready);
For example:
$PAGE->requires->js_init_call('init', array($USER->lang), true);
Be sure to make the $PAGE available with global $PAGE;, first.
Your CSS file can be named styles.css and put into the root folder of your plugin. The file will be automatically read by the system and included. It will take precedence over (will overwrite the settings of) the system CSS files. After that you will have to reload the theme caches.

Show HTML artifacts in bamboo without downloading

I've successfully created a small demo HTML report of test results from a build. Simply put, I'm doing numerical computations, and would like to give more detailed information on test results than a binary pass/fail. The HTML report consists of multiple HTML files with relative links between them.
However, linking to one file from the other sometimes leads to the file being opened in the browser, and sometimes a "download file" dialog opens. Any ideas what the rules are, so I can look at the whole report in-browser without resorting to downloading a zip file of the whole report, unzipping, etc etc?
Just a quick note here, if anyone should need it - as this was where I ended up in my search.
After upgrading our Bamboo to 6.8.1 build 60805 our code coverage artifacts started downloading, instead of being displayed inline.
This can be fixed by setting the Security and permission setting Allow artifacts to be embedded in Bamboo pages.
Be aware of the note about Cross-Site-Scripting vulnerabilities if enabled.
On our project we use this simple solution
1.In Stage configure final task script to copy reports to some folder:
echo "Copy artifact report"
rm -rf ../artifacts
mkdir ../artifacts
cp -r functionalTests/build/html/behat/* ../artifacts/
2.On Artifacts tab edit artifact definition and set Copy pattern to artifacts/**
Then when you navigate to build artifact then folder with reports will be opened in browser
To have an embedded html page in bamboo showing the coverage results, this page has partially helped me to make bamboo cooperate with python coverage:
Troubleshooting
The Clover tab shows the directory listing instead of the HTML report
Please check which artifact handler you use. The Amazon S3 Artifact
Handler serves files on a one-by-one basis, instead of exposing all
files as a static website. To change this, open Configure plan and on
the Miscellaneous tab select the Use custom artifact handler settings
check-box. Then select Server-Local Artifact Handler for shared and
non-shared artifacts and finally re-run the build.
In my setup though "Server-Local Artifact Handler" failed completely, but choosing "Bamboo remote handler" did the job.

Is it possible to use Dojo build without modifying JS files?

Is it possible to use Dojo build without the need to modify JavaScript files?
The article dgrid and Dojo Nano Build provides the instruction to create the build, but it requires adding the following line into JavaScript file, which initializes the application:
require(['dgrid/dgrid'], function () {
(replacing 'dgrid/dgrid' with your build module name).
However, it is very problematic when using build for own modules, because, of course, in development mode the require with own layer can't be included, otherwise the modifications made to own modules wouldn't be visible. But in production mode this line must be added.
So either you must modify the file manually before production build, or write a script that would modify the file during the build. Both are very error-prone.
Is there a better way to achieve that result? Is it possible for Dojo to recognize that the build is provided and should be used, instead of loading each module separately?
The following line of code can be included in both development and production modes.
require(['dgrid/dgrid'], function () {
I describe the reasons why in my answer here.
What you need to do is configure Dojo differently based on what environment.
In a blog post that I wrote, I describe this in more detail. The following summarizes the post:
I create three modes: Production, Uncompressed, and Development.
Development
When developing code, I will have the js source linked into the web server and the Development mode will point to the dojo.js file and the raw css file(s). The browser will load modules that I need using xhr. And I point to the top level css files which import other css files. The result is that a lot of requests will be made to the server and the loading of the page will be noticeably slow. The benefit is that you can see development changes without having to do a full build.
Production
Production mode points the main dojo file at the dojo.js that is built using the build tool. I also create <script> elements for the other layers that are needed in the page. I point the css to the built css files which the build tool has inlined the imported css. The page loads quickly, but it is difficult to debug
Uncompressed
Similar to production, but I point to the .uncompressed.js files. Production and Uncompressed are available in the released version of our software. I use uncompressed when trying to troubleshoot an issue in a production environment. The value of this mode is dwindling as the developer tools are better supporting compressed javascript (ie source maps, etc.)
Server side
The default mode is Production, but I use a query parameter to switch modes. I also store the current mode in the session, so that I only have to set the mode once to change it. Subsequent pages will run in the changed mode, until I change it back.
Here is a java implementation of this code.