I have an angular module that has some angular configuration that I want all tests to have before running. I could go in all the files and include that module via angular.mock.module but I was wondering if I could do it via karma somehow.
make a new file with a suitable name. lets say required.ts. in this file, include all the required modules/dependencies which you want to include in your tests.
Now in karma.conf, add the required.ts in files and preprocess it to get the desired result.
you could also decide to make a require.js in which case you would not have to preprocess it in karma.conf.
Related
I want to create two versions of an app with slightly different content. Therefore I thought about having two "www" directories (lets say "www-foo" and "www-bar") and tell capacitor in the capacitor.config.json which one to use (with "webDir" setting). Also the "appId" should be different then.
So I guessed the easiest way would be to have these capacitor.config.json files with different "appId" and "webDir" settings and when running the build script to specify which config file to use (like it's known from webpack with --config flag). But I can't find any information if it's possible to specify the config file to use for building the app.
Is it just not possible (yet) or am I too stupid to find it? :)
Otherwise I would try to create the capacitor.config.json file with webpack before running the capacitor build script.
I used this article as a guide for my project.
I don't know if exist any option for two or more AppID/webDir in Capacitor.
But understanding you necessity my suggest is create a custom build script in Node.js that change info in capacitor.config.ts (appId) > build (www/www-Two) > sync & copy to platform
I am attempting to load a file from a remote URL during build to be WebPacked. This file is an MDX file and I am using the MDX vue-loader to load this file for use within the Vue application.
The system I am deploying is tenanted with a headless CMS powering some pages across the system. I would like to explore the possibilities of loading the MDX files at build time from a remote URL.
I have placed the MDX files on GitHub Pages with the remote URL passed in as an environment variable at build time.
The result is something like this (the idea here is that I can swap the domain during build to satisfy the tenanted site requirement):
import('https://somedomain.com/content/home.mdx');
This fails with your typical error during build of:
dependencies not found please install them using npm --save https://somedomain.com/content/home.mdx
I can WebPack ignore this import which allows it to build but then it fails to load in the browser as browsers will only load external modules with a MIME type of JS. Not to mention the fact that this hasn't been through the MDX loader so I suspect even if I could get the browser to load it the file would not have been parsed into something usable.
I realise I could copy these files in during the build stage from the remote but I was hopeful that there might be a way to either allow the browser to pull this remote file or WebPack to download this remote file and pack it into the output.
Does anyone have any ideas if this might be possible? Many thanks in advance.
As MDX needs pre-processing during build I think integration with Webpack is the only way.
You can try the SaveRemoteFilePlugin webpack plugin which allows you to download the file from remote to local file system. But maybe it's not what you want as it seems pushing downloaded files directly into dist folder without passing it through rest of the Webpack pipeline...
So probably better option is val-loader which allows executing your own Node scripts during build - here you can find the example which does almost what you need - Fetching Remote data during build
Background:
Using VueJS, specifically in regards to PWA template https://github.com/vuejs-templates/pwa
There is a build step npm run build which bundles the project and transpiles any Vue into a distribution browser JS.
The files in /static/ are "static" and just copied into dist, but I am wondering if it's possible to template it at all, or read in some dynamic values.
Question:
Is it possible to have static files that servce under /static in the url, but also during build can accept dynamic values?
More context:
The problem is Vue compiles everything into the dist directory.
All non-static assets are cached and get a unique url each build, whereas static files (I know this is configurable, but you arguably want your non-static assets to have caching) have absolute paths.
Server Routing to map a file in /static/ to a cached dynamic file is outside of Vue. The question pertains to needing to host some "absolute pathed files" (static), but some files might have internally 1-2 urls that need to change in the files depending on what config is used, dev, prod, staging.. just as an example of the use case.
The solution I found was to use CopyWebpackPlugin which comes natively inside build/webpack.prod.conf.js
This is the plugin that copies files from static into dist/static.
You can use the process.env.NODE_ENV to allow you to copy specific files from static into dist.
I decided just to keep environment specific copies of the files with values changed, but you could easily add code to that file to parse and copy over whatever specific files you want.
I think most people put dynamic configuration values in a file under public/ then use javascript fetch to load those values in Vue components. Webpack will copy the files in public/ to the web root (dist/) and it will avoid compiling those config values into the minified javascript. If you put files in static/ and use import or require to load them into Vue components then webpack will resolve those during build time and compiling them into the minified Javascript - which is probably not what you want.
I need to develop a plugin for Moodle, and i need to have some js and css files in plugin. But i have the next problem - how to work with them from installed plugin? Of course, i can hardcode their path via to moodle structure, but it's a very dirty and bad way. Also, i know that i can place all js and css code inline, but i think that it's a bad decision too. Is there a built-in way to serve assets from plugin? I tried to find it in documentations, but found nothing.
Thanks
I assume you want to know how to include CSS and JS files into your plugin.
You can include a JS file via the command:
$PAGE->requires->js( /relative/path/your_script.js');
You can then call a JS function once the page has been downloaded with the command:
$PAGE->requires->js_init_call ( your_JS_function_name, array_of_parameters_here, bool: on DOM ready);
For example:
$PAGE->requires->js_init_call('init', array($USER->lang), true);
Be sure to make the $PAGE available with global $PAGE;, first.
Your CSS file can be named styles.css and put into the root folder of your plugin. The file will be automatically read by the system and included. It will take precedence over (will overwrite the settings of) the system CSS files. After that you will have to reload the theme caches.
I'm trying to work out how to use DOH to test Dojo modules if I don't install Dojo locally to my project. I'm working in Eclipse and ideally, I'd like something that I can run as part of a Maven build eventually. The Dojo package is 5-20Mb and I don't want to have it stashed in my source control system with each project if possible.
I've tried a few options with the runner.html test runner, but DOH is going to need to find a Dojo somewhere, and then it seems that modules will be found relative to that installation.
Having Dojo installed on my system but not in the project gives me a problem in trying to find the project relative to the location of the dojo.js file. The cross-domain protection prevents me serving up any kind of absolute path as it strips : characters. It also stops me using a Dojo installation served up on a different domain over http.
Is it necessary to have Dojo installed somewhere that I can then define a relative path from dojo.js to the roots of my modules? If not, how do I configure to get around it?
I've not tried this completely cross domain, but yes, you can define paths which may be enough to get you going.
We run our tests using a somewhat complicated deployment (to ensure we don't introduce accidental dojo/doh path dependencies), and our URL looks like this:
http://server/XXX/dev/dohpath/util/doh/runner.html?boot=../../../dojo/dojo.js&dojoUrl=../../../dojo/dojo.js&paths=doh,../dohpath/util/doh;mymodule,../../mymodule&testModule=full.test.module
That is, you fire up the runner, give it both 'boot' and 'dojoUrl' to tell it where Dojo itself lives, use 'paths' to tell DOH where it lives and how to find your own modules.
blech
Whether those relative paths can be made absolute successfuly, and whether it'll work cross-domain is an entirely different matter, I'm afraid. We'll be hitting that problem ourselves in a couple of months.
I've been able to do this with the runner located at http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html and a gist.
The trick seems to be to use a path alias in the URL, and use a network-path reference URL (i.e. omit the URL scheme, URL starts with //).
I found this out while trying to answer this question without a local copy of DOH.
Here it is:
http://archive.dojotoolkit.org/nightly/checkout/util/doh/runner.html?testModule=aa&paths=aa,//gist.github.com/gitgrimbo/5406688/raw/e6bc4469ce72dfd6d50e61e885889cb915a3f66b/gistfile1