I have a Vue application which uses Webpack 3 as bundler. The application contains custom translation tags ({% trans %}...{% endtrans %} as the templates used to be Jinja2 style) which get converted to $(...) as part of the bundling (using replace-string-loader), which vuex-i18n requires.
Is there a way to extract all $t() strings as part of the bundling into a file, so that this file can be translated? Since the replace-string-loader needs to process the files first, I expect this translation extractor to be integrated in the webpack.conf file (as part of the module.rules array)? Is there anything already out there which does the job or can someone please provide some help on how to write such a "loader"?
if $t() is an invocation of a function, u can write a webpack plugin that will hook the parser and trace all of the invocations, then write them to file using emit hook to write the collected data to file.
compiler.hooks.normalModuleFactory.tap('MyPlugin', factory => {
factory.hooks.parser.for('javascript/auto').tap('MyPlugin', (parser, options) => {
parser.hooks.expression.for('$t').tap("MyPlugin", expr => {
console.log(expr);
});
});
});
Good blog post about starting with webpack plugins: https://medium.com/webpack/the-contributors-guide-to-webpack-part-2-9fd5e658e08c
You can check the source code of providePlugin as a reference, read here what it does.
Related
I have a simple vue / nuxt project that I would like to serve from AWS lambda.
For this, I'd like to group everything into a single file.
I see that Nuxt is splitting the files in order to only load what matters at a given time, but the app is a single page, is for internal use and loading time / memory usage is completely irrelevant.
My question is 2 fold:
how can I disable the file splitting
is there a way to pack everything into a single index.html file? I didn't find a solution on the web because the moment I start to research solutions, I keep finding posts about SSR which is also totally irrelevant to my case.
For the split part, this one with all set to false should be enough: https://nuxtjs.org/docs/2.x/configuration-glossary/configuration-build#splitchunks
Like this
export default {
build: {
splitChunks: {
layouts: false,
pages: false,
commons: false
}
}
}
This one should also help: https://github.com/nuxt/nuxt.js/issues/2363
You also can have the whole control of the webpack config here: https://nuxtjs.org/docs/2.x/configuration-glossary/configuration-build#optimization
As for hosting on Lambda, you can check those 2 articles:
https://www.serverless.com/examples/aws-node-vue-nuxt-ssr
https://medium.com/#fernalvarez/serverless-side-rendering-with-aws-lambda-nuxtjs-b94d15782af5
We're using the VueI18n plugin for internationalizing our Vue application which is working fine so far. Text contents are managed and translated by our editorial staff using Zanata.
However there is a major drawback in our current approach. We're pulling the translations from Zanata during build time, i.e. "baking" them into the application. As a consequence, we need to rebuild and redeploy the application each time a text is edited.
Are there any approaches which pull the translations (or the translation files) when running the application so that the user is always presented the latest content?
You should be able to do this if you load the translation file and then use setLocaleMessage( locale, message ) to load them. I'm assuming you're using axios.
axios.get('/path/to/locale_de')
.then((response) => {
Vuei18nInstance.setLocaleMessage('de', response)
});
The response should be a plain JSON.
I studied the documentation for automatic global generation of base components but it is confusing. Can anyone please explain it in detail?
You can use require.context() in order to resolve a directory where components live during webpack's build process. this exposes to you, within the browser, the list of files in that directory. From that, you can use some magic to automatically register them, here's an example:
const files = require.context('./components', true, /\.vue$/i);
files.keys().map(key => Vue.component(key.split('/').pop().split('.')[0], files(key).default));
So in the above, we've said that we want webpack to create a context for us of all files in the './components' directory. From there, we can loop over all the keys of the files (which represent the file name) and register them with Vue.component(...)
I recommend watching this video:
https://www.vuemastery.com/courses/real-world-vue-js/global-components
First you can understand why you may want to use global components, then why you would want to automatically global register those components, and then what the registration code (taken from the following link) actually does.
https://v2.vuejs.org/v2/guide/components-registration.html#Automatic-Global-Registration-of-Base-Components
I've been looking at using Aloha for a project but I'm completely stumped by the documentation. I'm trying to create a repository following the documentation and I have this so far:
requirejs.config({
paths: {
'jquery': "http://ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min",
'aloha': "http://cdn.aloha-editor.org/latest/lib/aloha",
},
});
define(
['jquery'],
function($) {
"use strict";
require(['aloha'], function(localAloha) {
console.debug(localAloha);
console.debug(Aloha.AbstractRepository);
});
return {};
}
);
Now. This tries to pull Aloha and jQuery from an appropriate CDN, and it works fine. However, despite what the Aloha documentation tells me, localAloha is not defined (it appears Aloha doesn't return itself) but that's not a problem since my that point it's in the global namespace anyway.
More frustrating when trying to define a repository is the fact that Aloha.AbstractRepository is undefined, despite all the examples, and code from live projects like the Drupal Aloha plugin, telling that all I need to do is extend Aloha.AbstractRepository.
Does anyone have any idea what's going on here? Aloha looks great, and is perfect for what I have in mind, but it's proven to be very difficult to actually get it working.
here is some code which should help you.
a php script is reading files from direcotries (eg. upload dir) and generates a json file with that information. that json is in the format which can be used by a js file (repository api) to tell aloha editor what's in your repository: http://ge.tt/1VJqium/v/0?c
I have created a multi-layer build using build.dojotoolkit.org (my first attempt) with 3 layers: dojo.js, dojox.js, dijit.js. Each js file is uploaded in its own folder (dojo,dojox,dijit).
When I run the code, I would expect it to look in dijit.js to get the form modules like dijit.form.TextBox. But instead it tries to load dijit/form/TextBox.js and of course ends up with a 404 error.
What am I doing wrong?
The files are here if it helps:
http://usermanagedsolutions.com/Demos/Pages
Manually include each layer in a script tag on the page.
<script src="path/to/dojo.js" />
<script src="path/to/dojox.js" />
<script src="path/to/dijit.js" />
This will make available all modules that you have defined in the build. When you require the text box, Dojo will see that it has the code and will not make the XHR call.
Even though you do not have the intention of using the individual files, you may want to put them on the server as well. This way if someone forgets to add the file to the build, the penalty incurred is an xhr request, as opposed to a javascript error.
Re: AMD
When you include your layers in the manner that I described above, you are not loading all the modules that you included the build - you are just making the define functions available without having to make xhr requests.
If you look at the js file that is output from the build, the file contains a map of the module path to a function that when called will define the module.
So when you write the following code
require(["dijit/form/TextBox"], function(TextBox){
...
});
AMD will first determine if dijit/form/TextBox has already been defined. If so it will just take the object and execute the callback.
If the module hasn't already been defined, then AMD will look in it's cache to see if the define code is available. When you include your script files, you are providing a cache of define functions. AMD finds the code to define the module. It calls this define function and the result is the object that is passed into the callback. Subsequent requires for dijit/form/TextBox will also use this object as described above.
If the module hasn't already been defined and AMD does not find the define function in its cache, then AMD will make an XHR request back to the server to try to locate the specific module code. The result of the XHR call should provide the define function. AMD will call the function and use the result as the object to pass into the callback. Again, subsequent requires for dijit/form/TextBox will also use this object.
The Dojo build, provides the ability to 1) minify the code and 2) combine it into fewer files that need to be requested from the server.
AMD allows you to write code that can run in either environment (using built files or the individual files) without having to make modifications.