Using Vue CLI 3 on XAMPP - vuejs2

A little background..
As mentioned before in https://forum.vuejs.org/t/how-to-make-webpack-vue-work-on-xampp/33808. And it works when I put my Vue project directly in htdocs like this.
htdocs/
| - css/
| - js/
| etc..
However I'm using it differently, here's my current file structure in the htdocs.
htdocs/
| - project1/
| | - css/
| | - some other stuff for project1
| - project2/
| | - css/
| | - some other stuff for project2
| - vue-project/ (Where I wanted my Vue went to)
| | - css/
| | - stuff..
The error I get when I put dist/ of my vue project directly to htdocs/vue-project is 404, because they directly went back to the root file (root is htdocs/), in which it doesn't find the required file to launch Vue Project!
What I wanted
Anything that can launch the project in htdocs/vue-project/. Would accept any answer configuring either settings in the Vue/Webpack OR from XAMPP itself. If you need additional information please do ask in the comment section.
And if it turns out there are no other way, then I would accept answers involving XAMPP configuration on how to start a server in a different directories. Like start a server in htdocs, and other-file would be great as well.

For Vue CLI before 3.x
Try changing assetsPublicPath under build object in config/index.js. Then append your folder name there. Similiar issue that might help https://forum.vuejs.org/t/vue-js-webpack-deployment-for-xaamp-testing/28970
And if vue-router used then add ROUTER_BASE.
https://router.vuejs.org/en/api/options.html#base
For Vue CLI 3
Create vue.config.js inside your vue-projects
Inside it simply add the following
module.exports = {
publicPath: "/{path-to-your-vue-project}"
}
For more information please refer to this docs https://cli.vuejs.org/config/#publicpath

Related

How to use terraform to ignore previous execution(state) [duplicate]

I'm a bit of a newbie with Terraform and still working my way through the documentation, have not yet been able to find a way to accomodate the set up I need to achieve for a specific solution and hoping that some kind soul may be able to give me a push in the right direction.
I'm trying to manage a single set of paramaterised templates which deploy everything needed to support a new application we are working on in GCP. What I am trying to achieve is being able to deploy those templates to three different environments, each environment being in a distinct GCP project, by itself.
The plan is, as per recommendations, run terraform and pass in
a) The specific .tfvars file depending on the environment/project being deployed to (dev/test/prod).
b) Use the -chdir parameter to tell Terraform to pick up all the templates from 'infra-common' folder.
The tricky part is that we want each environment (gcp project) to host it's own state file in gcs/storage.
I had been looking at workspaces but it appears that workspaces will just create state subfolders on a single backend.
Question: Can this be done or is there a better way to do it?
Thanks!
You can use --backend-config for this. Here's how you can achieve the desired behavior:
Create a .config file for each environment (dev.config, test.config, prod.config) which contain the name of the gcs bucket (which must already exist) for the respective environment
Specify the common backend in a single remote_state.tf file
Here's how it would look:
config/dev.config:
bucket = "tf-state-dev"
config/test.config:
bucket = "tf-state-test"
config/prod.config:
bucket = "tf-state-prod"
remote_state.tf:
terraform {
backend "gcs" {
prefix = "terraform/state"
}
}
then, you can run the init. So for example, for dev this would look like:
$ terraform init --backend-config=config/dev.config
then, you can create a workspace for the environment:
$ terraform workspace new dev
With this approach, you can use a single set of templates (you can in fact configure dynamic variables based on the current workspace).
What you could do (we have a project with a similar setup with a different cloud provider), is:
use infra-common as a module
instead of working with .tfvar files per environment, use a separate root module per environment which invokes infra-common as sub-module.
Your folder structure could look like:
project
|-- dev
| `-- main.tf
|-- modules
| `-- infra-common
| |-- main.tf
| `-- variables.tf
|-- test
| `-- main.tf
`-- prod
`-- main.tf
dev/main.tf
terraform {
backend "gcs" {
bucket = "tf-state-dev"
prefix = "terraform/state"
}
}
module "stage" {
source = "../modules/infra-common"
env = "dev"
some_var = "value"
}
prod/main.tf
terraform {
backend "gcs" {
bucket = "tf-state-prod"
prefix = "terraform/state"
}
}
module "stage" {
source = "../modules/infra-common"
env = "prod"
some_var = "value"
}

Docusaurus error during building the project

First of all let me say that I'm new of Docusaurus and I'm using version 2.
I was building my documentation project and I have got the following error:
Docusaurus Node/SSR could not render static page with path=/docs/My_MDX_Page because of error: useBaseUrl is not defined
I'm using a MDX page to get the baseURL to retrieve static contents.
The MDX page:
---
title: title
hide_title: true
sidebar_label: item
---
import useBaseUrl from '#docusaurus/useBaseUrl';
## Section
| Name | image |
| :--- | :--- |
| YL_GN3 | <img src={useBaseUrl('img/YL_GN3.png')} /> |
I think I'm doing something wrong even if executing without building it works (by npm start).
npm install was executed.
Here is the directory structure of project:
+ .docusaurus
+ build
+ docs
+ node_modules
+ src
+ static
+ versioned_docs
+ versioned_sidebars
babel.config.js
docusaurus.config.js
package-lock.json
package.json
README.md
sidebars.js
versions.json
UPDATE
For reasons that I don't know, it has started working. I have changed docusaurs config, adding the baseUrl different from '/' and 'index.js' in order to have the docs as landing page

Optional slug in dynamic route using a folder

I'm currently working on the internalization of a website using Vue.js and the Nuxt frameworks.
I'm moving from a page's url being website.com/my-page to website.com/<lang>/my-page.
To do so, and following the documentation, I've moved all my pages in a _lang folder, giving me the following architecture:
pages/
│
└───_lang
│ │
| |__ my-page.vue
|
...
This works exactly as intended apart from the fact that the <lang> slug is now mandatory, making the access to website.com/my-page return a 404.
I would like for this url to show the page in the default language declared in my application.
The only way I found to achieve this is to create another my-page.vue at the root of pages/ containing the following:
<script>
import Mypage from '~/pages/_lang/my-page'
export default Mypage
</script>
However, this means creating this kind of alias for every pages of my website, giving me:
pages/
│
│─── my-page.vue
│
└───_lang/
│ │
| |__ my-page.vue
|
...
Is there any way to make this automate as this is a very tedious process?
Thank you,
Side-note: I've been investigating extendRoutes without success.
You could use router-extras-module module.
<router>
alias:
- /my-page
</router>

How to use pdfjs with aurelia-cli

Has anyone got pdfjs (https://github.com/mozilla/pdf.js) working with the Aurelia cli?
I'm getting nowhere fast with getting it up and running.
I followed the docs here (http://aurelia.io/docs/build-systems/aurelia-cli#adding-client-libraries-to-your-project) and guessed that main needed to be set to webpack, but the paths in that file seem to be interpreted incorrectly.
------- File not found or not accessible ------
| Location: /home/ubuntu/workspace/public/pdfjs-dist/build/pdf.js
| Requested by: /home/ubuntu/workspace/public/src/modules/admin/admin.js
| Is this a package? Make sure that it is configured in aurelia.json and that it is not a Node.js package
------- File not found or not accessible ------
| Location: /home/ubuntu/workspace/public/src/worker-loader.js
| Requested by: /home/ubuntu/workspace/public/src/modules/admin/admin.js
| Is this a package? Make sure that it is configured in aurelia.json and that it is not a Node.js package

How to require js files with node webkit

I'm structuring my app like this:
dev
|-js
| |-C
| | |-app.js
| |-M
| |-folder.js
|-index.html
|-package.json
in index.html, I have
<script type="text/javascript" src="js/C/app.js"></script>
in app.js I have
var folders = require("js/M/folder.js")
When I run this app, I'm seeing:
Uncaught Error: Cannot find module 'js/M/folder.js' module.js:341
Module._resolveFilename module.js:341
Module._load module.js:280
Module.require module.js:367
require module.js:383
window.require
(anonymous function) app.js:6
I've tried using
"js/M/folder"
"js/M/folder.js"
"/js/M/folder"
"/js/M/folder.js"
"./js/M/folder"
"./js/M/folder.js"
"../../M/folder"
"../../M/folder.js"
And it never seems to find the file. I also noticed that the require function object doesn't seem to be the same one in the node documentation, as it has no resolve method attached to it.
Is there something fundamental I'm missing about node-webkit?
Nevermind. It looks like node-webkit doesn't like to run files located on a network share using the UNC.