In Magento 2 when we are opening multiple products it stores data of those products into window.localStorage.product_data_storage locally into the browser many of the browser that data get deleted automatically but in some of the browsers it is not getting deleted
I do have to say I opened about 200 products before this happened, but
it is never clear.
When we clear the cache of browser then it is removing that content of data from browser, but it is simply understood that the end user will not go to clear their cache every time so How we can remove this content from their browser?
Any Help will be appreciated
Thank You!
As of I understood that you wanted to remove the content from browser that Magento stores locally into every browser when you/anyone opens your site to use them at some places for example to maintain the component of knock-out js in magento like in checkout page, mini-cart, etc.
Here I have created a module to achieve this, with this module you can manage that how much content should be stored into the browsers local storage. Means the magento will store the content into local storage of browser but like if you want that data should be get deleted when the limit riches to 10-20 whatever the unit you wish Then the module will delete the content from the browser & magento will again start storing data to the local from 1 count to your desired limit.
Create a module at below location
Magento_root/app/code/{VendorName}/{ModuleName}
Create below files into it at the appropriate given location.
registration.php
etc/module.xml
view/frontend/layout/catalog_product_view.xml
view/frontend/templates/product/view/removelocal.phtml
view/frontend/web/js/removelocal.js
Here I am not putting the content of registration.php & module.xml assuming that it is already understood to you. For this answer I am taking the VendorName => Vendorname & the ModuelName => Removelocal. Here is the code of custom module.
catalog_product_view.xml
<?xml version="1.0"?>
<page layout="1column" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="urn:magento:framework:View/Layout/etc/page_configuration.xsd">
<body>
<referenceContainer name="content">
<block class="Magento\Framework\View\Element\Template" name="vendorname.removelocal.content" template="Vendorname_Removelocal::product/view/removelocal.phtml" before="-" />
</referenceContainer>
</body>
</page>
removelocal.phtml
<div class="swatch-opt" data-role="remov-local-content"></div>
<script type="text/x-magento-init">
{
"[data-role=remov-local-content]": {
"Vendorname_Removelocal/js/removelocal": {
}
}
}
</script>
removelocal.js
define([
'jquery'
], function ($){
'use strict';
$.widget('mage.removelocal', {
_init: function () {
if(window.localStorage.product_data_storage)
{
var temp = window.localStorage.product_data_storage;
var myObject = JSON.parse(window.localStorage.product_data_storage);
var count = Object.keys(myObject).length;
if(count >= 10){
window.localStorage.removeItem('product_data_storage');
}
}
}
});
return $.mage.removelocal;
});
Note: After completing please do run the below command.
php bin/magento module:enable Vendorname_Removelocal
php bin/magento setup:upgrade
php bin/magento setup:static-content:deploy -f
php bin/magento cache:flush
Related
I am trying to play a video when developing locally with VueJS 2.
My code is the following :
<video class="back_video" :src="`../videos/Space${videoIndex}.mp4`" id="background-video"></video>
...
data :
function() {
return {
videoIndex:1
}
}
...
const vid = document.getElementById("background-video");
vid.crossOrigin = 'anonymous';
let playPromise = vid.play();
if (playPromise !== undefined) {
playPromise.then(function() {
console.log("video playing");
}).catch(function(error) {
console.error(error);
});
}
This code is causing the exception given in title. Tried in several browsers, always the same.
If I change the src by :
:src="require(`../videos/Space${videoIndex}.mp4`)"
it works.
But in that case building time is very long as I have many different videos in my videos directory, because adding require() will force to copy all videos in the running directory at build phase (vue-cli serve), and this is really annoying. In other words I want to refer videos that are outside the build directory to avoid this (but also to avoid having videos in my git).
It is interesting to note that when I deploy server side, it works perfectly with my original code
:src="`../videos/Space${videoIndex}.mp4`"
Note also that if i replace my code with simply
src="../videos/Space1.mp4"
it works too. So the video itself, or its location, are not the source of the problem.
Any clue ?
You can host your videos on a CDN to have something faster and easier to debug/work with.
Otherwise, it will need to bundle it locally and may take some time.
I am using GWT with Bootstrap3 and Openlayers Map. I have implemented my own OSM Map server.
My application does not start without internet connection. I need guidance.
I followed the instructions in boostrap3 V1.0.2 for offline applications.
However I only got a blank screen.
Starting with the Firefox debugger I got the following message in the console:
Uncaught ReferenceError: OpenLayers is not defined
<anonymous> http://www.openstreetmap.org/openlayers/OpenStreetMap.js:7
Starting with Google Chrome I get the following warning
[Deprecation] Application Cache API manifest selection is deprecated and will be removed in M85, around August 2020. See https://www.chromestatus.com/features/6192449487634432 for more details.
followed by
GET http://www.openlayers.org/api/OpenLayers.js net::ERR_INTERNET_DISCONNECTED
and
localhost/:1 Application Cache Error event: Invalid or missing manifest origin trial token: http://localhost:8090/simaso/simasoweb/appcache.manifest
Here is my basic setup
SiMaSoWeb.gwt.xml:
...
<inherits name='com.google.gwt.json.JSON'/>
<inherits name="com.google.web.bindery.autobean.AutoBean"/>
<inherits name="org.gwtbootstrap3.extras.cachemanifest.Offline"/>
...
<add-linker name="offline" />
SiMaSoWeb.html:
<!doctype html>
<html manifest="simasoweb/appcache.manifest">
<head>
<title>Sirene</title>
<script type="text/javascript" language="javascript" src="simasoweb/simasoweb.nocache.js"></script>
<script src="http://www.openlayers.org/api/OpenLayers.js"></script>
<script src="http://www.openstreetmap.org/openlayers/OpenStreetMap.js"></script>
<link type="text/css" rel="stylesheet" href="SiMaSoWeb.css">
....
</html>
In ...\simasoweb\appcache.manifest I find:
CACHE MANIFEST
# Version: 1599380329409.0.6297069797290025
CACHE:
AF4477772D0DB53A10ABCF74A5AE0C4D.cache.js
fonts/fontawesome-webfont.woff
clear.cache.gif
fonts/FontAwesome.otf
css/bootstrap-notify-custom.min.cache.css
7192594CA2F468C2F793523022719FA0.cache.js
...
css/font-awesome-4.7.0.min.cache.css
NETWORK:
*
Finally
I compile all this . Resources seem to be included in the war file ..
Needless to say that with internet connection, only in the first 1-2 seconds of starting, all is running fine ..
As per the Google Chrome warning you included, App Cache is a deprecated standard and is being removed. It has already been removed from non-secure contexts.
You should be using Service Workers instead to cache resources for offline use. You may have to write your own linker or maybe you can use gwt-serviceworker-linker.
Thanks to ELEVATE I managed to move from AppCache to ServiceWorker. However the Openlayers couldnt be fixed this way. So here is what solved the issues:
ServiceWorker
I am still using Java 8
I upgraded to GWT 2.9
I added to .gwt.xml
<inherits name="org.realityforge.gwt.serviceworker.Linker"/>
<inherits name="elemental2.dom.Dom"/>
<inherits name="elemental2.promise.Promise"/>
<inherits name="jsinterop.base.Base"/>
...
and
<add-linker name="serviceworker"/>
<extend-configuration-property name="serviceworker_static_files" value="./"/>
<extend-configuration-property name="serviceworker_static_files" value="../SiMaSoWeb.html"/>
In my entry JAVA-routine right at the start of my onModuleLod I added
import static elemental2.dom.DomGlobal.*;
import elemental2.dom.DomGlobal;
public void onModuleLoad() {
...
initStatic();
...
and later in that module
public void initStatic() {
if ( null != navigator.serviceWorker )
{
navigator.serviceWorker.register("simasoweb/"+ GWT.getModuleName() + "-sw.js" ).then( registration -> {
console.log( "ServiceWorker registration successful with scope: " + registration.getScope() );
// Every minute attempt to update the serviceWorker. If it does update
// then the "controllerchange" event will fire.
DomGlobal.setInterval( v -> registration.update(), 60000 );
return null;
}, error -> {
console.log( "ServiceWorker registration failed: ", error );
return null;
} );
navigator.serviceWorker.addEventListener( "controllerchange", e -> {
// This fires when the service worker controlling this page
// changes, eg a new worker has skipped waiting and become
// the new active worker.
console.log( "ServiceWorker updated ", e );
} );
}
}
}
I had issues with accesing the right files in my gwt - war directory. Therefore I updated the navigator.serviceWorker.register... command.
Very useful was the google inherent debugger with CTRL+SHIFT+I. In the 'console'-tab you find the messages - red means bad - solve it!
As external jar libraries I had to include
elemental2-core
elemental2-dom
elemental2-promise
base
Now the openlayers issues... Needless to say that there might be a much more elegant way and further more, you need an offline-map, which I have rendered myself and available (150GB for Germany!).
In the HTML file
Note that I opened the openlayers and openstreetmap .js files in a browser copied them in a file and copied them into my war directory in the subdirectory src. Again the browser debugger can help find directory issues.
<script type="text/javascript" language="javascript" src="simasoweb/simasoweb.nocache.js"></script>
<script src="src/OpenLayers.js"></script>
<script src="src/OpenStreetMap.js"></script>
Copy hard
I downloaded the gwt-openlayers demo project GWT-OpenLayers-master.zip
and copied all files in GWT-OpenLayers-master\gwt-openlayers-showcase\src\main\resources\org\gwtopenmaps\demo\openlayers\public\openlayers into my ...war\src\ directory where my openlayers.js files lie.
Finally I am not sure if the service-worker point 1-3 really helped.
I created an app with vue-cli and then I build the dist folder for production.
The app is deployed on IIS with flask backend and works fine.
The problem occurs when I have to make some changes and I have to redo the deployment. After this, users call me because app doesn't work but if I clear the chrome cache, the app works fine again.
How can I fix this problem? Is there a method to clear chrome cache automatically when I release a new application version?
Thanks
my dist folder
deployment: copy and paste folder dist on IIS
if files in dist folder are correct, maybe the problem is in axios cache? i have make some changes also to rest apis
I had the same problem and changing (incrementing) the version number in package.json before running the build command fixed it.
For example by default the version number is set to "0.1.0"
package.json file:
{
"name": "project-name",
"version": "0.1.1",
"private": true,
...
}
If you use vue-cli, then it has built-in webpack configs for building dist. And in fact it adds hash-names to output files.
But if it was removed somehow, you can add it back to webpack config like
output: {
filename: '[name].[hash].bundle.js'
}
And your app will looks like this:
And even more, you do not need to handle how all this stuff will be added to html, coz webpack will figure it out for you.
You need to add a version query to your js file. This is how a browser can know if the file has changed and needs to download the new version.
So something like:
<script src="main.js?v=1.1"></script>
<script src="main.js?v=1.2"></script>
etc...
Assuming this is nothing to do with service worker/PWA, the solution could be implemented by returning the front-end version.
axiosConfig.js
axios.interceptors.response.use(
(resp) => {
let fe_version = resp.headers['fe-version'] || 'default'
if(fe_version !== localStorage.getItem('fe-version') && resp.config.method == 'get'){
localStorage.setItem('fe-version', fe_version)
window.location.reload() // For new version, simply reload on any get
}
return Promise.resolve(resp)
},
)
You can also ensure the fe-version is returned based on any sort of uniqueness, here I have used the commit SHA.
Full Article here: https://blog.francium.tech/vue-js-cache-not-getting-cleared-in-production-on-deploy-656fcc5a85fe
You can't access the browser's cache, that would be huge a security flaw.
To fix it, you must send some headers with your flask responses telling the browser not to cache you app.
This is an example for express.js for you to get the idea:
setHeaders: function (res, path, stat) {
res.set('Cache-Control', 'no-cache, no-store, must-revalidate') // HTTP 1.1
res.set('Pragma', 'no-cache') // HTTP 1.0
res.set('Expires', '0') // Proxies
}
You can read a lot more about caching in here.
This is an older post, but since I could not find the solution for this problem online, ill just post this here in case someone else might find it usefull.
I added the hash to the appllication chunk files via the webpack.mix.js file by adding:
mix.webpackConfig({
output: {
chunkFilename: 'js/[name].js?id=[chunkhash]',
},
})
This adds a fingerprint to the actual chunks and not just the app.js file. You can add a version name to the app.js file aswell by adding version(['public/js/app.js']); at the end of the file, or add filename: '[name].js?[hash]' to the output block.
My complete webpack.mix.js:
const mix = require('laravel-mix');
mix.webpackConfig({
output: {
chunkFilename: 'js/main/[name].js?id=[chunkhash]',
}
}).js('resources/js/app.js', 'public/js').vue()
.postCss('resources/css/app.css', 'public/css', [
//
]).version(['public/js/app.js']);
In my laravel blade file I used
<script src="{{ mix('js/app.js') }}"></script>
to load the app.js file with the correct version fingerprint.
The answer for me was caching at my DNS provider level.
Basically, I'm using Cloudflare DNS proxy and they are caching the website so in development mode I was not getting the code updates.
I had to clear the cache many times to get anything to change. I had to wait a significant period of time before anything update.
Turned it off and it stopped doing that.
the method I want to suggest
<script src="{{ asset('js/app.js?time=') }}{{ time() }}" defer></script>
add below script in publc/index.html
<head>
...
<script type="text/javascript" language="javascript">
var timestamp = (new Date()).getTime();
var script = document.createElement("script");
script.type = "text/javascript";
script.src = "<%= BASE_URL %>sample.js?t=" + timestamp;
document.head.appendChild(script);
</script>
...
</head>
could you try ?
vm.$forceUpdate();
Also it's possible that the component it self needs a unique key :
<my-component :key="unique" />
The code below handles GET request and will display the login page:
const routes = [
{
path: '/',
name: 'root',
component: Login
}
]
I can display data using router with GET method in vuejs. Now, I want accept POST requests/methods from external website. Is it possible? If it is, how should I make it, if not possible, is there another alternative solution for this?
No, that's not possible. Your routes are not even GET requests. You can intercept any request on your own application, but you can't listen for external requests, that's what HTTP servers are for.
No, a client app in a browser cannot accept requests from other websites/services, no matter which HTTP method is going to be used.
If you want for some reason your Vue based application to be accessible remotely then you can consider using SSR.
Pass data via the webpage holding the Vue instance
Yes you can with a trick!
What is your first base file that is loading your vuejs ?
is this an html file like index.html?
try load your vuejs in a php file like index.php and at top of your php file like write this:
<!DOCTYPE html>
<html >
<head>
...
<script >
window.postedData = '<?php echo json_encode($_POST)?>';
</script>
...
</head>
Now you can use your postedData variable every where in your vuejs code
mounted() {
const arr=JSON.parse(postedData)
console.log(arr)
...
}
I use the dynamic source for vue-webpack images in nuxt :src="require('path/to/image' + dynamic.variable)" in my project navbar. If the users substitute their image through a form which refetches their information and deletes their previous image I get a webpack error module (img) not found (it does not find the new one): is there a way to solve this, like wait for webpack HRM to finish?
I tried setting up a setTimeout() of one second before user re-fetch and it works, but I don't like a random waiting, I'd use a promise or a sync dynamic, the point is webpack hot reload is not controlled by my functions.. I also tried with setting the dynamic path as a computed: but it doesn't fix.
My image tag:
<img v-if="this.$auth.user.image" class="userlogo m-2 rounded-circle" :src="require('#assets/images/users/' + this.$auth.user.image)" alt="usrimg">
My Useredit page methods:
...
methods: {
userEdit() {
//uploads the image
if (this.formImageFilename.name) {
let formImageData = new FormData()
formImageData.append('file', this.formImageFilename)
axios.post('/db/userimage', formImageData, { headers: { 'Content-Type': 'multipart/form-data' } })
// once it has uploaded the new image, it deletes the old one
.then(res=>{this.deleteOldImage()})
.catch(err=>{console.log(err)})
}else{
this.userUpdate() //if no new image has to be inserted, it proceeds to update the user information
}
},
deleteOldImage(){
if(this.$auth.user.image){axios.delete('/db/userimage', {data: {delimage: this.$auth.user.image}} )}
console.log(this.$auth.user.image + ' deleted')
this.userUpdate() // it has deleted the old image so it proceeds to update the user information
},
userUpdate(){
axios.put(
'/db/user', {
id: this.id,
name: this.formName,
surname: this.formSurname,
email: this.formEmail,
password: this.formPassword,
image: this.formImageFilename.name,
})
.then(() => { console.log('User updated'); this.userReload()}) // reloads the updated user information
.catch(err => {console.log(err)} )
},
userReload(){
console.log('User reloading..')
this.$auth.fetchUser()
.then(() => { console.log('User reloaded')})
.catch(err => {console.log(err)} )
},
}
...
the problem happens after "console.log('User reloading..')" and before "console.log('User reloaded');", it is not related to the file upload nor the server response. I broke a single function in many little ones just to check the function progression and its asynchronous dynamics but the only one that is not manageable is the webpack hot reload :/
I'd like the users to upload their images and see their logo in the Navbar appear updated after submitting the form.
First of all, as somebody told you in the comments, webpack hmr shouldn't be used for production.
In Nuxt, everything that you reference from the assets folder will be optimized and bundled into the project package. So the ideal use case for this folder is all assets that can be packaged and optimized, and most likely won't change like fonts, css, background images, icons, etc.
Then, require is called only once by webpack when it is either building the site for local development or building the site for generating a production package. The problem in your case is that you delete the original file while you're in development and webpack tries to read it and fails.
In the case of these images that the user uploads, I think you should use the static folder instead and instead of using require you'll have to change the :src with
:src="'/images/users/' + this.$auth.user.image"
Let me know if this helps.
Okay, I probably solved it.
HMR: you are of course right. Thank you for pointing out, I am sorry, I am a beginner and I try to understand stuff along the way.
Aldarund, thank you, your idea of not changing the path and cache it client side.. I am too noob to understand how I could implement it ( :) ) but it gave me a good hint: the solution was to keep the image name as the user id + the '.png' extension and to manage the image with jimp so that the image name, extension and file type are always the same, and with or without webpack compiling the new path, I always have the correct require().
Jair, thank you for the help, I didn't follow that road, but I will keep it as a second chance if my way creates errors. Just to be specific: the error comes when it does not find -and asks for the name of- the NEW image, not the OLD one, as I wrote in my question: it happens because the fetchUser() functions reloads the user information including the new image name.
Do you guys see any future problems in my methodology?
Really thank you for your answers. I am learning alone and it's great to receive support.