Access to image has been blocked by CORS only on Windows - vue.js

Vue component has a photo block and the "edit" button.
<template>
<div>
<tui-image-editor ref="editor" > </tui-image-editor>
<div class="">
<img :src="img">
<button #click="edit()">Edit</button>
</div>
</div>
</template>
<script>
export default {
data() {
return {
img: "cdn.domain.shop/eaa49b02e350627622904290a83599d6.png",
};
},
methods: {
edit() {
this.$refs.editor.invoke("loadImageFromURL", this.img, "Editable image");
},
},
};
</script>
As a photo editor, I use TUI image editor. In the click handler, I pass the url to the editor by loadImageFromURL function
when I click the "edit" button in Chrome in Windows I get an error
Access to image at
'cdn.domain.shop/eaa49b02e350627622904290a83599d6.png' from origin
'example.org' has been blocked by CORS policy: No
'Access-Control-Allow-Origin' header is present on the requested
resource.
But when I do the same thing in Chrome in Ubuntu, everything works fine.
What am I doing wrong?

just add random string to url
this.$refs.editor.invoke("loadImageFromURL",this.img+'?'+Math.random(), "Editable image");
the error was due to caching in the browser

After that you have to make sure that every URL you request from Chrome and Safari uses http:// instead of https://. HTTPS retrieval will not work in these browsers at all.
some allows both http and https requests I solved it with a small regular expression that replaced our https URL string with http.

What's the quick solution ?
Add the attribute crossorigin="anonymous" in the <img> tag that displays the image before opening it in the editor.
ie: <img src="targetUri" crossorigin="anonymous" />
Explain the issue and solution
The main issue is related to caching and how the browser send the Origin header.
First you have to know that by default the browser does not send the Origin header when you load an image with the <img> tag that does not have the crossorigin="anonymous" attribute.
More info
What's happening is that the browser tries to load the image from the <img> tag before the image editor is opened, and the puts it into its cache.
So when you open the editor, it tries to load the image a second time, and you actually get a cached response of the first request that was made without the Origin header. Without this header, that cached response does not contain all the allow-control-* headers necessary to pass the CORS check, that why you get the error.
You can check this, by opening Chrome's inspector with "disable cache" checked. It should work.
The previous posts that suggested to include a parameter ?t=<random_number> had the effect to bypass the browser cache, but this solution is not possible when using pre-signed urls.
So adding crossorigin="anonymous" in the img tag should solve the problem.

Related

How to redirect to router.base URL in NuxtJS

Let's say my nuxt app is running in a subfolder 'test'.
nuxt.config.js:
router: {
base: '/test/'
}
That means my application runs on localhost:3000/test
Now, when I go to localhost:3000/tes, all i get is a 404 Error with the contents Cannot GET /tes
However, I want to redirect to /test, or show my own 404 page. But I couldn't figure out a way to handle that case.
I tried using a middleware, but that only worked for links within the subfolder.
Thanks for your help!
I found what I needed in the nuxt documentation:
Using a Hook to router.base when not on root:
https://nuxtjs.org/docs/configuration-glossary/configuration-hooks#redirect-to-routerbase-when-not-on-root
You can create an error layout for the 404 page
Error page
Or you can use middle to check the incoming URL and redirect them somewhere.
Middleware
You can create your error layout (error.vue in layout folder) and when error it will show that page.
<template>
<div>
<h1 v-if="error.statusCode === 404">Page not found</h1>
<h1 v-else>An error occurred</h1>
<NuxtLink to="/">Home page</NuxtLink>
</div>
</template>
<script>
export default {
props: ['error'],
layout: 'error' // you can set a custom layout for the error page
}
</script>
Link showing practically how to do that: medium

sending POST request to express route - after receiving form data, res.render is not triggered

I'm trying to create a simple app where a picture gets uploaded, and that picture is drawn on html canvas so that i can do some simple pixel manipulation.
Right now I have the GET method for root render an EJS template with a fileReader and a canvas.
With code attached at the bottom of the EJS file through script tags, I draw the uploaded image onto the canvas so I can read each pixel's rgb values.
I then tried to send those rgb values to the POST route in the app (through fetch), but it's not working as expected.
app.post("/", (req, res)=>{
console.log("inside post");
console.log(req.body);
res.render("test", {result: req.body});
console.log("after res.render");
});
All three of the console logs print correctly in the terminal, including the request body, but the test template is not being rendered. It just stays on the same "index" view the app launches with.
Can someone give me some insight as to why this is happening? I also included console logs inside the script tags in the ejs template, and these are only displayed in the browser, not in the terminal I launch the express app with. How can I render the view inside the post method?
First
If you use AJAX like Fetch API or XHR, browser will not render the test page.
Because it's asynchronous, and you could see Ajax in MDN web docs.
You need to use form post with following code.
<form action="/" method="post">
<button type="submit">go to another page</button>
</form>
But, if you use form post, your page which might be "index.ejs" will be replaced with "test.ejs".
In other words,
Browser uses the response from the forms POST request to load the new page.
But browser pass AJAX request's response to a callback and trigger callback in js.
Browser handle these two type request (Form Post and AJAX POST) with different ways.
In common, both are sending data to server.
So, in your case, res.render is triggered successfully.
Let me show you an example. Here is my server code.
const express = require('express')
const app = express()
app.set("view engine", "ejs")
app.get("/", (req, res, next) => {
res.render("test")
})
app.post("/test", (req, res, next) => {
res.render("other-test")
})
app.listen(3000)
<!-- test.ejs -->
<h1>this is test pages.</h1>
<!-- other-test.ejs -->
<h1>this is other test pages.</h1>
When I type url http://localhost:300, browser show me this.
And I open console in chrome and type following code.
fetch("/test", {
method: 'POST', // or 'PUT'
body: JSON.stringify({}), // data can be `string` or {object}!
}).then(res => {console.log("trigger response")})
Then go the network tab in chrome, you will see the request.
Here, this request trigger the express method.
But, what is the response?
Well, it's a html. That means res.render("other-test") is triggered correctly.
And you will find the console output show "trigger response" which callback is triggered in my fetch.
And, page still stay in "test.ejs".
Next, I add following code in my test.ejs
<form action="/test" method="post">
<button type="submit">Go to other page</button>
</form>
Page will be like this.
After you click, you will find out the browser show you "other-test" content.
That's a difference between form post and ajax post.
Second
You put script tag into ejs template.
Express will use ejs engine to render your ejs template become to html page.
After it become to html page, it means all script is running in browser not your nodejs terminal.

Show a loader while a Meteor CollectionFS and S3 image downloads?

Is there any function/hook for showing a loader while an Amazon S3 image downloads from Amazon S3 (or any image from anywhere for that matter)? I'm not currently using any CDNs or CloudFront, so my downloads can sometimes be slow. I'd like to just show a loader while the image is downloading. In my code I have:
{{#if uploadedCustomLogo}}
{{#with customLogo}}
{{#if isUploaded}}
<div class="img-wrapper">
<img src="{{this.url store='logos'}}" alt=""/>
</div>
{{else}}
{{> loading}}
{{/if}}
{{/with}}
{{/if}}
The issue is the uploading {{> loading }} loader-template runs fine, but it only lasts a fraction of a second because the actual upload is really fast. It's the download that can then take several seconds (sometimes up to twenty or so even on a small image). Is there any way to test/check if an image has been downloaded?
I used FF Inspector to see if there was a delay in the src getting set on the img tag but it gets set immediately. So the wait is really on S3... nothing changes in the DOM once it finally loads.
I'm using CollectionFS and the S3 adapter (Meteor-cfs-s3).
I figured it out. I was searching for the wrong thing on Google. The question is really how to use JQuery to listen for when an image has loaded. So you can just add your loader in your template, then hide it once the load event fires on the image. This simple code works great in Meteor:
Template.myTemplate.events({
'load #whateverImage': function(event) {
event.preventDefault();
// Hide your loader DIV (for example)
hideLoader();
},
The solution is to use has stored helper, I also had a problem figuring this one out isUploaded is to the meteor server but if you want to wait to for it to be uploaded to amazon s3 {{#if this.hasStored 'thumbs'}}
edit:
Here is my custom helper to check if it's upload & stored to amazon s3 (So I can safely create an amazon s3 url and display it to the user).
Which in my case looks like this:
uploadDoc: function () {
var fileId = Template.instance().posterData.get('fileId'); // You can ignore this, It's just how I get the file document id.
if (fileId)
return Images.findOne(fileId); // We need the file document
}
isUploadedAndStored: function (storage) {
if (this && this.copies && this.copies[storage])
return true;
}
sUrl: function () {
if (this && this.copies && this.copies.thumbs)
return 'https://s3-us-west-2.amazonaws.com/NAME/' + this.copies.thumbs.key;
}
using it like this:
{{#with uploadDoc}}
{{#if isUploadedAndStored 'thumbs'}}
<img class="attachment-thumbnail" src="{{sUrl}}">
{{else}}
{{>loading}}
{{/if}}
{{/with}}
How it works? When we subscribe to the uploaded file collection the document will not have copies and when it comes from the server it means it's actually saved on amazon s3, the this.hasStored does similar check but I found it to re-run too many times maybe need to report it to github so they can fix it.

PhantomJS rewriting URL in backgroundImage property to local file system

I am using PhantomJS to do some rewriting of HTML. I'm adding a background-image property to an element. But when I write out the resulting DOM, the URL has been rewritten to a local URL. I've boiled this down to the following test case:
JS
var page = require('webpage').create();
page.open("test.html",function(){
setTimeout(function(){
page.evaluate(function(){
document.getElementById("test").style.backgroundImage="url(test.png)";
});
console.log(page.content);
phantom.exit();
},1000);
});
HTML
<html>
<body>
<div id="test"></div>
</body>
</html>
Output
$ phantomjs test.js
<html><head></head><body>
<div id="test" style="background-image: url(file:///C:/cygwin/tmp/test.png); ">
</div>
</body></html>
UPDATE
The problem remains if you specify ./test.png or //test.png. However, http://example.com/test.png is left unchanged, as might be expected.
If this HTML document is opened in Chrome, and the background-image property added to the div element in the style inspector, the URL is unmodified, whether the document is inspected in the Elements tab in devtools, or via document.body.innerHTML displayed in the console, or copying the HTML.
UPDATE 2
I just found out that if the document is located in Chrome, and the command elt.style.backgroundImage="url(test.png"); is issued in the console, then the URL is rewritten. So at the end of the day it appears that this is not a PhantomJS issue, although I still don't understand this behavior.
Obviously, I don't want this URL to be rewritten in this fashion, and I don't understand why PhantomJS feels the need to do this. Ideas?

Google web fonts and SSL error

My site is working well with Google webfonts UNTIL the user hits the SSL portion of the site.
At that point, chrome throws the partial encoding error, and my cufon menu losses it's kerning.
I'm including my webfont with this css:
#font-face {
src: local('Lusitana'), url(https://themes.googleusercontent.com/static/fonts/lusitana
/v1/tAIvAkRzqMJf8Y4fM1R7PXYhjbSpvc47ee6xR_80Hnw.woff) format('woff');
}
My js console then gives me this error:
[blocked] The page at https://domain.com/ecommerce.php ran insecure content from
http://fonts.googleapis.com/css?family=Lusitana:regular,700&subset=latin.
Any ideas how I can get google fonts to force SSL?
Have you tried replacing https:// with // in the url? The request should use the correct protocol automatically.
locate this line on your HTML page (or template):
<link href='http://fonts.googleapis.com/css?family=Dosis:400,700' rel='stylesheet' type='text/css'>
and change it to this:
<link href='//fonts.googleapis.com/css?family=Dosis:400,700' rel='stylesheet' type='text/css'>
This simple change will make your browser call the Google Font page in the applicable mode (HTTP vs HTTPS).
Enjoy!
To load google fonts that will work in non-secure, and SSL mode, try the following in your page header - (and remove what you've got there calling a https:// inside the CSS):
<script type="text/javascript">
WebFontConfig = { google: { families: [ 'Droid+Serif::latin' ] } };
(function() {
var wf = document.createElement('script');
wf.src = ('https:' == document.location.protocol ? 'https' : 'http') +
'://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js';
wf.type = 'text/javascript';
wf.async = 'true';
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(wf, s);
})();
</script>
In my example, I'm using Droid Serif font, so swap that with yours.
You can read more on this here.
I also had this error caused by a theme in WordPress. It caused slow page loading and the following error reported by development console:
Mixed Content: The page at 'https://xxxxxxx.co.uk/' was loaded over HTTPS, but requested an insecure stylesheet 'http://fonts.googleapis.com/css?
family=Droid+Serif%3A400%2C700%2C400italic%2C700italic&ver=5.4.1'. This
request has been blocked; the content must be served over HTTPS.
The culprit was Wordpress theme called "Fresh and Clean". It inherits code written in 2014 which contains 'pre-SSL' coding practices
To resolve the problem all need to do is make changes inside the following file in the theme:
/wp-content/themes/wpex-freshandclean/functions/scripts.php
Look inside for any occurences of http:// and change each one to https://