My site is working well with Google webfonts UNTIL the user hits the SSL portion of the site.
At that point, chrome throws the partial encoding error, and my cufon menu losses it's kerning.
I'm including my webfont with this css:
#font-face {
src: local('Lusitana'), url(https://themes.googleusercontent.com/static/fonts/lusitana
/v1/tAIvAkRzqMJf8Y4fM1R7PXYhjbSpvc47ee6xR_80Hnw.woff) format('woff');
}
My js console then gives me this error:
[blocked] The page at https://domain.com/ecommerce.php ran insecure content from
http://fonts.googleapis.com/css?family=Lusitana:regular,700&subset=latin.
Any ideas how I can get google fonts to force SSL?
Have you tried replacing https:// with // in the url? The request should use the correct protocol automatically.
locate this line on your HTML page (or template):
<link href='http://fonts.googleapis.com/css?family=Dosis:400,700' rel='stylesheet' type='text/css'>
and change it to this:
<link href='//fonts.googleapis.com/css?family=Dosis:400,700' rel='stylesheet' type='text/css'>
This simple change will make your browser call the Google Font page in the applicable mode (HTTP vs HTTPS).
Enjoy!
To load google fonts that will work in non-secure, and SSL mode, try the following in your page header - (and remove what you've got there calling a https:// inside the CSS):
<script type="text/javascript">
WebFontConfig = { google: { families: [ 'Droid+Serif::latin' ] } };
(function() {
var wf = document.createElement('script');
wf.src = ('https:' == document.location.protocol ? 'https' : 'http') +
'://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js';
wf.type = 'text/javascript';
wf.async = 'true';
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(wf, s);
})();
</script>
In my example, I'm using Droid Serif font, so swap that with yours.
You can read more on this here.
I also had this error caused by a theme in WordPress. It caused slow page loading and the following error reported by development console:
Mixed Content: The page at 'https://xxxxxxx.co.uk/' was loaded over HTTPS, but requested an insecure stylesheet 'http://fonts.googleapis.com/css?
family=Droid+Serif%3A400%2C700%2C400italic%2C700italic&ver=5.4.1'. This
request has been blocked; the content must be served over HTTPS.
The culprit was Wordpress theme called "Fresh and Clean". It inherits code written in 2014 which contains 'pre-SSL' coding practices
To resolve the problem all need to do is make changes inside the following file in the theme:
/wp-content/themes/wpex-freshandclean/functions/scripts.php
Look inside for any occurences of http:// and change each one to https://
Related
Vue component has a photo block and the "edit" button.
<template>
<div>
<tui-image-editor ref="editor" > </tui-image-editor>
<div class="">
<img :src="img">
<button #click="edit()">Edit</button>
</div>
</div>
</template>
<script>
export default {
data() {
return {
img: "cdn.domain.shop/eaa49b02e350627622904290a83599d6.png",
};
},
methods: {
edit() {
this.$refs.editor.invoke("loadImageFromURL", this.img, "Editable image");
},
},
};
</script>
As a photo editor, I use TUI image editor. In the click handler, I pass the url to the editor by loadImageFromURL function
when I click the "edit" button in Chrome in Windows I get an error
Access to image at
'cdn.domain.shop/eaa49b02e350627622904290a83599d6.png' from origin
'example.org' has been blocked by CORS policy: No
'Access-Control-Allow-Origin' header is present on the requested
resource.
But when I do the same thing in Chrome in Ubuntu, everything works fine.
What am I doing wrong?
just add random string to url
this.$refs.editor.invoke("loadImageFromURL",this.img+'?'+Math.random(), "Editable image");
the error was due to caching in the browser
After that you have to make sure that every URL you request from Chrome and Safari uses http:// instead of https://. HTTPS retrieval will not work in these browsers at all.
some allows both http and https requests I solved it with a small regular expression that replaced our https URL string with http.
What's the quick solution ?
Add the attribute crossorigin="anonymous" in the <img> tag that displays the image before opening it in the editor.
ie: <img src="targetUri" crossorigin="anonymous" />
Explain the issue and solution
The main issue is related to caching and how the browser send the Origin header.
First you have to know that by default the browser does not send the Origin header when you load an image with the <img> tag that does not have the crossorigin="anonymous" attribute.
More info
What's happening is that the browser tries to load the image from the <img> tag before the image editor is opened, and the puts it into its cache.
So when you open the editor, it tries to load the image a second time, and you actually get a cached response of the first request that was made without the Origin header. Without this header, that cached response does not contain all the allow-control-* headers necessary to pass the CORS check, that why you get the error.
You can check this, by opening Chrome's inspector with "disable cache" checked. It should work.
The previous posts that suggested to include a parameter ?t=<random_number> had the effect to bypass the browser cache, but this solution is not possible when using pre-signed urls.
So adding crossorigin="anonymous" in the img tag should solve the problem.
I created an app with vue-cli and then I build the dist folder for production.
The app is deployed on IIS with flask backend and works fine.
The problem occurs when I have to make some changes and I have to redo the deployment. After this, users call me because app doesn't work but if I clear the chrome cache, the app works fine again.
How can I fix this problem? Is there a method to clear chrome cache automatically when I release a new application version?
Thanks
my dist folder
deployment: copy and paste folder dist on IIS
if files in dist folder are correct, maybe the problem is in axios cache? i have make some changes also to rest apis
I had the same problem and changing (incrementing) the version number in package.json before running the build command fixed it.
For example by default the version number is set to "0.1.0"
package.json file:
{
"name": "project-name",
"version": "0.1.1",
"private": true,
...
}
If you use vue-cli, then it has built-in webpack configs for building dist. And in fact it adds hash-names to output files.
But if it was removed somehow, you can add it back to webpack config like
output: {
filename: '[name].[hash].bundle.js'
}
And your app will looks like this:
And even more, you do not need to handle how all this stuff will be added to html, coz webpack will figure it out for you.
You need to add a version query to your js file. This is how a browser can know if the file has changed and needs to download the new version.
So something like:
<script src="main.js?v=1.1"></script>
<script src="main.js?v=1.2"></script>
etc...
Assuming this is nothing to do with service worker/PWA, the solution could be implemented by returning the front-end version.
axiosConfig.js
axios.interceptors.response.use(
(resp) => {
let fe_version = resp.headers['fe-version'] || 'default'
if(fe_version !== localStorage.getItem('fe-version') && resp.config.method == 'get'){
localStorage.setItem('fe-version', fe_version)
window.location.reload() // For new version, simply reload on any get
}
return Promise.resolve(resp)
},
)
You can also ensure the fe-version is returned based on any sort of uniqueness, here I have used the commit SHA.
Full Article here: https://blog.francium.tech/vue-js-cache-not-getting-cleared-in-production-on-deploy-656fcc5a85fe
You can't access the browser's cache, that would be huge a security flaw.
To fix it, you must send some headers with your flask responses telling the browser not to cache you app.
This is an example for express.js for you to get the idea:
setHeaders: function (res, path, stat) {
res.set('Cache-Control', 'no-cache, no-store, must-revalidate') // HTTP 1.1
res.set('Pragma', 'no-cache') // HTTP 1.0
res.set('Expires', '0') // Proxies
}
You can read a lot more about caching in here.
This is an older post, but since I could not find the solution for this problem online, ill just post this here in case someone else might find it usefull.
I added the hash to the appllication chunk files via the webpack.mix.js file by adding:
mix.webpackConfig({
output: {
chunkFilename: 'js/[name].js?id=[chunkhash]',
},
})
This adds a fingerprint to the actual chunks and not just the app.js file. You can add a version name to the app.js file aswell by adding version(['public/js/app.js']); at the end of the file, or add filename: '[name].js?[hash]' to the output block.
My complete webpack.mix.js:
const mix = require('laravel-mix');
mix.webpackConfig({
output: {
chunkFilename: 'js/main/[name].js?id=[chunkhash]',
}
}).js('resources/js/app.js', 'public/js').vue()
.postCss('resources/css/app.css', 'public/css', [
//
]).version(['public/js/app.js']);
In my laravel blade file I used
<script src="{{ mix('js/app.js') }}"></script>
to load the app.js file with the correct version fingerprint.
The answer for me was caching at my DNS provider level.
Basically, I'm using Cloudflare DNS proxy and they are caching the website so in development mode I was not getting the code updates.
I had to clear the cache many times to get anything to change. I had to wait a significant period of time before anything update.
Turned it off and it stopped doing that.
the method I want to suggest
<script src="{{ asset('js/app.js?time=') }}{{ time() }}" defer></script>
add below script in publc/index.html
<head>
...
<script type="text/javascript" language="javascript">
var timestamp = (new Date()).getTime();
var script = document.createElement("script");
script.type = "text/javascript";
script.src = "<%= BASE_URL %>sample.js?t=" + timestamp;
document.head.appendChild(script);
</script>
...
</head>
could you try ?
vm.$forceUpdate();
Also it's possible that the component it self needs a unique key :
<my-component :key="unique" />
For example this is the source of an html page: example.html
<html>
<head></head>
<body>
<script>
for (let i = 0; i < 10; ++i) {
}
window.location = 'http://example.com';
</script>
</body>
</html>
Execute the command to open the example.html page with casperjs
casper.start('example.html')
The result I had not expected. casperjs cannot be redirected to http://example.com
But if I edited the example.html page above by replacing in for
let i = 0 ~~> var i = 0
everything worked out for me. casperjs will be redirected to http://example.com
How can I fix this situation with every website?
CasperJS is based on PhantomJS which is very outdated in terms of Javascript capabilities and does not support ES6 at all (and let is a feature of ES6) and it is absolutely not possible to polyfill a keyword.
Since the PhantomJS is suspended it is strongly advised to move to another tool. I'd suggest puppeteer which is being actively developed by Google and has great community support.
The code below handles GET request and will display the login page:
const routes = [
{
path: '/',
name: 'root',
component: Login
}
]
I can display data using router with GET method in vuejs. Now, I want accept POST requests/methods from external website. Is it possible? If it is, how should I make it, if not possible, is there another alternative solution for this?
No, that's not possible. Your routes are not even GET requests. You can intercept any request on your own application, but you can't listen for external requests, that's what HTTP servers are for.
No, a client app in a browser cannot accept requests from other websites/services, no matter which HTTP method is going to be used.
If you want for some reason your Vue based application to be accessible remotely then you can consider using SSR.
Pass data via the webpage holding the Vue instance
Yes you can with a trick!
What is your first base file that is loading your vuejs ?
is this an html file like index.html?
try load your vuejs in a php file like index.php and at top of your php file like write this:
<!DOCTYPE html>
<html >
<head>
...
<script >
window.postedData = '<?php echo json_encode($_POST)?>';
</script>
...
</head>
Now you can use your postedData variable every where in your vuejs code
mounted() {
const arr=JSON.parse(postedData)
console.log(arr)
...
}
I just switched my site over to SSL and all of my social sharing button counts have reset to zero, which is expected, but apparently it's possible to tell those buttons to use the old http urls in order to bring back the old counts.
I just can't figure out how to do it for my setup, which is AddThis for the buttons and Php/Html for the code (Joomla actually, but that may be irrelevant).
The AddThis code is simple:
<div class="addthis_sharing_toolbox" data-url="THE URL"></div>
So my best guess is that I need to take the current URL, change it from https to http, and plug it into the above 'data-url'.
But looking at other threads here, there seems to be a lot of controversy about how to securely and correctly get the current URL, so that's where I'm getting stuck.
(And then on top of that, I'll need to make this switch only for past articles, not new ones, but that's another story.)
Any ideas?
Thanks very much,
Phil
The share counts are based off of the exact URL and unfortunately, the APIs for each of the sharing services (Facebook, Pinterest, etc.) treat the protocols as distinct URLs.
The only thing you could do prevent losing the share counts from existing URLs would be to set override the shared URL to be the old HTTP URL.
Then, you'd need to setup a 301 redirect on your site to redirect the visitor from the old URL to the new HTTPS URL after a visitor clicks the old URL from a shared link on Facebook (or any other service).
It looks like you already found the instructions for changing the URL that's shared (http://www.addthis.com/academy/setting-the-url-title-to-share/), so you would just set the data-url attribute to be the old (HTTP) URL.
Took me all day, but I finally figured this out! This gave me a lot of the answer, but I still had trouble tweaking it for AddThis.
Here's the code (the first line applies the fix only to articles published before Aug 1, 2016, because I don't need to make the change for newer articles):
<?php if (strtotime($this->item->publish_up) < 1470009600) : ?>
<script type="text/javascript">
function buttons(){
var kCanonical = document.querySelector("link[rel='canonical']").href;
window.kCompositeSlug = kCanonical.replace('https://','http://');
return;
}
buttons();
var addthis_share = { url: ''+kCompositeSlug+'' };
</script>
<?php endif; ?>
<script type="text/javascript" src="//s7.addthis.com/js/300/addthis_widget.js#pubid=ID-GOES-HERE" async="async"></script>
Here is my code that worked on a client's Joomla / K2 site, this is part of my item template override:
<?php if (strtotime($this->item->publish_up) < 1503201600) : ?>
<!-- Non SSL Command for buttons here -->
<div class='shareaholic-canvas' data-app='share_buttons' data-app-id='YOURAPPIDHERE' data-link='http://www.yoursite.com<?php echo $this->item->link; ?>'></div>
<?php else: ?>
<!-- Regular SSL Command for buttons here -->
<div class='shareaholic-canvas' data-app='share_buttons' data-app-id='YOURAPPIDHERE'></div>
<?php endif; ?>
Details / analysis of solution here:
https://www.covingtoncreations.com/blog/solution-for-lost-share-count-after-moving-to-ssl-https
Does it work on a Joomla website which uses Sharethis not AddThis ?
Currently I have the following code in the
<script type="text/javascript">var switchTo5x=true;</script>
<script type="text/javascript" src="https://ws.sharethis.com/button/buttons.js"></script>
<script type="text/javascript">stLight.options({publisher: "XXXXXXXXXXXX", doNotHash: false, doNotCopy: false, hashAddressBar: false});</script>
<meta property="fb:app_id" content="XXXXXXXXXXXX"/>
<div id="fb-root"></div>
<script>(function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s); js.id = id;
js.src = "//connect.facebook.net/en_US/sdk.js#xfbml=1&appId=XXXXXXXXXXXX=v2.0";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));</script>