The error message from the console:
[Error] Unsafe attempt to load URL from origin . Domains, protocols and ports must match.
My VTT domain is different than my origin domain.
It seems iOS 14 and macOS Safari 14 are more strict than prior versions. I was able to fix the problem by adding a crossorigin like so:
<video
:id="videoIdentifier"
crossorigin="anonymous"
/>
I initially tried the solution of Steve
However, adding crossorigin="anonymous" created CORS issues when loading videos.
But the captions are loading fine.
I guess there are 2 solutions:
1- Using crossorigin="anonymous", you now need to setup the appropriate header on your server. Otherwise it will create CORS issues.
2- Or if you cannot access to the server, you can store the VTT as a blob in the local storage:
<script>
fetch('https://bitflix-subs.herokuapp.com/sub.vtt')
.then(response => {
if (!response.ok)
throw new Error('Network response was not ok')
return response.blob()
})
.then(blob => {
blob.type = 'text/vtt'
let b = URL.createObjectURL(blob)
const tmpUrl = b; // Use that URL in track tag.
})
.catch(error => {
console.error('There has been a problem with your fetch operation:', error)
});
</script>
And add the new URL to the track tag:
<video />
<track src="tmpUrl"/>
Related
I created an app with vue-cli and then I build the dist folder for production.
The app is deployed on IIS with flask backend and works fine.
The problem occurs when I have to make some changes and I have to redo the deployment. After this, users call me because app doesn't work but if I clear the chrome cache, the app works fine again.
How can I fix this problem? Is there a method to clear chrome cache automatically when I release a new application version?
Thanks
my dist folder
deployment: copy and paste folder dist on IIS
if files in dist folder are correct, maybe the problem is in axios cache? i have make some changes also to rest apis
I had the same problem and changing (incrementing) the version number in package.json before running the build command fixed it.
For example by default the version number is set to "0.1.0"
package.json file:
{
"name": "project-name",
"version": "0.1.1",
"private": true,
...
}
If you use vue-cli, then it has built-in webpack configs for building dist. And in fact it adds hash-names to output files.
But if it was removed somehow, you can add it back to webpack config like
output: {
filename: '[name].[hash].bundle.js'
}
And your app will looks like this:
And even more, you do not need to handle how all this stuff will be added to html, coz webpack will figure it out for you.
You need to add a version query to your js file. This is how a browser can know if the file has changed and needs to download the new version.
So something like:
<script src="main.js?v=1.1"></script>
<script src="main.js?v=1.2"></script>
etc...
Assuming this is nothing to do with service worker/PWA, the solution could be implemented by returning the front-end version.
axiosConfig.js
axios.interceptors.response.use(
(resp) => {
let fe_version = resp.headers['fe-version'] || 'default'
if(fe_version !== localStorage.getItem('fe-version') && resp.config.method == 'get'){
localStorage.setItem('fe-version', fe_version)
window.location.reload() // For new version, simply reload on any get
}
return Promise.resolve(resp)
},
)
You can also ensure the fe-version is returned based on any sort of uniqueness, here I have used the commit SHA.
Full Article here: https://blog.francium.tech/vue-js-cache-not-getting-cleared-in-production-on-deploy-656fcc5a85fe
You can't access the browser's cache, that would be huge a security flaw.
To fix it, you must send some headers with your flask responses telling the browser not to cache you app.
This is an example for express.js for you to get the idea:
setHeaders: function (res, path, stat) {
res.set('Cache-Control', 'no-cache, no-store, must-revalidate') // HTTP 1.1
res.set('Pragma', 'no-cache') // HTTP 1.0
res.set('Expires', '0') // Proxies
}
You can read a lot more about caching in here.
This is an older post, but since I could not find the solution for this problem online, ill just post this here in case someone else might find it usefull.
I added the hash to the appllication chunk files via the webpack.mix.js file by adding:
mix.webpackConfig({
output: {
chunkFilename: 'js/[name].js?id=[chunkhash]',
},
})
This adds a fingerprint to the actual chunks and not just the app.js file. You can add a version name to the app.js file aswell by adding version(['public/js/app.js']); at the end of the file, or add filename: '[name].js?[hash]' to the output block.
My complete webpack.mix.js:
const mix = require('laravel-mix');
mix.webpackConfig({
output: {
chunkFilename: 'js/main/[name].js?id=[chunkhash]',
}
}).js('resources/js/app.js', 'public/js').vue()
.postCss('resources/css/app.css', 'public/css', [
//
]).version(['public/js/app.js']);
In my laravel blade file I used
<script src="{{ mix('js/app.js') }}"></script>
to load the app.js file with the correct version fingerprint.
The answer for me was caching at my DNS provider level.
Basically, I'm using Cloudflare DNS proxy and they are caching the website so in development mode I was not getting the code updates.
I had to clear the cache many times to get anything to change. I had to wait a significant period of time before anything update.
Turned it off and it stopped doing that.
the method I want to suggest
<script src="{{ asset('js/app.js?time=') }}{{ time() }}" defer></script>
add below script in publc/index.html
<head>
...
<script type="text/javascript" language="javascript">
var timestamp = (new Date()).getTime();
var script = document.createElement("script");
script.type = "text/javascript";
script.src = "<%= BASE_URL %>sample.js?t=" + timestamp;
document.head.appendChild(script);
</script>
...
</head>
could you try ?
vm.$forceUpdate();
Also it's possible that the component it self needs a unique key :
<my-component :key="unique" />
I am trying to play back a video (currently hosted on S3 with public access) by creating a blob URL.
I have used Elastic Transcoder to encode the video since it is supposed to set the MOOV atom to the top (beginning).
I am unable to get the code to work but also found a working example: link here
Here is my code:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8"/>
</head>
<body>
<video controls></video>
<script>
var video = document.querySelector('video');
var assetURL = 'https://ovation-blob-url-test.s3.amazonaws.com/AdobeStock_116640093_Video_WM_NEW.mp4';
// Need to be specific for Blink regarding codecs
// ./mp4info frag_bunny.mp4 | grep Codec
var mimeCodec = 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"';
if ('MediaSource' in window && MediaSource.isTypeSupported(mimeCodec)) {
var mediaSource = new MediaSource;
//console.log(mediaSource.readyState); // closed
video.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', sourceOpen);
} else {
console.error('Unsupported MIME type or codec: ', mimeCodec);
}
function sourceOpen (_) {
//console.log(this.readyState); // open
var mediaSource = this;
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
fetchAB(assetURL, function (buf) {
sourceBuffer.addEventListener('updateend', function (_) {
mediaSource.endOfStream();
video.play();
//console.log(mediaSource.readyState); // ended
});
sourceBuffer.appendBuffer(buf);
});
};
function fetchAB (url, cb) {
console.log(url);
var xhr = new XMLHttpRequest;
xhr.open('get', url);
xhr.responseType = 'arraybuffer';
xhr.onload = function () {
cb(xhr.response);
};
xhr.send();
};
</script>
</body>
</html>
What am I doing wrong? I looked at tools ie.e MP4Box or QT-FastStart but they seem to be kind of old school. I would also be willing to change from MP4 to M3U8 playlist but then I don't know what MIME types to use.
At the ned of the day I am trying to play back a video/stream and hide the URL (origin) potentially using blob.
Thank you guys!
So, first, even though this code seems to be taken from mozilla documentation site, there are a few issues - you are not checking the readyState before calling endOfStream thus the error you get is valid, secondly, the play() call is blocked by the autoplay policy changes. If you add an error handler, you will actually see that the appendBuffer fails. Here is the updated snippet:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8"/>
</head>
<body>
<video controls></video>
<script>
var video = document.querySelector('video');
var assetURL = 'https://ovation-blob-url-test.s3.amazonaws.com/AdobeStock_116640093_Video_WM_NEW.mp4';
// Need to be specific for Blink regarding codecs
// ./mp4info frag_bunny.mp4 | grep Codec
var mimeCodec = 'video/mp4; codecs="avc1.42E01E, mp4a.40.2"';
if ('MediaSource' in window && MediaSource.isTypeSupported(mimeCodec)) {
var mediaSource = new MediaSource;
//console.log(mediaSource.readyState); // closed
video.src = URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', sourceOpen);
} else {
console.error('Unsupported MIME type or codec: ', mimeCodec);
}
function sourceOpen (_) {
//console.log(this.readyState); // open
var mediaSource = this;
var sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
fetchAB(assetURL, function (buf) {
sourceBuffer.addEventListener('updateend', function (_) {
// console.log(mediaSource.readyState); // ended
if (mediaSource.readyState === "open") {
mediaSource.endOfStream();
video.play();
}
});
sourceBuffer.addEventListener('error', function (event) {
console.log('an error encountered while trying to append buffer');
});
sourceBuffer.appendBuffer(buf);
});
};
function fetchAB (url, cb) {
console.log(url);
var xhr = new XMLHttpRequest;
xhr.open('get', url);
xhr.responseType = 'arraybuffer';
xhr.onload = function () {
cb(xhr.response);
};
xhr.send();
};
</script>
</body>
</html>
So lets advance to next issue - the actual error. So, using chrome://media-internals/ we can see that the video actually fails to load do to incompatibility with the ISOBMFF format:
I am not familiar with Elastic Transcoder, but it seems that is it not producing an mp4 file suitable for live streaming. Also, if using mse, putting moov at the beginning is not enough, the video actually has to meet all of the ISOBMFF requirements - see chapters 3. and 4.
The working sample you mentioned is not a valid comparison since it uses the blob for the src, where the ISOBMFF rules do not apply. If it is fine for you to go that way, don't use MSE and put the blob directly in the src. If you need MSE, you have to mux it correctly.
Ok, so I got the original code example to work by encoding my MP4 videos with ffmpeg:
ffmpeg -i input.mp4 -vf scale=1920:1080,setsar=1:1 -c:v libx264 -preset medium -c:a aac -movflags empty_moov+default_base_moof+frag_keyframe output.mp4 -hide_banner
Important is: -movflags empty_moov+default_base_moof+frag_keyframe
This setup also scales the video to 1920x1080 (disregarding any aspect ratio of the input video)
However, based on the comments of the original post, I do believe there might be a more efficient way to generate the blob url and ingest into a video tag. This example was copied straight from https://developer.mozilla.org.
If anyone comes up with a better script (not over-engineered), please post it here.
Thank you #Rudolfs Bundulis for all your help!
I am using the blueimp fileupload basic plugin in my project. It all works well in Safari, Firefox, Chrome but there is a problem with Internet Explorer 9 and below:
The start callback gets called and in the network tab of developer tools I see the ajax call being executed. However the file is never being upload (I checked on the server, too) and the call eventually ends up in a 408 request timeout.
Any hints on what could be the reason?
Here are my relevant code parts:
<input class="input-file" id="fileupload" name="files[]" data-url="/app_dev.php/backend/ajax/upload/wish/1850cf918a43d42" type="file">
<script type="text/javascript" src="js/jquery/jquery-1.8.2.min.js"></script>
<script type="text/javascript" src="js/uploader/vendor/jquery.ui.widget.js"></script>
<script type="text/javascript" src="js/uploader/jquery.fileupload.js"></script>
<script type="text/javascript" src="js/uploader/jquery.iframe-transport.js"></script>
<script>
$(document).ready(function() {
$('#fileupload').fileupload({
dataType: 'json',
dropZone: null,
start: function (e, data){
console.log('start'); //fires in all browsers = fine
},
progress: function (e, data){
console.log('progress'); //fires in Safari, FF, Chrome = fine
},
done: function (e, data) {
console.log('done'); //never getting here in IE cause file doesn't get uploaded.
}
});
</script>
Problem fixed!
There were two issues. One had to do with local network settings.
The other was to implement the correct handling of content type negotiation. See https://github.com/blueimp/jQuery-File-Upload/wiki/Setup for more details.
Just my 5 ¢:
I had a very hard time trying to make it work with pretty links! Following dumps were totally empty!
var_dump($_FILES);
var_dump($_POST);
var_dump($_GET);
So:
$('#fileupload').fileupload({
url: 'http://code.dev/products/postUpload' // <--- remove trailing slash!!!
});
Following code works in Chrome (22.0) but not in Safari (6.0)
<!DOCTYPE html>
<html>
<head>
<script>
function onGo(e) {
var fr = new FileReader();
var file = document.getElementById("file").files[0];
fr.onload = function(e) {
var data = new Uint8Array(e.target.result);
var blob = new Blob([data], {type: 'audio/mpeg'});
var audio = document.createElement('audio');
audio.addEventListener('loadeddata', function(e) {
audio.play();
}, false);
audio.addEventListener('error', function(e) {
console.log('error!', e);
}, false);
audio.src = webkitURL.createObjectURL(blob);
};
fr.readAsArrayBuffer(file);
}
</script>
</head>
<body>
<input type="file" id="file" name="file" />
<input type="submit" id="go" onclick="onGo()" value="Go" />
</body>
</html>
In Safari, neither callback (loadeddata nor error) is called.
The content used is an mp3 file, which is normally played back with audio tag.
Is there any special care needed for Safari?
Many years later, I believe the example in the OP should work just fine. As long as you somehow set the mime type when creating the blob, like the OP does above with the type property of the options passed in:
new Blob([data], {type: 'audio/mpeg'});
You could also use a <source> element inside of an audio element and set the type attribute of the <source> element. I have an example of this here:
https://lastmjs.github.io/safari-object-url-test
And here is the code:
const response = await window.fetch('https://upload.wikimedia.org/wikipedia/commons/transcoded/a/ab/Alexander_Graham_Bell%27s_Voice.ogg/Alexander_Graham_Bell%27s_Voice.ogg.mp3');
const audioArrayBuffer = await response.arrayBuffer();
const audioBlob = new Blob([audioArrayBuffer]);
const audioObjectURL = window.URL.createObjectURL(audioBlob);
const audioElement = document.createElement('audio');
audioElement.setAttribute('controls', true);
document.body.appendChild(audioElement);
const sourceElement = document.createElement('source');
audioElement.appendChild(sourceElement);
sourceElement.src = audioObjectURL;
sourceElement.type = 'audio/mp3';
I prefer just setting the mime type of the blob when creating it. The <source> element src attribute/property cannot be updated dynamically.
I have the same problem, and I spend a couple days troubleshooting this already.
As pwray mentioned in this other post, Safari requires file extensions for media requests:
HTML5 Audio files fail to load in Safari
I tried to save my blob to a file, named it file.mp3 and Safari was able to load the audio that way, but after I renamed the file to have no extension (just "file"), it didn't load.
When I tried the url created from the blob in another tab in Safari:
url = webkitURL.createObjectURL(blob);
it download a file right away called "unknown", but when I tried the same thing in Chrome (also on Mac), it showed the content of the file in the browser (mp3 files start with ID3, then a bunch of non-readable characters).
I couldn't figure out yet how I could force the url made of blob to have an extension, because usually it looks like this:
blob:https://example.com/a7e38943-559c-43ea-b6dd-6820b70ca1e2
so the end of it looks like a session variable.
This is where I got stuck and I would really like to see a solution from some smart people here.
Thanks,
Steven
Sometimes, HTML5 audio can just stop loading without any apparent reason.
If you take a look to the Media Events (http://www.w3schools.com/tags/ref_eventattributes.asp) you´ll see an event called: "onStalled", the definition is "Script to be run when the browser is unable to fetch the media data for whatever reason" and it seems that it should be helpful for you.
Try listening for that event and reloading the file if necessary, with something like this:
audio.addEventListener('onstalled', function(e) {
audio.load();
}, false);
I hope it helps!
Just use source tag in audio.
<audio controls>
<source src="blob" type="blobType">
</audio>
My site is working well with Google webfonts UNTIL the user hits the SSL portion of the site.
At that point, chrome throws the partial encoding error, and my cufon menu losses it's kerning.
I'm including my webfont with this css:
#font-face {
src: local('Lusitana'), url(https://themes.googleusercontent.com/static/fonts/lusitana
/v1/tAIvAkRzqMJf8Y4fM1R7PXYhjbSpvc47ee6xR_80Hnw.woff) format('woff');
}
My js console then gives me this error:
[blocked] The page at https://domain.com/ecommerce.php ran insecure content from
http://fonts.googleapis.com/css?family=Lusitana:regular,700&subset=latin.
Any ideas how I can get google fonts to force SSL?
Have you tried replacing https:// with // in the url? The request should use the correct protocol automatically.
locate this line on your HTML page (or template):
<link href='http://fonts.googleapis.com/css?family=Dosis:400,700' rel='stylesheet' type='text/css'>
and change it to this:
<link href='//fonts.googleapis.com/css?family=Dosis:400,700' rel='stylesheet' type='text/css'>
This simple change will make your browser call the Google Font page in the applicable mode (HTTP vs HTTPS).
Enjoy!
To load google fonts that will work in non-secure, and SSL mode, try the following in your page header - (and remove what you've got there calling a https:// inside the CSS):
<script type="text/javascript">
WebFontConfig = { google: { families: [ 'Droid+Serif::latin' ] } };
(function() {
var wf = document.createElement('script');
wf.src = ('https:' == document.location.protocol ? 'https' : 'http') +
'://ajax.googleapis.com/ajax/libs/webfont/1/webfont.js';
wf.type = 'text/javascript';
wf.async = 'true';
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(wf, s);
})();
</script>
In my example, I'm using Droid Serif font, so swap that with yours.
You can read more on this here.
I also had this error caused by a theme in WordPress. It caused slow page loading and the following error reported by development console:
Mixed Content: The page at 'https://xxxxxxx.co.uk/' was loaded over HTTPS, but requested an insecure stylesheet 'http://fonts.googleapis.com/css?
family=Droid+Serif%3A400%2C700%2C400italic%2C700italic&ver=5.4.1'. This
request has been blocked; the content must be served over HTTPS.
The culprit was Wordpress theme called "Fresh and Clean". It inherits code written in 2014 which contains 'pre-SSL' coding practices
To resolve the problem all need to do is make changes inside the following file in the theme:
/wp-content/themes/wpex-freshandclean/functions/scripts.php
Look inside for any occurences of http:// and change each one to https://