I'm having some problems getting this to work. I think I've done it right based on http://flowplayer.org/forum/5/14664#post-14830). The clip starts to play fine, but when I skip to a part of the clip which hasn't loaded yet it just goes back to the start of the file/video
Sadly the browser used is IE6 for the most part :(
Apache 2 running on Redhat
I've created an mp4 file with ffmpeg.
Run qt-faststart 1.mp4 1.qt.mp4
Installed mod_flvx.c
Added:
LoadModule flvx_module modules/mod_flvx.so
AddHandler flv-stream .flv
To Apache httpd.conf
Using the example page:
<script type="text/javascript">
flowplayer("player", "flash/flowplayer-3.0.3.swf", {
clip: {
url: 'http://servername/player/media/1.qt.mp4',
// default provider: 'h264streaming'
provider: flashembed.isSupported([9, 115]) ? 'h264streaming' : 'lighttpd',
scaling: 'fit',
autoBuffering: true,
autoplay: false,
bufferLength: 3
},
log: {
level: 'debug'
},
plugins: {
h264streaming: {
url: 'flash/flowplayer.h264streaming-3.0.5.swf'
},
controls: {
url: 'flash/flowplayer.controls-3.0.3.swf',
// which buttons are visible and which not ?
play:false,
fullscreen:true,
// scrubber is a well known nickname to the timeline/playhead combination
scrubber: true
}
}
});
</script>
Anyone have any suggestions?
Thanks
First off, you need to see if your apache is configured to correctly seek to parts of the video. You can do that by passing a get parameter like my_vide_path.mp4?start=10 to have it start 10 seconds in. If it successfully starts 10 seconds in, then apache is set up correctly and you need to make flowplayer work. If apache isn't set up correctly, then even the correct flowplayer config won't help.
I doubt your apache is set up correctly though. You're telling apache to handle .flv files and yet you're using mp4s.
I've successfully gotten this to work with the apache module from http://h264.code-shop.com/trac/wiki.
You'll need to add and load the module into apache and tell apache to handle .mp4 files with it.
Next step would be to test with that get parameter ?start=10 to see if it is seeking correctly.
After that, all you need to do is:
<script type="text/javascript">
flowplayer("player", "flash/flowplayer-3.0.3.swf", {
clip: {
url: 'http://servername/player/media/1.qt.mp4',
default provider: 'h264streaming'
},
plugins: {
h264streaming: {
url: 'flash/flowplayer.h264streaming-3.0.5.swf'
}
}
});
</script>
I used the updated flowplayer swf and instead of h264's swf, the flowplayer plugin for psuedostreaming, as it works with the newer version of flowplayer and apparently the h264 one didnt.
Related
I created an app with vue-cli and then I build the dist folder for production.
The app is deployed on IIS with flask backend and works fine.
The problem occurs when I have to make some changes and I have to redo the deployment. After this, users call me because app doesn't work but if I clear the chrome cache, the app works fine again.
How can I fix this problem? Is there a method to clear chrome cache automatically when I release a new application version?
Thanks
my dist folder
deployment: copy and paste folder dist on IIS
if files in dist folder are correct, maybe the problem is in axios cache? i have make some changes also to rest apis
I had the same problem and changing (incrementing) the version number in package.json before running the build command fixed it.
For example by default the version number is set to "0.1.0"
package.json file:
{
"name": "project-name",
"version": "0.1.1",
"private": true,
...
}
If you use vue-cli, then it has built-in webpack configs for building dist. And in fact it adds hash-names to output files.
But if it was removed somehow, you can add it back to webpack config like
output: {
filename: '[name].[hash].bundle.js'
}
And your app will looks like this:
And even more, you do not need to handle how all this stuff will be added to html, coz webpack will figure it out for you.
You need to add a version query to your js file. This is how a browser can know if the file has changed and needs to download the new version.
So something like:
<script src="main.js?v=1.1"></script>
<script src="main.js?v=1.2"></script>
etc...
Assuming this is nothing to do with service worker/PWA, the solution could be implemented by returning the front-end version.
axiosConfig.js
axios.interceptors.response.use(
(resp) => {
let fe_version = resp.headers['fe-version'] || 'default'
if(fe_version !== localStorage.getItem('fe-version') && resp.config.method == 'get'){
localStorage.setItem('fe-version', fe_version)
window.location.reload() // For new version, simply reload on any get
}
return Promise.resolve(resp)
},
)
You can also ensure the fe-version is returned based on any sort of uniqueness, here I have used the commit SHA.
Full Article here: https://blog.francium.tech/vue-js-cache-not-getting-cleared-in-production-on-deploy-656fcc5a85fe
You can't access the browser's cache, that would be huge a security flaw.
To fix it, you must send some headers with your flask responses telling the browser not to cache you app.
This is an example for express.js for you to get the idea:
setHeaders: function (res, path, stat) {
res.set('Cache-Control', 'no-cache, no-store, must-revalidate') // HTTP 1.1
res.set('Pragma', 'no-cache') // HTTP 1.0
res.set('Expires', '0') // Proxies
}
You can read a lot more about caching in here.
This is an older post, but since I could not find the solution for this problem online, ill just post this here in case someone else might find it usefull.
I added the hash to the appllication chunk files via the webpack.mix.js file by adding:
mix.webpackConfig({
output: {
chunkFilename: 'js/[name].js?id=[chunkhash]',
},
})
This adds a fingerprint to the actual chunks and not just the app.js file. You can add a version name to the app.js file aswell by adding version(['public/js/app.js']); at the end of the file, or add filename: '[name].js?[hash]' to the output block.
My complete webpack.mix.js:
const mix = require('laravel-mix');
mix.webpackConfig({
output: {
chunkFilename: 'js/main/[name].js?id=[chunkhash]',
}
}).js('resources/js/app.js', 'public/js').vue()
.postCss('resources/css/app.css', 'public/css', [
//
]).version(['public/js/app.js']);
In my laravel blade file I used
<script src="{{ mix('js/app.js') }}"></script>
to load the app.js file with the correct version fingerprint.
The answer for me was caching at my DNS provider level.
Basically, I'm using Cloudflare DNS proxy and they are caching the website so in development mode I was not getting the code updates.
I had to clear the cache many times to get anything to change. I had to wait a significant period of time before anything update.
Turned it off and it stopped doing that.
the method I want to suggest
<script src="{{ asset('js/app.js?time=') }}{{ time() }}" defer></script>
add below script in publc/index.html
<head>
...
<script type="text/javascript" language="javascript">
var timestamp = (new Date()).getTime();
var script = document.createElement("script");
script.type = "text/javascript";
script.src = "<%= BASE_URL %>sample.js?t=" + timestamp;
document.head.appendChild(script);
</script>
...
</head>
could you try ?
vm.$forceUpdate();
Also it's possible that the component it self needs a unique key :
<my-component :key="unique" />
I am currently working on a DICOM based web application, I have created my backend server using .net core and want to integrate my server with OHIF viewer. I read all the documentation of OHIF viewer and configured my default.js file and changed the routes of wadoUriRoot,qidoRoot and wadoRoot as follows
window.config = {
// default: '/'
routerBasename: '/',
extensions: [],
showStudyList: true,
filterQueryParam: false,
servers: {
dicomWeb: [
{
name: 'DCM4CHEE',
wadoUriRoot: 'http://127.0.0.1:5000',
qidoRoot: 'http://127.0.0.1:5000',
wadoRoot: 'http://127.0.0.1:5000',
qidoSupportsIncludeField: true,
imageRendering: 'wadors',
thumbnailRendering: 'wadors',
enableStudyLazyLoad: true,
},
],
}
Now when I recompile and run it, I get a totally black screen. I have checked that data is going to the browser. What are the possible reason for this behavior? How can I make my OHIF viewer to display my own Dicom images?
I finally able to solve it. It was a browser problem, Firefox and Chrome were not allowing CORS so I had to add services and UseCors to my Startup.cs in my server code to enable CORS
https://learn.microsoft.com/en-us/aspnet/core/security/cors?view=aspnetcore-3.1
There it's mentioned how to do it.
I've a webpage , where I've included the requirejs via script tag like -
<script data-main="/media/course-book-app/courses.require.main.js" src="/media/common/vendor/requirejs/require.js"></script>
On Safari browser, I'm getting error like -
What is causing this issue?
This issue is very frequent on Safari but on chrome it is less frequent.
Testing URL
From https://requirejs.org/docs/errors.html#scripterror (which is linked right there in the error). Follow the instructions and look at the script that caused the error
This occurs when the script.onerror function is triggered in a
browser. This usually means there is a JavaScript syntax error or
other execution problem running the script. To fix it, examine the
script that generated the error in a script debugger.
This error may not show up in IE, just other browsers, and instead, in
IE you may see the No define call for ... error when you see "Script
error". This is due to IE's quirks in detecting script errors.
Here is the way to use requirejs correctly. This ensures the configuration gets loaded before loading any module -
define('requireconfig', function(){
require.config({
paths: {
"jquery": "/common/vendor/jquery/jquery-1.9.1.min",
"backbone": "/common/vendor/backbone/backbone.min-1.1.2",
"underscore": mediaPath + "/common/vendor/underscore/underscore.min-1.7.0"
},
shim: {
backbone : {
deps: ["jquery","underscore"],
exports: "Backbone"
},
}
});
});
define('main', ['requireconfig'], function () {
'use strict';
});
// loading main module which loads the requirejs configuration
requirejs(['main'],()=>{
requirejs(['jquery'], ($)=>{//jquery loaded});
}, ()=>{//error loading module})
I want to load Dojo1.9 amd modules from an ad-hoc server on the www, but I won't know from where until runtime (with url params).
In essence, I want to do the equivalent of this:
require(['http://www.foo.com/SomeRandomModule'], function( SomeRandomModule ) {
// use SomeRandomModule
});
Quick and dirty way
Might have some unexpected quirks when it comes to the module system and relative paths, I haven't used it enough to say:
require([ "//host/myext/mod1/mod2.js" ],function(mod2){
// If current webpage is http:// or https:// or file://
// it tries to use the same protocol
});
Better way
Configure require() to treat all modules that start with a certain package name (e.g. foo) as coming from a particular URL. From your starter page, something like:
<script src="dojo/dojo.js"
data-dojo-config="packages:[{name:'myext',location:'//host/js/myext'}], async: 1, >
</script>
This lets you vastly improve your first example to:
require([ "myext/mod1/mod2" ],function(mod2){
});
If you are using a Dojo Bootstrap installation instead, you can avoid touching your data-dojo-config and instead put it inside the run.js startup file:
require({
baseUrl: '',
packages: [
'dojo',
'dijit',
'dojox',
'myapp',
{ name: 'myext', location: '//host/js/myext', map: {} }
]
}, [ 'myapp' ]);
I want to use dojo within a chrome extension's content script. I have this in my manifest.json:
"content_scripts":[
{
"js":["lib/dojo/dojo.js","main.js"],
"matches":["<all_urls>"],
"run_at": "document_idle"
}
]
I've already put a dojo folder under "lib" folder of the root of this extension. However, the script paused execution and told me dojo is undefined. This means dojo is not loaded.
then i tried register dojoConfig before dojo.js is loaded:
"content_scripts":[
{
"js":["env.js",
"lib/dojo/dojo.js",
"main.js"],
"matches":["<all_urls>"],
"run_at": "document_idle"
}
],
in the env.js, it contains;
dojoConfig = {
"baseUrl" : "/lib/dojo"
};
still not work though. Anyone else has sucessful experience?
According to http://code.google.com/chrome/extensions/content_scripts.html
These are injected in the order they appear in this array.
Maybe your should have to change the js files list order