I created a new Blazor Web Assembly App with Core Hosted, and implemented a file upload as described in:
https://learn.microsoft.com/en-us/aspnet/core/blazor/file-uploads?view=aspnetcore-6.0&pivots=webassembly
The Upload itsself is working, but its unusable slow. I picked a 2 KB File and it took 60 seconds for the request to arrive at the server. With everything larger than like 10 KB I can't even upload it because it reaches a timeout. I set the timeout to 5 minutes and I can't upload a 23 KB png file.
I am using VS 2022 - Version 17.3.3 and created the project today (with NET 6.0), so its completely new and clean.
Even when not debugging, the upload takes the same amount of time. Is there something wrong with my system?
Related
I'm currently working in a POS system and i'm writting the code in net core(web api) with vue as spa and mssql for my database.
I thought at first to use some storage service like aws-s3 or azure-storage ones to fetch the images from there, but I couldn't keep on going that way because there were extra costs, so I was requested to upload images to server better. So, now i'm having an issue and is that, the vue app has to be precompiled by babel when going to publish and therefore i can't changes the files in dist folder when running, so when i do it on dev stage (my machine) I see it working as good as possible, file uploads to folder ok and after compiling back, it will show the image. But, how do I accomplish this in prod stage?
I have configured my drupal site so that all images/files/media etc is handled my s3 by using S3 file system module.
Now everything works fine, the image/file/ field uploader works fine but there is a huge performance issue when using IMCE file browser from the WYSIWYG editor. It takes at least a minute for the browser to display its content and there are only 290 images with 78 MB used in that initial folder which should not cause such huge delays. This is having a huge impact for our editors and several minutes lost just to upload a couple of images.
I tried various pagination patch and there is no difference at all in the performance.
What are my options now
As drilled through many forms and discussions, turns out that IMCE was not meant for S3 file system and I found this patch in pdf form(warning downloads rather than opens )
I followed the steps in that patch which significantly improved my performance.
I'm hosting an HLS stream with XAMPP / Apache, which basically means I have a folder in my document root that contains a couple of incrementally numbered 10-second video files.
Every 10 seconds, a new video file is saved into the folder and the oldest video file in the folder is deleted.
Apart from these video files, the document root also contains some other files, such as PHP scripts and playlist files.
My server has plenty of RAM and a pretty fast CPU, but is using a comparatively slow hard disk.
Given the fact that the constant downloading of these video files is likely what's going to make or break the server performance, it seems like a good idea to cache these files in memory.
If Apache were to keep all video files (with a .ts extension) that're downloaded by a user's video player, in it's memory for about 60 seconds, the next user would then be able to download the file much faster. Apache could rely on the files not changing after the first open and on the fact that the files won't be requested anymore after those 60 seconds.
All other files do not (necessarily) have to be cached, since they're rather small and are regularly modified.
Is anyone able to give me directions on how to get started?
Modern operating systems already cache accessed files in memory. The whole process is managed by the kernel automatically.
Apache in-memory caching won't help you since it needs all the files at start-up.
If you want some level of control over the caching you could use vmtouch. Check the manual.
We have upgraded to ColdFusion 10 and I am testing large upload capability.
Using both a HTML form and the flash multi-file upload CFFILEUPLOAD I can upload files of up to 2GB.
With files over 2gb the upload does not even start. 0% both with the flash upload and what chrome browser reports with HTML form.
Technical services suggest it does not even get as far as Apache, that is not restricting the upload. ColdFusion is also setup to allow 4000MB post data even with throttle.
The upload is occurring across the network, so even with test a 1.7gb file it doesn't take long - but 2.5gb does not even begin.
Any suggestions to help diagnose the cause?
Thanks
I am working on a project, the team (and vs2012) by default uses iis7.5 during development.
Very recently (yesterday) the sites I was debugging start loading very slow.
Using fiddler I could see that it would take about 30 seconds for every single call that is being made.
so "localhost:{port}" takes 30 seconds (2kb), once that finally comes back and the page starts to load it calls all the css and js files, each take about 30 seconds to load.
As this is a large application it quickly became unusable.
this happens whether I am debugging, or loading a page without debugging.
This same behavior happens an all sites on my dev machine, even plain newly build site exhibits the same behavior And occurs in all browsers.
I am able to "resolve" the problem several ways.
1. Switch from iis 7.5 to visual studio dev server.
(url does not change localhost:{port} to localhost:{port}.)
2. In web matrix, if I change the url from localhost:{port} to
http://{machinename}:{port} the site then runs faster (normal speed),
but debugging becomes and issue.
option 1 above does the trick and I dont seem to be effected too much by the change.
I am very curious as to why this is all of a sudden happening though, and would really like to fix the problem.
Any thoughts would be greatly appreciated.