I'm looking for a third-party library that allows chunked upload.
I've implemented the upload mechanism towards a Web API without using libraries.
I've done the same thing using ng2-file-upload.
They works very well but a big file is sent entirely with only an http call.
I need a chunked upload.
I've found PrimeNG but i don't see an option for chunked upload.
Same thing for ngx-uploader.
The only library that seems to allow a chunked upload is this one:
https://github.com/kukhariev/ngx-uploadx but the documentation is very poor and i cannot make it works.
Thank you
Related
I have a Nest.js generated PDF that I am trying to download in React frontend.
Everything works fine when I use Postman or any other API client to request the PDF directly on the endpoint, but when I make a fetch/axios request in my frontend with the same method as stated here (conversion to blob/arraybuffer), the overall content is bigger (fonts, assets) and the texts are not selectable before. I sense that it has to be due to the responseType: blob, but I have not found any other solution whatsoever anywhere.
Does anybody have any idea how to overcome this issue?
The desired result (not working images is ok)
The fetch/axios result
We are developing an application for Web. Inside that application, to download a file, I have created a WCF Rest service that will download the files based on this link Download using WCF Rest. The purpose is to check for user authentication before downloading. I used streaming concept to download the file. It is now that I have found out few things
When the user downloads the file, he is not able to determine what are the file size and the time remaining. I analyzed and found out that the reason is because, it’s using the “Transfer Encoding: chunked” in the header so that the file will be downloaded in chunks. One of the advantages is that the memory consumption is less in the server even when there are many users downloading a file. So I thought of adding “Content-Length” header, but I found out that you can use only either one of the headers not both. So I was thinking how Hotmail and Gmail were downloading attachments. From my investigation, I found out that Hotmail uses chunking header whereas Gmail uses Content-length header. Also in the case of Gmail, it is also checking if the session is active or not then downloads the file accordingly. I want to achieve the following
a) Like Gmail, I want to check if the session is active or not and then downloads the files accordingly. What will be the method for me to implement it?
b) When downloading the file, I want to use Content-Length header instead of Chunked header. Also the memory consumption should be less. Can we achieve it in WCF Rest? If so how?
c) Is it possible for me to add a header in WCF that will display the file size in the browser Downloads window?
d) When downloading an inline images from WCF, I found out that the image after loading is not cached in local machine. I was thinking that once an image is shown in an HTML page, it will get automatically cached and the next time user visits the page, the image will load from cache instead from server. I want to cache the inline images to cache, what is the option that I can use for it? Are there any headers that I need to specify when downloading an inline image from server?
e) When I download a zip file using WCF in IPhone Chrome browser, it’s not downloading at all. But the same link works in Android Chrome browser. What could be the problem? Am I missing header in WCF?
Are there any methods that will achieve the above?
Regards,
Jollyguy
im working on an app which fetches json from a website. everything is working properly and im using alamofire .
but for some reason, when i post new content on the website and the json file changes, alamofire doesnt get the new content. instead, it loads the content from the cache instead of redownloading the new content.
the only workaround to this is to clear the cache which is a way that i do not prefer since the user will have to download the content all over again at each view load.
so what im asking is, is there a way to notify the alamofire method about the new content and try to load the new content instead of having me to implement a method to clear the cache?
Alamofire uses the Foundation URL loading system, which relies on NSURLCache. The cache behavior for HTTP requests is determined by the contents of your HTTP response's Cache-Control headers. For example, you may wish to configure your server to specify must-revalidate:
Cache-Control: max-age=3600, must-revalidate
You should also make sure your server is specifying ETag and Content-Length headers to make it easy to tell when content has changed.
NSHipster's writeup on NSURLCache has a few good examples. If you're totally new to web caching, I recommend you read the very helpful section 13 of the HTTP 1.1 spec, and possibly also this caching tutorial.
I'm building a Chromecast app, where I want to stream .m3u8 files (HLS) from a streaming provider. The streaming provider does not add CORS headers to the HTTP headers, which is a requirement for building Chromecast apps.
Is there any way to route the requests through a proxy, and have the proxy add the necessary headers for .m3u8 files? AFAICS, the .m3u8 files further point to files for the different bandwith streams, so it would be necessary to have the proxy add appropriate CORS headers to the header for those files as well.
Here is an example of a link to a .m3u8 file that I want to be able to stream.
Hey I realise I'm a bit late but I thought I would post here in case other find it usefull. I had the same problem when developing a chromecast application. The simple solution I found was to include the TOMODOkorz library this will pass all http requests through it's proxy.
You could host your own proxy and change the library to point to yours relatively easily.
This is actually possible by rewriting the urls within Chromecast's Media Player Library and having these sub-playlists also proxy through a CORS proxy like http://www.corsproxy.com/.
To do this in your custom receiver, do not import the google-hosted library
<script type="text/javascript" src="//www.gstatic.com/cast/sdk/libs/mediaplayer/0.5.0/media_player.js"></script>
Instead, copy the obfuscated javascript directly into your receiver html page, and do the following:
Find+replace g.D.url=k with g.D.url='http://www.corsproxy.com/' + k.replace(/^(?:[a-z]+:)?\/\//i,'')
Find+replace url:k with url:('http://www.corsproxy.com/' + k.replace(/^(?:[a-z]+:)?\/\//i,''))
Now, if you send the initial contentId to Chromecast with the http://www.corsproxy.com/YOUR_M3U8_FILE_HERE you should have a fully functional HLS-playing Chromecast app.
Most providers have the ability to set CORS for their customers. Akamai certainly does.
I've been able to stream HLS to ChromeCast from an S3 bucket by adding a permissive CORS file to the permissions for the bucket.
To answer my own question:
This is not possible without rebroadcasting the streams. .m3u8 files are files containing links to other files, which in the end also contain the binaries. All of these, including the HTTP response containing the binary, needs the CORS headers for the Chromecast to display the contents.
If you're only looking to add CORS headers to textual responses corsproxy.com is a good alternative, a long with several available open source projects.
if I get data from an external website in JSONP form, how do I access the http header response? I have heard this may be difficult but my experience is that everything is possible.
Nope.
This is completely impossible.
The whole point of JSONP is to bypass the same-origin policy by passing a result through executable Javascript code.
Other than JS code generated by the remote server, you cannot get any information.