Icecast Icy headers - header

I'm using icecast with a bunch of mountpoints to broadcast local radio stations for our customers. For some of them we fetch the current playing song and update the metadata info on icecast every few seconds using the /admin/metadata URI.
However, I've noticed that some players like VLC or iTunes apprently seem to go for the ICY headers in the HTTP request (ICY Info: StreamTitle=....)
I am now wondering if there is a way to set these headers on the icecast server in a similar way like the metadata uri, or how would anyone set these headers? Icecast seems not to set the ICY headers if I update the metadata of the mountpoint.
Thanks and regards

VLC and iTunes both want the metadata at regular intervals, just like any other SHOUTcast/Icecast clients. If this isn't working, whatever code you have updating the metadata on Icecast isn't working.

Related

Working with HLS and Push content to Apache

I have customer that have a envivio encoder. Today they push live HLS streams to Akamai.
Now the they want to try to push the stream to us. Buw wat kind of webserver do I need and how do I config the server to save the push stream to webb-storage.
I believe that your encoder (as well as many others) is using HTTP PUT to publish the video. Apache web server can do the trick easily.
if you need help enabling the HTTP PUT functionality in Apach -, try
How to enable and use HTTP PUT and DELETE with Apache2 and PHP?

$_SERVER["CONTENT_LENGTH"] not set

Is there any php.ini setting which could cause isset($_SERVER["CONTENT_LENGTH"]) to never be set on one server, but work fine on another, where the application code is the same and php.ini upload settings are the same? In an uploader library I'm using, the content length check always fails because of this issue. On PHP5.3, CentOS and Apache. Thanks for any help
EDIT: I should add that in the Request Headers, Content-Length:33586 - but when trying to process $_SERVER["CONTENT_LENGTH"], it isn't set.
Content-Length is sent by the server application, it's not part of the HTTP request.
Your application is the one that will be setting that, however you should not be doing that from within PHP as PHP does this automatically.
If you're dealing with input from something like an upload, then you will only get the Content-Length if the HTTP request is not CHUNKED. When sending a chunked request, the data length is not known to the recipient until all the chunks have been sent.

Does apache commons fileupload support chunked uploads?

We have moved to using PLupload for file uploads and found that it can support "chunked" file uploads. The problem is that our server sees one large file upload as multiple smaller files coming in multiple POST requests.
Does anybody know if Apache Commons FileUpload supports chunked uploads?
FWIW looking at the PLupload webpage the "Chunking" they are talking about is not "HTTP Chunking". http://www.plupload.com/index.php
Their marketing term "Chunking" is their concept of sending a large payload up in small and separate HTTP requests. The server is required to have logic to group, stitch up and verify all the small parts. You are better off getting help on their forum on this. There is no reason why this logic can not be created by you on the server side and maybe they have example Java code implementing it.
Useful info and pointer to their upload.php example (maybe you convert to Java and on top of Apache Commons FileUpload) :
http://www.plupload.com/punbb/viewtopic.php?id=1484
What you are observing the small segments of a file arriving like they are separate files is exactly how the "PLupload Chunking" mechanism works. This technique is not defined in any standard, but it is also not an uncommon solution to the problems it addresses.
The "HTTP Chunking" is standard for defining how to transfer a single HTTP Request (and/or HTTP Response) between click/server using a HTTP transfer encoding. This is supported by all webservers and all browsers and has been around for a long time (since HTTP/1.1).

Adding decision logic to Apache's mod_proxy_balancer with Memcache

What I am trying to achieve is to have Apache's mod_proxy_balancer check if a request was already made using a Memcache store.
Basically:
Streaming media request comes in.
Check if streaming media has already been served with Memcache.
If so, can that streaming media server handle another request.
If so send request to said streaming media server.
If not send request to the next streaming media server in line.
Store key:value pair in Memcache.
My questions are:
Does mod_proxy_balancer already do this in some way?
Is there anyway to make Apache a content-aware load balancer?
Any other suggestions would be greatly appreciated too, other software, other approach, etc.
Cheers.
Looking at 'mod_proxy_balancer.c'; one could, as suggested in the comments in the file, add additional lbmethods. Something along the lines of "bymemcached_t" or "bymemcached_r" where the t and r endings denote the "bytraffic" and "byrequests" methods respectively. We would do our pseudo code above and if not found proceed to the other methods and save the result in the memcached store.
In my research I came across HAProxy which does exactly what I want from its documentation using the balance algorithm option of 'uri' just not using Memcached. Which is fine for my purposes.

Can we detect if a site is on CDN?

Is there a way to detect if a site is on a Content Delivery Network and if yes, can we tell which service are they using?
A method that is achievable from the command line is using the 'host' command, with the -a flag set to see the DNS record e.g.
host -a www.visitbritain.com
Returns:
www.visitbritain.com. 0 IN CNAME d18sjq5nyxcof4.cloudfront.net.
Here you can see that the CNAME entry tells us that the site is using cloudfront as the CDN.
Just take a look at the urls of the images (and other media) of the site.
Reverse lookup IP's of the hostnames you see there and you will see who own them.
I built this little tool to identify the CDN used by a site or a domain, feel free to try it.
The URL: http://www.whatsmycdn.com/
You might also be able to tell from the HTTP headers of the media if the URL doesn't give it away. For example, media served by SimpleCDN has Server: SimpleCDN 5.6a4 in its headers.
cdn planet now have their cdn finder tool on github
http://www.cdnplanet.com/blog/better-cdn-finder/ The tool installs on the command line and allows you the feed in host names and check if they use a CDN.
If Website using GCP CDN you simply check it using curl
curl -I <https://site url>
In reponse you can find following headers there available
x-goog-metageneration: 2
x-goog-stored-content-encoding: identity
x-goog-stored-content-length: 17393
x-goog-meta-object-id: 11602
x-goog-meta-source-id: 013dea516b21eedfd422a05b96e2c3e4
x-goog-meta-file-hash: cf3690283997e18819b224c6c094f26c
Yes you can find by
host -a www.website.com
Apart from some excellent answers already posted here which include some direct methods which may or may not work for all the websites out there, there is also an indirect way to see if a CDN is there. And especially if its your own website and you want to know if you are getting what you are paying for !
The promise of a CDN is that connections from your users are terminated closer to them so that they get less TCP / TLS connection establishment overhead and static content is cached closet to them so that it loads faster, puts less strain on your origin servers.
To verify this, you can take measurements of site load times across the globe and see if all the users get similar loads times. No you dont have to get a machine everywhere in the world to do that ! Someone has already done that for you
Head to https://prober.tech/ and the URL you wish to test for load times.
Because this site itself is in Cloudflare's CDN, you can put that link itself in the test box and use it as baseline !
More information on using the tool can be found here