What can be the reason why artwork is not showing in the Sonos desktop controller for PC but it does in the Sonos desktop controller for Mac - sonos

My Sonos music service is rejected because artwork is blank on the desktop controller. However I have submitted png artwork in my service. I test my sonos music service on a Mac and the artwork shows up fine.
According to the digital asset guideline (http://musicpartners.sonos.com/?q=node/390):
The Windows and MacOS controllers, and the Sonos CONTROL (cr200) are considered legacy controllers. These controllers still use images in PNG format. Therefore, you must submit your logos in PNG format as well.
Does anyone know the reason why the artwork is showing in the Mac desktop controller but not on the PC desktop controller?
<PresentationMap type="BrowseIconSizeMap">
<Match>
<browseIconSizeMap>
<sizeEntry size="0" substitution="_legacy.png"/>
<sizeEntry size="40" substitution="_40.svg"/>
<sizeEntry size="290" substitution="_290.svg"/>
</browseIconSizeMap>
</Match>
</PresentationMap>
<PresentationMap type="ArtWorkSizeMap">
<Match>
<browseIconSizeMap>
<sizeEntry size="0" substitution="_legacy.png"/>
<sizeEntry size="40" substitution="_40.svg"/>
<sizeEntry size="290" substitution="_290.svg"/>
</browseIconSizeMap>
</Match>
</PresentationMap>
Url browse icon
http://webservice.aristomusic.com/Sonos/static/images/menu/blue_legacy.png
http://webservice.aristomusic.com/Sonos/static/images/menu/blue_40.svg
http://webservice.aristomusic.com/Sonos/static/images/menu/blue_290.svg
Url artwork
http://webservice.aristomusic.com/Sonos/static/images/menu/orange_track_legacy.png
http://webservice.aristomusic.com/Sonos/static/images/menu/orange_track_40.svg
http://webservice.aristomusic.com/Sonos/static/images/menu/orange_track_290.svg

Looking at your presentation map, you should be using <imageSizeMap> for ArtWorkSizeMap instead of <browseIconSizeMap>.
Also, note that the ArtWorkSizeMap is used for album art mapping whereas BrowseIconSizeMap is used for browse icon mapping. In your presentation map, they currently have same mappings.
Implementing Album Art: http://musicpartners.sonos.com/?q=node/366
Implementing Custom Browse Icons:
http://musicpartners.sonos.com/?q=node/365

Related

Accessing documents stored in Amazon's S3

We are building a system which will allow users to store documents (images or pdf) in s3. These files can later be accessed via their URL like (https://my-bucket.s3.amazonaws.com/person/2/provider/test1.png)). We have no problem uploading and deleting documents in s3 using proper IAM keys and AWS SDK. However, when we try to access documents using their URL, we get the following error:
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<RequestId>V0DTW6T3F6J3ZFDG</RequestId>
<HostId>ZOiybrTAfx8t+NZQW2cpS4nw8vNhmQaemFfinQSBP41K2mZhDItF29156LTUwZh+SqZacfssLIE=</HostId>
</Error>
I understand the reason for the error but don't know how to resolve it.
Basically, we are building a health related portal. Patients can upload their documents (health records). Later on we want them to be able to view their documents while they are logged in. The idea was that the documents could be displayed via their URL.
The only solution I can think of is to first download the document locally (whether that is on the browser's local filesystem or on the mobile device if accessed through a mobile app) and then displayed from there. That does not sound like an ideal way. The other alternative is to make the bucket completely public which is not acceptable because these are health care records.
Any help/advice will be greatly appreciated.

Securing Kentico Media Library file from direct URL access

I am trying to prevent unauthorized access to all files within some of my media libraries in Kentico v9.0. I intend to only provide access to certain libraries via Kentico Role membership and global admins only.
I have followed the guide published here: Securing media libraries and it seems to work fine: Direct URL path access results in a 401.2 - Unauthorized message, regardless of user authentication state and role membership. And I have a media gallery web part set up correctly to provide the permanent URLs, which also work as I expect them to.
However, I have another requirement; which is to serve up a ~/googlesitemap.xml file. I followed the guide published here: Google Sitemaps, which instructs me to provide an attribute to the system.webServer/modules node in the web.config:
<modules runAllManagedModulesForAllRequests="true">
...
</modules>
However, when I add the above, then any authenticated user can gain access to my media library files if they enter in the full URL. This violates my attempts at media library access based on role membership; since I dont want users in other roles to be able to get at the files.
I have tried to exclude all media library files from the re-writing engine by adding my media folder root as an Excluded URL in Settings > URLs and SEO > URL Format > Excluded URLs... but this doesn't appear to help.
Any suggestions would be highly welcome!
As far as I know runAllManagedModulesForAllRequests attribute is mandatory only for different extensions (different from .aspx). If you are using IIS 7 or above you can omit this attribute in your web.config (see source).
Note: In ASP.NET websites, the value of runAllManagedModulesForAllRequests previously had to be set to true to support routing. However, once IIS 7 has been updated with a Service Pack, the value of runAllManagedModulesForAllRequests can be set to false or omitted when working with ASP.NET routing. For more information, see ASP.NET Routing on the MSDN website.
So quick fix: Do not add this attribute to web.config and your media gallery (permissions) should work as you wish.
EDIT: So I think I have got solution for you. It seems that runAllmanagedModulesForAllRequests attribute kills Anonymous Authentication setting so Kentico serves data after successful authentication. I`ve found workaround so you can forbid access to media library. Try to add something like:
<location path="MySite/media/MyMediaLibrary">
<system.web>
<authorization>
<deny users="*"/>
</authorization>
</system.web>
</location>
into your web.config inside configuration section.

Why Do I need Dash.js for streaming MPEG DASH videos?

I'm new with html 5 adaptive streaming and information out there is quite conflicting. I want to create an test environment on my windows server cloud streaming a 2hours h264 file and play on my local computer with an html5 player.
Question:
Why Do I need Dash.js to play the Mpeg dash video?
Is Dash.js something I have to install in the server(sounds obvious) or client(sounds weird)?
DASH videos, like any other videos, involve two parts: a serve serves the videos and a player consumes them and presents them to the user. I will explain what is needed on both sides.
Serving DASH videos
Pieces of DASH videos can be delivered over HTTP or HTTPS by any modern web server - Apache, ngnix, IIS and others. No plugin or additional software is needed on the server side to serve DASH videos - they are just files and every web server knows how to serve files. You may need to do some configuration, however.
Most web servers have a list of MIME types of the files they are allowed to serve - you do usually need to add DASH videos to this list, since the default settings tend to be restrictive for security reasons and do not allow DASH videos to be streamed.
Here is an example web.config for IIS that allows DASH videos to be served:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<staticContent>
<remove fileExtension=".m4s" />
<mimeMap fileExtension=".m4s" mimeType="video/mp4" />
<remove fileExtension=".mpd" />
<mimeMap fileExtension=".mpd" mimeType="application/dash+xml" />
<remove fileExtension=".m4f" />
<mimeMap fileExtension=".m4f" mimeType="video/mp4" />
<remove fileExtension=".m4a" />
<mimeMap fileExtension=".m4a" mimeType="video/mp4" />
</staticContent>
</system.webServer>
</configuration>
The different video/mp4 elements are there since different DASH encoders name their files differently.
Some DASH players, especially web-based ones, may also require the server to support cross-origin resource sharing (CORS). This is a security mechanism that helps prevent malicious websites from operating by enabling you to choose what sites your content can be displayed on. The exact CORS headers your server needs to provide also depend on the player - in some situations, additional headers are used and must be explicitly enabled. I will leave the details of CORS out of scope of this answer. Here is a simple example IIS configuration that allows any website to consume the served videos:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="Access-Control-Allow-Origin" value="*" />
</customHeaders>
</httpProtocol>
</system.webServer>
</configuration>
Playing DASH videos
You need a player, obviously. There exist different types of players: stand-alone desktop apps (e.g. VLC), player SDKs for Android/iOS apps (e.g. ExoPlayer and Microsoft PlayReady Client SDK) and players for websites (e.g. dash.js and Bitdash). On Windows 10, Internet Explorer will even include a built-in player for DASH videos.
This is where dash.js comes in - it is a player. You put it in your website if you want your website to play videos. There are also different players available.
Depending on how you wish to offer content to the end-user you choose a player and, if not a stand alone player, embed it into your app or website. You provide the URL to the player and it will do its thing. Simple.
Website-based players require the server to support CORS but stand-alone or app-hosted players do not require it.
Why you need dash.js for streaming MPEG-DASH videos
You need it because web browsers do not natively support DASH, as they are not required to do so. Web browsers are, however, required to support Media Source Extensions (MSE). For (newer) browser versions that do implement MSE, their 'basic' supported media sources like MP4 can be supplemented by DASH simply by inclusion of Javascript libraries like dash.js. This is much more flexible (and future-proof) than the older routine of requiring users to install plugins like Flash Player to play non-basic media types.
Client-side setup
You also asked whether dash.js is something that needs to be installed server-side or client-side. Sander has written about any server-side setup that may be necessary to accommodate serving the files, so I'll add an explanation of how to implement it client-side.
From the dash.js GitHub page:
<script src="http://cdn.dashjs.org/latest/dash.all.min.js"></script>
...
<style>
video {
width: 640px;
height: 360px;
}
</style>
...
<body>
<div>
<video data-dashjs-player autoplay src="http://dash.edgesuite.net/envivio/EnvivioDash3/manifest.mpd" controls></video>
</div>
</body>
Note that if you want to do Clear Key encryption too, you'll need to serve bot the video file and dash.all.min.js from a secure context (eg. TLS). And if you want to use xhtml format rather than html, you'll need to add ="true" after each boolean property on the <video> element.

How can I identify web requests created when someone links to my site from Google+?

To comply with new EU legislation regarding cookies, we've had to implement cookie 'warning' banners in several places on our site. We're now seeing problems when users try and link/embed content from our site on Facebook, Google+, bit.ly and so on - the embedded page thumbnail is showing the cookie notification banner instead of a preview of the actual page.
Most of these sites use a distinctive user agent string, so we can easily identify their incoming requests and disable the cookie banner - but Google+ seems to identify itself as
Mozilla/5.0 (Windows NT 6.1; rv:6.0) Gecko/20110814 Firefox/6.0
which makes it very difficult to distinguish between Google+ automated requests and browsing traffic from 'real' users. Is there any way at all I can identify these requests? Custom HTTP headers I can look for or anything?
G+ has got its user agent. It contains the text: Google (+https://developers.google.com/+/web/snippet/).
Ref: https://developers.google.com/+/web/snippet/#faq-snippet-useragent
There are not any HTTP headers that you can depend on for detecting the +1 button page fetcher, but there is a feature request for it to specify a distinctive user agent. In the mean time, I would not depend on it.
You can, however, use other means to configure the snippet of content that appears on Google+. If you add structured markup such as schema.org or OpenGraph to your pages, the +1 button will pull the snippet from those tags. Google+ provides a configuration tool and docs to help you design your markup.
If you add schema.org markup it might look something like this:
<body itemscope itemtype="http://schema.org/Product">
<h1 itemprop="name">Shiny Trinket</h1>
<img itemprop="image" src="http://example.com/trinket.jpg" />
<p itemprop="description">Shiny trinkets are shiny.</p>
</body>

Silverlight cross-scheme access to jpegs denied

I've got a Silverlight4 app that I'm running on https, deployed to Azure. Everything's working except for one small glitch. I've got content in the form of jpg thumbnails and associated zip files with a .gld extension. My app is supposed to display the thumbnails, and allow the users to download the associated .gld/zip files.
The downloads works fine, but, the thumbnails won't display in my UI. I get AG_E_NETWORK_ERROR from my ImageFailed handler. I do have a clientaccesspolicy.xml file in the root of my cdn domain. Fiddler does not show Silverlight accessing this file.
Here's the clientaccesspolicy.xml:
<?xml version="1.0" encoding="utf-8"?>
<access-policy>
<cross-domain-access>
<policy>
<allow-from>
<domain uri="*"/>
<domain uri="http://*"/
<domain uri="https://*"/>
</allow-from>
<grant-to>
<resource path="/" include-subpaths="true" />
</grant-to>
</policy>
</cross-domain-access>
</access-policy>
I'm stumped as to how Silverlight can access the downloadable content, but not the image files. The app does pick up the thumbnails if I deploy them to blob storage in azure (with the same clientaccesspolicy.xml file in the $root folder), but it would be ideal if I can continue the same file structure that already exists on my cdn provider.
Everything displays perfectly if the Silverlight app runs in http rather than https.
Thanks in advance for any ideas!
Michael Conner
thanks for all the suggestions - have tried them all -- still no luck, sadly. ImageFailed event for bitmap doesn't give much detail, unfortunately. I think we maybe be up against the cross-scheme restriction on image files.
You are missing a ">" in line 7.
Check the uri used in fiddler or the like, check images are using https
try to implement more debug output - maybe like this:
You can trap the error. BitmapImage, Image, ImageBrush, MultiScaleImage all have an ImageFailed event. Just set an event handler to it.
or try this:
<allow-from http-request-headers="*">
I know it's a very old question but if someone is still stuck in a similar situation
Silverlight does not allow cross scheme access. You will have to have either http on both ends or https. From MSDN : URL Access Restrictions in Silverlight
If you want to host your Silverlight application and store your
images on different servers, the restrictions are as follows: You
cannot store your images on a site that uses the HTTPS scheme if you
are hosting your application on an HTTP site (cross-scheme).
You can store your images on a cross-domain site as long as the
scheme of that cross-domain site is the same as the scheme of the
site hosting your application.
Silverlight applications running on Windows cannot store images on a
server in the Local Intranet zone if the application was downloaded
from the Internet zone (cross-zone), except if the target domain is
localhost.
You can redirect to another image URL as long as the URL uses the
same scheme.