Apple News Preview -- can't load articles - macos-mojave

I am attempting to use News Preview to check out my apple news articles, but I keep getting the error "This document couldn't be opened. For details, see the error console...". The error console seems to be completely blank.
I have tried this with article json directly from my server and also json downloaded from iCloud's news publisher. The articles in question display fine on my test news channels in the Apple News app.
Has anyone had this issue before?
Edit: I have the latest version of XCode installed. The same result occurs when I try to load articles downloaded from Apple's documentation.

For the next person that runs into this, my problem was that my https local URLs (links and images) weren't using a valid certificate. Replacing them with http fixed the issue.

Typically this happens when the json is malformed or if incorrectly using an ANF property.

Related

Google search console fails to fetch sitemaps | "Sitemap could not be read"

I have generated a sitemap from online generators, it seems to be working and even i tested it on old google search console sitemap testor and it works. but when i submit it in both versions it just displays error message.
This is a known bug. See this Google support answer.
In my case, it's the sitemap that had a syntax error.
You should open sitemaps in Firefox, it will tell you if you have a syntax error.
Your sitemap domain address might have changed. If it is wordpress use yoast plugin, where search console will automatically consider sitemap.xml
I had the same problem and the solution was very simple, just put the full path to your sitemap.
Where the console asks 'add new sitemap', instead of writing /sitemap.xml, write the full path, such as https://example.com/sitemap.xml.
That should fix the problem.
Using the yoast SEO plugin which built out 10 sitemaps, the index got red the first time and only one of the sub-sitemaps did. I manually visited the other sitemaps (likely they took to long to respond I thought) and deleted the sitemap on google search console and re-uploaded. All were read that time.
I had this issue and it was because I didn't set the content-type to application/xml
This sitemap validator notified me of the issue: https://www.xml-sitemaps.com/validate-xml-sitemap.html
Enter the full URL of your sitemap, e.g., https://example.com/sitemap.xml. Also, ensure your sitemap name does not include numbers and symbols.

SONOS Playback error "These Songs are not available for Streaming from APPNAME"

I've been preparing POC to integrate our music service with SONOS, I've written simple service for testing purpose. I've implemented three essential methods to play url "getMetadata", "getMediaMetadata" and "getMediaURI".
First I've tried with media type "track" and returned song url(hard coded) from "getMediaURI" method which is .mp4 format, It worked fine as expected.
Later when I've tried with 7-digital url playback fails by saying "These Songs are not available for Streaming from "APPNAME". I've tried changing mime type values also nothing seems working. Type : audio/x-m4a
Note: Same 7 digital url is playing fine on browser.
Am I doing anything wrong here? Am I missing anything? Any help is really appreciated.
Thanks.
Looking at the documentation on Sonos' developer website, it doesn't seem like that audio/x-m4a is a supported MIME type. Do you know the audio format of 7-Digital's track for sure? If its mp4 or m4a, I would try setting the MIME type to one of these - audio/mp4, audio/aac,
application/x-mpegURL, application/vnd.apple.mpegURL, audio/x-mpegurl
Also make sure that your track's sampling is supported as described in the table at the link below.
Link http://musicpartners.sonos.com/node/464

Truncated results when downloading general catalog

I have inherited a project built by another developer that attempts to download and process the general catalog from the Netflix API (REST endpoint: catalog/titles/full?v=2.0). My client is complaining that the results are truncated (maybe even by 50%). I do not receive any error message during the download. Why would the response contain a partial dataset and how might I fix it?
Side note: I couldn't find the NETFLIXAPI tag and don't have enough reputation points to add it. Maybe some else can.
2 points:
1. Up until yesterday I have downloaded the catalog everyday and have never seen it get truncated
2. More importantly, Netflix has deprecated the full catalog and so you need to get catalog/dvd and/or catalog/streaming

Google Contacts API 404 photo upload

Using the Contact API v3 I had a working implementation for uploading a photo to an existing contact.
Since a couple of weeks this fails with 404. The implementation has not been changed when the API servers started to sent back 404s and I don't see any indication what exactly changed and would result now in the 404s.
I'm using HTTP PUT + the photo URL of the contact.
One interesting observation I made was that the contact's self-URL changes which each request (the provided details are still always the same and correct).
Did anyone notice something similar ?
Edit: Link to issue: http://code.google.com/a/google.com/p/apps-api-issues/issues/detail?id=3301&q=contact&colspec=API%20ID%20Type%20Status%20Priority%20Stars%20Opened%20Summary
tried different photo formats and sizes, different content types and even photos which had been uploaded previously (when it was still working). Nothing changed the behaviour of returning 404.
w.r.t to change contact ids: the contact ID changes between API invocations. I first thought it could be related to reopened connection( no keep-alive) that contact ids change. However what speaks against this being the cause of the issue is that first retrieving a contact and then editing a contact's address is possible without any issues.
authentication does not seem to be problem as well - otherwise editing a contact's address would not work as well.
PS: I'm using the JSON output format when retrieving the contact.
PS2: s/GET/PUT in step 3 ( I tried to change PUT to GET to see if it still returns 404... which it does).
PS3: am not using any client library but implement the protocol directly (which should not be relevant for the HTTP PUT on the photo link
After hours of investigation I found out that this is particular an issue using OAuth1. Using OAuth2 the exact same photo links which had been returned when requesting a specific contact record using OAuth1 work and return the photo data on HTTP GET. I expect HTTP PUT for photo links using OAuth2 to succeed as well.
Remains open if if there's some kind of workaround for OAuth1.

Graph API not returning image/picture for community pages

Graph API is not returning image("picture" attribute) for objects corresponding to community pages, which used to be returned earlier. For example this https://graph.facebook.com/178790412179919 does not have picture attribute whereas the corresponding page has an image.
Also the FQL query done on the "albums" connection for some objects does not have a "cover_pid" attribute for an album corresponding to type "profile", which again used to work earlier.
Does anybody know if anything has changed in Graph API corresponding to this in last couple of weeks (I am fairly confident it used to work earlier in the expected way). I looked through Facebook API release notes but could not find any changes corresponding to this. Please let me know if this not appropriate post for this forum.
https://developers.facebook.com/docs/reference/api/page/
picture is a connection, not an attribute. So ...
https://graph.facebook.com/178790412179919/picture
And as the docs say: Returns a HTTP 302 with the URL of the user's profile picture.
Kinda goofy? Yes, but it works exactly as the docs say it does. I suspect they implemented it this way so it could easily be used in an <IMG> tag.
UPDATE:
It still works via FQL. In your case:
https://api.facebook.com/method/fql.query?query=SELECT+page_id%2C+pic+FROM+page+WHERE+page_id+%3D+178790412179919&format=json
I can confirm that this PREVIOUSLY worked, but NO LONGER works. Facebook have removed the picture connection from Community Pages.
I suspect the reason is that most of these images are pulled from Wikipedia, and there was a licensing / attribution issue.
Unfortunately, Facebook is no longer a reliable source of images for entities (e.g. bands).