I have a site that uses web fonts.The site used to function properly until the client's IT team rolled out Windows 10 Pro across the organization. After the Windows upgrade, the web fonts stopped working with error "CSS3111: #font-face encountered unknown error" on IE 11 browser. However on all other major browsers including new Edge the site works without any issues.
I did some search to and came to know it is because of a recommended features called Untrusted Font Blocking and to disable it I need to modify certain registry keys. However in my case that is not an option as the this feature is recommended by Microsoft for security. Also it needs the change to be done on local machines.
While googling, I noticed some people suggesting encoding and embeding of font file as Base64 encoded string. In fact I could see that as a workaround in many Q&A sites and forums (eg here, here). But I failed when I attempted to do that. On my Win10+IE11 I still get CSS3111: #font-face encountered unknown error (screenshot)
Further googling led me to this SO question that actually answers to my problem. It says even if I convert the font file to Base64 the Win10+IE11 will still prevent it while converting and loading into the memory for execution.
Interestingly I found major Font and icon vendors are not attempting to figure out a workaround for this as I can see none of their website shows up properly on Win10+IE11. Even Microsoft's own site (outlook.office.com) has this issue.
Now my questions are:
Is there any workaround that can help me to fix the issue
If no workaround then is it a good idea to have a end user warning popup to switch to more supported browser with user agent detection?
Thanks in advance.
Related
Windows 10 has a feature to block any font outside the %windir%/Fonts directory from loading in a browser.
Our entire website (AEM + Apache webserver) uses a proprietary purchased font which is hence blocked in all PC's. Is there any way to bypass this setting so that the font loads seamlessly in all users.
We need to do something from AEM/Apache websrver/Akamai CDN end because any local changes will not be possible for end users to follow.
I didn't find much help online for this.
The documentation says - "By default, this feature is not turned on." so i would assume you are facing issues on company issued laptops that have this security feature enabled.
Depending on your fonts licensing you could choose to do following -
Use user-agent information to detect if the OS is windows 10
If Windows 10 is detected show an overlay with the link to download and install fonts
To let the font work with this setting, it is to install the font on each machine with the Untrusted font setting turned on and then declare the font with a fallback.
The second option would it be to whitelist the application, in this case, the browser to be whitelisted as explained in the Microsoft website here: https://technet.microsoft.com/en-us/itpro/windows/keep-secure/block-untrusted-fonts-in-enterprise
I had a similar issue, with Icomoon, but the concepts would be the same one: install the font in the system and then declare a fallback font.
This might help you in the process:
http://maurizionapoleoni.com/blog/how-to-display-fontawesome-icomoon-and-font-icons-on-a-windows-10-with-blocked-untrusted-fonts/
We are in the process of implementing Success Factors LMS, and trying to play and view SCORM compatible files exported from Adobe Captivate 8 and 9 in Success Factors LMS.
I get the message - 'ERROR – unable to acquire LMS API, content may not play properly and results may not be recorded. Please contact technical support’
I have tried SCORM versions 1.2 v3 and 2004 V2 and V4. We can view the content, however it does not track, show as complete etc.
We are also producing Scorm compliant files using Skillcast and Articulate, but we still hit the same issue, we can view the content after closing the API error window, but still does not track.
Anyone experienced this problem before? Or know of a fix?
Many thanks
Normally this issue comes up when the course is unable to get the SCORM API from the LMS...I have seen a ton of SCORM content running in Success Factors before, so I wonder if the issue is in the setup. Are you seeing any "Access Denied" type errors in the browser element inspector/developer tools? I wonder if the course just can not find/have access to the player window. If the course is launching in a new window, you may want to try launching it in the frameset. I have seen folks get around this issue by making sure the player and sco are in the same window...
If you wanted to rule out the content being the issue, you can always test your content in the SCORM Cloud's free sandbox (https://cloud.scorm.com) to make sure the course is properly asking for the API...
If you have any other questions, we would be happy to help...you can just shoot us an email at support#scorm.com.
Thank you!
Joe
The error occurs because the content is not speaking to the Learning Management System (LMS). The code that runs to initialize the session doesn't happen. There is no return "ping" from the LMS.
You will get this error when you publish in SCORM and run from your desktop, or from a web server that isn't connected to an LMS. If it occurs when you are launching from an LMS it can either mean that the SCORM API isn't configured correctly, or your content server is on a different domain (cross-domain) than your application servers.
To test, you should try launching your content in different browsers. Our system was configured in such a way that Firefox and Chrome read our content to be cross-domain issue, and threw the SCORM API error, but Internet Explorer worked just fine.
In the end, it was determined that our server configuration in tandem with our firewall and security settings read the Content server as cross-domain and we had to redeploy our content servers within the firewall.
I'm in the process of developing a Mobile version of some websites using the MVC4 *.mobile.cshtml system.
Everything is working great except when I try and view the sites on my HTC 8x. I get an outrageously large viewport, no javascript executing, offline touch regions and all-around incomplete pageload such that it doesn't function at all. I'm trying to diagnose/debug and not having any luck.
Using IE10 locally with a User Agent string for Windows Phone 8 doesn't show the same behavior. Further, using my exact UA string in any browser locally does not replicate the behavior. I've tried the various viewport workarounds posted on the internet and those have had no impact either.
I'm not on a Windows 8 machine, so I can't install the SDK/Emulator, but I suppose I could upgrade if no other options present themselves.
Anyone have any additional ideas as to how to test/diagnose/replicate this? I've been Googling for days and haven't been able to find any significant resource about this sort of thing.
This was a combination issue with Output Caching and the MVC DisplayModes bug.
I am using Google Doc Viewer to display local PDFs on our site (override for IE7 and 8 because of permissions issue): http://www.scad.edu/news/index.cfm?pageid=338423. The issue is that sometimes people have experienced the PDF not being displayed, being replaced with an iframed Google sign in page. I am having a lot of difficulty reproducing these results, trying several browsers, even clearing my entire browser cache, using off-site browsers such as Adobe BrowserLabs and BrowserShots, etc. I know the error is occurring on an OS X 10.5 machine running Firefox 3.5.2 and on another machine with similar software. The not-so-technical personnel are claiming it happens after not having signed into Google for a week or so, but it displays fine for me when I am signed out on a fresh install. And yes, I have witnessed the issue several times on their machines but simply can not reproduce it.
Please advise on how to hunt down this bug. I can't find anyone else with the same issue. I am considering just switching to Scribd PDF viewer.
The not-so-technical personal may actually be on to something. This issue seems to occur when a Google user's session expires. The user is presented with the login form instead of the document, even when the viewer is embedded in an iframe. Apparently it's still an unresolved bug. Check out this question for more details:
Embedded Google Docs PDF viewer displays login page rather than PDF
I hope they fix this :)
this is related with permissions check your permissions and try again,i think viewer can reach your files if you give permission.
Just do this prior to loading the gdocs viewer, it should solve the problem every time.
<iframe id="google-signout"
src="https://www.google.com/accounts/Logout"
style="width: 943px; height: 1px; display: none;"
frameborder="0"></iframe>
I am part of the developer team for a quite a large online system using ASP.NET(4).
Asp.net Ajax completely breaks down for Webkit browsers and we are getting full page postbacks when we should be getting partial only for the UpdatePanels.
I am starting to believe it has something to do with my Application Configuration, mainly for the following reasons.
If I move the ajax enabled controls to a new project they will work as expected for all browsers, including Webkit.
I created a static .aspx file with nothing but an UpdatePanel,ScriptManager and a button making a literal visible on click.
I get no Javascript errors from any browser, and i see an http request for the asp.net-ajax (ScriptResource.axd) in both Firebug and Chrome Developer tools
I tried ye'old safari fix from this highly referenced thread
Edit: After a bit more testing and http sniffing i noticed a major difference between the test application and the actual application. The test application generates 2 additional .axd files which are not generated from the actual application. These WebResource.axd, seem to contain data related to the async postback. However this is only the case for Webkit browsers. The WebResource.axd files are generated for Firefox as i can see them in firebug
What i am asking from the community, is any ideas or suggestions as to what could be the cause of this problem and if i am correct to assume that the problem is probably on the server side
Thanks for any help
The problem was due to a deprecated config file that's used to limit the content that bots/spiders/crawlers receive, which was loading by mistake thanks to our lovely inhouse CMS
In short if u get behavior similar to my case, check your or configs
I was having a similar issue however my problem was with all browsers and not just webkit. I ended up going through and tearing up the web.config file and found out that a line: <xhtmlConformance mode="Legacy"> was preventing webresource.axd from working properly. The fix was to simply remove that line from my web.config file.
For a little more information on xhtmlConformance, visit http://technet.microsoft.com/en-us/librarY/ms228268(v=vs.85).aspx.
If you scroll all the way to the bottom you'll notice it explicitly states that it causes issues with webresource.axd and scriptresource.axd.