I read on this page https://developers.google.com/youtube/youtube-api-list YouTube API Subject to the Deprecation Policy
On this page is the YouTube IFrame Player API.
Does this mean that this api is deprecated ?
Hope someone can shine a light on this page
Thanks
I think that section 7 of the Youtube API terms of service helps clarify things a bit:
Google will announce if it intends to discontinue or make backwards
incompatible changes to this API or Service. Google will use
commercially reasonable efforts to continue to operate those YouTube
API versions and features identified at
http://developers.google.com/youtube/youtube-api-list without these
changes until the later of: (i) one year after the announcement or
(ii) April 20, 2015, unless (as Google determines in its reasonable
good faith judgment):
required by law or third party relationship (including if there is a
change in applicable law or relationship), or doing so could create a
security risk or substantial economic or material technical burden.
So in other words, there's not an announced deprecation of those APIs on the list you pointed to, but they reserve the right to announce deprecations that fall under the policy above. Here are the API technologies that have been officially deprecated:
https://developers.google.com/youtube/2.0/developers_guide_protocol_deprecated
Related
If we remove "powered by Google" while using google translator API does it violate the T&C of Google? In attribution requirements it is given but I am not sure whether it is legal or not to do so.
You are required to leave the "Powered by Google" which is within the terms you accepted in order to use the API. According to the documentation,
Use of these APIs is governed by the Terms of Service. Among other things, these Terms require that you adhere to certain guidelines on how lay out, Google attribution, and branding must be handled on your site. This document and the HTML Markup Requirements are intended to help you meet these requirements.
Since you signed/accepted these terms, you are required to follow them. In addition, in the documentation is also mentioned,
If you are uncomfortable with any of these branding guidelines, discontinue your use of the API, and contact us with your concerns.
For this reason, you should contact Google and discontinue the API's usage if you do not desire following all the requirements.
Lastly, under Attribution and logos,
In addition to following Google's general Brand Features guidelines, you are also required to adopt certain branding elements when using the Cloud Translation API.
...
The "powered by Google Translate" graphic must always be displayed adjacent any translation results.
Currently i'm working in a on-line payment company, i need to implement a access control system. I used XACML for experimental purpose 2 years ago, and used it in a management system(based on Balana's XACML implementation). I noticed XACML Version 3 specification hasn't been updated since Jan 2013, i wonder whether this specification is still under maintenance. If not, does anyone know any alternative?
What David says is correct. In addition, the OASIS XACML Technical Committee (TC) has just voted to hold a public review of Errata for XACML 3.0. The review should start within a few days. The corrections are minor, but it does show we are maintaining the documents and getting input from the field.
Although no one is currently working on them, there are several unfinished Profiles I would like to see completed. One is to extend the JSON format for XACML to cover the policy language. It currently only covers only the decision request protocol. Another is the ALFA policy language which is a more user friendly, JSON-like language originally developed by Axiomatics, and endorsed by the TC.
For people who want to use XACML, in addition to several excellent commercial products, there are at least two other open source implementations in addition the the WS02 - Balana one mentioned above. Forgerock has one and there is another originally developed in house at ATT. The later one was contributed to the Apache Incubator, but failed to gain traction and was mothballed. However the original code is still freely available under Apache license.
Finally I should mention that I have proposed various ways to integrate XACML with token-based authorization schemes such as OAuth. However this has not gone past the research stage.
Yes, XACML is still very much active. The standard, in version 3, is mature and right now no one is working on XACML 4.0. Given XACML 3.0 is a standard, there won't be changes made to 3.0. Either we go to 3.1 or 4.0. There are enhancements we are thinking of for a 4.0 version but this is not the focus for now.
The focus is on profiles, both technical profiles (such as the JSON profile of XACML) and business profiles (such as the Export Control profile of XACML).
Disclaimer: I work for Axiomatics, the leading XACML implementation. I am also a member of the XACML Technical Committee.
We see more and more requests for Attribute Based Access Control and XACML in the marketplace especially in financial and healthcare
I'm wondering about additional complexities involved in integrating with Auth0 vs plenty of available code for password complexity rules, UI etc. including using snowflake starter app, for authentication/user creation with open source parse server.
I am sure plenty of people have thought about this and was wondering what the consensus is? Requirement to keep profile/email in sync, relying on 3rd party, inability to customize view and I am sure many other issues.
At first I thought this is great, I would not need to worry about a lot of things, yet there are a lot of other things to worry about and not being able to customize.
What are the most convincing "PRO" answers?
Disclosure: I'm an Auth0 engineer.
TL;DR: I can talk about the pros and cons, but the definitive answer needs to be provided by you.
A bit about Auth0
Auth0 supports the authentication protocols in most widespread use (OAuth2/OIDC, SAML and WS-Federation) so integration into custom software that speaks or can be made to speak by available libraries is relatively easy and friction free. Sidenote, Parse Server does seem to support integration with OAuth compliant identity providers.
It can be used as a standalone identity provider where your users register and authenticate with username/password credentials specific to your application, but it can also integrate with a lot of downstream identity providers like for example Google, GitHub and Twitter. This makes it really easy to enable different methods of authenticating users just by configuration instead of having to directly talk to each individual provider and have to deal with their implementation discrepancies.
Finally, it provides enough extensibility points for you to still have significant degree of control on the authentication experience, for example:
Rules (JS code) allow you to customize the authentication process
Customization of Auth0 provided authentication user interface allows you to still have your own branding
Customization of Lock allows you to have a custom authentication experience integrated in your own app really quick and with very little effort
Of course, no matter how many extension points there's always some stuff that you will not be able to control. This can be seen as bad, but sometimes it's actually a good thing. Depends on the perspective and your specific requirements.
Comparison - Roll your own (RYO) vs Third-party service
On one side you'll have:
cost of acquisition of the service
cost of integration of the service with your product
On the other side you'll have:
cost of implementing the features you need
cost of ownership of those features
cost of integration of the new features in your product
In both cases you'll need some integration work in order to make all the parts fit no matter who created the parts. You could argue that if you are the author of everything it will be easier to fit a square peg in a round hole so lets say RYO wins by a small margin on that point.
It then all comes to comparing cost of acquisition versus the cost of implementation and ownership. I can't answer that one, but the cost of acquisition is generally easy to calculate while the cost of implementing software is very hard to predict; on top of that owning a custom authentication solution also has a heavy toll... you know what they say, no one ever got fired by buying IBM. I won't go that further, but it's safe to say it's easier to find yourself in a pickle if you roll your own security. :)
Conclusions
Go through the Auth0 trial, play with it and see what it has to offer and how that matches your requirements.
If you find something you're not able to accomplish leave a question here tagged auth0 or on Auth0 Forums.
FastLink looks good stand-alone and in an 800x600 iFrame in the desktop browser, but I'm hoping there are some more mobile-friendly configurations that I just missed in the docs.
I see the access_type and displayMode parameters here, which would imply that what I'm hoping for is at least a possibility:
http://developer.yodlee.com/Aggregation_API/Aggregation_Services_Guide/FastLink_for_Aggregation/Yodlee_FastLink_Integration_Guide
I've been unable to find any other reference to those parameters in the docs, however, or more detail with regard to layout options.
Are there some other valid values for those parameters other than what's listed there in the Integration Guide, and/or some more detailed docs besides the integration guide and product guide?
FastLink looks like it has the potential to save significant unanticipated work on account setup, especially MFA -- I'm hoping we can get the FastLink UX to gel nicely enough with our own UX to not have to invest in rolling our own.
Currently there is no Fastlink available for Mobile/responsive one, although its in works and will be available in near future(no specific ETA right now).
I'm working now on an API for developers feature of our product.
The first version was released and it has small number of users at the moment. Since I started to develop its second version, some parts were reworked, some parts were removed to make the API more elegant and clear.
But the 2nd version deployment can be a pain for old version users.
Our marketing department is planning to enhance our API product a lot, add more features to it.
How should I build the system, so
1) we wouldn't be constrained to the "old version" to add new interesting features
2) current API users won't be dissatisfied because of the need to rework their systems in order to comply with the changed API
Or should the API products be tested in a sandbox for quite a long period of time before the public release, so there wouldn't be any significant modifications in the specification?
When you have to make changes to the API which already has some users, probably the best route is to deprecate the old API calls and encourage use of the new calls.
Removing the capability of the old API calls would probably break the functionality of old code, so that is probably going to cause some developers using your "old" API to become somewhat dissatisfied.
If your language provides ways to indicate that certain functionality has been deprecated, it can serve as a indication for the users to stop using old API calls and transition to new calls instead. In Java, the #deprecated javadoc tag can provide notes in the documentation that a feature has been deprecated, or from Java 5 the #Deprecated annotation can be used to raise compile-time warnings on calls to deprecated APIs features.
Also, it would probably be a good idea to provide some tips and hints on migrating from the old API to the new API to encourage people to use the new way of interacting with the API. Having examples and sample code on what to do and what not to do, the users of the API would be able to write code according to the new, preferred way.
It's going to be difficult to change a public API, but with some care taken in the transition from the old to new, I believe that it the amount of pain inflicted on the users of the API can be mitigated to a certain extent.
Here's an article on How and When to Deprecate APIs from Sun, which might provide some more information on when it would be appropriate to deprecate parts of APIs.
Also, thank you to David Schmitt who added that the Obsolete attribute in .NET is similar to the #Deprecated annotation in Java. (Unfortunately the edit was overwritten by my edit, as we were both editing this answer at the same time.)
Microsoft is pretty famous for their insane backwards compatibility. One of the things they did was to keep all the old obsolete calls, and then add new ones that new programs could use to access the enhanced features that they could not work into the old API.
You did not specify which programming language you use, but both .NET and Java has a mechanism to mark certain API calls as obsolete. If backward compatibility is very important for you, you might want to take the same route.
It's a balance you will have to strike with your community:
Keep old functions for aeons and you'll end up with the Win32 API (30000 public
functions)?
Rewrite the API all the time, and you can get something similar to .NET, where a new revision goes out every so often (1.0, 2.0, 3.0, 3.5...) and breaks existing programs or introduces new and improved ways of doing UIs etc.)
If the community is tolerant of change and open to experimenting, you will strive for a lean, current API and know that some breakage, aka bit rot, will result. If, on the other hand, the community has tons of legacy code and no resources or desire to bring it up to the latest version, you must keep backward compatibility or all of their stuff will simply not work on the new API.
Note to one of the other answers: deprecating APIs is an often-used way of indicating which functions are "on the way out", but as long as they work, many developers will use them even in the new code because those are the functions they are used to. There are very few enlightened developers that have both the awareness to actually heed "Deprecated" warnings and the time to search the code for other instances of the old API and update them.
Backward compatibility should be the default. The only reason you should compromise this principle is when the API is somehow insecure which forces users to change to more secure methods.
Idealy applicitations written to your original API will continue to work with the new version.
One way to add new features while at the same time making sure that old applications continue to run is to have two versions of an API call.
For example, suppose you currently have a function Foo that takes 2 parameters (arguments) in the API but you decide the new version really should take 3 parameters. Keep Foo the way it is and add a new function Foo2 which takes 3 parameters.
That way users can continue to code against Foo for backward compatibility or use the new Foo2 function if they require the new features.
This technique has been commonly used bu Microsoft for the Windows APIs.