I have seen this error when API has been enabled for all the Orgs in question. All I need is a read-only data dump, for which I figured the ("Read your data via the web") Permissions is the one that needs to be granted during OAuth. When this permission was omitted in the App spec, SalesForce API threw a "API_DISABLED_FOR_ORG" error. After changing the spec and asking for the Web part to be granted, this error goes away for some orgs.
Other orgs still throw the same error. Which leads me to believe this error is misleading, either the first situation was a different error or the Org has actually disabled API access, (which I don't need, just need read-only data access). Yet, when asked, during OAuth, the User actually agreed to supplying the data. I have this as scope: "id api web refresh_token", for an authenticated OAuth user.
What gives? What am I missing here? Can't seem to get a consistent process flow, data and in this case a proper error message from SalesForce.
I usually see this for editions that do not have API, e.g. Group Edition and Professional Edition.
Yep Salesforce support will do this for you - just if you want to use a MAC app such as OneSync to manage contacts on an on-going basis you are stuffed, unless you pay for the Enterprise version - hmm $60/year to $1200 don't think so
Related
I am trying to bring the data from GCP Bigquery to Azure data lake using Azure data factory. I was able to setup and was able to bring the data into Azure. But my problem is, the GCP refresh token is keep on expiring.
How to avoid GCP refresh token expiry?
How to generate new refresh token from ADF every time we load the data?
Any help is much appreciated.
I was dealing with the same issue. I Followed the official documentation where it is recommended the tutorial indicated by #Veikko. On this tutorial and during my short research, everyone establish that the refresh_token won't expire. During my test, the refresh_token provided me access during the day. Once the day finished, the access is revoked.
The solution to my issue was to use ServiceAuthentication. The problem is you will require a Self-hosted Integration Runtime. You just need to configure a Service Account on BigQuery. This will provide you and email and a .p12 file that you need to copy on the machine where you installed the integration runtime and done!
My suspect for the reason is that I use multi-factor authentication and VPN to grant access, and it may be the root cause. But it is an open issue for me. Any comment is welcome.
Refresh token should not be expiring. Make sure you are using a refresh token, not an access token. Azure Data Factory will handle internally obtaining and refreshing the access token for you and you will not need to worry about it.
I think that Azure Data Factory BigQuery connector accepts also access token in refresh token -field and I would assume it would behave exactly like in your situation and would not work after access token has expired.
Good documentation on how to authenticate with BigQuery connector can be found here: https://jpda.dev/getting-your-bigquery-refresh-token-for-azure-datafactory-f884ff815a59. In phase Getting our refresh token make sure you use value from key refresh_token, not access_token.
More info on authentication types can be found from Microsoft documentation here: https://learn.microsoft.com/en-us/azure/data-factory/connector-google-bigquery#using-user-authentication
This topic is very confusing given the wide variety of advice especially generalized advice that doesn't apply to Bigquery/GA and ADF! #Veikko would be incorrect in saying it shouldn't expire - Google says differently regarding their tokens.
Refresh token expiration is controlled by the resource and according to Bigquery documentation...
"Access tokens have limited lifetimes. If your application needs access to a Google API beyond the lifetime of a single access token, it can obtain a refresh token. A refresh token allows your application to obtain new access tokens."
I read where this used to support control of the timeout but after Dec 30 2021 it's mandatory at 30 minutes.
Regardless you have to get a refresh token everytime you query Bigquery if you want to use OAuth2.0. I've read a couple of methods of getting the refresh token but are pretty complicated (at least to me). My workaround is same as #lalfab....create hosted IR and store the p12 version of the credentials on the VM referenced by the IR there. It works but of course means maintaining a VM and starting/stopping when needed. I'd love to replace this with OAuth2.0 connection to the Bigquery client id. Would love to find a simple way of doing this. But ADF's bigquery linked server also doesn't support dynamic expression to replace refresh token at connect time so it seems like we need to wait until Microsoft makes the bigquery adapter more intelligent.
Well all of these answers are actually correct.
After a lot of research, I think I found the issue.
The issue is that refresh token DO expire, as long as the project is not published yet. The button to publish it, is under the OAuth consent screen.
Please read: https://developers.google.com/identity/protocols/oauth2#expiration
I havent tested it yet, but have a strong feeling that this is causing the issues for all of us. Hope it helps!
I am very new to MS Graph and Office 365 and have made good progress. I am an O365 Global Admin for my organisation (a school) and have app development experience. There is a lot of scope for using MS-Access databases in our context for "globally" managing the O365 content. eg contacts, distribution lists and planner tasks. We want to manage these from an on-premises ms-access database or two and with an admin person authenticating the ms-graph activity, ideally.
So, to test, I created a new db and have managed to get it to consume the following endpoint using VBA but with no user authentication for now.
https://graph.microsoft.com/v1.0/groups
However, when I try
https://graph.microsoft.com/v1.0/planner/plans/with my plan id here
I get 401 - Unauthorized: Access is denied due to invalid credentials.
So, clearly my Application registration is wrong or my authentication or both! I have spent hours searching for examples and help and because of the evolving nature of the ecosystem I am finding it pretty hard to work out what I should do now (as opposed to a year or two ago).
The authorisation that generates the access_token that works to allow me access to the groups is:
POST
https://login.microsoftonline.com/{my tenant id here}/oauth2/token
grant_type=client_credentials
client_id={my client id}
client_secret={my url encoded secret} resource=https://graph.microsoft.com
but using that same access_token for the planner tasks throws the 401 error.
My app permissions look like this:
I presume this is because of the difference between the Application and Delegated types but have not fully grasped it all yet. And, I suspect I am using the wrong authentication flow anyway. :-(
So, my questions are:
1. Do my permissions look right?
2. Is my authentication flow correct? Should I be using these instead? ie have I been working from old information?
https://login.microsoftonline.com/{my tenant id here}/oauth2/v2.0/authorize
https://login.microsoftonline.com/{my tenant id here}/oauth2/v2.0/token
As you can tell I have become somewhat confused. If anyone can point me in the right overall direction given what I am attempting that would be so helpful.
Thanks so much,
Murray
1. Do my permissions look right?
Yeah undoubtedly, your azure portal permission seems alright. You need dedicated permission for that also need to grant admin consent which you have done perfectly shown on screen shot.
2. Is my authentication flow correct?
As you are using Client Credentials Grant Flow request format seems alright. But I doubt this flow is suitable for the API you are trying to call. because this API requires dedicated permission.
3. Should I be using these instead?
Since this API need dedicated permission you could use authorization code grant flow.
Follow below steps to get your token using Authorization Code grant flow
Get Authorization Code:
https://login.microsoftonline.com/YourTenant.onmicrosoft.com/oauth2/v2.0/authorize?client_id={ClientId}&response_type=code&redirect_uri={redirectURI}&response_mode=query&scope=https://graph.microsoft.com/.default
Request Token oauth2/V2.0/token with your code:
Request URL: https://login.microsoftonline.com/common/oauth2/V2.0/token Or https://login.microsoftonline.com/YourTenant.onmicrosoft.com/oauth2/V2.0/token
Method: POST
Request Body Format
client_id:Your_Clinet_Id
scope:https://graph.microsoft.com/.default
redirect_uri:Your_Portal_Redirect_URI
grant_type:authorization_code
client_secret:Your_Client_Secret
code: Paste Code Here
Decode Token:
You could decode your token on https://jwt.io/ and make sure you have required permission on your azure portal.
4. Have I been working from old information?
No, Information has no issue so far I have gone through.
Note: For for details implementation of Authorization Code grant flow you could take a look official docs
I/we at www.dr.dk are working on a Sonos integration with the bare minimum functionality. This means that we wish to apply anonymous access in this first version of our Sonos integration.
In the API documentation
https://musicpartners.sonos.com/node/289#toc0
is says 'Finally, you can decide not to use any authentication, also knows as anonymous access. ...'
Which we read as an option to not to implement authentication endpoints like 'GetAppLink(...)' etc.
So now we have teste our service and it appears to work fine, as far as we know. Therefore we have now started to fill out the application registration form.
In the registration form we find the following required fields regarding authentication as depicted in the image below
Screenshot from the application registration form
As we see it these fields are related to authentication and seems somewhat confusing to us. So with our logic - anonymous authentication means that no test accounts or customer care accounts are needed etc.
So the question is. What are we missing ?
You can just mark those as N/A for each of the fields.
I was reading through the windows live developers doc here. In that I saw they are having an authentication method something like this.
GET https://oauth.live.com/authorize?client_id=CLIENT_ID&scope=SCOPES&
response_type=RESPONSE_TYPE&redirect_uri=REDIRECT_URL
I understood everything except for where do I give the username and password of the user?
I am planning to create an app(first one in my life) to learn the working.
I also have never used or coded something over REST.
When using OAuth, your application never receives the user's username or password. Rather, the user logs in to Windows Live on the Windows Live servers and authorizes your application for access to their information. After they have authorized your application, you receive an access token from Windows Live on behalf of the user. You then use that access token with the Live API to retrieve user information.
Coding something using REST protocols isn't anything too terribly complicated. It has been my experience that you're just specifying parameters to the API using GET or POST as your request method. Adding OAuth on to your requests is a matter of specifying additional parameters.
You're task is to learn two things here since you've never done REST or OAUTH before. Spend time looking at both.
Oauth is hard to get and hard to implement.
You should choose an off-the-shelf Oauth library they exists for most languages.
(Then you do not have to worry about the details. OTOH: You should know how it works to know how to set up and fix if something goes wrong.)
http://oauth.net/code/
The roadmap for OpenSSO said entitlements would be out summer09. Anyone know whether it will be able to solve data level security, for example "userA can only enter <500 in this field on the screen" OR "UserA can see only these values in the dropdown".
How is this implemented in organisations, each app controls the data level security or is there some who have an enterprise repository for it.
Thanks
yes, this is exactly the kind of problem that OpenSSO Entitlements are designed to solve. I did a demo at JavaOne last week that showed a demo mobile phone account management system with three policies:
Each phone user can read their own permissions (e.g. can download music/video) and call log.
The account holder can read and write the permissions of all phones on the account, and can see the call logs for all phones on the account.
The account holder can read and write account-wide data (e.g. billing address).
I'll be posting the source code to the demo and explaining how to deploy it on my blog.
To answer the second part of your question, there is an enterprise repository for policy, but it is enforced on an app-by-app basis. In the demo, most enforcement is done by a servlet filter which makes entitlement calls for each requested URL. This worked well, since we used RESTful web services that express the requested resource in the URL. In one spot we made an explicit policy call since the URL pattern did not correspond to the policy - the client could navigate to the account resource via the phone URL. I expect I could have constructed another policy to handle this, but I actually wanted to show an explicit entitlement call.