Tableau Extract refresh through API call from python. Getting unauthorized access (401002 response) - api

we are working on setting up tableau extract refresh through API invocation. We are using Personal access tokens from tableau for authentication. While we are able to establish the communication and are able to retrieve details on tableau site, we get a 401002 response when we try for extract refresh. Is there a need for an additional privilege to the access token to set the extract refresh.
Any pointers on this would be of great help!

Make sure that the user whose PAT you're using is the owner of the workbook (and hence the extraction schedule). If not, the extraction request will fail. Alternatively, if the user cannot be the owner, they must be server administrators or site administrators on your Tableau server.
Also make sure you already have a schedule for the extract refresh. If one doesn't exist, you can create it with the Create Schedule method (with the API this can only be done by server administrators, on the browser the owner of the workbook can do this).
From the Tableau API docs, also note that "A REST request to start a refresh task will fail if the task has been put in the task queue in any of these ways, or is already in progress". This might also be one reason why it fails.

Related

Is it possible to access the Xero API without user interaction

I am trying to come up with something which will be scheduled to run daily and would import newly created invoices from a database into Xero. To have this run daily, I want to avoid logging in manually i.e entering username and password for logging into Xero, is this possible?
So if you are reading and writing data to a Xero org on a customer's behalf, they will need to authenticate that connection a single time. From there you can use OAuth 2.0 access_tokens & refresh_tokens to programmatically run scripts that connect to their org via Xero API. We are looking at ways to make this easier while maintaining security standards for use cases like this. But for now you will need to prompt a user login and save the credentials in your database/store.
A daily update can be performed without user interaction, but does need the user to authorise your application the first time.
After that, your application can use the 'refresh token' to automatically generate a new access token each day.
2 important things to remember:
you need to specify 'offline_access' in the SCOPE to give you the refresh tokens in the response.
save the refresh token to a db or file, and then use this each day to obtain new set of tokens (without user interaction). When new tokens are obtained, use access token to perform your updates, and save refresh token for tomorrow.

msGraph API from msAccess VBA - Planner plans credentials issue

I am very new to MS Graph and Office 365 and have made good progress. I am an O365 Global Admin for my organisation (a school) and have app development experience. There is a lot of scope for using MS-Access databases in our context for "globally" managing the O365 content. eg contacts, distribution lists and planner tasks. We want to manage these from an on-premises ms-access database or two and with an admin person authenticating the ms-graph activity, ideally.
So, to test, I created a new db and have managed to get it to consume the following endpoint using VBA but with no user authentication for now.
https://graph.microsoft.com/v1.0/groups
However, when I try
https://graph.microsoft.com/v1.0/planner/plans/with my plan id here
I get 401 - Unauthorized: Access is denied due to invalid credentials.
So, clearly my Application registration is wrong or my authentication or both! I have spent hours searching for examples and help and because of the evolving nature of the ecosystem I am finding it pretty hard to work out what I should do now (as opposed to a year or two ago).
The authorisation that generates the access_token that works to allow me access to the groups is:
POST
https://login.microsoftonline.com/{my tenant id here}/oauth2/token
grant_type=client_credentials
client_id={my client id}
client_secret={my url encoded secret} resource=https://graph.microsoft.com
but using that same access_token for the planner tasks throws the 401 error.
My app permissions look like this:
I presume this is because of the difference between the Application and Delegated types but have not fully grasped it all yet. And, I suspect I am using the wrong authentication flow anyway. :-(
So, my questions are:
1. Do my permissions look right?
2. Is my authentication flow correct? Should I be using these instead? ie have I been working from old information?
https://login.microsoftonline.com/{my tenant id here}/oauth2/v2.0/authorize
https://login.microsoftonline.com/{my tenant id here}/oauth2/v2.0/token
As you can tell I have become somewhat confused. If anyone can point me in the right overall direction given what I am attempting that would be so helpful.
Thanks so much,
Murray
1. Do my permissions look right?
Yeah undoubtedly, your azure portal permission seems alright. You need dedicated permission for that also need to grant admin consent which you have done perfectly shown on screen shot.
2. Is my authentication flow correct?
As you are using Client Credentials Grant Flow request format seems alright. But I doubt this flow is suitable for the API you are trying to call. because this API requires dedicated permission.
3. Should I be using these instead?
Since this API need dedicated permission you could use authorization code grant flow.
Follow below steps to get your token using Authorization Code grant flow
Get Authorization Code:
https://login.microsoftonline.com/YourTenant.onmicrosoft.com/oauth2/v2.0/authorize?client_id={ClientId}&response_type=code&redirect_uri={redirectURI}&response_mode=query&scope=https://graph.microsoft.com/.default
Request Token oauth2/V2.0/token with your code:
Request URL: https://login.microsoftonline.com/common/oauth2/V2.0/token Or https://login.microsoftonline.com/YourTenant.onmicrosoft.com/oauth2/V2.0/token
Method: POST
Request Body Format
client_id:Your_Clinet_Id
scope:https://graph.microsoft.com/.default
redirect_uri:Your_Portal_Redirect_URI
grant_type:authorization_code
client_secret:Your_Client_Secret
code: Paste Code Here
Decode Token:
You could decode your token on https://jwt.io/ and make sure you have required permission on your azure portal.
4. Have I been working from old information?
No, Information has no issue so far I have gone through.
Note: For for details implementation of Authorization Code grant flow you could take a look official docs

Web API / SSIS - A starting point

In the interest of transparancy this is work life related. But I am most definatly not looking for 'the solution' simple a starting points.
The issue;
I've been asked to bring all yammer data into a database. While I'm quite familiar with database created, administrator and moving data to and from flat sources/databses using SSIS. I have virtually zero understanding of web APIs.
I have found that Yammer uses an api to allow for scheduled downloaded of information there.
The Question;
Can Yammer be used as a SSIS data source to transform/import into database tables? And if so - how!? I keep getting unauthorised attempts using my own admin credentials.
Thanks,
Yammer has a Data Export API which returns most of the data as a ZIP file containing multiple CSV files. The list of models and attributes is about half-way down the page I linked to.
This seems more aligned with an SSIS solution, but some data is only available via individual REST calls. Do analysis of what the data export provides to decide if you need to make additional REST calls to get additional metadata.
I'm not very familiar with SSIS, but the generic process you'd need to follow is:
Create a Verified Admin user in Yammer associated with a service account (O365 user with Yammer licence upgraded to Verified Admin in Network Admin.) For testing, you can use any verified admin account, but a service account is a best practice.
Log on with the Verified Admin account and register an application.
Acquire a token when logged on with a Verified Admin account. You can follow an OAuth flow, or get this from the application information page after registration. This token has the required privileges to export content.
Make requests to the export API specifying the correct parameters. Try a small time window without attachments to get started. Test this outside of SSIS with PowerShell before attempting this with SSIS.
Expand the ZIP file to a directory on disk. Again, doing this outside SSIS first is going to be simpler initially.
Use SSIS to import the CSV files to your database.
The CSV files have API endpoints for getting additional metadata on messages, users, groups etc. You'll need to work out how best to call these from SSIS if you really need the metadata, but it's more a question of "how do I make many REST calls with SSIS?"

How is the access_type=online oauth mode (no refresh token) supposed to work?

This question is has a lot in common to the previous question Google OAuth: can't get refresh token with authorization code (and I won't be offended if it's considered a duplicate) but there are some differences: that question uses the Javascript and PHP libraries, and I'm using neither of those. That question wants to know how to get a refresh token, and I want to know if I should want a refresh token, or how the mode with no refresh tokens is intended to work.
I'm following this guide:
https://developers.google.com/identity/protocols/OAuth2WebServer
The goal is to allow users to upload files from Google Drive to my web application.
I'm not using one of Google's favored programming languages, so I don't have a library abstracting away all the interaction with Google. I need to know what the HTTP requests should actually should look like.
One of the parameters in the authorization request is access_type. The description says
Set the value to offline if your application needs to refresh access tokens when the user is not present at the browser.
I won't need to do that (I'll only want to retrieve a file on my server immediately after the user selects it) so in the spirit of not asking for more privileges than you really need, I used access_type=online. This gives me an access token and no refresh token. I've successfully used the access token to make some requests to Google Drive.
The user comes back the next day and tries to upload another file. While processing this request from the user, I make a request to Google Drive. The access token is expired, so I get a 401. What's supposed to happen next?
My guess is I should pretend this is a completely new user and send them through the full authorization process again. That would mean I have to abort whatever the user was trying to do, redirect them to https://accounts.google.com/o/oauth2/auth with all the parameters (scope, client_id, etc.) and embed enough information in the state parameter that I can resume the original request when the user gets back from their detour.
This seems rather difficult (in particular the part about saving and resuming the state of my application at some arbitrary point). It's a big enough obstacle that it should be explained somewhere. But the description of the access_type parameter didn't say anything about needing to insert authorization redirects everywhere. It just said the user must be "present".
You are using the right implementation. You don't need offline access if you aren't going to make requests when the user is not using the application. The thing is that access tokens expire in 1 hour. So you need to generate new access tokens if a user leaves the application and come back later.
If users have authorized your application, calling this URL with your configuration should return a new valid access token:
https://accounts.google.com/o/oauth2/v2/auth?
scope=scopes&
include_granted_scopes=true&
state=state_parameter_passthrough_value&
redirect_uri=http://oauth2.example.com/callback&
response_type=token&
client_id=client_id

Remote access to laravel models

Is it possible that a website uses the models of another lavarel website to access the database, without the first website having the sql credentials hardcoded. But with the credentials to log into the second lavarel website hardcoded.
This way the first website doesn't have to have the sql credentials on it's ftp server, but can still access the databases through the other website (with their personal login of that website).
If that is impossible, I am wondering, is there a way to access a databases without having to hardcode the credentials anywhere.
UPDATE (the actual problem)
Only a part of the database should be visible to a particular user, so i can provide different users with different credentials and they all see something different in the database
What you are talking about is an API. So you'd build out the entire infrastructure on the first website, then on the second website, it would make some kind of calls to the first website to get back the information it needs, usually using some kind of credentials or access token.
This way, you can allow anyone in the world to communicate with your website, kind of like how Facebook, or Twitter does.
As far as accessing your database, you would need to tell your app somewhere the credentials to use, so technically, you do need to hardcode them somewhere as they can't just magically make up some credentials somehow to access a database.
if your different users are defined:
use laravel model/db event to replicate the data to a database by
user.
Or sync each database with a cron job..
These have benefits to avoid security transport problems.