I have a database with all my employee's information. Should I build an API or should I give direct access to the database?
The API could be built with Node.js and be a REST API. The database access could be given with different database users / different permission sets.
I'm confused because when I query an API, I usually get JSON data back which is awesome. So I wonder if only I can build an API that contains all the data I want and when I query it, I can have my employee's info back in JSON format.
There are reasons to use a REST API. For example you'll have a app, that will be available to the public, you might don't want to write your SQL queries directly in your code, as well as the credentials to connect to your server. Anyone with the knowledge could decompile your app and see your code and have access to the credentials to you database server. The good approach here is to write an REST API to handle the flow of data between your app and database, since REST API should be designed to be accessed publicly, you'll only have to pass in the data, or just call the api you need, you don't need the servers credentials to connect. If you are going local, then you could use the database approach of yours. Since your app will be used only by your clients.
Related
I have a database in Google BigQuery that contains confidential client data. My team's recently partnered with a 3rd-party data provider who will feed real-time data into our BigQuery instance via an API I provided (source: https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-python)
We want to make sure this 3rd-party provider can only access/view the data they're sending us, and not the data in our client database. Is there a way to set permissions to control what they can see in our account?
If I understand what you are trying to do, check out Authorized Views in BigQuery. I believe this is what it was designed for.
Creating an authorized view
Essentially, you set up a filtered query that you give the third party access to. Since it runs whatever filter that you set in the View Query, they only see what you want them to see.
In the interest of transparancy this is work life related. But I am most definatly not looking for 'the solution' simple a starting points.
The issue;
I've been asked to bring all yammer data into a database. While I'm quite familiar with database created, administrator and moving data to and from flat sources/databses using SSIS. I have virtually zero understanding of web APIs.
I have found that Yammer uses an api to allow for scheduled downloaded of information there.
The Question;
Can Yammer be used as a SSIS data source to transform/import into database tables? And if so - how!? I keep getting unauthorised attempts using my own admin credentials.
Thanks,
Yammer has a Data Export API which returns most of the data as a ZIP file containing multiple CSV files. The list of models and attributes is about half-way down the page I linked to.
This seems more aligned with an SSIS solution, but some data is only available via individual REST calls. Do analysis of what the data export provides to decide if you need to make additional REST calls to get additional metadata.
I'm not very familiar with SSIS, but the generic process you'd need to follow is:
Create a Verified Admin user in Yammer associated with a service account (O365 user with Yammer licence upgraded to Verified Admin in Network Admin.) For testing, you can use any verified admin account, but a service account is a best practice.
Log on with the Verified Admin account and register an application.
Acquire a token when logged on with a Verified Admin account. You can follow an OAuth flow, or get this from the application information page after registration. This token has the required privileges to export content.
Make requests to the export API specifying the correct parameters. Try a small time window without attachments to get started. Test this outside of SSIS with PowerShell before attempting this with SSIS.
Expand the ZIP file to a directory on disk. Again, doing this outside SSIS first is going to be simpler initially.
Use SSIS to import the CSV files to your database.
The CSV files have API endpoints for getting additional metadata on messages, users, groups etc. You'll need to work out how best to call these from SSIS if you really need the metadata, but it's more a question of "how do I make many REST calls with SSIS?"
Sorry for may be simple question. I not experienced with server-based apps developing.
I study Azure recently and create simple mobile application that connect to azure database. Its make some trivial operations on tables like add items and make SQL select queries. Now I want add authorisation to app and restrict some operations with tables in db based on it. What is best way to do it? I think it's a good idea to write backend on azure server with authorisation-based rules but I don't find out about it from Azure documentation. For example what I want to achieve:
Not authorisation mobile app user restricted to make any modifying operations and can select only predefined columns.
Authorisation user can make add/update operations on some tables based on user info(uid/login etc...).
If I create database rules on frontend(mobile app) side its not difficult to write another app that have possibility to make anything with database in bypass of my app. Isn't it?
If I create database rules on frontend(mobile app) side its not difficult to write another app that have possibility to make anything with database in bypass of my app. Isn't it?
This is very true; security shouldn't be (just) in the frontend. Make sure your backend is set up in such a way it checks the access rules each time someone tries to do something in the backend.
Now, as far as your question goes: please implement an API that connects to your database. With each and every client directly connecting to your database, you will lose all control. If you implement an API in front, you can do stuff like caching and asynchronous processing if you need to.
When implementing the API, you can have the GET methods be unsecured, while POST, PUT and DELETE use a (for instance) JWT token retrieved from Azure Active Directory. This repo and the presentation it links to might give you some reference.
I've been creating an extension for VSTS, and so far i have stored some data in documents in collections (https://learn.microsoft.com/en-us/vsts/extend/develop/data-storage).
The problem I have now, is that I need to GET these documents somehow from an external application. I have looked into: https://github.com/Microsoft/vsts-auth-samples/tree/master/ClientLibraryConsoleAppSample to get the authorization done, but then I am unable to get the documents. If I try to access through the REST API I have issues authorizing myself(without the personal access token provided. The application is supposed to work for every user, and i cannot get and use every user's personal access token. This is not feasible for 350+ people) as well as I am unable to get the REST API working. The documentation on all of this is severely lacking.
Anyone able to help?
The documentation is lacking, because the Data Storage is isolated for the extension and there is no easy way to access the data from outside of the extension. If you need external access, you also need to store your data externally. Azure storage or in a TFVC/Git repo under the VSTS account.
As for per-user storage access, that's also isolated and would indeed require either a account owner token or a user specific Oauth or PAT token.
I have found the solution. The documentation states that there are 2 ways of working with the documents/collections. REST API and their VSS wrappers. The url required to get all documents in a certain collection is as follows:
https://{account}.extmgmt.visualstudio.com/_apis/ExtensionManagement/InstalledExtensions/{publisherName}/{extensionName}/Data/Scopes/Default/Current/Collections/{collectionName}/Documents/{documentName}.
Using this in a browser works just fine. All that needs to be done in order to use this with an external application is authorization.
If you use sdk methods from docs like VSS.getService(VSS.ServiceIds.ExtensionData) you can view (easiest in dev tool in browser) the request.
Its look like:
https://extmgmt.dev.azure.com/{organization}/_apis/ExtensionManagement/InstalledExtensions/{publisher id}/{extension id}/Data/Scopes/Default/Current/Collections/{collections (by default 'MyCollection')}/Documents
Is it possible that a website uses the models of another lavarel website to access the database, without the first website having the sql credentials hardcoded. But with the credentials to log into the second lavarel website hardcoded.
This way the first website doesn't have to have the sql credentials on it's ftp server, but can still access the databases through the other website (with their personal login of that website).
If that is impossible, I am wondering, is there a way to access a databases without having to hardcode the credentials anywhere.
UPDATE (the actual problem)
Only a part of the database should be visible to a particular user, so i can provide different users with different credentials and they all see something different in the database
What you are talking about is an API. So you'd build out the entire infrastructure on the first website, then on the second website, it would make some kind of calls to the first website to get back the information it needs, usually using some kind of credentials or access token.
This way, you can allow anyone in the world to communicate with your website, kind of like how Facebook, or Twitter does.
As far as accessing your database, you would need to tell your app somewhere the credentials to use, so technically, you do need to hardcode them somewhere as they can't just magically make up some credentials somehow to access a database.
if your different users are defined:
use laravel model/db event to replicate the data to a database by
user.
Or sync each database with a cron job..
These have benefits to avoid security transport problems.