setAuthConfig() vs setAuthConfigFile() in the Google OAuth2.0 API - google-oauth

While reading the online Google OAuth2.0 documentation, I have seen:
This:
$client->setAuthConfigFile('secret_keys.json');
And this:
$client->setAuthConfig('secret_keys.json');
They are being used in the same situations and I would like to know if there is any difference between them, or if I shall stick to use one of them always.

Use setAuthConfig.
setAuthConfigFile is now just an alias for backwards compatibility and is deprecated.
setAuthConfigFile just calls setAuthConfig
Refer to this. https://github.com/google/google-api-php-client/blob/master/src/Google/Client.php and search for public function setAuthConfigFile($file)

Related

How to implement api versioning?

I have an web API application that will serve many clients at different times of release and now i need to implement a versioning. Because the API code will be constantly updated and API users will not be able to instantly change their API. Well, the standard situation is when you need to introduce versioning in general. I'm finding a way to organize it inside my API. It's clear that it will not be different folders with an application on the server, conditionally called app_v1, app_v2, app_v2.1, etc., cause this is duplication, redundancy and bad practise.
It's look like will be one application, and in the controllers at the code level there will be a division of the logic already, like If(client_version==1) do function1() else if(client_version==2) do function2(), etc. It seems that git supports tags, this is something similar to versioning, but because all supported versions of the application need be on the server at the same time, this is not about that. how i can realize an architecture in this case?
There are many well-known ways to use API versioning to make code work with older versions. (backward compatibility). The general purpose of API versioning is a way to make sure that different clients can use different versions of an API at the same time. I've seen several ways to do API versioning, such as:
URL Path Versioning: In this method, the number of the version is part of the API endpoint's URL path. For instance:
https://api.example.com/v1/assets
https://api.example.com/v2/assets
URL Query String Parameter: In this method, the version number is added to the API endpoint's URL as a query string parameter. For instance:
https://api.example.com/assets?version=1
https://api.example.com/assets?version=2
HTTP Header: In this method, the version number is put in an HTTP header, like the Accept-Version header. For instance:
Accept-Version: 1
Accept-Version: 2
If you are using dotnet for you project I would like recommend to standard library for that recommend to check this out. Or you can find solid materials in term of WebApi Versioning following link by #Steve Smith.
There is another answer.

Is there a (documented) way to get attributes from the data point library via API?

In cumulocity cockpit you can specify data point attributes like a display name, unit, value range etc in a so called "data point library". I wonder how I can use these attributes when developing custom dashboard widgets.
I figured out that there is a fragmentType c8y_Kpi for API requests and a class called c8yKpi in the JS client lib which provides all necessary functions. It works fine in my custom widets, but the API/JS class are not documented.
Is there any (official, documented, supported) way to request attributes from the "data point library" via API or JS client library?
These kind of "internal" structures are not documented officially but like you already did you can of course use them in your code.
The risk coming with using undocumented structures are that they might change and then you would need to adapt your code.
Like you already found out yourself the way to get them on API is to call inventory with fragmentType=c8y_Kpi
/inventory/managedObjects?fragmentType=c8y_Kpi
No: currently there is no official, documented and support API for accessing the data point library.

How to start ArangoDB-GraphQL-Express?

I looked at the support from ArangoDB, and google search, but it did not help me much...I am fresh in these topic, (but Polish proverb says that you should not be ashamed to ask questions).
my situation is as follows, I have quite a very extensive database, which I created by GUI-HTTP-ArangoDB (by importing further crafted JSONs, as collections of Verexs & Edges) I would like to link this database and dynamically depending on the query, display the resutat, only hmm I do not know how to connect it. is like a tutorial on the arango page to Node, but there is nothing to write like where and what to create, just they only described the next command that do something .. ech ...
I am looking for examples, or a step-by-step guide/tutorial..
I am asking you for help / support..
how in it, to find himself..
Well, there are two options I would use to connect Arango to GraphQL:
1 Use the Foxx micro services that live within Arango to create a Rest API. Then you can use wrap the Rest api in GraphQL. Here is the tutorial for creating the Foxx micro services :
https://docs.arangodb.com/3.3/Manual/Foxx/GettingStarted.html
And here is the tutorial to wrap the the rest api in GraphQL:
https://www.prisma.io/blog/how-to-wrap-a-rest-api-with-graphql-8bf3fb17547d/
2 Have the GraphQL Server be part of the Foxx microservices instead of the Rest Api as described here
https://docs.arangodb.com/3.3/Manual/Foxx/GraphQL.html
And here
https://mikewilliamson.wordpress.com/2017/03/24/arangodb-and-graphql/
Hope this helps!

Backend database used in the API

By going through this API documentation page, is it possible to tell which database is being used in the backend?
Zomato API
MySQL would require a php file on the server to handle the requests, make queries, pack data in JSON format then send it back to the device. But in this case parameters are passed to .json files. Please advice
There is no way to "see through" to what the backend service actually used to provide you with the information you may query for. Are you sure you want to continue using this product? The site notes that Zomato will no longer be available to individuals, and that your API key will be disabled if you don't use it monthly.
I haven't read the specs for that particular API. But in general, is it possible to tell what database is being used on the back end by studying an API? No. That's the whole point of an API: It's supposed to shield the API-user from implementation details.
It's probably true that in many cases you could make reasonable guesses about what tools are being used on the back end. Like if you see that the API gives you a syntax for doing comparisons that looks exactly like the proprietary compare function used in Foobar SQL and not found in any other database product, that would be a strong clue. But even something like that wouldn't be proof. Maybe originally they were using Foobar SQL, then they switched to another database, but to maintain compatibility they wrote code to translate the Foobar SQL compare to standard SQL syntax.

YouTube API - Querying by publish date

I'm writing a webapp that uses the YouTube Code API to do specific types of searches. In this case, I'm trying to search for all videos that match a query, and which were uploaded between two dates. This document says I can use published-min and published-max parameters, while this one says I can use updated-min and updated-max.
Both of these parameter sets cause YouTube to return an error:
published-min returns "This service does not support the 'published-min parameter"
updated-min returns "This service does not support the 'updated-max' parameter"
With neither returns a correct result set.
How can I limit my result set to hits within a specified date range?
The Reference Guide for YouTube's Data API doesn't list anything that would suggest the possibility to filter on time interval in general.
The published-min argument is only advertised in the "User activity feeds" section which is something different and probably not the thing you wanted. Or is it?
The updated-min argument in your link is referenced in a generic gdata context. It looks like they intended to describe all the things common to all the specialized APIs, but somehow updated-min isn't available everywhere.
When it comes to your very problem. I would suggest sorting on time (orderby=published) and do the filtering on the client side. I know this is not the optimal way, but the only one I can see with what Google gives us.
youtube api v3 supports publishedAfter and publishedBefore parameters with search results. For example:
https://www.googleapis.com/youtube/v3/search?key={{YOUKEY}}&channelId={{CHANNELID}}&part=snippet,id&order=date&maxResults=50&publishedAfter=2014-09-21T00:00:00Z&publishedBefore=2014-09-22T02:00:00Z