Google Analytics API - ecommerce returns empty EcommerceData object - e-commerce

For some reason the object remains empty, ecommerce and enhanced ecommerce settings are enabled.
the request:
body = {
"viewId": VIEW_ID,
"user": {
"type": "CLIENT_ID",
"userId": "414884771.1598953392"
},
"activityTypes": "ECOMMERCE",
"dateRange": {
"startDate": "2020-09-01",
"endDate": "2020-10-30",
},
"pageSize":100,
}
and a row of response:
{'sessions': [{'sessionId': '160032xxxxxx', 'deviceCategory': 'mobile', 'platform': 'iOS', 'dataSource': 'web', 'activities': [{'activityTime': '2020-09-17T05:22:33.257619Z', 'source': '(direct)', 'medium': '(none)', 'channelGrouping': 'Direct', 'campaign': '(not set)', 'keyword': '(not set)', 'hostname': 'xxxxxxx.xx', 'landingPagePath': '/', 'activityType': 'ECOMMERCE', 'ecommerce': {}}]
any suggestions as to why 'ecommerce' : {}} remains empty are appreciated!

I have found that this is a reported bug. Unfortunately, it has been open for 1y+ :(
https://issuetracker.google.com/issues/139107430

Related

paypal-checkout-component for vue stays in sandbox mode

I'm struggling to go live. It's weird how something works in sandbox mode but then either there is no documented way to switch to production or it just does not work.
So here is what I got:
<template lang="pug">
.paypal
Spinner(:size="8" :thickness="3")
PaypalButtons.buttons(
:env="env"
:style="style"
:createOrder="order"
:onInit="init"
:onClick="validate"
:onApprove="approve"
:onError="error"
)
</template>
<script>
export default {
name: 'PayPal',
props: {
env: {
type: String,
default: 'sandbox',
validator: value => ['sandbox', 'production'].includes(value)
},
},
// ...
}
</script>
I've tried to set the env prop to production and removing it completely. I can not find any documentation on how to set the environment. I must miss something fundamental.
The error:
As I mentioned the sandbox mode works fine, but as soon as I go live (server side using PayPals production URL and client side with the corresponding env prop), I'm getting the following errors
Request
URL: https://www.sandbox.paypal.com/graphql?UpdateClientConfig.
BODY:
{
"query": "\n mutation UpdateClientConfig(\n $orderID : String!,\n $fundingSource : ButtonFundingSourceType!,\n $integrationArtifact : IntegrationArtifactType!,\n $userExperienceFlow : UserExperienceFlowType!,\n $productFlow : ProductFlowType!,\n $buttonSessionID : String\n ) {\n updateClientConfig(\n token: $orderID,\n fundingSource: $fundingSource,\n integrationArtifact: $integrationArtifact,\n userExperienceFlow: $userExperienceFlow,\n productFlow: $productFlow,\n buttonSessionID: $buttonSessionID\n )\n }\n ",
"variables": {
"orderID": "17884710UT885974F",
"fundingSource": "paypal",
"integrationArtifact": "PAYPAL_JS_SDK",
"userExperienceFlow": "INCONTEXT",
"productFlow": "SMART_PAYMENT_BUTTONS"
}
}
Response:
{
"data": {
"updateClientConfig": null
},
"errors": [
{
"_name": "RESOURCE_NOT_FOUND",
"checkpoints": [
"patchClientConfig"
],
"contingency": true,
"data": {
"message": "The specified resource does not exist."
},
"message": "RESOURCE_NOT_FOUND",
"meta": {},
"path": [
"updateClientConfig"
],
"statusCode": 200
}
],
"extensions": {
"correlationId": "464b1d56d4581",
"tracing": {
"duration": 98157194,
"endTime": "2021-09-06T16:29:45.133Z",
"execution": {
"resolvers": [
{
"duration": 96271082,
"fieldName": "updateClientConfig",
"parentType": "Mutation",
"path": [
"updateClientConfig"
],
"returnType": "Boolean",
"startOffset": 1222180
}
]
},
"startTime": "2021-09-06T16:29:45.035Z",
"version": 1
}
}
}
There is another request to https://www.sandbox.paypal.com/graphql?GetCheckoutDetails with a similar response.
As far as I can tell the request URL should not be www.sandbox.paypal...
I have also commented on an existing issue on GitHub, but I believe it will take too long to get an answer that way.
You are using a sandbox client ID.
Change to a live client ID, from an app in the "Live" tab of your Applications in developer.paypal.com

Docusign : 400 Error "Unable to parse multipart body" when trying to create envelope from Template ID from UI5 application

We are trying to figure out whether Docusign can be used in productive scenarios for our client requirements.
We have a UI5 application which will be used to sign Documents. We have created a template in the demo instance of Docusign.
However when we are trying to create an envelope from the application we are getting 400 Error Unable to parse multipart body. Now the same payload when used in POSTMAN application results in the envelope getting created successfully. The headers passed are also the same.
In Ui5 App :
var settings = {
"async": true,
"crossDomain": true,
"url": "/docusign/envelopes",
"method": "POST",
"timeout": 0,
"headers": {
"Authorization": "User DnVj27euWrCi4ANoMV5puvxVxYAcUCG3PlkUSpWpC08=, Organization 6ba64ce816dec995b17d04605e329a30, Element X4XuUq/T5UUh2o9xwaamZCCRwOKUCPr1Kv1Nj+qHPj0=",
"Content-Type": "application/json"
},
"data": JSON.stringify({
"status": "sent",
"compositeTemplates": [{
"compositeTemplateId": "1",
"inlineTemplates": [{
"recipients": {
"signers": [{
"email": "johndoe#testmail.com",
"name": "John Doe",
"recipientId": "1",
"roleName": "Signer",
"clientUserId": "12345",
"tabs": {
"textTabs": [{
"tabLabel": "firstName",
"value": "John"
}, {
"tabLabel": "lastName",
"value": "Doe"
}, {
"tabLabel": "phoneNo",
"value": "022-635363"
}, {
"tabLabel": "email",
"value": "test#gmail.com"
}]
}
}]
},
"sequence": "1"
}],
"serverTemplates": [{
"sequence": "1",
"templateId": "0bf97611-a457-4e8e-ac7e-1593c17ba3f6"
}]
}]
})
};
var deferred = $.Deferred();
$.ajax(settings).done(function (response) {
deferred.resolve(response);
}.bind(this)).fail(function (error) {
deferred.reject(error);
}.bind(this));
In Postman :
Help would be greatly appreciated in resolving this issue.
Could you stringify outside of the json settings and perhaps break your call down a little before placing everything in settings.
i.e. Try and re-shape your jquery ajax call:
var headers = {"Authorization": "User DnVj27euWrCi4ANoMV5puvxVxYAcUCG3PlkUSpWpC08=, Organization 6ba64ce816dec995b17d04605e329a30, Element X4XuUq/T5UUh2o9xwaamZCCRwOKUCPr1Kv1Nj+qHPj0=", "Content-Type": "application/json" };
var payload = JSON.stringify({
"status": "sent",
"compositeTemplates": [{
"compositeTemplateId": "1",
"inlineTemplates": [{
"recipients": {
"signers": [{
"email": "johndoe#testmail.com",
"name": "John Doe",
"recipientId": "1",
"roleName": "Signer",
"clientUserId": "12345",
"tabs": {
"textTabs": [{
"tabLabel": "firstName",
"value": "John"
}, {
"tabLabel": "lastName",
"value": "Doe"
}, {
"tabLabel": "phoneNo",
"value": "022-635363"
}, {
"tabLabel": "email",
"value": "test#gmail.com"
}]
}
}]
},
"sequence": "1"
}],
"serverTemplates": [{
"sequence": "1",
"templateId": "0bf97611-a457-4e8e-ac7e-1593c17ba3f6"
}]
}]
});
$.ajax({
"async": true,
"crossDomain": true,
"url": "/docusign/envelopes",
"method": "POST",
"timeout": 0,
"headers": headers,
"data": payload
});
I am sure this will lead you to your final "consolidated" answer.
If the exact same JSON is being sent from Postman and from the UI5 application, then you'll get the same results. But you aren't, so something is different.
Probably the UI5 system is sending the API as a mime multi-part request, but isn't setting the content type for the JSON request part correctly.
To verify: use the DocuSign API logger to see what is being received by DocuSign. Compare between the request being sent from UI5 and from Postman.
To fix: you'll need to set additional UI5 parameters so the request is NOT sent as a multi-part mime message. Or send the multi-part message with the needed settings. See the docs and see a multi-part example.
PS PLEASE post an answer to your question with the solution to your problem (once you've found it) to help others in the future. Thank you!!
I was able to fix the issue by directly using the Docusign API (https://demo.docusign.net/restapi/v2/accounts). I was earlier using the SAP Openconnector to connect to Docusign.
https://api.openconnectors.eu3.ext.hanatrial.ondemand.com/elements/api-v2
Thanks all for the help.
I have run into the very same issue recently, and almost decided to give up, but finally, I have managed to find a way to make it work!
The thing is that you need to execute the Ajax call in the following way:
_createEnvelops: function () {
var deferred = $.Deferred();
var oTemplateData = this._getTemplateData();
var oFormData = new FormData();
oFormData.append('envelope', JSON.stringify(oTemplateData));
var settings = {
"async": true,
"crossDomain": true,
"url": '/docusign/envelopes',
"method": "POST",
"data": oFormData,
processData: false,
contentType: false,
"headers": {
"Authorization": sAuthToken
}
};
$.ajax(settings).done(function (response) {
deferred.resolve(response);
}.bind(this)).fail(function (error) {
deferred.reject(error);
}.bind(this));
return deferred;
},
Maybe it will be useful for someone in the future ;)

Auth.currentAuthenticatedUser not loading name and family name attributes (and others) from Cognito

I'm using the Auth.currentAuthenticatedUser method to retrieve the attributes recorded for the logged user from AWS Cognito - but only basic atributes are showing. The ones I want are "name" and "family name", but they don't seem to be loaded in the Promise.
This is only the beggining, but I'm concerned as I will want to retrieve other attributes which are not showing up, like user picture, for instance.
Tried to use currentAuthenticatedUser and currentUserInfo with the same results.
async componentDidMount() {
await Auth.currentAuthenticatedUser({bypassCache: true})
.then ( user => this.setUserInfo( user ) )
.catch( err => console.log(err))
}
CognitoUser {
"Session": null,
"attributes": Object {
"email": "r...#gmail.com",
"email_verified": true,
"phone_number": "+5...",
"phone_number_verified": false,
"sub": "246e9...",
},
"authenticationFlowType": "USER_SRP_AUTH",
"client": Client {
"endpoint": "https://cognito-idp.us-east-2.amazonaws.com/",
"userAgent": "aws-amplify/0.1.x react-native",
},
"deviceKey": undefined,
"keyPrefix": "CognitoIdentityServiceProvider.12ddetjn0c0jo0npi6lrec63a7",
"pool": CognitoUserPool {
"advancedSecurityDataCollectionFlag": true,
"client": Client {
"endpoint": "https://cognito-idp.us-east-2.amazonaws.com/",
"userAgent": "aws-amplify/0.1.x react-native",
},
"clientId": "12ddetjn0c0jo0npi6lrec63a7",
"storage": [Function MemoryStorage],
"userPoolId": "us-east...",
},
"preferredMFA": "NOMFA",
"signInUserSession": CognitoUserSession {
"accessToken": CognitoAccessToken {
"jwtToken": "e...oJPg",
"payload": Object {
"auth_time": 1565137817,
"client_id": "1...6lrec63a7",
"event_id": "c3...-4bd9-ad42-200f95f9921c",
"exp": 15...2,
"iat": 156...5872,
"iss": "https://cognito-idp.us-east-2.amazonaws.com/us-east-...",
"jti": "5483e...544149c42e58",
"scope": "aws.cognito.signin.user.admin",
"sub": "246e93...f4d8e6f4725b",
"token_use": "access",
"username": "r...f",
},
},
"clockDrift": -2,
"idToken": CognitoIdToken {
"jwtToken": "eyJraWQiOiJk...",
"payload": Object {
"aud": "12ddetjn0c0j..rec63a7",
"auth_time": 1565137817,
"cognito:username": "r..",
"email": "r..#gmail.com",
"email_verified": true,
"event_id": "c3ae..200f95f9921c",
"exp": ..2,
"iat": ..2,
"iss": "https://cognito-idp.us-east-2.amazonaws.com/us-east-..",
"phone_number": "+5...3",
"phone_number_verified": false,
"sub": "246e937..f4d8e6f4725b",
"token_use": "id",
},
},
"refreshToken": CognitoRefreshToken {
"token": "eyJjd...",
},
},
"storage": [Function MemoryStorage],
"userDataKey": "CognitoIdentityServiceProvider.12ddetjn0....userData",
"username": "r...ff",
}
To get all user attributes, you may be looking for the Auth.userAttributes() function. To use this you want something like this code:
const authUser = await Auth.currentAuthenticatedUser();
const attributes = await Auth.userAttributes(authUser);
// the next line is a convenience that moves the attributes into
// the authUser object
attributes.forEach((attr) => {
authUser.attributes[attr.Name] = attr.Value;
});
If you're still not getting the attributes you need, take a look here, and you can see that you can enable the reading of other attributes from the Amplify command line.
So, in the root of your project:
Type "amplify update auth" at the console.
Select "Walkthrough the auth configurations"
Step through making all the same selections as you've done before.
When it asks, "Do you want to specify the user attributes this app can read and write?" it's "Y", and then you select the attributes you want to be able to read.
When you finish the wizard, use "amplify push auth"
When that's completed, try re-running.
As an alternative to steps 1-4 above, you can also edit cli-inputs.json in the amplify\backend\auth<your auth config name> directory. It's in "userpoolClientReadAttributes". Simply add the attributes you would like to this array (e.g. "name").
This answer was verified with amplify CLI version 8.1.0.

Error with IPFS COR

When trying to use IPFS from my localhost I am having trouble accessing the IPFS service. I tried setting my config to accept the localhost and all server stuff, but nothing seems to work.
The error:
Failed to load http://127.0.0.1:5001/api/v0/files/stat?arg=0x6db883c6f3b2824d26f3b2e9c30256b490d125b10a3942f49a1ac715dd2def89&stream-channels=true: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:63342' is therefore not allowed access. The response had HTTP status code 403. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
IPFS Config:
{
"API": {
"HTTPHeaders": {
"Access-Control-Allow-Origin": [
"*"
]
}
},
"Addresses": {
"API": "/ip4/127.0.0.1/tcp/5001",
"Announce": [],
"Gateway": "/ip4/127.0.0.1/tcp/8080",
"NoAnnounce": [],
"Swarm": [
"/ip4/0.0.0.0/tcp/4001",
"/ip6/::/tcp/4001"
]
},
"Bootstrap": [
"/dnsaddr/bootstrap.libp2p.io/ipfs/QmNnooDu7bfjPFoTZYxMNLWUQJyrVwtbZg5gBMjTezGAJN",
"/dnsaddr/bootstrap.libp2p.io/ipfs/QmQCU2EcMqAqQPR2i9bChDtGNJchTbq5TbXJJ16u19uLTa",
"/dnsaddr/bootstrap.libp2p.io/ipfs/QmbLHAnMoJPWSCR5Zhtx6BHJX9KiKNN6tpvbUcqanj75Nb",
"/dnsaddr/bootstrap.libp2p.io/ipfs/QmcZf59bWwK5XFi76CZX8cbJ4BhTzzA3gU1ZjYZcYW3dwt",
"/ip4/104.131.131.82/tcp/4001/ipfs/QmaCpDMGvV2BGHeYERUEnRQAwe3N8SzbUtfsmvsqQLuvuJ",
"/ip4/104.236.179.241/tcp/4001/ipfs/QmSoLPppuBtQSGwKDZT2M73ULpjvfd3aZ6ha4oFGL1KrGM",
"/ip4/128.199.219.111/tcp/4001/ipfs/QmSoLSafTMBsPKadTEgaXctDQVcqN88CNLHXMkTNwMKPnu",
"/ip4/104.236.76.40/tcp/4001/ipfs/QmSoLV4Bbm51jM9C4gDYZQ9Cy3U6aXMJDAbzgu2fzaDs64",
"/ip4/178.62.158.247/tcp/4001/ipfs/QmSoLer265NRgSp2LA3dPaeykiS1J6DifTC88f5uVQKNAd",
"/ip6/2604:a880:1:20::203:d001/tcp/4001/ipfs/QmSoLPppuBtQSGwKDZT2M73ULpjvfd3aZ6ha4oFGL1KrGM",
"/ip6/2400:6180:0:d0::151:6001/tcp/4001/ipfs/QmSoLSafTMBsPKadTEgaXctDQVcqN88CNLHXMkTNwMKPnu",
"/ip6/2604:a880:800:10::4a:5001/tcp/4001/ipfs/QmSoLV4Bbm51jM9C4gDYZQ9Cy3U6aXMJDAbzgu2fzaDs64",
"/ip6/2a03:b0c0:0:1010::23:1001/tcp/4001/ipfs/QmSoLer265NRgSp2LA3dPaeykiS1J6DifTC88f5uVQKNAd"
],
"Datastore": {
"BloomFilterSize": 0,
"GCPeriod": "1h",
"HashOnRead": false,
"Spec": {
"mounts": [
{
"child": {
"path": "blocks",
"shardFunc": "/repo/flatfs/shard/v1/next-to-last/2",
"sync": true,
"type": "flatfs"
},
"mountpoint": "/blocks",
"prefix": "flatfs.datastore",
"type": "measure"
},
{
"child": {
"compression": "none",
"path": "datastore",
"type": "levelds"
},
"mountpoint": "/",
"prefix": "leveldb.datastore",
"type": "measure"
}
],
"type": "mount"
},
"StorageGCWatermark": 90,
"StorageMax": "10GB"
},
"Discovery": {
"MDNS": {
"Enabled": true,
"Interval": 10
}
},
"Experimental": {
"FilestoreEnabled": false,
"Libp2pStreamMounting": false,
"ShardingEnabled": false
},
"Gateway": {
"HTTPHeaders": {
"Access-Control-Allow-Headers": [
"X-Requested-With",
"Range"
],
"Access-Control-Allow-Methods": [
"GET"
],
"Access-Control-Allow-Origin": [
"localhost:63342"
]
},
"PathPrefixes": [],
"RootRedirect": "",
"Writable": false
},
"Identity": {
"PeerID": "QmRgQdig4Z4QNEqs5kp45bmq6gTtWi2qpN2WFBX7hFsenm"
},
"Ipns": {
"RecordLifetime": "",
"RepublishPeriod": "",
"ResolveCacheSize": 128
},
"Mounts": {
"FuseAllowOther": false,
"IPFS": "/ipfs",
"IPNS": "/ipns"
},
"Reprovider": {
"Interval": "12h",
"Strategy": "all"
},
"Swarm": {
"AddrFilters": null,
"ConnMgr": {
"GracePeriod": "20s",
"HighWater": 900,
"LowWater": 600,
"Type": "basic"
},
"DisableBandwidthMetrics": false,
"DisableNatPortMap": false,
"DisableRelay": false,
"EnableRelayHop": false
}
}
Ben, try replacing 127.0.0.1 with localhost. go-ipfs whitelists localhost only. Also check https://github.com/ipfs/js-ipfs-api/#cors
my answer might come very late, however I am trying to solve some CORS issues with IPFS on my end; therefore I might have a solution for you:
by running:
# please update origin according to your setup...
origin=http://localhost:63342
ipfs config --json API.HTTPHeaders.Access-Control-Allow-Origin '["'"$origin"'", "http://127.0.0.1:8080","http://localhost:3000", "http://127.0.0.1:48084", "https://gateway.ipfs.io", "https://webui.ipfs.io"]'
ipfs config API.HTTPHeaders.Access-Control-Allow-Origin
and restarting your ipfs daemon it might fix it
if the "fetch" button in the following linked page works : you are all set ! https://gateway.ipfs.io/ipfs/QmXkhGQNruk3XcGsidCzQbcNQ5a8oHWneHZXkPvWB26RbP/
This Command Works for me
ipfs config --json API.HTTPHeaders.Access-Control-Allow-Origin
'["'"$origin"'", "http://127.0.0.1:8080","http://localhost:3000"]'
you can allow the request from multiple origins

Get video from tweet using Twitter API

I am trying to use the Twitter API to import a video from a given tweet. However, when I use the statuses/show endpoint, it doesn't return any extended entity for the video as it would an image, but instead returns a url entity linking to some video container embed with a video player containing an obscure link to the video.
Here is an example:
I am trying to import the tweet at https://twitter.com/NHL/status/633987786018717696
Using the Twitter API's statuses/show endpoint and the tweet id, I get this response:
{
"created_at": "Wed Aug 19 13:04:01 +0000 2015",
"id": 633987786018717700,
"id_str": "633987786018717696",
"text": "The offseason has us missing all of our fans, even the wacky ones... especially the wacky ones. #IsItOctoberYet?\nhttps://t.co/v4UGDQpa61",
"source": "Twitter Web Client",
"truncated": false,
"in_reply_to_status_id": null,
"in_reply_to_status_id_str": null,
"in_reply_to_user_id": null,
"in_reply_to_user_id_str": null,
"in_reply_to_screen_name": null,
"user": {
"id": 50004938,
"id_str": "50004938",
"name": "NHL",
"screen_name": "NHL",
"location": "30 cities across U.S. & Canada",
"description": "The official source of everything you need and want to know from the National Hockey League. Read before tweeting us: http://t.co/JlyVXSpqMn",
"url": "http://t.co/VI8RlwuVr9",
"entities": {
"url": {
"urls": [
{
"url": "http://t.co/VI8RlwuVr9",
"expanded_url": "http://www.NHL.com",
"display_url": "NHL.com",
"indices": [
0,
22
]
}
]
},
"description": {
"urls": [
{
"url": "http://t.co/JlyVXSpqMn",
"expanded_url": "http://nhl.com/socialmediapolicy",
"display_url": "nhl.com/socialmediapol…",
"indices": [
118,
140
]
}
]
}
},
"protected": false,
"followers_count": 4130811,
"friends_count": 2646,
"listed_count": 18479,
"created_at": "Tue Jun 23 15:24:18 +0000 2009",
"favourites_count": 909,
"utc_offset": -14400,
"time_zone": "Eastern Time (US & Canada)",
"geo_enabled": true,
"verified": true,
"statuses_count": 87436,
"lang": "en",
"contributors_enabled": false,
"is_translator": false,
"is_translation_enabled": true,
"profile_background_color": "000000",
"profile_background_image_url": "http://pbs.twimg.com/profile_background_images/378800000139631457/fd-xWa9G.jpeg",
"profile_background_image_url_https": "https://pbs.twimg.com/profile_background_images/378800000139631457/fd-xWa9G.jpeg",
"profile_background_tile": false,
"profile_image_url": "http://pbs.twimg.com/profile_images/534776558238437376/yxrm83O7_normal.jpeg",
"profile_image_url_https": "https://pbs.twimg.com/profile_images/534776558238437376/yxrm83O7_normal.jpeg",
"profile_banner_url": "https://pbs.twimg.com/profile_banners/50004938/1435502670",
"profile_link_color": "040CDE",
"profile_sidebar_border_color": "FFFFFF",
"profile_sidebar_fill_color": "2E2E2E",
"profile_text_color": "0F5A80",
"profile_use_background_image": true,
"has_extended_profile": false,
"default_profile": false,
"default_profile_image": false,
"following": true,
"follow_request_sent": false,
"notifications": false
},
"geo": null,
"coordinates": null,
"place": null,
"contributors": null,
"is_quote_status": false,
"retweet_count": 865,
"favorite_count": 1342,
"entities": {
"hashtags": [
{
"text": "IsItOctoberYet",
"indices": [
96,
111
]
}
],
"symbols": [],
"user_mentions": [],
"urls": [
{
"url": "https://t.co/v4UGDQpa61",
"expanded_url": "https://amp.twimg.com/v/2a0210d1-4d39-4665-a749-ea34f8efef08",
"display_url": "amp.twimg.com/v/2a0210d1-4d3…",
"indices": [
113,
136
]
}
]
},
"favorited": false,
"retweeted": false,
"possibly_sensitive": false,
"possibly_sensitive_appealable": false,
"lang": "en"
}
Upon following the URL, the source of the video tag is https://amp.twimg.com/amplify-web-player/prod/source.html?vmap_url=https%3A%2F%2Famp.twimg.com%2Fprod%2Fmultibr_v_1%2Fvmap%2F2015%2F08%2F20%2F13%2F609fc2af-1d06-4894-80be-1c231f97557a%2Fa69baa90-58de-4d1d-b2dc-2c3ef1ab9b35.vmap&duration=91.958&image_src=https%3A%2F%2Famp.twimg.com%2Fprod%2Fdefault%2F2015%2F08%2F20%2F13%2Fe8f0b317-ba48-4cec-bf2c-da4598e2b46b_poster-67227.jpg&content_id=609fc2af-1d06-4894-80be-1c231f97557a&page=amplify_card
How do I extract this video file from the tweet if they do not supply an external_entity for it?
The Twitter API has now changed and the videos are stored in the extended_entities object. There could be multiple sources depending on bitrate. This is how to retrieve the one with the highest bitrate:
var bitrate = 0;
var hq_video_url;
for (var j=0; j<tweet.extended_entities.media[0].video_info.variants.length; j++) {
if (tweet.extended_entities.media[0].video_info.variants[j].bitrate) {
if (tweet.extended_entities.media[0].video_info.variants[j].bitrate > bitrate) {
bitrate = tweet.extended_entities.media[0].video_info.variants[j].bitrate;
hq_video_url = tweet.extended_entities.media[0].video_info.variants[j].url;
}
}
}
Workaround for GIFs here!
In Twitter API V2, it is not possible to fetch GIF and video URLs currently. I know it is too silly. But one workaround is to fetch the preview image of the content and construct the media URL by hand.
Let's say we want to get GIF URL of the following tweet via V2 API: https://twitter.com/FloodSocial/status/870042717589340160
When we fetch the tweet with the following URL
https://api.twitter.com/2/tweets/870042717589340160?tweet.fields=attachments,author_id,created_at,entities,id,text&media.fields=preview_image_url,url&expansions=attachments.media_keys (with your bearer token of course), you will see that the response includes a preview_image_url with https://pbs.twimg.com/tweet_video_thumb/DBMDLy_U0AAqUWP.jpg
So here we can extract the DBMDLy_U0AAqUWP part from the URL, and construct the real GIF URL manually where it should be https://video.twimg.com/tweet_video/DBMDLy_U0AAqUWP.mp4
There you go. Only you need to write the extractor function.
When using statuses/show endpoint , add this option tweet_mode:'extended' to get the extended_entities object.
This path in the response will have video urls : extended_entities.media[0].video_info.variants
Example:
[
{
content_type: 'application/x-mpegURL',
url: 'https://video.twimg.com/ext_tw_video/1358226.........'
},
{
bitrate: 832000,
content_type: 'video/mp4',
url: 'https://video.twimg.com/ext_tw_video/1358226.........'
},
{
bitrate: 256000,
content_type: 'video/mp4',
url: 'https://video.twimg.com/ext_tw_video/1358226.........'
}
]
Well, depending on what platform you're using...
Connect to the URL directly
Consume the binary video data
Pump the binary data (e.g., a byte stream) through some widget that will display it.
The specifics on how to do this will vary greatly based on what platform & language(s) you are using.