How do I get traffic source for each video? - api

TL;DR: Want to get % of views from external traffic source for each video (not for each day)
Long story short, I'm working on a project where I'm trying to understand what factors influence video views on a niche channel I have.
One factor I want to look at is the percent of views from external sources. To track this, I want to measure the traffic source types by each video for ~500 videos.
See example screenshot from the YouTube analytics built in page:
enter image description here
I have been able to get views/title/tag/thumbnail url easily using python, but I have not been able to get the percent of views from external sources from each video.
Is this possible? A link to a working page in the API explorer would be great. I would like to not have to go through every single one of my 500 videos get this number by hand, although I suppose I could write a macro that does that...

Related

Get search terms keywords from traffic source of video using Youtube Data Api v3

I want to get search keywords from traffic source of video using Youtube Data Api.
I can already view 8-10 on the Dashboard, but most of the time, the video is in 1000 views and although I see 7-8, the api returns empty or only takes one. how can i solve this problem? I will be glad if you help.

How can I get the Views Per Hour stat for a YouTube video?

I checked the Youtube API and it's mainly to do with adding functionalities related to the YouTube app rather than getting analytics data about videos.
There is a chrome extension called VidIQ that shows the views per hours of a particular video when going to the video's page on YouTube, so I tried reading the source code for it, but it is all compressed and I can't easily find what I'm looking for.
Could someone explain to me how VidIQ chrome extension is getting the views per hour stat for YouTube? Maybe it's not an official stat from Youtube but a rough estimate calculated by VidIQ. How do they get this information?
I tried debugging the VidIQ chrome extension to search through the source code but adding a simple html tag made the file corrupted and disabled the extension until I repaired it again. I'm having difficulties deciphering the source code.
Most of what VidIQ gets is from the YouTube analytics api and not directly from the YouTube data api although i would be they use some combination of both.
If you create a report that extracts views and run it every hour you should get the results you are looking for.
However i would be willing to be that they cache a lot of the data and do some internal analytics on it. They would need to cache it as the YouTube analytics api only returns data for the last 90 days last i checked.
If your intent is to Reverse Engineer VidIQ you may need to accept that a lot of the data you are seeing is internally stored in their system and generated by them based upon the data that is avaliable in the YouTube Analytics API and the YouTube data apis.

Shopify Trekkie loading extra tracking Pixels

This is by far the most frustrating issue I've run into with Shopify. I'm trying to optimize a client's site speed by wrapping up all their tracking codes into Google Tag Manager to reduce the total number of outgoing requests. I removed all hardcoded tracking pixels from theme.liquid and placed them in GTM, went through ALL the apps and sales channels and disconnected from accounts, but there are still extra codes being loaded by Trekkie.
I'm using the Shopify Facebook and Google Analytics integrations as recommended, so those are not represented in GTM. Even so, it's still somehow loading 2 Google Analytics, 2 Google Ads and 2 Facebook pixels.
As you can see in the source code, there are 2 facebook pixel ids contained within the Trekkie object, but how is this possible when there's only one place to add this information?
If I remove the facebook pixel id from this screen (Themes > Preferences), then the first pixel will not load, only the second unwanted pixel loads. The same issue persists for Google Analytics and Google Ads, except I cannot see multiple account ids in the source code, I can only see this in the network tab of DevTools and in the Google Tag Assistant.
I would typically assume that these codes must be in the theme code somewhere or an app or something, except I can actually see with DevTools that the code is being called by Trekkie.
This is driving me absolutely crazy and I've already spent lots of time trying to make what I thought should be a simple optimization. If anyone can help with this issue I'd be hugely appreciative.
Thanks!

Best practice for bulk uploading to Street View Publish API?

We're currently in the process of recording 1fps time-lapsed 4k 360 photos of every island in the Bahamas, with embedded GPS EXIF data. An average hour of filming tends to produce around 600 image frames, which can easily expand to 2000-10,000 images per day on bigger routes. 2000 or so are approved on Google Maps already, but we're hitting a larger brick wall.
The Street View app is obviously the best way to upload when you have 50-100 image files, but it obviously struggles when it starts to hit over 500+ uploads in a batch (publishing doesn't start, or the app crashes), so we're left manually submitting collections. Add that to the standard 4000/day quota, and it's quite a challenge.
Having looked at the Publish API, it's rather tricky to leave a CLI tool running as it's designed with OAuth flow in mind with 1hr access tokens. The service account route seems to the way to go, but the PHP API client seems to have scant documentation for SV Publishing. Connecting photos is also tricky with that many images.
We ideally need a desktop uploader (like the backup tool), or a way to directly import from folders in Google Drive. The first seems discontinued, and there's no data on the second.
Can anyone explain or elucidate on the best practice for this kind of volume upload with the Street View publishing service?
Since you are only capturing 4K images, it will be much simpler to use video mode, depending on if your camera supports it at the desired framerate. You may check this documentation for more information.
Additional information:
You may also check out some of the desktop utilities at the bottom of the Street View website which may be able to help.
You may want to consider one of the Street View ready cameras (also listed on the above webpage) that is capable of recording and uploading 360 videos to Street View. Upon publication, the 360 photo frames are extracted from the video and used to create an automatically connected Street View experience on Google Maps.
Check these pages to learn more about this option:
Set up and connect 360 cameras
Capture and publish in Video mode with the Street View app
Tips for capturing 360 videos for Street View

Best streaming service for mp4 into webview

For the welcome screen of my app, we are trying to serve up a webpage in a webview that consists of a video and some text. (We want to go this route so that we could quickly update the welcome screen and test changes on the fly, versus having to submit and get approval each time.)
The video is only 8.6mB and is currently being played via HTML5 , hosted on an S3 and served via CloudFront. However, the playback still tends to be a bit choppy at times. Does anyone have any recommendations as to an optimal way to host and serve up the video to make it play smoothly? Are there any specific settings for the S3 or CloudFront anyone would recommend that could help?
Thanks in advance for any help anyone can provide.
The most common technique currently is to use ABR in parallel with a CDN to provide smooth playback.
ABR, Adaptive Bit Rate, involves making multiple copies of the video at different bit rates, from low to high and hosting these on the server.
The client receives an index file for the videos, e.g. an m3u8 manifest file, and then chooses the best bit rate for the current conditions to allow smooth playback without buffering.
If network conditions improve the client will 'step up' bit rates and if it gets worse it will 'step down' bit rates.
Typically a low or medium bit rate is chosen as the first one to allow quick and smooth start up.
You can see this effect on services like Netflix as they start up, and you can also see it on YouTube if you right click the video and select 'Stats for Nerds'.
Some links for ABR in AWS Elastic transcoding - you can set the bit rates you want, for e.g. see the note below from their FAQ re HLS jobs:
Specify that the transcoding job create a playlist that references the outputs. You should order your bit rates from lowest to highest, with the audio only stream last, since this order will be maintained in the generated playlist file. Once your transcoding job has completed, the output bucket will contain a proper arrangement of your master and individual M3U8 playlists, and MPEG-2 TS media stream fragments.
Take a look at the sample request on this page here which includes two different bit rates (video service providers will generally have more than 2 but this gives you a feel for the approach):
http://docs.aws.amazon.com/elastictranscoder/latest/developerguide/create-job.html
Azure Media Services has a built in "Adaptive Streaming" preset that is content-adaptive and can adjust the encoding settings to meet your content that is coming in.
See the following - https://learn.microsoft.com/en-us/azure/media-services/media-services-autogen-bitrate-ladder-with-mes