I have code that executes a youtube upload that's working fine, when it's done, it then attempts to move the video to a specific playlist.
For testing, I have hardcoded the playlist (it's correct and re-verified) and the videoid (it does exist in my channel) into the code.
I cannot get the video to to the playlist I specify and get the output below from the NSlog query1, but no errors.
Query1: GTLQueryYouTube 0x1700b0860: {method:youtube.playlistItems.insert params:(part) bodyObject:GTLYouTubePlaylistItem}
Any ideas what I am missing?
// Enter PlaylistItem Code
GTLYouTubePlaylistItem *playlistitem = [[GTLYouTubePlaylistItem alloc] init];
GTLYouTubePlaylistItemSnippet *playlistitemSnippet = [[GTLYouTubePlaylistItemSnippet alloc] init];
playlistitemSnippet.playlistId = #"PL4YcQc6s41BjKOoAPAQ_B-KNC2JBB3gl2";
playlistitemSnippet.resourceId.kind = #"youtube#video";
playlistitemSnippet.resourceId.videoId = #"4frmxoGMOcQ";
GTLQueryYouTube *query1 = [GTLQueryYouTube queryForPlaylistItemsInsertWithObject:playlistitem part:#"snippet"];
UIAlertView *waitIndicator1 = [Utils showWaitIndicator:#"Moving Video to Playlist"];
NSLog(#"Query1: %#", query1);
[service executeQuery:query1
completionHandler:^(GTLServiceTicket *ticket,
GTLYouTubePlaylistItem *resource, NSError *error) {
[waitIndicator1 dismissWithClickedButtonIndex:0 animated:YES];
Ended up scrapping this and used the playlist auto add feature in youtube. Way easier.
Related
I have been exploring the Apple Music API to see what kind of functionality I can expect to be able to use in an iOS app. I have created a little test app that gains permission from the user and outputs the playlists I have (and songs) to NSLog.
MPMediaQuery *myPlaylistsQuery = [MPMediaQuery playlistsQuery];
[myPlaylistsQuery setGroupingType:MPMediaGroupingPlaylist];
NSArray *playlists = [myPlaylistsQuery collections];
for (MPMediaPlaylist *playlist in playlists) {
NSLog (#"%#", [playlist valueForProperty: MPMediaPlaylistPropertyName]);
NSArray *songs = [playlist items];
for (MPMediaItem *song in songs) {
NSString *songTitle =
[song valueForProperty: MPMediaItemPropertyTitle];
NSLog (#"\t\t%#", songTitle);
}
}
From this, I have been able to deduce the following (but I'm not 100% certain):
the playlist (basic info: name, id) is stored locally on the device
the playlist songs are also pulled from local storage but if the playlist hasn't been downloaded to the device it goes off to Apple to grab the song list.
So far, so good. What I want to know is:
is there a way of creating a playlist from my app (via the API)?
I know there is an MPMediaPlaylist addItem and add method but can't seem to find a way of creating the new playlist itself.
According to this page it should be possible: https://affiliate.itunes.apple.com/resources/blog/apple-music-api-faq/
Can a developer create brand new playlists on the user’s device with the Apple Music API?
Yes. The API allows develops to new create playlists on the user’s device.
I've figured this out. If you use the following code you can generate a new playlist and perform an action on it.
NSUUID *uuid = [NSUUID UUID]; //uuid for the playlist
[[MPMediaLibrary defaultMediaLibrary] getPlaylistWithUUID:uuid creationMetadata:[[MPMediaPlaylistCreationMetadata alloc] initWithName:#"YOUR PLAYLIST NAME"] completionHandler:^(MPMediaPlaylist * _Nullable playlist, NSError * _Nullable error) {
NSLog(#"%#", error);
if (!error) {
NSLog(#"All ok let's do some stuff with the playlist!");
}
}];
Apple's documentation on the whole API is severely lacking in terms of sample code and practical examples!
I have successfully implemented YouTube video upload. Since I am also dealing with text that can be used as subtitles, I'd like to upload these, as well.
GTLYouTubeVideoContentDetails’s header does not mention what format "caption" has to be in, so I tried SRT, but that didn't work. The SRT file itself seems to be valid, I uploaded it manually to a demo video and it looked fine.
My approach was this:
GTLYouTubeVideo *video = [GTLYouTubeVideo object];
if (nil != captionString)
{
GTLYouTubeVideoContentDetails *details = [[GTLYouTubeVideoContentDetails alloc] init];
details.caption = captionString;
video.contentDetails = details;
[details release];
}
This was the only alteration to the previously working code. The result now is HTTP status 501 when trying to upload.
Any ideas?
I am trying to upload simple .txt file as a test. It's only 4kb. Here is my code:
-(IBAction)uploadFile:(id)sender{
if (![self.credentials canAuthorize]) {
GTMOAuth2WindowController *windowController;
windowController = [[[GTMOAuth2WindowController alloc] initWithScope:scope clientID:kClientID clientSecret:kClientSecret keychainItemName:kKeychainItemName resourceBundle:nil] autorelease];
[windowController signInSheetModalForWindow:self.window
delegate:self
finishedSelector:#selector(windowController:finishedWithAuth:error:)];
}else{
NSLog(#"Credentials already authorized.");
}
GTLDriveFile *driveFile = [GTLDriveFile object];
driveFile.title = #"myfile";
driveFile.descriptionProperty = #"Uploaded by Google Drive API test.";
driveFile.mimeType = #"text/plain";
NSData *data = [[NSFileManager defaultManager] contentsAtPath:#"/Users/Blake/Downloads/glm/util/autoexp.txt"];
GTLUploadParameters *params = [GTLUploadParameters uploadParametersWithData:data MIMEType:driveFile.mimeType];
GTLQueryDrive *query = [GTLQueryDrive queryForFilesInsertWithObject:driveFile uploadParameters:params];
NSLog(#"Starting Upload to Google Drive");
[self.progressBar setHidden:FALSE];
[self.progressBar startAnimation:sender];
[self.driveService executeQuery:query completionHandler:^(GTLServiceTicket *st, GTLDriveFile *df, NSError *error){
[self.progressBar setHidden:TRUE];
if (error == nil) {
NSLog(#"File upload succeeded");
}else{
NSLog(#"Uh-oh: File upload failed");
}
}];
}
The GTMOAuth2Authentication is authorized and the completion handler block never executes.
I looked at Activity Monitor and it says my application has sent 54 packets in the beginning. So I guess something is being sent.
How can I tell what's going wrong here, if the block never finishes to give me an error?
Could someone please help me understand what I am doing wrong?
I've recently got problem with sticking on GTMOAuth2WindowController. It was unable to present self window. I spent a lot of time in debugger and found that I forget to add WebKit.framework. It was not documented explicitly and there were no any messages in runtime (minus in google dev karma)))). So, maybe your problem also in that? Try to check your frameworks set and compare it with examples shipped with GTL.
I created a 'mirror'-like view in my app that uses the front camera to show a 'mirror' to the user. The problem I'm having is that I have not touched this code in weeks (and it did work then) but now I'm testing it again and it's not working. The code is the same as before, there are no errors coming up, and the view in the storyboard is exactly the same as before. I have no idea what is going on, so I was hoping that this website would help.
Here is my code:
if([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront]) {
//If the front camera is available, show the camera
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureOutput *output = [[AVCaptureStillImageOutput alloc] init];
[session addOutput:output];
//Setup camera input
NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
//You could check for front or back camera here, but for simplicity just grab the first device
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
NSError *error = nil;
// create an input and add it to the session
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; //Handle errors
//set the session preset
session.sessionPreset = AVCaptureSessionPresetHigh; //Or other preset supported by the input device
[session addInput:input];
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
//Now you can add this layer to a view of your view controller
[cameraView.layer addSublayer:previewLayer];
previewLayer.frame = self.cameraView.bounds;
[session startRunning];
if ([session isRunning]) {
NSLog(#"The session is running");
}
if ([session isInterrupted]) {
NSLog(#"The session has been interupted");
}
} else {
//Tell the user they don't have a front facing camera
}
Thank You in advanced.
Not sure if this is the problem but there is an inconsistency between your code and the comments. The inconsistency is with the following line of code:
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
In the comment above it says: "...for simplicity just grab the first device". However, the code is grabbing the second device, NSArray is indexed from 0. I believe the comment should be corrected as I think you are assuming the front camera will be the second device in the array.
If you are working on the assumption that the first device is the back camera and the second device is the front camera then this is a dangerous assumption. It would be much safer and more future proof to check the list of possibleDevices for the device that is the front camera.
The following code will enumerate the list of possibleDevices and create input using the front camera.
// Find the front camera and create an input and add it to the session
AVCaptureDeviceInput* input = nil;
for(AVCaptureDevice *device in possibleDevices) {
if ([device position] == AVCaptureDevicePositionFront) {
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error]; //Handle errors
break;
}
}
Update: I have just cut and pasted the code exactly as it is in the question into a simple project and it is working fine for me. I am seeing the video from the front camera. You should probably look elsewhere for the issue. First, I would be inclined to check the cameraView and associated layers.
Right now I am trying to upload a photo to Picasa using the objective c client like below:
GDataServiceGooglePhotos* service =
[self photoService];
// get the URL for the album
NSURL *albumURL = [GDataServiceGooglePhotos
photoFeedURLForUserID:userEmailAdress albumID:albumName
albumName:nil photoID:nil kind:#"album" access:#"all"];
// set a title and description for the new photo
NSDateFormatter *df = [[NSDateFormatter alloc] init];
df.dateFormat = #"yyyyMMddHHmmssSSSS";
GDataTextConstruct *title, *desc;
title = [GDataTextConstruct textConstructWithString:[df stringFromDate:[NSDate date]]];
desc = [GDataTextConstruct textConstructWithString:[descriptionTextfield text]];
GDataEntryPhoto *newPhoto = [GDataEntryPhoto photoEntry];
[newPhoto setTitle:title];
[newPhoto setPhotoDescription:desc];
// attach the photo data
NSData *data = UIImageJPEGRepresentation(imageView.image, 1.0);
[newPhoto setPhotoData:data];
[newPhoto setPhotoMIMEType:#"image/jpeg"];
[newPhoto setUploadData:data];
[newPhoto setUploadMIMEType:#"image/jpeg"];
[newPhoto setUploadSlug:title.stringValue];
// now upload it
GDataServiceTicket *ticket;
ticket = [service fetchEntryByInsertingEntry:newPhoto forFeedURL:albumURL delegate:self didFinishSelector:#selector(addPhotoTicket:finishedWithEntry:error:)];
[service setServiceUploadProgressSelector:nil];
And here is my addPhotoTicket:finishedWithEntry:error: method
if (error == nil) {
NSLog(#"UPLOADED");
} else {
NSLog(#"THERE WAS AN ERROR");
}
I keep getting "There was an error", and failedWithStatus:400 data:Photo data or source id must be included. Any help is greatly appreciated.
Thanks.
As shown in the GooglePhotosSample application, uploading should be done to the URL of an album's uploadLink. Don't try to make an album URL manually for uploading. The album ID is not the album's name.
UIImageJPEGRepresentation can easily fail; check that it is returning non-nil data.
setPhotoData: and setPhotoMIMEType: are synonyms for setUploadData: and setUploadMIMEType:; there is no need to call both.
setTitle: and setPhotoDescription: have "WithString" versions so there's no need to create a text construct explicitly to set those.
Enable the library's http logging to see the actual server requests and responses.