Attach array of photos with mailcore - objective-c

I need to attach an array of photos to an email using MailCore. I found the following code in another question, however I'm not sure how to apply this to the photos taken with the camera in my app.
My guess is that I have to get the filename of the photos and save them as strings to an array, and then attach the strings to the email by using the snippet of code.
NSArray *allAttachments = [NSArray arrayWithObjects:#{#"FilePathOnDevice": #"/var/mobile/etc..", #"FileTitle": #"IMG_0522.JPG"}, nil];
for (int x = 0; x < allAttachments.count; x++) {
NSString *attachmentPath = [[allAttachments objectAtIndex:x] valueForKey:#"FilePathOnDevice"]];
MCOAttachment *attachment = [MCOAttachment attachmentWithContentsOfFile:attachmentPath];
[msgBuilder addAttachment:attachment];
}
This is how I'm getting the photo using the camera
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject]){
imagePicker.sourceType = .Camera
imagePicker.dismissViewControllerAnimated(true, completion: nil)
var photo = info[UIImagePickerControllerOriginalImage] as? UIImage
bedroomCells[lastSelectedIndex!.row].image = photo
tableView.reloadData()
}
Thank you for your help!

I ended up saving the image to the documents directory and using that file path to attach the image to mailcore.
Here is my code for the image picker:
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [NSObject : AnyObject]){
imagePicker.sourceType = .Camera
imagePicker.dismissViewControllerAnimated(true, completion: nil)
var photo = info[UIImagePickerControllerOriginalImage] as? UIImage
bedroomCells[lastSelectedIndex!.row].image = photo
var photoLabel = bedroomCells[lastSelectedIndex!.row].text
tableView.reloadData()
//save photo to document directory
var imageData = UIImageJPEGRepresentation(photo, 1.0)
var imageFilePath = fileDirectory[0].stringByAppendingPathComponent("\(photoLabel!).jpg")
var imageFileURL = NSURL(fileURLWithPath: imageFilePath)
imageData.writeToURL(imageFileURL!, atomically: false)
}
and the code for my mailcore method:
MCOMessageBuilder * builder = [[MCOMessageBuilder alloc] init];
NSString *documentsDirectory = [fileDirectory objectAtIndex:0];
NSArray *allAttachments = [[NSFileManager defaultManager] subpathsOfDirectoryAtPath:documentsDirectory error:nil];
for (int x = 0; x < allAttachments.count; x++) {
NSString *attachmentPath = [documentsDirectory stringByAppendingPathComponent:[allAttachments objectAtIndex:x]];
MCOAttachment *attachment = [MCOAttachment attachmentWithContentsOfFile:attachmentPath];
[builder addAttachment:attachment];
}

In swift code:
var dataImage: NSData?
dataImage = UIImageJPEGRepresentation(image, 0.6)!
var attachment = MCOAttachment()
attachment.mimeType = "image/jpg"
attachment.filename = "image.jpg"
attachment.data = dataImage
builder.addAttachment(attachment)

Related

Implementing Progress Bar While Uploading the Image

I am uploading the image and some string to server , it is working fine ,now i am want to implement the Progress bar,if i am sending 5 images means ,i want to show the progress bar and i want to show the count of images sended successfully ,like 2 /5 ,Please anyone help me to do this.
The follwoing method is for uploading image ,dictionary , and string
-(void)uploadImage
{
NSString *userCategory = self.UserCategory;
NSDictionary *dict = [self.arrayWithImages objectAtIndex:self.currentIndex];
NSString *notes = [dict objectForKey:#"string"];
UIImage *sample = [dict objectForKey:#"image"];
NSData *sampleData = UIImageJPEGRepresentation(sample, 1.0);
NSMutableDictionary *FinalDict = [self.dictMetaData mutableCopy];
[FinalDict setObject:userCategory forKey:#"user_category"];
if (notes.length > 0) {
[FinalDict setObject:notes forKey:#"note"];
}
for (int i = 0; i<self.arrayWithImages.count; i++) {
[ServerUtility uploadImageWithAllDetails:FinalDict noteResource:sampleData andCompletion:^(NSError *error,id data)
{
if (!error) {
NSString *strResType = [data objectForKey:#"res_type"];
if ([strResType.lowercaseString isEqualToString:#"success"]) {
NSLog(#"Upload Successfully");
self.currentIndex++;
}
else if ([strResType.lowercaseString isEqualToString:#"error"])
{
NSString *strMsg = [data objectForKey:#"msg"];
[self.view makeToast:strMsg duration:1.0 position:CSToastPositionCenter];
}
}
else{
[self.view makeToast:error.localizedDescription duration:1.0 position:CSToastPositionCenter];
}
}];
}

objective c - AvAssetReader and Writer to overlay video

I am trying to overlay a recorded video with AvAssetReader and AvAssetWriter with some images. Following this tutorial, I am able to copy a video (and audio) into a new file. Now my objective is to overlay some of the initial video frames with some images with this code:
while ([assetWriterVideoInput isReadyForMoreMediaData] && !completedOrFailed)
{
// Get the next video sample buffer, and append it to the output file.
CMSampleBufferRef sampleBuffer = [assetReaderVideoOutput copyNextSampleBuffer];
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
EAGLContext *eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
CIContext *ciContext = [CIContext contextWithEAGLContext:eaglContext options:#{kCIContextWorkingColorSpace : [NSNull null]}];
UIFont *font = [UIFont fontWithName:#"Helvetica" size:40];
NSDictionary *attributes = #{NSFontAttributeName:font, NSForegroundColorAttributeName:[UIColor lightTextColor]};
UIImage *img = [self imageFromText:#"test" :attributes];
CIImage *filteredImage = [[CIImage alloc] initWithCGImage:img.CGImage];
[ciContext render:filteredImage toCVPixelBuffer:pixelBuffer bounds:[filteredImage extent] colorSpace:CGColorSpaceCreateDeviceRGB()];
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
if (sampleBuffer != NULL)
{
BOOL success = [assetWriterVideoInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
sampleBuffer = NULL;
completedOrFailed = !success;
}
else
{
completedOrFailed = YES;
}
}
And to create image from text:
-(UIImage *)imageFromText:(NSString *)text :(NSDictionary *)attributes{
CGSize size = [text sizeWithAttributes:attributes];
UIGraphicsBeginImageContextWithOptions(size, NO, 0.0);
[text drawAtPoint:CGPointMake(0.0, 0.0) withAttributes:attributes];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return image;
}
The video and audio are copied, but I haven't any text on my video.
Question 1: Why this code is not working?
Moreover, I want to be able to check the timecode of the current read frame. For example I would like to insert a text with the current timecode in the video.
I try this code following this tutorial:
AVAsset *localAsset = [AVAsset assetWithURL:mURL];
NSError *localError;
AVAssetReader *assetReader = [[AVAssetReader alloc] initWithAsset:localAsset error:&localError];
BOOL success = (assetReader != nil);
// Create asset reader output for the first timecode track of the asset
if (success) {
AVAssetTrack *timecodeTrack = nil;
// Grab first timecode track, if the asset has them
NSArray *timecodeTracks = [localAsset tracksWithMediaType:AVMediaTypeTimecode];
if ([timecodeTracks count] > 0)
timecodeTrack = [timecodeTracks objectAtIndex:0];
if (timecodeTrack) {
AVAssetReaderTrackOutput *timecodeOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:timecodeTrack outputSettings:nil];
[assetReader addOutput:timecodeOutput];
} else {
NSLog(#"%# has no timecode tracks", localAsset);
}
}
But I get the log:
[...] has no timecode tracks
Question 2: Why my video hasn't any AVMediaTypeTimecode? Ad so how can I get the current frame timecode?
Thanks for your help
I found the solutions:
To overlay video frames, you need to fix the decompression settings:
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* decompressionVideoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
// If there is a video track to read, set the decompression settings for YUV and create the asset reader output.
assetReaderVideoOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:assetVideoTrack outputSettings:decompressionVideoSettings];
To get the frame timestamp, you have to read the video informations and then use a counter to increment the current timestamp:
durationSeconds = CMTimeGetSeconds(asset.duration);
timePerFrame = 1.0 / (Float64)assetVideoTrack.nominalFrameRate;
totalFrames = durationSeconds * assetVideoTrack.nominalFrameRate;
Then in this loop
while ([assetWriterVideoInput isReadyForMoreMediaData] && !completedOrFailed)
You can found the timestamp:
CMSampleBufferRef sampleBuffer = [assetReaderVideoOutput copyNextSampleBuffer];
if (sampleBuffer != NULL){
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (pixelBuffer) {
Float64 secondsIn = ((float)counter/totalFrames)*durationSeconds;
CMTime imageTimeEstimate = CMTimeMakeWithSeconds(secondsIn, 600);
mergeTime = CMTimeGetSeconds(imageTimeEstimate);
counter++;
}
}
I hope it could help!

IOS::How to get the MPMediaquery Songsquery Artwork

I'm using this code for getting the Artwork, but it's not workout for me. What's the wrong in this code.Suggest me.
Thanks.
MPMediaQuery *mySongsQuery = [MPMediaQuery songsQuery];
NSArray *SongsList = [mySongsQuery collections];
for (MPMediaItemCollection *SongsArt in SongsList) {
NSArray *songs = [SongsArt items];
for (MPMediaItem *song in songs) {
if ([(MPMediaItem*)item valueForProperty:MPMediaItemPropertyAssetURL] != nil) {
CGSize artworkImageViewSize = CGSizeMake(40, 40);
MPMediaItemArtwork *artwork = [song valueForProperty:MPMediaItemPropertyArtwork];
UIImage * image = [artwork imageWithSize:artworkImageViewSize];
if (image!= nil)
{
imgv_songImageView.image = image;
}
else
{
imgv_songImageView.image = [UIImage imageNamed:#"musicD-jpeg.png"];
}
}
}
I assume you just want to loop through all songs in the music library so I don't see a need for collections:
MPMediaQuery *mySongsQuery = [MPMediaQuery songsQuery];
for (MPMediaItem *item in mySongsQuery.items) {
if (![[item valueForProperty:MPMediaItemPropertyIsCloudItem]boolValue]) {
CGSize artworkImageViewSize = CGSizeMake(40, 40);
MPMediaItemArtwork *artwork = [song valueForProperty:MPMediaItemPropertyArtwork];
UIImage *image = [artwork imageWithSize:artworkImageViewSize];
if (image) {
imgv_songImageView.image = image;
} else {
imgv_songImageView.image = [UIImage imageNamed:#"musicD-jpeg.png"];
}
}
}
I'm not sure why you want to check for the Asset URL but I've left it in.
Here i am posting the code to get the tracks and sorting them alphabetically. Its written in swift3.
/// Get all the songs in the device and display in the tableview
///
func getAllSongs() {
let query: MPMediaQuery = MPMediaQuery.songs()
let allSongs = query.items
allSongItems?.removeAll()
guard allSongs != nil else {
return
}
var index = 0
for item in allSongs! {
let pathURL: URL? = item.value(forProperty: MPMediaItemPropertyAssetURL) as? URL
if pathURL == nil {
print("#Warning!!! Track : \(item) is not playable.")
} else {
let trackInfo = SongItem()
trackInfo.index = index
trackInfo.mediaItem = item
let title = item.value(forProperty: MPMediaItemPropertyTitle) as? String ?? "<Unknown>"
let artistName = item.value(forProperty: MPMediaItemPropertyArtist) as? String ?? "<Unknown>"
trackInfo.songName = title
trackInfo.artistName = artistName
trackInfo.isSelected = false
trackInfo.songURL = item.value(forProperty: MPMediaItemPropertyAssetURL) as? URL
allSongItems?.append(trackInfo)
index += 1
}
}
// Sort the songs alphabetically
let sortedArray: [SongItem]? = allSongItems?.sorted {
$0.songName!.localizedCompare($1.songName!) == .orderedAscending
}
allSongItems?.removeAll()
if let arr = sortedArray {
allSongItems?.append(contentsOf: arr)
}
}

Create a thumbnail or image of an AVPlayer at current time

I have implemented an AVPlayer and i want to take an image or thumbnail when clicking on a toolbar button and open in a new UIViewController with UIImageView. The image should be scaled exactly like the AVPlayer.
The segue is already working, i just have to implement that i get the image at the current play time.
Thanks!
Objective-C
AVAsset *asset = [AVAsset assetWithURL:sourceURL];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(1, 1);
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef); // CGImageRef won't be released by ARC
Swift
var asset = AVAsset.assetWithURL(sourceURL)
var imageGenerator = AVAssetImageGenerator(asset: asset!)
var time = CMTimeMake(1, 1)
var imageRef = try! imageGenerator!.copyCGImageAtTime(time, actualTime: nil)
var thumbnail = UIImage.imageWithCGImage(imageRef)
CGImageRelease(imageRef) // CGImageRef won't be released by ARC
Swift 3.0
var sourceURL = URL(string: "Your Asset URL")
var asset = AVAsset(url: sourceURL!)
var imageGenerator = AVAssetImageGenerator(asset: asset)
var time = CMTimeMake(1, 1)
var imageRef = try! imageGenerator.copyCGImage(at: time, actualTime: nil)
var thumbnail = UIImage(cgImage:imageRef)
Note : Interpret Swift code according to your swift version.
Try this
- (UIImage*)takeScreeenShot {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidURL
options:nil];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
imageGenerator.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60); // time range in which you want
screenshot
CGImageRef imgRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL
error:&err];
return [[UIImage alloc] initWithCGImage:imgRef];
}
Hope this helps !!!
Swift 2.x:
let asset = AVAsset(...)
let imageGenerator = AVAssetImageGenerator(asset: asset)
let screenshotTime = CMTime(seconds: 1, preferredTimescale: 1)
if let imageRef = try? imageGenerator.copyCGImageAtTime(screenshotTime, actualTime: nil) {
let image = UIImage(CGImage: imageRef)
// do something with your image
}
Add below code to generate thumbnail from video.
AVURLAsset *assetURL = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil];
AVAssetImageGenerator *assetGenerator = [[AVAssetImageGenerator alloc] initWithAsset:assetURL];
assetGenerator.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime time = CMTimeMake(1, 2);
CGImageRef imgRef = [assetGenerator copyCGImageAtTime:time actualTime:NULL error:&err];
UIImage *one = [[UIImage alloc] initWithCGImage:imgRef];
This is how I get a shot of the current visible frame on the scene in Swift:
The key is to
get the current time of the player which is of type CMTime
convert that time into seconds of type Float64
switch the secondss back to CMTime using CMTimeMake. The first parameter which would be where the seconds goes should be cast to Int64
Code:
var myImage: UIImage?
guard let player = player else { return }
let currentTime: CMTime = player.currentTime() // step 1.
let currentTimeInSecs: Float64 = CMTimeGetSeconds(currentTime) // step 2.
let actionTime: CMTime = CMTimeMake(Int64(currentTimeInSecs), 1) // step 3.
let asset = AVAsset(url: fileUrl)
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true // prevent image rotation
do{
let imageRef = try imageGenerator.copyCGImage(at: actionTime, actualTime: nil)
myImage = UIImage(cgImage: imageRef)
}catch let err as NSError{
print(err.localizedDescription)
}
Swift extension for generating thumbnails from video
extension AVPlayer {
func generateThumbnail(time: CMTime) -> UIImage? {
guard let asset = currentItem?.asset else { return nil }
let imageGenerator = AVAssetImageGenerator(asset: asset)
do {
let cgImage = try imageGenerator.copyCGImage(at: time, actualTime: nil)
return UIImage(cgImage: cgImage)
} catch {
print(error.localizedDescription)
}
return nil
}
}
When you need to create multiple thumbnails at once the class AVAssetImageGenerator is golden, as it provides an async way.
If you need a Thumbnail-Image of the player's current frame, simply render it's View (platform specific) or its Layer (platform independent):
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGSize frameSize = _playerLayer.frame.size;
CGContextRef thumbnailContext = CGBitmapContextCreate(nil, frameSize.width, frameSize.height, 8, 0, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGColorSpaceRelease(colorSpace);
[_playerLayer renderInContext:thumbnailContext];
CGImageRef playerThumbnail = CGBitmapContextCreateImage(thumbnailContext);
CGContextRelease(thumbnailContext);
This is super fast and works synchronously.
Code for 2022:
seconds = .. the normal human-meaning desired time position in the video
guard let pl = .. your player ..
guard let ite = pl.currentItem ..
let testGen = AVAssetImageGenerator(asset: ite.asset)
testGen.maximumSize = CGSize(width: 0, height: .. height of your preview box)
testGen.requestedTimeToleranceBefore = .zero // during development
// or something like ... CMTime(value: .. your tolerance .., timescale: 600)
testGen.requestedTimeToleranceAfter = .zero // during development
// ditto
if #available(tvOS 16, *) {
Task { [weak self] ..
do {
let ct = CMTime(value: CMTimeValue(seconds), timescale: 1)
// NOTE THE "1"
let (foundImage, foundTime) = try await testGen.image(at: ct)
let foundAsSecs = CMTimeGetSeconds(foundTime)
print("tried gen at \(seconds) found as \(foundAsSecs) \n")
self. .. your preview .image = UIImage(cgImage: foundImage)
} catch {
print("gen err \(error)")
}
}
}
Setting the two tolerances is a sophisticated issue, google.
Watch out for the gotchya where timescale of 1 is needed for the CMTime.

Unable to parse the result returned by NSAppleScript method executeAndReturnError

This is the code which i am using:
NSDictionary *errorInfo=nil;
NSString *source=#"tell application \"Mail\"\nget name of mailbox of every account\nend tell";
NSAppleScript *run = [[NSAppleScript alloc] initWithSource:source];
NSAppleEventDescriptor *aDescriptor=[[NSAppleEventDescriptor alloc]init];
aDescriptor=[run executeAndReturnError:&errorInfo];
[aDescriptor coerceToDescriptorType:'utxt'];
NSLog(#"result:%#",[aDescriptor stringValue]);
Output which i got:
result:(null)
Please help me anyone on this.Thanks in advance:)
IIRC that will return a list descriptor filled with list descriptors. You need to iterate over them and pull out the info you want. You're also initializing a descriptor and then immediately overwriting its pointer. Do something like (untested):
NSDictionary *errorInfo = nil;
NSString *source = #"tell application \"Mail\"\nget name of mailbox of every account\nend tell";
NSAppleScript *run = [[NSAppleScript alloc] initWithSource:source];
NSAppleEventDescriptor *aDescriptor = [run executeAndReturnError:&errorInfo];
NSInteger num = [aDescriptor numberOfItems];
// Apple event descriptor list indexes are one-based!
for (NSInteger idx = 1; idx <= num; ++idx) {
NSAppleEventDescriptor *innerListDescriptor = [aDescriptor descriptorAtIndex:idx];
NSInteger innerNum = [innerListDescriptor numberOfItems];
for (NSInteger innerIdx = 1; innerIdx <= innerNum; ++innerIdx) {
NSString *str = [[innerListDescriptor descriptorAtIndex:innerIdx] stringValue];
// Do something with str here
}
}
Swift version, tested in 2022:
func run(appleScript: String) {
var error: NSDictionary? = nil
if let scriptObject = NSAppleScript(source: appleScript) {
let output = scriptObject.executeAndReturnError(&error)
// Print All Values
let numberOfItems = output.numberOfItems
print("numberOfItems: \(numberOfItems)")
for i in 0..<numberOfItems {
let innerDescriptor = output.atIndex(i)
print("\(i): " + (innerDescriptor?.stringValue ?? "nil"))
}
// Catch Error
if let error = error {
print("Error: '\(error)'")
}
} else {
print("Error: Unable to init NSAppleScript")
}
}