What is exactly hostname for sending Microsoft Exchange email with mailcore - mailcore2

I can not send MS Exchange email with mailcore.
It always returns an error "A stable connection to the server could not be established".
Here is my code
smtpSession.hostname = #"smtp.outlook.office365.co.uk";
smtpSession.username = [currentUser objectForKey:#"email_account"];
smtpSession.password = [currentUser objectForKey:#"email_password"];
smtpSession.port = 25;
smtpSession.connectionType = MCOConnectionTypeClear;
I think it because of the hostname.
Could anybody tell me what is exactly hostname of MS Exchange in this case?

Finally, I found out answer here
smtpSession.hostname = #"smtp.office365.com";
smtpSession.username = [currentUser objectForKey:#"email_account"];
smtpSession.password = [currentUser objectForKey:#"email_password"];
smtpSession.port = 25;
smtpSession.connectionType = MCOConnectionTypeStartTLS;

For future reference for others here's a complete example using MailCore2 added using CocoaPods
#import <MailCore/MailCore.h>
MCOSMTPSession *smtpSession = [[MCOSMTPSession alloc] init];
smtpSession.hostname = #"smtp.outlook.office365.com";
smtpSession.username = #"user#domain.com";
smtpSession.password = #"NotMyPass";
smtpSession.port = 587;
smtpSession.connectionType = MCOConnectionTypeStartTLS;
MCOMessageBuilder * builder = [[MCOMessageBuilder alloc] init];
[[builder header] setFrom:[MCOAddress addressWithDisplayName:#"Me" mailbox:#"user#domain.com"]];
[[builder header] setTo:#[[MCOAddress addressWithDisplayName:#"To you" mailbox:#"user#domain.com"]]];
[[builder header] setSubject:#"Mailcore test"];
[builder setTextBody:#"Message received"];
NSData * rfc822Data = [builder data];
MCOSMTPSendOperation *sendOperation = [smtpSession sendOperationWithData:rfc822Data];
[sendOperation start:^(NSError *error) {
if(error) {
NSLog(#"Error sending email:%#", error);
} else {
NSLog(#"Successfully sent email!");
}
}];

Related

Can not export AVMutableComposition with AVAssetExportSession

Here is part of composition creation:
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
for (NSDictionary *track in tracks) {
NSURL *url = [[NSURL alloc] initWithString:track[#"url"]];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, urlAsset.duration)
ofTrack:[[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero
error:nil];
}
[self persist:mixComposition for:songId];
Then i wish to persist collection in directory so i do not have to download it each time
Output of composition looks like this:
"AVMutableCompositionTrack: 0x1c4a276a0 trackID = 1, mediaType = soun, editCount = 1",
"AVMutableCompositionTrack: 0x1c4a28560 trackID = 2, mediaType = soun, editCount = 1"...,
- (void)persist:(AVMutableComposition *) composition
for:(NSString *) songId {
NSLog(#"%#", composition);
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
initWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent: songId];
NSURL *url = [NSURL fileURLWithPath: path];
exportSession.outputURL = url;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeAppleM4A;
// perform the export
[exportSession exportAsynchronouslyWithCompletionHandler:^{
if (AVAssetExportSessionStatusCompleted == exportSession.status) {
NSLog(#"AVAssetExportSessionStatusCompleted");
NSLog(#"Path : %#", url);
} else if (AVAssetExportSessionStatusFailed == exportSession.status) {
// a failure may happen because of an event out of your control
// for example, an interruption like a phone call comming in
// make sure and handle this case appropriately
NSLog(#"%#", exportSession.error);
} else {
NSLog(#"Export Session Status: %ld", (long)exportSession.status);
}
}];
}
The error i get:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could
not be completed" UserInfo={NSLocalizedFailureReason=An unknown error
occurred (-12780), NSLocalizedDescription=The operation could not be
completed, NSUnderlyingError=0x1c0a409f0 {Error
Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}}
the path you use must content type of your file need to export, it's look like:
let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first
let audioTypeOutput = ".m4a"//type of your file, mp3, m4a for audio
let exportPath = directory?.appendingPathComponent(name + audioTypeOutput)
hope this help!

How can I get rid of page margins using UIPrintInteractionController for Airprint

I am trying to AirPrint a UIImage using UIPrintInteractionController
this code works successfully except a margin is added to the print preview which is not desirable for me. how can I customize the left, right , top & bottom margins to zero or other values? i am aware of printablerect but how to modify it? Thanks for helps in advance:)
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
NSData *imageData=UIImageJPEGRepresentation(imageforPrint, 1.0);
if (pic && [UIPrintInteractionController canPrintData: imageData] ) {
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = #"Job 1";
printInfo.duplex = UIPrintInfoDuplexLongEdge;
pic.printInfo = printInfo;
UIPrintPageRenderer *renderer = [[UIPrintPageRenderer alloc] init];
UIPrintFormatter *Formatter = [[UIPrintFormatter alloc] init];
Formatter=pic.printFormatter;
Formatter.startPage=0;
Formatter.perPageContentInsets = UIEdgeInsetsMake(0.0, 0.0, 0.0, 0.0);
Formatter.startPage=0;
[renderer Formatter startingAtPageAtIndex: 0];
pic.showsPaperSelectionForLoadedPapers=YES;
pic.printPageRenderer=renderer;
pic.printFormatter=Formatter;
pic.printingItem = imageData;
pic.showsPageRange = YES;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *pic, BOOL completed, NSError *error) {
//self.content = nil;
if (!completed && error)
NSLog(#"FAILED! due to error in domain %# with error code %ld", error.domain, (long)error.code);
};
[pic presentAnimated:YES completionHandler:completionHandler];

UIViewReportBrokenSuperviewChain

Application get crashed for barcode scanning using AVFoundation.
following is my code.
_session = [[AVCaptureSession alloc] init];
_device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
_input = [AVCaptureDeviceInput deviceInputWithDevice:_device error:&error];
if (_input) {
[_session addInput:_input];
} else {
NSLog(#"Error: %#", error);
}
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_output.metadataObjectTypes = [_output availableMetadataObjectTypes];
_prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_prevLayer.frame = _previewView.bounds;
_prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//[self.view.layer addSublayer:_prevLayer];
[_previewView.layer addSublayer:_prevLayer];
//[self.view];
//[_session startRunning];
[_previewView bringSubviewToFront:_highlightView];
/* code Ends*/
Showing Bad Access.
[_previewView.layer addSublayer:_prevLayer];
This line occurs after the frame is set. Try adding the layer and then setting the frame. I'm sure you've moved on, but this answer could benefit someone else.

How can I use MailCore to fetch emails using the token I got from gmail?

I am new to IOS developement.I have use oauth 2.0 for authentication and access token.now i want to fetch the emails using token in mailcore2. how can I do it.I had google lots of stuff but it does not worth it . So,Please help to solve this problem..
Thanks in advance. Here is my code.
- (void)viewDidLoad {
[super viewDidLoad];
NSUserDefaults *def = [NSUserDefaults standardUserDefaults];
NSString *accessToken = [def objectForKey:#"sessionToken"];
NSLog(#"Access token:%#",accessToken);//token from google api authentication
session = [[MCOIMAPSession alloc] init];
session.hostname = #"imap.gmail.com";
session.port = 993;
session.authType =MCOAuthTypeXOAuth2;
session.OAuth2Token = accessToken;
session.username = emailId;
session.connectionType = MCOConnectionTypeTLS;
session.password = nil;
[session setConnectionLogger:^(void * connectionID, MCOConnectionLogType type,
NSData * data){
NSLog(#"MCOIMAPSession: [%i] %#", type, [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding]);
}];
MCOIMAPMessagesRequestKind requestKind = (MCOIMAPMessagesRequestKind (MCOIMAPMessagesRequestKindHeaders | MCOIMAPMessagesRequestKindStructure |MCOIMAPMessagesRequestKindInternalDate | MCOIMAPMessagesRequestKindHeaderSubject |MCOIMAPMessagesRequestKindFlags);
MCOIMAPFolderInfoOperation *inboxFolderInfo = [session folderInfoOperation:Folder];
[inboxFolderInfo start:^(NSError *error, MCOIMAPFolderInfo *info){
NSLog(#"INFO:%#",info);
}];
}
I got INFO:(null).correct me If i am doing wrong stuff.

When button click switch camera front and back in iOS [duplicate]

I am developing an iPhone App. In that, there is a requirement for Pausing and resuming the camera. So i used AVFoundation for that instead of using UIImagePickerController.
My code is :
- (void) startup :(BOOL)isFrontCamera
{
if (_session == nil)
{
NSLog(#"Starting up server");
self.isCapturing = NO;
self.isPaused = NO;
_currentFile = 0;
_discont = NO;
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(isFrontCamera)
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
cameraDevice = captureDevice;
}
cameraDevice=[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
// create an output for YUV output with self as delegate
_captureQueue = dispatch_queue_create("uk.co.gdcl.cameraengine.capture", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
[_videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
NSDictionary* actual = videoout.videoSettings;
_cy = [[actual objectForKey:#"Width"] integerValue];
_cx = [[actual objectForKey:#"Height"] integerValue];
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
[_session startRunning];
_preview = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
}
}
Here i am facing the problem when i change the camera to Front. when i calling the above method by changing the camera to front, the preview layer is getting stuck and no preview is coming. My doubt is "Can we change the capture device in the middle of capture session ?". Please guide me where i am going wrong (or) Suggest me with a solution on how to navigate between front and back camera while recording.
Thanks in Advance.
Yes, you can. There are just a few of things you need to cater to.
Need to be using AVCaptureVideoDataOutput and its delegate for recording.
Make sure you remove the previous deviceInput before adding the new deviceInput.
You must remove and recreate the AVCaptureVideoDataOutput as well.
I am using these two functions for it right now and it works while the session is running.
- (void)configureVideoWithDevice:(AVCaptureDevice *)camera {
[_session beginConfiguration];
[_session removeInput:_videoInputDevice];
_videoInputDevice = nil;
_videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:camera error:nil];
if ([_session canAddInput:_videoInputDevice]) {
[_session addInput:_videoInputDevice];
}
[_session removeOutput:_videoDataOutput];
_videoDataOutput = nil;
_videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[_videoDataOutput setSampleBufferDelegate:self queue:_outputQueueVideo];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey, nil];
_videoDataOutput.videoSettings = setcapSettings;
[_session addOutput:_videoDataOutput];
_videoConnection = [_videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
if([_videoConnection isVideoOrientationSupported]) {
[_videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
[_session commitConfiguration];
}
- (void)configureAudioWithDevice:(AVCaptureDevice *)microphone {
[_session beginConfiguration];
_audioInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:microphone error:nil];
if ([_session canAddInput:_audioInputDevice]) {
[_session addInput:_audioInputDevice];
}
[_session removeOutput:_audioDataOutput];
_audioDataOutput = nil;
_audioDataOutput = [[AVCaptureAudioDataOutput alloc] init];
[_audioDataOutput setSampleBufferDelegate:self queue:_outputQueueAudio];
[_session addOutput:_audioDataOutput];
_audioConnection = [_audioDataOutput connectionWithMediaType:AVMediaTypeAudio];
[_session commitConfiguration];
}
You can't change the captureDevice mid-session. And you can only have one capture session running at a time. You could end the current session and create a new one. There will be a slight lag (maybe a second or two depending on your cpu load).
I wish Apple would allow multiple sessions or at least multiple devices per session... but they do not... yet.
have you considered having multiple sessions and then afterwards processing the video files to join them together into one?