Function that sends an image to AirPrint - objective-c

Im trying to find a function that lets me print using AirPrint.
I have a button btnPrint, that when pressed, should print myPic.jpg to the default AirPrint device. But I cannot figure out if there even is such a function.
I cannot find a lot of documentation on AirPrint in xcode.

Apple has documentation on printing that would probably benefit you.
And the following is from Objective-C code for AirPrint:
Check wether printing is available:
if ([UIPrintInteractionController isPrintingAvailable])
{
// Available
} else {
// Not Available
}
Print after button click:
-(IBAction) buttonClicked: (id) sender;
{
NSMutableString *printBody = [NSMutableString stringWithFormat:#"%#, %#",self.encoded.text, self.decoded.text];
[printBody appendFormat:#"\n\n\n\nPrinted From *myapp*"];
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = self.titleLabel.text;
pic.printInfo = printInfo;
UISimpleTextPrintFormatter *textFormatter = [[UISimpleTextPrintFormatter alloc] initWithText:printBody];
textFormatter.startPage = 0;
textFormatter.contentInsets = UIEdgeInsetsMake(72.0, 72.0, 72.0, 72.0); // 1 inch margins
textFormatter.maximumContentWidth = 6 * 72.0;
pic.printFormatter = textFormatter;
[textFormatter release];
pic.showsPageRange = YES;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"Printing could not complete because of error: %#", error);
}
};
[pic presentFromBarButtonItem:self.rightButton animated:YES completionHandler:completionHandler];
}

Related

AVCapturePhotoOutput not returning proper image

My requirement in the app is to capture an image without previewing the UIImagePickerController So I have used following code to capture an image without presenting the same.
- (void)clickImage
{
AVCaptureDevice *rearCamera = [self checkIfRearCameraAvailable];
if (rearCamera != nil)
{
photoSession = [[AVCaptureSession alloc] init];
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:rearCamera error:&error];
if (!error && [photoSession canAddInput:input])
{
[photoSession addInput:input];
AVCapturePhotoOutput *output = [[AVCapturePhotoOutput alloc] init];
if ([photoSession canAddOutput:output])
{
[photoSession addOutput:output];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in output.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo])
{
videoConnection = connection;
break;
}
}
if (videoConnection)
{
break;
}
}
if (videoConnection)
{
[photoSession startRunning];
[output capturePhotoWithSettings:[AVCapturePhotoSettings photoSettings] delegate:self];
}
}
}
}
}
- (AVCaptureDevice *)checkIfRearCameraAvailable
{
AVCaptureDevice *rearCamera;
AVCaptureDeviceDiscoverySession *captureDeviceDiscoverySession =
[AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:#[AVCaptureDeviceTypeBuiltInWideAngleCamera]
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
NSArray *allCameras = [captureDeviceDiscoverySession devices];
for (int i = 0; i < allCameras.count; i++)
{
AVCaptureDevice *camera = [allCameras objectAtIndex:i];
if (camera.position == AVCaptureDevicePositionBack)
{
rearCamera = camera;
}
}
return rearCamera;
}
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error
{
if (error)
{
NSLog(#"error : %#", error.localizedDescription);
}
if (photoSampleBuffer)
{
NSData *data = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
UIImage *image = [UIImage imageWithData:data];
_imgView.image = image;
}
}
I have used above code to capture an image But the output image is returning is looking like in Night mode or like Negative Image.
Would you please review code and correct me with this?
I have found following sample code from Apple Developer's site:
https://developer.apple.com/library/content/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112
Above sample code helped to get proper image. I have added this answer for other's help as well.

AVAudioEngine (iOS 8 Obj-C): How to construct the graph for Recording & Playback?

I need to create the render graph for basic recording to a file and playback from that file, using AVAudioFile & AVAudioInputNode.
Below are the main setup methods I started, but the graph is connected just for player.
How do I construct the graph so that [Input], [Player] [Mixers] are connected to achieve Rec/Play?
#pragma mark - Setup
-(void)setupAudioEngine{
//create
_audioEngine = [[AVAudioEngine alloc]init];
_player = [[AVAudioPlayerNode alloc] init];
_inputMixer = [[AVAudioMixerNode alloc]init];
_playerMixer = [[AVAudioMixerNode alloc]init];
//attach player to engine
[_audioEngine attachNode:_player];
[_audioEngine attachNode:_inputMixer];
[_audioEngine attachNode:_playerMixer];
_input = [_audioEngine inputNode];
_mainMixer = [_audioEngine mainMixerNode];
//setup audio file
NSError *error = nil;
//Connect the render grapth
[_audioEngine connect:_input to:_mainMixer format:[_input inputFormatForBus:0]];
[_audioEngine connect:_player to:_mainMixer format:_audioFile.processingFormat];
//Start the Engine
[_audioEngine startAndReturnError:&error];
if(error)
{
NSLog(#"error: %#", [error localizedDescription]);
}
}
- (IBAction)recordAudio:(UIButton *)sender {
NSError *error = nil;
//setup for writing to a file
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
#"MyAudioMemo.m4a",
nil];
NSURL *inputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
NSDictionary *recordSettings = #{
AVFormatIDKey : #(kAudioFormatMPEG4AAC),
AVSampleRateKey : #44100.0f,
AVNumberOfChannelsKey : #1,
AVEncoderBitDepthHintKey : #16,
AVEncoderAudioQualityKey : #(AVAudioQualityMedium)
};
_audioFile = [[AVAudioFile alloc] initForWriting:inputFileURL settings:recordSettings error:&error];
[_mainMixer installTapOnBus:0 bufferSize:4096 format:[_mainMixer outputFormatForBus:0] block:^(AVAudioPCMBuffer *buffer, AVAudioTime *when) {
}];
}
- (IBAction)stopRec:(id)sender {
[_audioEngine disconnectNodeOutput:_input];
}
- (IBAction)playAudio:(UIButton *)sender {
//schedule play
[_player scheduleFile:_audioFile atTime:nil completionHandler:nil];
[_player play];
}
Rule #1: Not all bitrate/audioformat/samplerate combinations work. Most of them will crash, so stick to the ones that work (I discovered a few, empirically).
This code records and plays .M4A format:
func directoryURL() -> NSURL {
let fileManager = NSFileManager.defaultManager()
let urls = fileManager.URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
filepath = urls[0]
let documentDirectory = urls[0] as NSURL
print("STORAGE DIR: "+documentDirectory.description)
//print("---filepath: "+(filepath?.description)!)
let soundURL = documentDirectory.URLByAppendingPathComponent(m4aName) //.m4a
print("SAVING FILE: "+soundURL!.description)
return soundURL!
}
//INIT AUDIO RECORDER
func initializeAudioSession(){
let recordSettings = [AVSampleRateKey : NSNumber(float: Float(16000.0)),
AVFormatIDKey : NSNumber(int: Int32(kAudioFormatMPEG4AAC)), //.m4a !!
// AVFormatIDKey : NSNumber(int: Int32(kAudioFileMP3Type)),
AVNumberOfChannelsKey : NSNumber(int: 1),
AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Low.rawValue)) ] //mp3 crashes!!
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try audioRecorder = AVAudioRecorder(URL: self.directoryURL(),
settings: recordSettings)
audioRecorder.delegate = self
audioRecorder.meteringEnabled = true
audioRecorder.prepareToRecord()
print("M4a Recorder Initialized - OK...")
recordSpeechM4A();
stopRecordingCapio();
} catch let error1 as NSError{
print("Init Error: "+error1.description)
}
}
//REC SPEECH AUDIO IN .M4A format
func recordSpeechM4A(){
if !audioRecorder.recording {
isRecording=true //optional variable for tracking
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(true)
print("**** RECORDING ****")
audioRecorder.record()
//Animate a microphone waveform while the user speaks
self.meterTimer = NSTimer.scheduledTimerWithTimeInterval(0.03,
target:self,
selector:#selector(ViewController.updateWaveview2(_:)),
userInfo:nil,
repeats:false)
} catch {
print("RECORDING ERROR")
}
}
}
//STOP RECORDING
func stopRecording(){
audioRecorder.stop()
}
//PLAY
func playRecordedAudio(){
if (!audioRecorder.recording){
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder.url)
audioPlayer.play()
print("PLAYING AUDIO...: "+audioRecorder.url.description)
print("Audio duration: "+audioPlayer.duration.description)
} catch {
print("AUDIO PLAYBACK ERROR")
}
}
}

Capturing iSight image using AVFoundation on Mac

I previously had this code to capture a single image from a Mac's iSight camera using QTKit:
- (NSError*)takePicture
{
BOOL success;
NSError* error;
captureSession = [QTCaptureSession new];
QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];
success = [device open: &error];
if (!success) { return error; }
QTCaptureDeviceInput* captureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device];
success = [captureSession addInput: captureDeviceInput error: &error];
if (!success) { return error; }
QTCaptureDecompressedVideoOutput* captureVideoOutput = [QTCaptureDecompressedVideoOutput new];
[captureVideoOutput setDelegate: self];
success = [captureSession addOutput: captureVideoOutput error: &error];
if (!success) { return error; }
[captureSession startRunning];
return nil;
}
- (void)captureOutput: (QTCaptureOutput*)captureOutput
didOutputVideoFrame: (CVImageBufferRef)imageBuffer
withSampleBuffer: (QTSampleBuffer*)sampleBuffer
fromConnection: (QTCaptureConnection*)connection
{
CVBufferRetain(imageBuffer);
if (imageBuffer) {
[captureSession removeOutput: captureOutput];
[captureSession stopRunning];
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
_done = YES;
}
}
However, I found today that QTKit has been deprecated and so we must now use AVFoundation.
Can anyone help me convert this code to its AVFoundation equivalent? It seems as though many methods have the same name, but at the same time, a lot is different and I'm at a complete loss here... Any help?
Alright, I found the solution!! Here it is:
- (void)takePicture
{
NSError* error;
AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: device error: &error];
if (!input) {
_error = error;
_done = YES;
return;
}
AVCaptureStillImageOutput* output = [AVCaptureStillImageOutput new];
[output setOutputSettings: #{(id)kCVPixelBufferPixelFormatTypeKey: #(k32BGRAPixelFormat)}];
captureSession = [AVCaptureSession new];
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[captureSession addInput: input];
[captureSession addOutput: output];
[captureSession startRunning];
AVCaptureConnection* connection = [output connectionWithMediaType: AVMediaTypeVideo];
[output captureStillImageAsynchronouslyFromConnection: connection completionHandler: ^(CMSampleBufferRef sampleBuffer, NSError* error) {
if (error) {
_error = error;
_result = nil;
}
else {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (imageBuffer) {
CVBufferRetain(imageBuffer);
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
}
}
_done = YES;
}];
}
I hope this helps whoever has any problems in doing this same thing.

Air Print Action not doing anything unless file type is png

Wrangling with Air Print not doing anything from a button press, im trying to print a file from screen (my first time implementing this)
Ive added the UIPrintInteractionControllerDelegate in .h
.m Hooked up my button which has been checked as working correctly
Button handler code:
NSString *path = [[NSBundle mainBundle] pathForResource:#"mylocalhtmlfile" ofType:#"HTML"];
NSData *dataFromPath = [NSData dataWithContentsOfFile:path];
UIPrintInteractionController *printController = [UIPrintInteractionController sharedPrintController];
if(printController && [UIPrintInteractionController canPrintData:dataFromPath]) {
printController.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = [path lastPathComponent];
printInfo.duplex = UIPrintInfoDuplexLongEdge;
printController.printInfo = printInfo;
printController.showsPageRange = YES;
printController.printingItem = dataFromPath;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) = ^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"FAILED! due to error in domain %# with error code %u", error.domain, error.code);
}
};
[printController presentAnimated:YES completionHandler:completionHandler];
}
I get no action whatsoever, no presentation of the air print controls or anything unless file type is set to png

Sending data to wifi printer from ipad programmatically

I am creating an ipad application in which i want to send data to the wifi printer programmatically. Is there any API or sample code available to acheive this?
Thanks.
I think the normal printing APIs will accomplish this, using AirPrint. http://developer.apple.com/library/IOs/documentation/2DDrawing/Conceptual/DrawingPrintingiOS/Printing/Printing.html
In addition, there's a great app called Printopia that allows your Mac to serve as an AirPrint host: http://www.ecamm.com/mac/printopia/
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
if (pic && [UIPrintInteractionController canPrintData: self.myPDFData] ) {
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = #"PrintPdf";
printInfo.duplex = UIPrintInfoDuplexLongEdge;
pic.printInfo = printInfo;
pic.showsPageRange = YES;
pic.printingItem = self.myPDFData;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *pic, BOOL completed, NSError *error) {
if (!completed && error)
NSLog(#"FAILED! due to error in domain %# with error code %ld",
error.domain, (long)error.code);
};
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad) {
[pic presentFromRect:self.printButton.frame inView:self.view animated:YES completionHandler:^(UIPrintInteractionController *printInteractionController, BOOL completed, NSError *error) {
}];
} else {
[pic presentAnimated:YES completionHandler:completionHandler];
}
}