Sending data to wifi printer from ipad programmatically - objective-c

I am creating an ipad application in which i want to send data to the wifi printer programmatically. Is there any API or sample code available to acheive this?
Thanks.

I think the normal printing APIs will accomplish this, using AirPrint. http://developer.apple.com/library/IOs/documentation/2DDrawing/Conceptual/DrawingPrintingiOS/Printing/Printing.html
In addition, there's a great app called Printopia that allows your Mac to serve as an AirPrint host: http://www.ecamm.com/mac/printopia/

UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
if (pic && [UIPrintInteractionController canPrintData: self.myPDFData] ) {
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = #"PrintPdf";
printInfo.duplex = UIPrintInfoDuplexLongEdge;
pic.printInfo = printInfo;
pic.showsPageRange = YES;
pic.printingItem = self.myPDFData;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *pic, BOOL completed, NSError *error) {
if (!completed && error)
NSLog(#"FAILED! due to error in domain %# with error code %ld",
error.domain, (long)error.code);
};
if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad) {
[pic presentFromRect:self.printButton.frame inView:self.view animated:YES completionHandler:^(UIPrintInteractionController *printInteractionController, BOOL completed, NSError *error) {
}];
} else {
[pic presentAnimated:YES completionHandler:completionHandler];
}
}

Related

Could not send command to XPrinter

I have a problem to send command to xPrinter. I have its SDK for iOS app and follow the instruction in this framework to successfully connect my iOS app to this printer via WIFI connection.
NSString * ip = "IPAddress";
[manager MConnectWithHost:ip port:9100 completion: ^(BOOL result){
}];
--> manager is created and connection is successfully.
But when send command by MWriteCommandWithData method of MWIFIManager class, it always print out the String, not command to printer.
Example:
I want to cut the page by prepare cmd for TscCommnand:
NSData *data = [TscCommand cut]
then send it to MWIFIManager by:
[manager MWriteCommandWithData:data];
Result: it print 'CUT' to the paper of printer, not cut the paper
The SDK I downloaded from here: https://www.xprintertech.com/sdk
it was written by Objective C and My iOS app was developed by Swift.
I just want to implement a method to connect and print on the XPrinter (thermal-printer).
Anyone who has experience on this please advise. Thanks so much!
Not sure what you're trying to do with the IP address...
Here is all you need to print in iOS:
Class printInteractionController = NSClassFromString(#"UIPrintInteractionController");
if ((printInteractionController != nil) && [printInteractionController isPrintingAvailable])
{
NSURL *fileURL = document.fileURL; // Document file URL
printInteraction = [printInteractionController sharedPrintController];
if ([printInteractionController canPrintURL:fileURL] == YES) // Check first
{
UIPrintInfo *printInfo = [NSClassFromString(#"UIPrintInfo") printInfo];
printInfo.duplex = UIPrintInfoDuplexLongEdge;
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = document.fileName;
printInteraction.printInfo = printInfo;
printInteraction.printingItem = fileURL;
printInteraction.showsPageRange = YES;
if ([UIDevice currentDevice].userInterfaceIdiom == UIUserInterfaceIdiomPad)
{
[printInteraction presentFromRect:button.bounds inView:button animated:YES completionHandler:
^(UIPrintInteractionController *pic, BOOL completed, NSError *error)
{
#ifdef DEBUG
if ((completed == NO) && (error != nil)) NSLog(#"%s %#", __FUNCTION__, error);
#endif
}
];
}
else // Presume UIUserInterfaceIdiomPhone
{
[printInteraction presentAnimated:YES completionHandler:
^(UIPrintInteractionController *pic, BOOL completed, NSError *error)
{
#ifdef DEBUG
if ((completed == NO) && (error != nil)) NSLog(#"%s %#", __FUNCTION__, error);
#endif
}
];
}
}
}

Simple Speech into Text in IPhone

I am trying to developing the app like speech into text , i want to convert the speech into a text in textfield .I have searched in google i got some sample code but that is not useful.i got these link,raywenderlinch,but in this they mentioned some API is used for speech reconginization ,but i cant able to get that.
Please anyone can share a tutorial with the sample project it might be very useful to me .
Thanks in advance !!!
- (void) viewDidAppear:(BOOL)animated {
_recognizer = [[SFSpeechRecognizer alloc] initWithLocale:[NSLocale localeWithLocaleIdentifier:#"en-US"]];
[_recognizer setDelegate:self];
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
switch (authStatus) {
case SFSpeechRecognizerAuthorizationStatusAuthorized:
//User gave access to speech recognition
NSLog(#"Authorized");
break;
case SFSpeechRecognizerAuthorizationStatusDenied:
//User denied access to speech recognition
NSLog(#"SFSpeechRecognizerAuthorizationStatusDenied");
break;
case SFSpeechRecognizerAuthorizationStatusRestricted:
//Speech recognition restricted on this device
NSLog(#"SFSpeechRecognizerAuthorizationStatusRestricted");
break;
case SFSpeechRecognizerAuthorizationStatusNotDetermined:
//Speech recognition not yet authorized
break;
default:
NSLog(#"Default");
break;
}
}];
audioEngine = [[AVAudioEngine alloc] init];
_speechSynthesizer = [[AVSpeechSynthesizer alloc] init];
[_speechSynthesizer setDelegate:self];
}
-(void)startRecording
{
[self clearLogs:nil];
NSError * outError;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:&outError];
[audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
[audioSession setActive:true withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&outError];
request2 = [[SFSpeechAudioBufferRecognitionRequest alloc] init];
inputNode = [audioEngine inputNode];
if (request2 == nil) {
NSLog(#"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
}
if (inputNode == nil) {
NSLog(#"Unable to created a inputNode object");
}
request2.shouldReportPartialResults = true;
_currentTask = [_recognizer recognitionTaskWithRequest:request2
delegate:self];
[inputNode installTapOnBus:0 bufferSize:4096 format:[inputNode outputFormatForBus:0] block:^(AVAudioPCMBuffer *buffer, AVAudioTime *when){
NSLog(#"Block tap!");
[request2 appendAudioPCMBuffer:buffer];
}];
[audioEngine prepare];
[audioEngine startAndReturnError:&outError];
NSLog(#"Error %#", outError);
}
- (void)speechRecognitionTask:(SFSpeechRecognitionTask *)task didFinishRecognition:(SFSpeechRecognitionResult *)result {
NSLog(#"speechRecognitionTask:(SFSpeechRecognitionTask *)task didFinishRecognition");
NSString * translatedString = [[[result bestTranscription] formattedString] stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
[self log:translatedString];
if ([result isFinal]) {
[audioEngine stop];
[inputNode removeTapOnBus:0];
_currentTask = nil;
request2 = nil;
}
}

Capturing iSight image using AVFoundation on Mac

I previously had this code to capture a single image from a Mac's iSight camera using QTKit:
- (NSError*)takePicture
{
BOOL success;
NSError* error;
captureSession = [QTCaptureSession new];
QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];
success = [device open: &error];
if (!success) { return error; }
QTCaptureDeviceInput* captureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device];
success = [captureSession addInput: captureDeviceInput error: &error];
if (!success) { return error; }
QTCaptureDecompressedVideoOutput* captureVideoOutput = [QTCaptureDecompressedVideoOutput new];
[captureVideoOutput setDelegate: self];
success = [captureSession addOutput: captureVideoOutput error: &error];
if (!success) { return error; }
[captureSession startRunning];
return nil;
}
- (void)captureOutput: (QTCaptureOutput*)captureOutput
didOutputVideoFrame: (CVImageBufferRef)imageBuffer
withSampleBuffer: (QTSampleBuffer*)sampleBuffer
fromConnection: (QTCaptureConnection*)connection
{
CVBufferRetain(imageBuffer);
if (imageBuffer) {
[captureSession removeOutput: captureOutput];
[captureSession stopRunning];
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
_done = YES;
}
}
However, I found today that QTKit has been deprecated and so we must now use AVFoundation.
Can anyone help me convert this code to its AVFoundation equivalent? It seems as though many methods have the same name, but at the same time, a lot is different and I'm at a complete loss here... Any help?
Alright, I found the solution!! Here it is:
- (void)takePicture
{
NSError* error;
AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: device error: &error];
if (!input) {
_error = error;
_done = YES;
return;
}
AVCaptureStillImageOutput* output = [AVCaptureStillImageOutput new];
[output setOutputSettings: #{(id)kCVPixelBufferPixelFormatTypeKey: #(k32BGRAPixelFormat)}];
captureSession = [AVCaptureSession new];
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
[captureSession addInput: input];
[captureSession addOutput: output];
[captureSession startRunning];
AVCaptureConnection* connection = [output connectionWithMediaType: AVMediaTypeVideo];
[output captureStillImageAsynchronouslyFromConnection: connection completionHandler: ^(CMSampleBufferRef sampleBuffer, NSError* error) {
if (error) {
_error = error;
_result = nil;
}
else {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (imageBuffer) {
CVBufferRetain(imageBuffer);
NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];
_result = [[NSImage alloc] initWithSize: [imageRep size]];
[_result addRepresentation: imageRep];
CVBufferRelease(imageBuffer);
}
}
_done = YES;
}];
}
I hope this helps whoever has any problems in doing this same thing.

Air Print Action not doing anything unless file type is png

Wrangling with Air Print not doing anything from a button press, im trying to print a file from screen (my first time implementing this)
Ive added the UIPrintInteractionControllerDelegate in .h
.m Hooked up my button which has been checked as working correctly
Button handler code:
NSString *path = [[NSBundle mainBundle] pathForResource:#"mylocalhtmlfile" ofType:#"HTML"];
NSData *dataFromPath = [NSData dataWithContentsOfFile:path];
UIPrintInteractionController *printController = [UIPrintInteractionController sharedPrintController];
if(printController && [UIPrintInteractionController canPrintData:dataFromPath]) {
printController.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = [path lastPathComponent];
printInfo.duplex = UIPrintInfoDuplexLongEdge;
printController.printInfo = printInfo;
printController.showsPageRange = YES;
printController.printingItem = dataFromPath;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) = ^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"FAILED! due to error in domain %# with error code %u", error.domain, error.code);
}
};
[printController presentAnimated:YES completionHandler:completionHandler];
}
I get no action whatsoever, no presentation of the air print controls or anything unless file type is set to png

Function that sends an image to AirPrint

Im trying to find a function that lets me print using AirPrint.
I have a button btnPrint, that when pressed, should print myPic.jpg to the default AirPrint device. But I cannot figure out if there even is such a function.
I cannot find a lot of documentation on AirPrint in xcode.
Apple has documentation on printing that would probably benefit you.
And the following is from Objective-C code for AirPrint:
Check wether printing is available:
if ([UIPrintInteractionController isPrintingAvailable])
{
// Available
} else {
// Not Available
}
Print after button click:
-(IBAction) buttonClicked: (id) sender;
{
NSMutableString *printBody = [NSMutableString stringWithFormat:#"%#, %#",self.encoded.text, self.decoded.text];
[printBody appendFormat:#"\n\n\n\nPrinted From *myapp*"];
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = self.titleLabel.text;
pic.printInfo = printInfo;
UISimpleTextPrintFormatter *textFormatter = [[UISimpleTextPrintFormatter alloc] initWithText:printBody];
textFormatter.startPage = 0;
textFormatter.contentInsets = UIEdgeInsetsMake(72.0, 72.0, 72.0, 72.0); // 1 inch margins
textFormatter.maximumContentWidth = 6 * 72.0;
pic.printFormatter = textFormatter;
[textFormatter release];
pic.showsPageRange = YES;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"Printing could not complete because of error: %#", error);
}
};
[pic presentFromBarButtonItem:self.rightButton animated:YES completionHandler:completionHandler];
}