I am trying to record primary and secondary monitor screens into two separate files using below code.
const uint32_t MAX_DISPLAY = 2;
CGDirectDisplayID displays[MAX_DISPLAY] = {0};
CGGetActiveDisplayList(MAX_DISPLAY, displays, &m_nDisplays);
NSString* dest_file[2] = {0};
NSURL* dest_path[2] = {0};
AVCaptureConnection *CaptureConnection[2] = {0};
NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
#2048, AVVideoCleanApertureWidthKey,
#1152, AVVideoCleanApertureHeightKey,
#0, AVVideoCleanApertureHorizontalOffsetKey,
#0, AVVideoCleanApertureVerticalOffsetKey,
nil];
NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
#3,AVVideoPixelAspectRatioHorizontalSpacingKey,
#3,AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
NSNumber* bitsPerSecond = [NSNumber numberWithDouble:1024*1000];
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
bitsPerSecond, AVVideoAverageBitRateKey,
videoCleanApertureSettings, AVVideoCleanApertureKey,
videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
nil];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
codecSettings,AVVideoCompressionPropertiesKey,
AVVideoScalingModeResize,AVVideoScalingModeKey,
#2048, AVVideoWidthKey,
#1152, AVVideoHeightKey,
nil];
for( int nIdx = 0; nIdx < m_nDisplays; ++nIdx )
{
m_session[nIdx] = [[AVCaptureSession alloc] init];
dest_file[nIdx] = [NSString stringWithFormat:#"%#_%d.MOV",destination_path,nIdx];
dest_path[nIdx] = [NSURL fileURLWithPath: dest_file[nIdx] ];
// Create a ScreenInput with the display and add it to the session
m_movie_file_input[nIdx] = [[[AVCaptureScreenInput alloc] initWithDisplayID:displays[nIdx]] autorelease];
[m_movie_file_input[nIdx] removesDuplicateFrames ];
if ([m_session[nIdx] canAddInput:m_movie_file_input[nIdx]])
{
[m_session[nIdx] addInput:m_movie_file_input[nIdx]];
}
// Create a MovieFileOutput and add it to the session
m_movie_file_output[nIdx] = [[[AVCaptureMovieFileOutput alloc] init] autorelease];
if ([m_session[nIdx] canAddOutput:m_movie_file_output[nIdx]])
{
[m_session[nIdx] addOutput:m_movie_file_output[nIdx]];
}
CaptureConnection[nIdx] = [m_movie_file_output[nIdx] connectionWithMediaType:AVMediaTypeVideo];
[m_movie_file_output[nIdx] setOutputSettings : videoSettings forConnection : CaptureConnection[nIdx]];
// Start running the session
[m_session[nIdx] startRunning];
[m_movie_file_output[nIdx] startRecordingToOutputFileURL:dest_path[nIdx] recordingDelegate:self];
}
I am getting both the screens saved into two separate files. But while calling startRecordingToOutputFileURL API for the secondary monitor i.e. for the second pass of loop, I am getting an error as shown below :
VTCompressionSessionCreate signalled err=-8973 (err)
(VTVideoEncoderStartSession failed) at
/SourceCache/CoreMedia_frameworks/CoreMedia-1562.19/Sources/VideoToolbox/VTCompressionSession.c
line 897
Also compression parameters(bitrate) are not setting properly for Secondary monitor, it takes some different values other than the one I have specified in the program.
Can somebody please help me on this ? Also please let me know this is the proper way of doing this.
Thanks in Advance
George
Related
I have these functions called in a thread that draw a NSView:
+(NSFont *)customFontWithName:(NSString *)fontName AndSize:(float)fontSize
{
NSData *data = [[[NSDataAsset alloc]initWithName:fontName] data];
CGDataProviderRef fontProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGFontRef cgFont = CGFontCreateWithDataProvider(fontProvider);
CGDataProviderRelease(fontProvider);
NSDictionary *fontsizeAttr=[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:fontSize], NSFontSizeAttribute,
nil];
CTFontDescriptorRef fontDescriptor = CTFontDescriptorCreateWithAttributes((__bridge CFDictionaryRef)fontsizeAttr);
CTFontRef font = CTFontCreateWithGraphicsFont(cgFont, 0, NULL, fontDescriptor);
CFRelease(fontDescriptor);
CGFontRelease(cgFont);
NSFont* retval= (__bridge NSFont*)font;
CFRelease(font);
return retval;
}
and this:
+(NSAttributedString*) createCurrentTextWithString:(NSString *)string AndMaxLenght:(float)length AndMaxHeight:(float)maxHeight AndColor:(NSColor *)color AndFontName: (NSString*) fontName
{
float dim=0.1;
NSDictionary *dictionary=[NSDictionary dictionaryWithObjectsAndKeys:[CustomFont customFontWithName:fontName AndSize:dim], NSFontAttributeName,color, NSForegroundColorAttributeName, nil];
NSAttributedString * currentText=[[NSAttributedString alloc] initWithString:string attributes: dictionary];
while([currentText size].width<maxLength&&[currentText size].height<maxHeight)
{
dictionary=[NSDictionary dictionaryWithObjectsAndKeys:[CustomFont customFontWithName:fontName AndSize:dim], NSFontAttributeName,color, NSForegroundColorAttributeName, nil];
currentText=[[NSAttributedString alloc] initWithString:string attributes: retval];
dim+=0.1;
}
return currentText;
}
All the objects created in these functions were correctly deallocated and I can't find memory leaks, but this code caused an huge use of memory (many gigabytes) and I can't understand why. Please help.
I found a solution. For some reasons, that I don't know, the code:
NSData *data = [[[NSDataAsset alloc]initWithName:fontName] data];
CGDataProviderRef fontProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);
CGFontRef cgFont = CGFontCreateWithDataProvider(fontProvider);
allocs large memory that will not be deallocate any more. So I create a static variable CGFontRef for each custom font I have. This is the only way I found:
static CGFontRef font1;
....
static CGFontRef font;
+(CGFontRef) getFontWithValue: (int) value
{
switch (value)
{
case 1:
return font1;
break;
...
case n:
return fontn;
default:
return NULL;
}
}
And
+(NSFont*) customFontWithName:(int)fontName AndSize:(float)fontSize
{
NSDictionary *fontsizeAttr=[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat:fontSize], NSFontSizeAttribute,
nil];
CTFontDescriptorRef fontDescriptor = CTFontDescriptorCreateWithAttributes((__bridge CFDictionaryRef)fontsizeAttr);
CTFontRef font = CTFontCreateWithGraphicsFont([CustomFont getFontWithValue:fontName], 0, NULL, fontDescriptor);
CFRelease(fontDescriptor);
NSFont* retval= (__bridge NSFont*)font;
CFRelease(font);
return retval;
}
I still don't understand why there is this memory leak and this is not a solution, but only a trick, but it works.
I want to get a NSArray with all the UIImage from a Live Photo to create a GIF of that. I tried to make screenshots while animating the live photo but it doesn't work.
Can anyone help me?
Thanks!
First step, you need convert a Live Photo to Video, using this:
PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes,
toFile: fileURL, options: nil, completionHandler:
{
// Video file has been written to path specified via fileURL
}
Finally, using this library to convert this to GIF, or you can search in google for another way: https://github.com/NSRare/NSGIF
Hope this will help you.
This is what I did to achieve the same thing as you requested.
PHFetchOptions *options = [[PHFetchOptions alloc] init];
options.sortDescriptors = #[[NSSortDescriptor sortDescriptorWithKey:#"creationDate" ascending:NO]];
options.predicate = [NSPredicate predicateWithFormat:#"mediaType == %d", PHAssetMediaTypeImage];
options.predicate = [NSPredicate predicateWithFormat:#"mediaSubtype == %d", PHAssetMediaSubtypePhotoLive];
options.includeAllBurstAssets = NO;
PHFetchResult *allLivePhotos = [PHAsset fetchAssetsWithOptions:options];
NSLog(#"Get total live count : %ld",(unsigned long)allLivePhotos.count);
NSMutableArray *arrAllLiveImagesGroups = [NSMutableArray array];
for (PHAsset *asset in allLivePhotos) {
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *urlMov = [contentEditingInput.livePhoto valueForKey:#"videoURL"];
NSMutableArray *arrLive = [NSMutableArray array];
NSMutableArray *arrSingleLiveImagesGroup = [NSMutableArray array];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:urlMov options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * 5 ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, 5);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image scale:1.0 orientation:UIImageOrientationDown];
[arrLive addObject:generatedImage];
CGImageRelease(image);
}
}
[arrSingleLiveImagesGroup addObject:arrLive];
[arrAllLiveImagesGroups addObject:arrSingleLiveImagesGroup];
}];
}
Hello I need to know how to send two objects through a withParameters: method.
Here is my code:
NSDictionary *numberparam = [NSDictionary dictionaryWithObject:phoneNumber forKey:#"number"];
NSDictionary *messageparam = [NSDictionary dictionaryWithObject:message forKey:#"message"];
[PFCloud callFunctionInBackground:#"inviteWithTwilio" withParameters:numberparam messageparam block:^(id object, NSError *error) {
NSString *message1 = #"";
Everything works fine if I take out messageparam from the PFCloud call but I need to include it. How do i do this?
You'd put them in a dictionary together:
NSMutableDictionary * params = [NSMutableDictionary new];
params[#"number"] = phoneNumber;
params[#"message"] = message;
[PFCloud callFunctionInBackground:#"inviteWithTwilio" withParameters:params block:^(id object, NSError *error) {
NSString *message1 = #"";
}];
Then, in cloud use
var phoneNumber = request.number;
var message = request.message;
How can I do something like "share to mail"? Like in NSSharingServices when selecting mail. For example I have NSImage and I want to achieve result like in image2. How can I do it? Any pointers?
Image1:
Image2:
To create message from text only I can do:
NSURL * url;
url = [NSURL URLWithString:#"mailto:"
"?subject="
"&body=text"
];
(void) [[NSWorkspace sharedWorkspace] openURL:url];
but I don't know what to do to create message with image.
I found a way to add attachment when using ScriptingBridge framework. Code:
MailApplication *mail = [SBApplication
applicationWithBundleIdentifier:#"com.apple.Mail"];
MailOutgoingMessage *emailMessage =
[[[mail classForScriptingClass:#"outgoing message"] alloc]
initWithProperties:
[NSDictionary dictionaryWithObjectsAndKeys:
#"this is my subject", #"subject",
#"this is my content", #"content",
nil]];
[[mail outgoingMessages] addObject: emailMessage];
emailMessage.visible = YES;
NSString *attachmentFilePath = [NSString stringWithUTF8String:"<my provided file path>"];
if ( [attachmentFilePath length] > 0 ) {
MailAttachment *theAttachment = [[[mail
classForScriptingClass:#"attachment"] alloc]
initWithProperties:
[NSDictionary dictionaryWithObjectsAndKeys:
attachmentFilePath, #"fileName",
nil]];
[[emailMessage.content attachments] addObject: theAttachment];
}
[emailMessage visible];
It works. But how to add NSImage to attachment? Maby I have to write NSImage to temporary file, then add as attachment and delete temporary file? Or what? Or maby I should somehow add NSImage to body?
like this:
NSString *text = #"sometext";
NSImage *image = [[NSImage alloc] initWithContentsOfURL:[NSURL URLWithString:#"/logo57.png"]];
NSArray * shareItems = [NSArray arrayWithObjects:text, image, nil];
NSSharingService *service = [NSSharingService sharingServiceNamed:NSSharingServiceNameComposeEmail];
service.delegate = self;
[service performWithItems:shareItems];
also make sure you put the NSSharingServiceDelegate in your headerfile of the delegate.
Please consider this code:
NSString *jsonreturn = [[NSString alloc] initWithContentsOfURL:[ NSURL URLWithString:url ]]; // Pulls the URL
NSLog(#"jsonreturn#########=%#",jsonreturn); // Look at the console and you can see what the restults are
NSData *jsonData = [jsonreturn dataUsingEncoding:NSUTF32BigEndianStringEncoding];
CJSONDeserializer *theDeserializer = [CJSONDeserializer deserializer];
theDeserializer.nullObject = NULL;
NSError *error = nil;
NSDictionary *dictt = [[CJSONDeserializer deserializer] deserializeAsDictionary:jsonData error:&error];
NSLog(#"###dict=%#", dictt);
if (dictt) {
rowsForQuestion = [[dictt objectForKey:#"faqdetails"] retain];// NSArray rowsForQuestion
}
[jsonreturn release];
// I have got this data is in console NOW I WANT TO PRINT IT UITextView but HOW I can do it
faqdetails = (
{
faqAns = "Yes, Jack Kalis is the best crickter";
faqQues = "who is the best cricketer in present year?";
}
);
}
Your question isn't particularly clear regarding what you want to show where, but dropped text into a UITextView couldn't be easier.
[yourTextView setText: [[rowsForQuestion objectAtIndex: 0] objectForKey: #"faqQues"]];
The above code grabs the first dict from rowsForQuestion, and puts its value for #"faqQues" into a UITextView.
Assuming you have a UITextView instance called myTextView, made either programatically or through IB, try this:
[myTextView setText:faqdetails];