Shader doesn't work - OpenGL ES - objective-c

I have created a program that allows me to display 3D objects and now I want to make a cut in relation to another object. Here is my result : screen.
The screen shows that we can see through the cut portion. So I decided to use a shader to fill this cut part.
I tried to load a shader and then to use it in a glUniform3f but that doesn't work. I did several searches on the internet, without results.
Here is my class to load a shader:
- (id)initWithVertexShaderFilename:(NSString *)vShaderFilename
fragmentShaderFilename:(NSString *)fShaderFilename
{
NSLog(fShaderFilename);
if (self = [super init])
{
attributes = [[NSMutableArray alloc] init];
NSString *vertShaderPathname, *fragShaderPathname;
program = glCreateProgram();
vertShaderPathname =[[NSBundle mainBundle]
pathForResource:vShaderFilename
ofType:#"vsh"
inDirectory:#"Shader"];
if (![self compileShader:&vertShader
type:GL_VERTEX_SHADER
file:vertShaderPathname])
NSLog(#"Failed to compile vertex shader");
else
NSLog(#"Vertex Shader OK");
fragShaderPathname = [[NSBundle mainBundle]
pathForResource:fShaderFilename
ofType:#"fsh"
inDirectory:#"Shader"];
if (![self compileShader:&fragShader
type:GL_FRAGMENT_SHADER
file:fragShaderPathname])
NSLog(#"Failed to compile fragment shader");
else
NSLog(#"Fragment shader OK");
glAttachShader(program, vertShader);
glAttachShader(program, fragShader);
}
return self;
}
- (BOOL)compileShader:(GLuint *)shader
type:(GLenum)type
file:(NSString *)file
{
NSLog(#"bonjour");
GLint status;
const GLchar *source;
source =
(GLchar *)[[NSString stringWithContentsOfFile:file
encoding:NSUTF8StringEncoding
error:nil] UTF8String];
if (!source)
{
NSLog(#"Failed to load vertex shader");
return NO;
}
*shader = glCreateShader(type);
glShaderSource(*shader, 1, &source, NULL);
glCompileShader(*shader);
glGetShaderiv(*shader, GL_COMPILE_STATUS, &status);
NSString *intString = [NSString stringWithFormat:#"%d", status];
return status = GL_TRUE;
}
#pragma mark -
- (void)addAttribute:(NSString *)attributeName
{
if (![attributes containsObject:attributeName])
{
[attributes addObject:attributeName];
glBindAttribLocation(program,
[attributes indexOfObject:attributeName],
[attributeName UTF8String]);
}
}
- (GLuint)attributeIndex:(NSString *)attributeName
{
return [attributes indexOfObject:attributeName];
}
- (GLuint)uniformIndex:(NSString *)uniformName
{
return glGetUniformLocation(program, [uniformName UTF8String]);
}
#pragma mark -
- (BOOL)link
{
GLint status;
glLinkProgram(program);
glValidateProgram(program);
glGetProgramiv(program, GL_LINK_STATUS, &status);
if (status == GL_FALSE)
return NO;
if (vertShader)
glDeleteShader(vertShader);
if (fragShader)
glDeleteShader(fragShader);
return YES;
}
- (void)use
{
glUseProgram(program);
}
Here is the function to initialize my shaders in the main class:
- (void)setup
{
GLShader *theProgram = [[GLShader alloc] initWithVertexShaderFilename:#"shader"
fragmentShaderFilename:#"shader"];
self.program = theProgram;
[self.program addAttribute:#"position"];
[self.program addAttribute:#"textureCoordinates"];
if (![self.program link])
{
NSLog(#"Link failed");
NSString *progLog = [self.program programLog];
NSLog(#"Program Log: %#", progLog);
NSString *fragLog = [self.program fragmentShaderLog];
NSLog(#"Frag Log: %#", fragLog);
NSString *vertLog = [self.program vertexShaderLog];
NSLog(#"Vert Log: %#", vertLog);
//[(GLView *)self.view stopAnimation];
self.program = nil;
}
textureCoordinateAttribute = [program attributeIndex:#"textureCoordinates"];
colorUniform = [program uniformIndex:#"uColor"];
textureUniform = [program uniformIndex:#"texture"];
//colorUniform = glGetUniformLocation(self.program, "uColor");
glEnable(GL_DEPTH_TEST);
glEnable(GL_CULL_FACE);
glEnable(GL_TEXTURE_2D);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ZERO);
}
Here is the function where I want to use my shader:
-(void) draw
{
[self.program use];
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT);
glDepthMask(true);
lUniform3f(colorUniform,1.0, 1.0, 0.5);
[self drawSubtraction:image1 with:image2];
}
Here is my fragment shader:
precision highp float;
uniform vec3 uColor;
uniform vec3 uLight;
varying vec3 vNormal;
const float alpha = 0.7;
void main(void) {
float val = dot( vNormal, uLight ) * 0.4 + 0.6;
gl_FragColor = vec4( uColor * val, alpha);
}
Am I doing something wrong ? Someone has an idea that could help me ?
Thanks.

Your shader code looks ok. Make sure you are passing in the "textureCoordinates" attribute correctly to the vertex shader. You will need to call glEnableVertexAttribArray() and glVertexAttribPointer() at some point of the initialization. Depending on how you are rendering the models you might need to ensure you are passing in vertices correctly as well.
Also i'm assuming [self drawSubtraction:image1 with:image2] calls glDrawArrays() or glDrawElements() at some point.
The Xcode default OpenGL game project is a great place to start for loading shaders too if you want a project to compare against.

Related

How to play pcm audio buffer from a socket server using audio unit circular buffer

I hope someone can help me. I am new to Objective-c and OSX and I am trying to play audio data I am receiving via socket into my audio queue. I found out this link https://stackoverflow.com/a/30318859/4274654 which in away address my issue with circular buffer.
However when I try to run my project it returns
It returns an error (OSStatus) -10865. That is why the code logs " Error enabling AudioUnit output bus".
status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &one, sizeof(one));
Here is my code:
Test.h
#import <Foundation/Foundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import "TPCircularBuffer.h"
#interface Test : Communicator
#property (nonatomic) AudioComponentInstance audioUnit;
#property (nonatomic) TPCircularBuffer circularBuffer;
-(TPCircularBuffer *) outputShouldUseCircularBuffer;
-(void) start;
#end
Test.m
#import "Test.h"
#define kOutputBus 0
#define kInputBus 1
#implementation Test{
BOOL stopped;
}
static OSStatus OutputRenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData){
Test *output = (__bridge Test*)inRefCon;
TPCircularBuffer *circularBuffer = [output outputShouldUseCircularBuffer];
if( !circularBuffer ){
SInt32 *left = (SInt32*)ioData->mBuffers[0].mData;
for(int i = 0; i < inNumberFrames; i++ ){
left[ i ] = 0.0f;
}
return noErr;
};
int32_t bytesToCopy = ioData->mBuffers[0].mDataByteSize;
SInt16* outputBuffer = ioData->mBuffers[0].mData;
uint32_t availableBytes;
SInt16 *sourceBuffer = TPCircularBufferTail(circularBuffer, &availableBytes);
int32_t amount = MIN(bytesToCopy,availableBytes);
memcpy(outputBuffer, sourceBuffer, amount);
TPCircularBufferConsume(circularBuffer,amount);
return noErr;
}
-(void) start
{
[self circularBuffer:&_circularBuffer withSize:24576*5];
stopped = NO;
[self setupAudioUnit];
// [super setup:#"http://localhost" port:5321];
}
-(void) setupAudioUnit
{
AudioComponentDescription desc;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
AudioComponent comp = AudioComponentFindNext(NULL, &desc);
OSStatus status;
status = AudioComponentInstanceNew(comp, &_audioUnit);
if(status != noErr)
{
NSLog(#"Error creating AudioUnit instance");
}
// Enable input and output on AURemoteIO
// Input is enabled on the input scope of the input element
// Output is enabled on the output scope of the output element
UInt32 one = 1;
status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, kOutputBus, &one, sizeof(one));
if(status != noErr)
{
NSLog(#"Error enableling AudioUnit output bus");
}
// Explicitly set the input and output client formats
// sample rate = 44100, num channels = 1, format = 16 bit int point
AudioStreamBasicDescription audioFormat = [self getAudioDescription];
status = AudioUnitSetProperty(_audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, kOutputBus, &audioFormat, sizeof(audioFormat));
if(status != noErr)
{
NSLog(#"Error setting audio format");
}
AURenderCallbackStruct renderCallback;
renderCallback.inputProc = OutputRenderCallback;
renderCallback.inputProcRefCon = (__bridge void *)(self);
status = AudioUnitSetProperty(_audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &renderCallback, sizeof(renderCallback));
if(status != noErr)
{
NSLog(#"Error setting rendering callback");
}
// Initialize the AURemoteIO instance
status = AudioUnitInitialize(_audioUnit);
if(status != noErr)
{
NSLog(#"Error initializing audio unit");
}
}
- (AudioStreamBasicDescription)getAudioDescription {
AudioStreamBasicDescription audioDescription = {0};
audioDescription.mFormatID = kAudioFormatLinearPCM;
audioDescription.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked | kAudioFormatFlagsNativeEndian;
audioDescription.mChannelsPerFrame = 1;
audioDescription.mBytesPerPacket = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mFramesPerPacket = 1;
audioDescription.mBytesPerFrame = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mBitsPerChannel = 8 * sizeof(SInt16);
audioDescription.mSampleRate = 44100.0;
return audioDescription;
}
-(void)circularBuffer:(TPCircularBuffer *)circularBuffer withSize:(int)size {
TPCircularBufferInit(circularBuffer,size);
}
-(void)appendDataToCircularBuffer:(TPCircularBuffer*)circularBuffer
fromAudioBufferList:(AudioBufferList*)audioBufferList {
TPCircularBufferProduceBytes(circularBuffer,
audioBufferList->mBuffers[0].mData,
audioBufferList->mBuffers[0].mDataByteSize);
}
-(void)freeCircularBuffer:(TPCircularBuffer *)circularBuffer {
TPCircularBufferClear(circularBuffer);
TPCircularBufferCleanup(circularBuffer);
}
-(TPCircularBuffer *) outputShouldUseCircularBuffer
{
return &_circularBuffer;
}
-(void) stop
{
OSStatus status = AudioOutputUnitStop(_audioUnit);
if(status != noErr)
{
NSLog(#"Error stopping audio unit");
}
TPCircularBufferClear(&_circularBuffer);
_audioUnit = nil;
stopped = YES;
}
-(void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)event{
switch (event) {
case NSStreamEventOpenCompleted:
NSLog(#"Stream opened");
break;
case NSStreamEventHasBytesAvailable:
if (stream == [super inputStream]) {
NSLog(#"NSStreamEventHasBytesAvailable");
uint8_t buffer[1024];
NSUInteger len;
while ([[super inputStream] hasBytesAvailable]) {
len = [[super inputStream] read:buffer maxLength:sizeof(buffer)];
if (len > 0) {
//converting buffer to byte data
NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
if (nil != output) {
//NSLog(#"server overideddddd said: %#", output);
}
NSData *data0 = [[NSData alloc] initWithBytes:buffer length:len];
if (nil != data0) {
SInt16* byteData = (SInt16*)malloc(len);
memcpy(byteData, [data0 bytes], len);
double sum = 0.0;
for(int i = 0; i < len/2; i++) {
sum += byteData[i] * byteData[i];
}
Byte* soundData = (Byte*)malloc(len);
memcpy(soundData, [data0 bytes], len);
if(soundData)
{
AudioBufferList *theDataBuffer = (AudioBufferList*) malloc(sizeof(AudioBufferList) *1);
theDataBuffer->mNumberBuffers = 1;
theDataBuffer->mBuffers[0].mDataByteSize = (UInt32)len;
theDataBuffer->mBuffers[0].mNumberChannels = 1;
theDataBuffer->mBuffers[0].mData = (SInt16*)soundData;
NSLog(#"soundData here");
[self appendDataToCircularBuffer:&_circularBuffer fromAudioBufferList:theDataBuffer];
}
}
}
}
}
break;
case NSStreamEventErrorOccurred:
NSLog(#"Can't connect to server");
break;
case NSStreamEventEndEncountered:
[stream close];
[stream removeFromRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
break;
default:
NSLog(#"Unknown event");
}
[super stream:stream handleEvent:event];
}
#end
I would highly appreciate if there is any one with an example of playing buffers returned from a socket server into audio queue so that I can be able to listen to sound as it comes from the socket server.
Thanks
Your code seems to be asking for a kAudioUnitSubType_VoiceProcessingIO audio unit. But kAudioUnitSubType_RemoteIO would be a more suitable iOS audio unit for just playing buffers of audio samples.
Also, your code does not seem to first select an appropriate audio session category and activate it before playing audio. See Apple's documentation for doing this: https://developer.apple.com/library/content/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/Introduction/Introduction.html

Error cocos3d + Storyboard?

I did an Example like CC3DemoMultiScene in iOS objective-c, I follow the same code thing like in example
my AppDelegate follows:
#import "AppDelegate.h"
#interface AppDelegate ()
#end
#implementation AppDelegate
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{return YES;
}
-(void) applicationWillResignActive: (UIApplication*) application {
[CCDirector.sharedDirector pause];
}
-(void) applicationDidBecomeActive: (UIApplication*) application {
[CCDirector.sharedDirector resume];
}
-(void) applicationDidReceiveMemoryWarning: (UIApplication*) application {
}
-(void) applicationDidEnterBackground: (UIApplication*) application {
[CCDirector.sharedDirector stopAnimation];
}
-(void) applicationWillEnterForeground: (UIApplication*) application {
[CCDirector.sharedDirector startAnimation];
}
-(void)applicationWillTerminate: (UIApplication*) application {
[CC3OpenGL terminateOpenGL];
}
-(void) applicationSignificantTimeChange: (UIApplication*) application {
[CCDirector.sharedDirector setNextDeltaTimeZero: YES];
}
#end
there is not any error meanwhile compile, but when run in iOS Simulator, the App Crashes print me this issues:
2015-08-14 16:43:26.446 ExampleCocos3d[9063:116836] *** Assertion failure in GLint CompileShader(GLenum, const char *)(), /Users/Dennis/Desktop/ExampleCocos3d/Libraries/DennisCocos3D/cocos2d/cocos2d/cocos2d/CCShader.m:173
2015-08-14 16:43:26.462 ExampleCocos3d[9063:116836] *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Error compiling shader'
*** First throw call stack:
(
0 CoreFoundation 0x000000010edc6c65 __exceptionPreprocess + 165
1 libobjc.A.dylib 0x000000010e642bb7 objc_exception_throw + 45
2 CoreFoundation 0x000000010edc6aca +[NSException raise:format:arguments:] + 106
3 Foundation 0x000000010e149a57 -[NSAssertionHandler handleFailureInFunction:file:lineNumber:description:] + 169
4 ExampleCocos3d 0x000000010aab3d0c CompileShader + 364
5 ExampleCocos3d 0x000000010aab3964 -[CCShader initWithVertexShaderSource:fragmentShaderSource:] + 212
6 ExampleCocos3d 0x000000010aab3ebd -[CCShader initWithFragmentShaderSource:] + 77
7 ExampleCocos3d 0x000000010aab40d4 +[CCShader initialize] + 180
8 libobjc.A.dylib 0x000000010e6434d6 _class_initialize + 648
9 libobjc.A.dylib 0x000000010e64c6e1 lookUpImpOrForward + 351
10 libobjc.A.dylib 0x000000010e6590d3 objc_msgSend + 211
11 ExampleCocos3d 0x000000010aabd331 -[CCRenderer init] + 353
12 ExampleCocos3d 0x000000010ade7c1b -[CCDirector init] + 875
13 ExampleCocos3d 0x000000010ad6e266 -[CCDirectorIOS init] + 54
14 ExampleCocos3d 0x000000010ade7760 +[CCDirector sharedDirector] + 144
15 ExampleCocos3d 0x000000010ace34bd -[AppDelegate applicationDidBecomeActive:] + 61
16 UIKit 0x000000010c2b92ce -[UIApplication _stopDeactivatingForReason:] + 313
17 UIKit 0x000000010c2ce417 -[UIApplication _handleNonLaunchSpecificActions:forScene:withTransitionContext:] + 2589
18 FrontBoardServices 0x0000000112a845e5 __31-[FBSSerialQueue performAsync:]_block_invoke_2 + 21
19 CoreFoundation 0x000000010ecfa41c __CFRUNLOOP_IS_CALLING_OUT_TO_A_BLOCK__ + 12
20 CoreFoundation 0x000000010ecf0165 __CFRunLoopDoBlocks + 341
21 CoreFoundation 0x000000010eceff25 __CFRunLoopRun + 2389
22 CoreFoundation 0x000000010ecef366 CFRunLoopRunSpecific + 470
23 GraphicsServices 0x000000011071ca3e GSEventRunModal + 161
24 UIKit 0x000000010c2be900 UIApplicationMain + 1282
25 Tripsland 0x000000010ad63b4f main + 111
26 libdyld.dylib 0x000000010f465145 start + 1
27 ??? 0x0000000000000001 0x0 + 1
)
libc++abi.dylib: terminating with uncaught exception of type NSException
(lldb)
//------------------------------------------//
// THANKS GUYS //
// GRETTINGS FROM BOLIVIA //
// ROCK ON!!!! n_n' //
//------------------------------------------//
please help!! or exist any error in the CCShader.m file from Cocos2D Library, the error is generated in:
static GLint
CompileShader(GLenum type, const char *source)
{
GLint shader = glCreateShader(type);
const GLchar *sources[] = {
CCShaderHeader,
CCShaderTypeHeader(type),
source,
};
glShaderSource(shader, 3, sources, NULL);
glCompileShader(shader);
NSCAssert(CCCheckShaderError(shader, GL_COMPILE_STATUS, glGetShaderiv, glGetShaderInfoLog), #"Error compiling shader");
return shader;
}
FIRST EDIT: thanks for the recommendation user2242300
well the .h and .m files from CCShader are in the
cocos2d-library-iOS.xcodeproj/cocos2d/cocos2d/CCShader.h
cocos2d-library-iOS.xcodeproj/cocos2d/cocos2d/CCShader.m
the CCShader.h file have:
#import <Foundation/Foundation.h>
#import "ccTypes.h"
#import "ccMacros.h"
#import "Platforms/CCGL.h"
/// Macro to embed GLSL source
#define CC_GLSL(x) ##x
extern const NSString *CCShaderUniformProjection;
extern const NSString *CCShaderUniformProjectionInv;
extern const NSString *CCShaderUniformViewSize;
extern const NSString *CCShaderUniformViewSizeInPixels;
extern const NSString *CCShaderUniformTime;
extern const NSString *CCShaderUniformSinTime;
extern const NSString *CCShaderUniformCosTime;
extern const NSString *CCShaderUniformRandom01;
extern const NSString *CCShaderUniformMainTexture;
extern const NSString *CCShaderUniformNormalMapTexture;
extern const NSString *CCShaderUniformAlphaTestValue;
#interface CCShader : NSObject<NSCopying>
#property(nonatomic, copy) NSString *debugName;
+(instancetype)shaderNamed:(NSString *)shaderName;
-(instancetype)initWithVertexShaderSource:(NSString *)vertexSource fragmentShaderSource:(NSString *)fragmentSource;
-(instancetype)initWithFragmentShaderSource:(NSString *)source;
+(instancetype)positionColorShader;
+(instancetype)positionTextureColorShader;
+(instancetype)positionTextureColorAlphaTestShader;
+(instancetype)positionTextureA8ColorShader;
#end
and the CCShader.m have:
#import "CCShader_private.h"
#import "ccMacros.h"
#import "Support/CCFileUtils.h"
#import "Support/uthash.h"
#import "CCRenderer_private.h"
#import "CCTexture_private.h"
#import "CCDirector.h"
#import "CCCache.h"
#import "CCGL.h"
enum {
CCAttributePosition,
CCAttributeTexCoord1,
CCAttributeTexCoord2,
CCAttributeColor,
};
const NSString *CCShaderUniformProjection = #"cc_Projection";
const NSString *CCShaderUniformProjectionInv = #"cc_ProjectionInv";
const NSString *CCShaderUniformViewSize = #"cc_ViewSize";
const NSString *CCShaderUniformViewSizeInPixels = #"cc_ViewSizeInPixels";
const NSString *CCShaderUniformTime = #"cc_Time";
const NSString *CCShaderUniformSinTime = #"cc_SinTime";
const NSString *CCShaderUniformCosTime = #"cc_CosTime";
const NSString *CCShaderUniformRandom01 = #"cc_Random01";
const NSString *CCShaderUniformMainTexture = #"cc_MainTexture";
const NSString *CCShaderUniformNormalMapTexture = #"cc_NormalMapTexture";
const NSString *CCShaderUniformAlphaTestValue = #"cc_AlphaTestValue";
// Stringify macros
#define STR(s) #s
#define XSTR(s) STR(s)
/*
main texture size points/pixels?
*/
static const GLchar *CCShaderHeader =
"#ifndef GL_ES\n"
"#define lowp\n"
"#define mediump\n"
"#define highp\n"
"#endif\n\n"
"uniform highp mat4 cc_Projection;\n"
"uniform highp mat4 cc_ProjectionInv;\n"
"uniform highp vec2 cc_ViewSize;\n"
"uniform highp vec2 cc_ViewSizeInPixels;\n"
"uniform highp vec4 cc_Time;\n"
"uniform highp vec4 cc_SinTime;\n"
"uniform highp vec4 cc_CosTime;\n"
"uniform highp vec4 cc_Random01;\n\n"
"uniform " XSTR(CC_SHADER_COLOR_PRECISION) " sampler2D cc_MainTexture;\n\n"
"uniform " XSTR(CC_SHADER_COLOR_PRECISION) " sampler2D cc_NormalMapTexture;\n\n"
"varying " XSTR(CC_SHADER_COLOR_PRECISION) " vec4 cc_FragColor;\n"
"varying highp vec2 cc_FragTexCoord1;\n"
"varying highp vec2 cc_FragTexCoord2;\n\n"
"// End Cocos2D shader header.\n\n";
static const GLchar *CCVertexShaderHeader =
"#ifdef GL_ES\n"
"precision highp float;\n\n"
"#endif\n\n"
"#define CC_NODE_RENDER_SUBPIXEL " XSTR(CC_NODE_RENDER_SUBPIXEL) "\n"
"attribute highp vec4 cc_Position;\n"
"attribute highp vec2 cc_TexCoord1;\n"
"attribute highp vec2 cc_TexCoord2;\n"
"attribute highp vec4 cc_Color;\n\n"
"// End Cocos2D vertex shader header.\n\n";
static const GLchar *CCFragmentShaderHeader =
"#ifdef GL_ES\n"
"precision " XSTR(CC_SHADER_DEFAULT_FRAGMENT_PRECISION) " float;\n"
"#endif\n\n"
"// End Cocos2D fragment shader header.\n\n";
static NSString *CCDefaultVShader =
#"void main(){\n"
#" gl_Position = cc_Position;\n"
#"#if !CC_NODE_RENDER_SUBPIXEL\n"
#" vec2 pixelPos = (0.5*gl_Position.xy/gl_Position.w + 0.5)*cc_ViewSizeInPixels;\n"
#" gl_Position.xy = (2.0*floor(pixelPos)/cc_ViewSizeInPixels - 1.0)*gl_Position.w;\n"
#"#endif\n\n"
#" cc_FragColor = clamp(cc_Color, 0.0, 1.0);\n"
#" cc_FragTexCoord1 = cc_TexCoord1;\n"
#" cc_FragTexCoord2 = cc_TexCoord2;\n"
#"}\n";
typedef void (* GetShaderivFunc) (GLuint shader, GLenum pname, GLint* param);
typedef void (* GetShaderInfoLogFunc) (GLuint shader, GLsizei bufSize, GLsizei* length, GLchar* infoLog);
static BOOL
CCCheckShaderError(GLint obj, GLenum status, GetShaderivFunc getiv, GetShaderInfoLogFunc getInfoLog)
{
GLint success;
getiv(obj, status, &success);
if(!success){
GLint length;
getiv(obj, GL_INFO_LOG_LENGTH, &length);
char *log = (char *)alloca(length);
getInfoLog(obj, length, NULL, log);
fprintf(stderr, "Shader compile error for 0x%04X: %s\n", status, log);
return NO;
} else {
return YES;
}
}
static const GLchar *
CCShaderTypeHeader(GLenum type)
{
switch(type){
case GL_VERTEX_SHADER: return CCVertexShaderHeader;
case GL_FRAGMENT_SHADER: return CCFragmentShaderHeader;
default: NSCAssert(NO, #"Bad shader type enumeration."); return NULL;
}
}
static GLint
CompileShader(GLenum type, const char *source)
{
GLint shader = glCreateShader(type);
const GLchar *sources[] = {
CCShaderHeader,
CCShaderTypeHeader(type),
source,
};
glShaderSource(shader, 3, sources, NULL);
glCompileShader(shader);
NSCAssert(CCCheckShaderError(shader, GL_COMPILE_STATUS, glGetShaderiv, glGetShaderInfoLog), #"Error compiling shader");
return shader;
}
#interface CCShaderCache : CCCache #end
#implementation CCShaderCache
-(id)createSharedDataForKey:(id<NSCopying>)key
{
NSString *shaderName = (NSString *)key;
NSString *fragmentName = [shaderName stringByAppendingPathExtension:#"fsh"];
NSString *fragmentPath = [[CCFileUtils sharedFileUtils] fullPathForFilename:fragmentName];
NSAssert(fragmentPath, #"Failed to find '%#'.", fragmentName);
NSString *fragmentSource = [NSString stringWithContentsOfFile:fragmentPath encoding:NSUTF8StringEncoding error:nil];
NSString *vertexName = [shaderName stringByAppendingPathExtension:#"vsh"];
NSString *vertexPath = [[CCFileUtils sharedFileUtils] fullPathForFilename:vertexName];
NSString *vertexSource = (vertexPath ? [NSString stringWithContentsOfFile:vertexPath encoding:NSUTF8StringEncoding error:nil] : CCDefaultVShader);
CCShader *shader = [[CCShader alloc] initWithVertexShaderSource:vertexSource fragmentShaderSource:fragmentSource];
shader.debugName = #"shaderName";
return shader;
}
-(id)createPublicObjectForSharedData:(id)data
{
return [data copy];
}
#end
#implementation CCShader {
BOOL _ownsProgram;
}
+(GLuint)createVAOforCCVertexBuffer:(GLuint)vbo elementBuffer:(GLuint)ebo
{
glPushGroupMarkerEXT(0, "CCShader: Creating vertex buffer");
GLuint vao = 0;
glGenVertexArrays(1, &vao);
glBindVertexArray(vao);
glEnableVertexAttribArray(CCAttributePosition);
glEnableVertexAttribArray(CCAttributeTexCoord1);
glEnableVertexAttribArray(CCAttributeTexCoord2);
glEnableVertexAttribArray(CCAttributeColor);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glVertexAttribPointer(CCAttributePosition, 4, GL_FLOAT, GL_FALSE, sizeof(CCVertex), (void *)offsetof(CCVertex, position));
glVertexAttribPointer(CCAttributeTexCoord1, 2, GL_FLOAT, GL_FALSE, sizeof(CCVertex), (void *)offsetof(CCVertex, texCoord1));
glVertexAttribPointer(CCAttributeTexCoord2, 2, GL_FLOAT, GL_FALSE, sizeof(CCVertex), (void *)offsetof(CCVertex, texCoord2));
glVertexAttribPointer(CCAttributeColor, 4, GL_FLOAT, GL_FALSE, sizeof(CCVertex), (void *)offsetof(CCVertex, color));
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ebo);
glBindVertexArray(0);
glBindBuffer(GL_ARRAY_BUFFER, 0);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
glPopGroupMarkerEXT();
return vao;
}
//MARK: Uniform Setters:
static CCUniformSetter
SetFloat(NSString *name, GLint location)
{
return ^(CCRenderer *renderer, NSDictionary *shaderUniforms, NSDictionary *globalShaderUniforms){
NSNumber *value = shaderUniforms[name] ?: globalShaderUniforms[name] ?: #(0.0);
NSCAssert([value isKindOfClass:[NSNumber class]], #"Shader uniform '%#' value must be wrapped in a NSNumber.", name);
glUniform1f(location, value.floatValue);
};
}
static CCUniformSetter
SetVec2(NSString *name, GLint location)
{
NSString *textureName = nil;
bool pixelSize = [name hasSuffix:#"PixelSize"];
if(pixelSize){
textureName = [name substringToIndex:name.length - #"PixelSize".length];
} else if([name hasSuffix:#"Size"]){
textureName = [name substringToIndex:name.length - #"Size".length];
}
return ^(CCRenderer *renderer, NSDictionary *shaderUniforms, NSDictionary *globalShaderUniforms){
NSValue *value = shaderUniforms[name] ?: globalShaderUniforms[name];
// Fall back on looking up the actual texture size if the name matches a texture.
if(value == nil && textureName){
CCTexture *texture = shaderUniforms[textureName] ?: globalShaderUniforms[textureName];
GLKVector2 sizeInPixels = GLKVector2Make(texture.pixelWidth, texture.pixelHeight);
GLKVector2 size = GLKVector2MultiplyScalar(sizeInPixels, pixelSize ? 1.0 : 1.0/texture.contentScale);
value = [NSValue valueWithGLKVector2:size];
}
// Finally fall back on 0.
if(value == nil) value = [NSValue valueWithGLKVector2:GLKVector2Make(0.0f, 0.0f)];
NSCAssert([value isKindOfClass:[NSValue class]], #"Shader uniform '%#' value must be wrapped in a NSValue.", name);
if(strcmp(value.objCType, #encode(GLKVector2)) == 0){
GLKVector2 v; [value getValue:&v];
glUniform2f(location, v.x, v.y);
} else if(strcmp(value.objCType, #encode(CGPoint)) == 0){
CGPoint v = {}; [value getValue:&v];
glUniform2f(location, v.x, v.y);
} else if(strcmp(value.objCType, #encode(CGSize)) == 0){
CGSize v = {}; [value getValue:&v];
glUniform2f(location, v.width, v.height);
} else {
NSCAssert(NO, #"Shader uniformm 'vec2 %#' value must be passed using [NSValue valueWithGLKVector2:], [NSValue valueWithCGPoint:], or [NSValue valueWithCGSize:]", name);
}
};
}
static CCUniformSetter
SetVec3(NSString *name, GLint location)
{
return ^(CCRenderer *renderer, NSDictionary *shaderUniforms, NSDictionary *globalShaderUniforms){
NSValue *value = shaderUniforms[name] ?: globalShaderUniforms[name] ?: [NSValue valueWithGLKVector3:GLKVector3Make(0.0f, 0.0f, 0.0f)];
NSCAssert([value isKindOfClass:[NSValue class]], #"Shader uniform '%#' value must be wrapped in a NSValue.", name);
NSCAssert(strcmp(value.objCType, #encode(GLKVector3)) == 0, #"Shader uniformm 'vec3 %#' value must be passed using [NSValue valueWithGLKVector3:]", name);
GLKVector3 v; [value getValue:&v];
glUniform3f(location, v.x, v.y, v.z);
};
}
static CCUniformSetter
SetVec4(NSString *name, GLint location)
{
return ^(CCRenderer *renderer, NSDictionary *shaderUniforms, NSDictionary *globalShaderUniforms){
NSValue *value = shaderUniforms[name] ?: globalShaderUniforms[name] ?: [NSValue valueWithGLKVector4:GLKVector4Make(0.0f, 0.0f, 0.0f, 1.0f)];
if([value isKindOfClass:[NSValue class]]){
NSCAssert(strcmp([(NSValue *)value objCType], #encode(GLKVector4)) == 0, #"Shader uniformm 'vec4 %#' value must be passed using [NSValue valueWithGLKVector4:].", name);
GLKVector4 v; [value getValue:&v];
glUniform4f(location, v.x, v.y, v.z, v.w);
} else if([value isKindOfClass:[CCColor class]]){
GLKVector4 v = [(CCColor *)value glkVector4];
glUniform4f(location, v.x, v.y, v.z, v.w);
} else {
NSCAssert(NO, #"Shader uniformm 'vec4 %#' value must be passed using [NSValue valueWithGLKVector4:] or a CCColor object.", name);
}
};
}
static CCUniformSetter
SetMat4(NSString *name, GLint location)
{
return ^(CCRenderer *renderer, NSDictionary *shaderUniforms, NSDictionary *globalShaderUniforms){
NSValue *value = shaderUniforms[name] ?: globalShaderUniforms[name] ?: [NSValue valueWithGLKMatrix4:GLKMatrix4Identity];
NSCAssert([value isKindOfClass:[NSValue class]], #"Shader uniform '%#' value must be wrapped in a NSValue.", name);
NSCAssert(strcmp(value.objCType, #encode(GLKMatrix4)) == 0, #"Shader uniformm 'mat4 %#' value must be passed using [NSValue valueWithGLKMatrix4:]", name);
GLKMatrix4 m; [value getValue:&m];
glUniformMatrix4fv(location, 1, GL_FALSE, m.m);
};
}
-(NSDictionary *)uniformSettersForProgram:(GLuint)program
{
NSMutableDictionary *uniformSetters = [NSMutableDictionary dictionary];
glUseProgram(program);
GLint count = 0;
glGetProgramiv(program, GL_ACTIVE_UNIFORMS, &count);
int textureUnit = 0;
for(int i=0; i<count; i++){
GLchar cname[256];
GLsizei length = 0;
GLsizei size = 0;
GLenum type = 0;
glGetActiveUniform(program, i, sizeof(cname), &length, &size, &type, cname);
NSAssert(size == 1, #"Uniform arrays not supported. (yet?)");
NSString *name = #(cname);
GLint location = glGetUniformLocation(program, cname);
// Setup a block that is responsible for binding that uniform variable's value.
switch(type){
default: NSAssert(NO, #"Uniform type not supported. (yet?)");
case GL_FLOAT: uniformSetters[name] = SetFloat(name, location); break;
case GL_FLOAT_VEC2: uniformSetters[name] = SetVec2(name, location); break;
case GL_FLOAT_VEC3: uniformSetters[name] = SetVec3(name, location); break;
case GL_FLOAT_VEC4: uniformSetters[name] = SetVec4(name, location); break;
case GL_FLOAT_MAT4: uniformSetters[name] = SetMat4(name, location); break;
case GL_SAMPLER_2D: {
// Sampler setters are handled a differently since the real work is binding the texture and not setting the uniform value.
uniformSetters[name] = ^(CCRenderer *renderer, NSDictionary *shaderUniforms, NSDictionary *globalShaderUniforms){
CCTexture *texture = shaderUniforms[name] ?: globalShaderUniforms[name] ?: [CCTexture none];
NSAssert([texture isKindOfClass:[CCTexture class]], #"Shader uniform '%#' value must be a CCTexture object.", name);
// Bind the texture to the texture unit for the uniform.
glActiveTexture(GL_TEXTURE0 + textureUnit);
glBindTexture(GL_TEXTURE_2D, texture.name);
};
// Bind the texture unit at init time.
glUniform1i(location, textureUnit);
textureUnit++;
}
}
}
return uniformSetters;
}
//MARK: Init Methods:
-(instancetype)initWithProgram:(GLuint)program uniformSetters:(NSDictionary *)uniformSetters ownsProgram:(BOOL)ownsProgram
{
if((self = [super init])){
_program = program;
_uniformSetters = uniformSetters;
_ownsProgram = ownsProgram;
}
return self;
}
-(instancetype)initWithVertexShaderSource:(NSString *)vertexSource fragmentShaderSource:(NSString *)fragmentSource
{
glPushGroupMarkerEXT(0, "CCShader: Init");
GLuint program = glCreateProgram();
glBindAttribLocation(program, CCAttributePosition, "cc_Position");
glBindAttribLocation(program, CCAttributeTexCoord1, "cc_TexCoord1");
glBindAttribLocation(program, CCAttributeTexCoord2, "cc_TexCoord2");
glBindAttribLocation(program, CCAttributeColor, "cc_Color");
GLint vshader = CompileShader(GL_VERTEX_SHADER, vertexSource.UTF8String);
glAttachShader(program, vshader);
GLint fshader = CompileShader(GL_FRAGMENT_SHADER, fragmentSource.UTF8String);
glAttachShader(program, fshader);
glLinkProgram(program);
NSCAssert(CCCheckShaderError(program, GL_LINK_STATUS, glGetProgramiv, glGetProgramInfoLog), #"Error linking shader program");
glDeleteShader(vshader);
glDeleteShader(fshader);
glPopGroupMarkerEXT();
return [self initWithProgram:program uniformSetters:[self uniformSettersForProgram:program] ownsProgram:YES];
}
-(instancetype)initWithFragmentShaderSource:(NSString *)source
{
return [self initWithVertexShaderSource:CCDefaultVShader fragmentShaderSource:source];
}
- (void)dealloc
{
CCLOGINFO( #"cocos2d: deallocing %#", self);
if(_ownsProgram && _program) glDeleteProgram(_program);
}
-(instancetype)copyWithZone:(NSZone *)zone
{
return [[CCShader allocWithZone:zone] initWithProgram:_program uniformSetters:_uniformSetters ownsProgram:NO];
}
static CCShaderCache *CC_SHADER_CACHE = nil;
static CCShader *CC_SHADER_POS_COLOR = nil;
static CCShader *CC_SHADER_POS_TEX_COLOR = nil;
static CCShader *CC_SHADER_POS_TEXA8_COLOR = nil;
static CCShader *CC_SHADER_POS_TEX_COLOR_ALPHA_TEST = nil;
+(void)initialize
{
// +initialize may be called due to loading a subclass.
if(self != [CCShader class]) return;
CC_SHADER_CACHE = [[CCShaderCache alloc] init];
// Setup the builtin shaders.
CC_SHADER_POS_COLOR = [[self alloc] initWithFragmentShaderSource:#"void main(){gl_FragColor = cc_FragColor;}"];
CC_SHADER_POS_COLOR.debugName = #"CCPositionColorShader";
CC_SHADER_POS_TEX_COLOR = [[self alloc] initWithFragmentShaderSource:#"void main(){gl_FragColor = cc_FragColor*texture2D(cc_MainTexture, cc_FragTexCoord1);}"];
CC_SHADER_POS_TEX_COLOR.debugName = #"CCPositionTextureColorShader";
CC_SHADER_POS_TEXA8_COLOR = [[self alloc] initWithFragmentShaderSource:#"void main(){gl_FragColor = cc_FragColor*texture2D(cc_MainTexture, cc_FragTexCoord1).a;}"];
CC_SHADER_POS_TEXA8_COLOR.debugName = #"CCPositionTextureA8ColorShader";
CC_SHADER_POS_TEX_COLOR_ALPHA_TEST = [[self alloc] initWithFragmentShaderSource:CC_GLSL(
uniform float cc_AlphaTestValue;
void main(){
vec4 tex = texture2D(cc_MainTexture, cc_FragTexCoord1);
if(tex.a <= cc_AlphaTestValue) discard;
gl_FragColor = cc_FragColor*tex;
}
)];
CC_SHADER_POS_TEX_COLOR_ALPHA_TEST.debugName = #"CCPositionTextureColorAlphaTestShader";
}
+(instancetype)positionColorShader
{
return CC_SHADER_POS_COLOR;
}
+(instancetype)positionTextureColorShader
{
return CC_SHADER_POS_TEX_COLOR;
}
+(instancetype)positionTextureColorAlphaTestShader
{
return CC_SHADER_POS_TEX_COLOR_ALPHA_TEST;
}
+(instancetype)positionTextureA8ColorShader
{
return CC_SHADER_POS_TEXA8_COLOR;
}
+(instancetype)shaderNamed:(NSString *)shaderName
{
return [CC_SHADER_CACHE objectForKey:shaderName];
}
#end
you can get the example code from:cocos3d following: the CC3DemoMultiScene in File Projects
SECOND EDIT:
I read several responses on StackOverflow and in this forum: Cocos 2D Forum, I had to re install xCode, I don't know what happened? but now, not show me the CCShader Error, show me:
OpenGL error GL_INVALID_OPERATION detected at -[CCRenderer(NoARCPrivate) setRenderState:] 232
how is the best way for show 1 Scene of Cocos 3D in 2 different ViewControllers?
Thanks Again!!
After of many searches and investigations, The reason this causes trouble is because this causes the AppDelegate applicationDidBecomeActive: method (which references CCDirector.sharedDirector singleton) to be invoked before the MainViewController createGLView method (which initializes the CCDirector.sharedDirector singleton).
The reference to the uninitialized CCDirector.sharedDirector singleton in AppDelegate applicationDidBecomeActive: triggers a whole bunch of Cocos2D OpenGL ES initialization before the OpenGL context has been set up correctly, causing a bunch of shader loading activity to fail.
The best way to fix this is to ensure that the code found in the MainViewController createGLView method is run before any references to the CCDirector.sharedDirector singleton are made elsewhere.
We can also side-step the problem in the short-term by commenting out the code in the AppDelegate applicationDidBecomeActive: method, but that might affect the behaviour of your app as it moves in and out of the background.
like the answers from #Bill Hollings in this questions:
Error CompileShader in Xcode 7.1.1 on iOS 9? and
Error cocos3d + Storyboard?
I configured and tried an example DennisCocos3D Prueba (dropbox)
or try with:
in MainViewController .m file:
-(CCGLView*) createGLView {
// Create the view first, since it creates the GL context, which CCDirector expects during init.
CCGLView* glView = [CCGLView viewWithFrame: _cc3FrameView.bounds
pixelFormat: kEAGLColorFormatRGBA8
depthFormat: GL_DEPTH24_STENCIL8 // Shadow volumes require a stencil
preserveBackbuffer: NO
numberOfSamples: 1];
CCDirector* director = CCDirector.sharedDirector;
director.animationInterval = (1.0f / kAnimationFrameRate);
director.displayStats = YES;
director.view = glView;
// Run the initial static 2D intro scene
[director runWithScene: [[self makeIntroScene] asCCScene]];
if(![[CCDirector sharedDirector] runningScene])
{
[[CCDirector sharedDirector] runWithScene:[[self makeIntroScene] asCCScene]];
}
else
{
[[CCDirector sharedDirector] startAnimation];
[[CCDirector sharedDirector] replaceScene:[[self makeIntroScene] asCCScene]];
}
return glView;
}

AVAssetWriter sometimes fails with status AVAssetWriterStatusFailed. Seems random

I'm writing a MP4 video file with a AVAssetWriter using a AVAssetWriterInputPixelBufferAdaptor.
The source is a video from a UIImagePickerController, either freshly captured from the camera or from the asset library. Quality right now is UIImagePickerControllerQualityTypeMedium.
Some times the writer fails. It's status is AVAssetWriterStatusFailed and the AVAssetWriter objects error property is:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed"
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210),
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)",
NSLocalizedDescription=The operation could not be completed
The error occurs approximately 20% of the times the code is run. It seems to fail more frequently on iPhone 4 / 4S than on iPhone 5.
It also occurs more frequently if the source video quality is higher.
Using UIImagePickerControllerQualityTypeLow the error doesn't happen so often.
Using UIImagePickerControllerQualityTypeHigh, the error happens a little more frequently.
I have also noticed something else:
It seems to come in waves. When it fails, the following runs will often fail too, even though I delete the app and reinstall it. That leaves me wondering, whether my program leaks some memory and if that memory stays alive even if the app gets killed (is that even possible?).
Here is the code i use to render my video:
- (void)writeVideo
{
offlineRenderingInProgress = YES;
/* --- Writer Setup --- */
[locationQueue cancelAllOperations];
[self stopWithoutRewinding];
NSError *writerError = nil;
BOOL succes;
succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil];
// DLog(#"Url: %#, succes: %i, error: %#", self.outputURL, succes, fileError);
writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError];
//writer.shouldOptimizeForNetworkUse = NO;
if (writerError) {
DLog(#"Writer error: %#", writerError);
return;
}
float bitsPerPixel;
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0]));
int numPixels = dimensions.width * dimensions.height;
int bitsPerSecond;
// Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
if ( numPixels < (640 * 480) )
bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
else
bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.
bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey,
[NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey,
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey,
nil], AVVideoCompressionPropertiesKey,
nil];
writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
writerVideoInput.transform = movie.preferredTransform;
writerVideoInput.expectsMediaDataInRealTime = YES;
[writer addInput:writerVideoInput];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
BOOL couldStart = [writer startWriting];
if (!couldStart) {
DLog(#"Could not start AVAssetWriter!");
abort = YES;
[locationQueue cancelAllOperations];
return;
}
[self configureFilters];
CIContext *offlineRenderContext = [CIContext contextWithOptions:#{kCIContextUseSoftwareRenderer : #NO}];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
if (!self.canEdit) {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES];
} else {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES];
}
CMTime startOffset = reader.timeRange.start;
DLog(#"startOffset: %llu", startOffset.value);
[self.thumbnailEditView removeFromSuperview];
// self.thumbnailEditView = nil;
[glLayer removeFromSuperlayer];
glLayer = nil;
[playerView removeFromSuperview];
playerView = nil;
glContext = nil;
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{
#try {
BOOL didWriteSomething = NO;
DLog(#"Preparing to write...");
while ([writerVideoInput isReadyForMoreMediaData]) {
if (abort) {
NSLog(#"Abort == YES");
[locationQueue cancelAllOperations];
[writerVideoInput markAsFinished];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
if (writer.status == AVAssetWriterStatusFailed) {
DLog(#"Writer.status: AVAssetWriterStatusFailed, error: %#", writer.error);
[[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:#"QualityOverride"];
[[NSUserDefaults standardUserDefaults] synchronize];
abort = YES;
[locationQueue cancelAllOperations];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
return;
DLog(#"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]);
}
DLog(#"Writing started...");
CMSampleBufferRef buffer = nil;
if (reader.status != AVAssetReaderStatusUnknown) {
if (reader.status == AVAssetReaderStatusReading) {
buffer = [readerVideoOutput copyNextSampleBuffer];
if (didWriteSomething == NO) {
DLog(#"Copying sample buffers...");
}
}
if (!buffer) {
[writerVideoInput markAsFinished];
DLog(#"Finished...");
CGColorSpaceRelease(colorSpace);
[self offlineRenderingDidFinish];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[writer finishWriting];
if (writer.error != nil) {
DLog(#"Error: %#", writer.error);
} else {
DLog(#"Succes!");
}
if (writer.status == AVAssetWriterStatusCompleted) {
videoConvertCompletionBlock(YES, nil);
}
else {
abort = YES;
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
});
return;
}
didWriteSomething = YES;
}
else {
DLog(#"Still waiting...");
//Reader just needs a moment to get ready...
continue;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
if (pixelBuffer == NULL) {
DLog(#"Pixelbuffer == NULL");
continue;
}
//DLog(#"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer));
//NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace];
CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil];
CIImage *outputImage = [self filteredImageWithImage:ciimage];
CVPixelBufferRef outPixelBuffer = NULL;
CVReturn status;
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CFDictionarySetValue(attrs,
kCVPixelBufferCGImageCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
CFDictionarySetValue(attrs,
kCVPixelBufferCGBitmapContextCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer);
//DLog(#"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer));
if (status != kCVReturnSuccess) {
DLog(#"Couldn't allocate output pixelBufferRef!");
continue;
}
[offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace];
CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer);
CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset);
CMTime duration = reader.timeRange.duration;
if (CMTIME_IS_POSITIVE_INFINITY(duration)) {
duration = movie.duration;
}
CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default);
float durationFloat = (float)durationConverted.value;
float progress = ((float) currentTime.value) / durationFloat;
//DLog(#"duration : %f, progress: %f", durationFloat, progress);
[self updateOfflineRenderProgress:progress];
if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) {
[writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime];
} else {
continue;
}
if (writer.status == AVAssetWriterStatusWriting) {
DLog(#"Writer.status: AVAssetWriterStatusWriting");
}
CFRelease(buffer);
CVPixelBufferRelease(outPixelBuffer);
}
}
#catch (NSException *exception) {
DLog(#"Catching exception: %#", exception);
}
}];
}
Ok, I think I solved it myself. The bad guy was this line:
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ ....
The global queue I was passing is a concurrent queue. This allows a new callback to be made before the previous one is finished. The asset writer is not designed to be written to from more than one thread at a time.
Creating and using a new serial queue seems to remedy the problem:
assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL);
[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{...

Function that sends an image to AirPrint

Im trying to find a function that lets me print using AirPrint.
I have a button btnPrint, that when pressed, should print myPic.jpg to the default AirPrint device. But I cannot figure out if there even is such a function.
I cannot find a lot of documentation on AirPrint in xcode.
Apple has documentation on printing that would probably benefit you.
And the following is from Objective-C code for AirPrint:
Check wether printing is available:
if ([UIPrintInteractionController isPrintingAvailable])
{
// Available
} else {
// Not Available
}
Print after button click:
-(IBAction) buttonClicked: (id) sender;
{
NSMutableString *printBody = [NSMutableString stringWithFormat:#"%#, %#",self.encoded.text, self.decoded.text];
[printBody appendFormat:#"\n\n\n\nPrinted From *myapp*"];
UIPrintInteractionController *pic = [UIPrintInteractionController sharedPrintController];
pic.delegate = self;
UIPrintInfo *printInfo = [UIPrintInfo printInfo];
printInfo.outputType = UIPrintInfoOutputGeneral;
printInfo.jobName = self.titleLabel.text;
pic.printInfo = printInfo;
UISimpleTextPrintFormatter *textFormatter = [[UISimpleTextPrintFormatter alloc] initWithText:printBody];
textFormatter.startPage = 0;
textFormatter.contentInsets = UIEdgeInsetsMake(72.0, 72.0, 72.0, 72.0); // 1 inch margins
textFormatter.maximumContentWidth = 6 * 72.0;
pic.printFormatter = textFormatter;
[textFormatter release];
pic.showsPageRange = YES;
void (^completionHandler)(UIPrintInteractionController *, BOOL, NSError *) =
^(UIPrintInteractionController *printController, BOOL completed, NSError *error) {
if (!completed && error) {
NSLog(#"Printing could not complete because of error: %#", error);
}
};
[pic presentFromBarButtonItem:self.rightButton animated:YES completionHandler:completionHandler];
}

Create an array of UIImages from camera roll

I would like to get all of the images from the camera roll and create an array of UIImages from them.
I have been trying to figure out how to do this for about a day now and I've gotten nowhere. I can't seem to figure out how to retrieve only items from the Camera Roll. It appears that all of the samples that I've seen all enumerate over all of the photo albums. I might be wrong about that though.
Any help would be appreciated. Thanks!
Have you tried ALAssetsLibrary? like this:
assets = [[NSMutableArray array] init]; // Prepare array to have retrieved images by Assets Library.
void (^assetEnumerator)(ALAsset *, NSUInteger, BOOL *) = ^(ALAsset *asset, NSUInteger index, BOOL *stop) {
if(asset != NULL) {
[assets addObject:asset];
dispatch_async(dispatch_get_main_queue(), ^{
[self insertArray];
});
}
};
void (^assetGroupEnumerator)(ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop) {
if(group != nil) {
[group enumerateAssetsUsingBlock:assetEnumerator];
}
};
// Create instance of the Assets Library.
library = [[ALAssetsLibrary alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos // Retrieve the images saved in the Camera roll.
usingBlock:assetGroupEnumerator
failureBlock: ^(NSError *error) {
NSLog(#"Failed.");
}];
that'll nab them. then do this to render them (this is wicked hacky. you'll want to import them as needed and not all at once like this, or you'll run into memory issues and crash)
-(void) insertArray {
int i = assetCount++;
if (i>20) {
return;
}
ALAssetRepresentation *rep = [[assets objectAtIndex:i] defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
CGSize cSize = CGSizeMake(75,75);
UIImage *largeimage = [UIImage imageWithCGImage:iref];
UIImage *resizedimage = (UIImage *)[largeimage resizedImage:cSize interpolationQuality:kCGInterpolationHigh];
UIImageView *newView = [[UIImageView alloc] initWithImage:resizedimage];
if((i>0)&&(i%4 == 0)){
rowCount++;
}
colCount = i%4;
newView.frame = CGRectMake(4+(colCount*(75+4)), 4+(rowCount*(75+4)), 75, 75);
[sv addSubview:newView];
[sv setContentSize:CGSizeMake(320, 85+(rowCount*(75+4)))];
NSLog(#"sv frame size is %# and i is %i", NSStringFromCGRect(sv.frame), i);
}