How to use CFDictionaryCreate in Swift 3? - core-graphics

I want to draw an attributed string with clipping rect, and I found a function written in Objective-C,
CFDictionaryRef PINCHFrameAttributesCreateWithClippingRect(CGRect clippingRect, CGAffineTransform transform)
{
CGPathRef clipPath = CGPathCreateWithRect(clippingRect, &transform);
CFDictionaryRef options;
CFStringRef keys[] = {kCTFramePathClippingPathAttributeName};
CFTypeRef values[] = {clipPath};
CFDictionaryRef clippingPathDict = CFDictionaryCreate(NULL,
(const void **)&keys, (const void **)&values,
sizeof(keys) / sizeof(keys[0]),
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFTypeRef clippingArrayValues[] = { clippingPathDict };
CFArrayRef clippingPaths = CFArrayCreate(NULL, (const void **)clippingArrayValues, sizeof(clippingArrayValues) / sizeof(clippingArrayValues[0]), &kCFTypeArrayCallBacks);
CFStringRef optionsKeys[] = {kCTFrameClippingPathsAttributeName};
CFTypeRef optionsValues[] = {clippingPaths};
options = CFDictionaryCreate(NULL, (const void **)&optionsKeys, (const void **)&optionsValues,
sizeof(optionsKeys) / sizeof(optionsKeys[0]),
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFRelease(clippingPathDict);
CFRelease(clippingPaths);
CGPathRelease(clipPath);
return options;
}
But I failed to translate it in Swift 3, anybody can help me?
//
I tried belowing codes, it compiles and run, but text doesn't do clip:
let keys: [CFString] = [kCTFrameClippingPathsAttributeName]
let values: [CFTypeRef] = [clipPath]
let keysPointer = UnsafeMutablePointer<UnsafeRawPointer?>.allocate(capacity: keys.count)
keysPointer.initialize(to: keys)
let valuesPointer = UnsafeMutablePointer<UnsafeRawPointer?>.allocate(capacity: values.count)
valuesPointer.initialize(to: values)
let options = CFDictionaryCreate(kCFAllocatorDefault, keysPointer, valuesPointer, 1, nil, nil)
let framesetter = CTFramesetterCreateWithAttributedString(attributedSring)
let frame = CTFramesetterCreateFrame(framesetter, CFRangeMake(0, attributedSring.length), rectPath.cgPath, options)
CTFrameDraw(frame, context)

Construct a Swift dictionary and then cast it to CFDictionary. Example:
let d = [
kCGImageSourceShouldAllowFloat as String : true as NSNumber,
kCGImageSourceCreateThumbnailWithTransform as String : true as NSNumber,
kCGImageSourceCreateThumbnailFromImageAlways as String : true as NSNumber,
kCGImageSourceThumbnailMaxPixelSize as String : w as NSNumber
] as CFDictionary
More details here: https://bugs.swift.org/browse/SR-2388

Related

Converting SHA1 string encryption code from objective-c to swift

So this is the original Objective-C code I have:
- (NSString *)calculateSignaturewithAPIKey:(NSString *)apiKey apiKeyPrivate:(NSString *)apiKeyPrivate httpMethod:(NSString *)httpMethod route:(NSString *)theRoute andExpiresIn:(NSString *)expireTime {
NSString *string_to_sign = [NSString stringWithFormat:#"%#:%#:%#:%#",apiKey,httpMethod,theRoute,expireTime];
const char *cKey = [apiKeyPrivate cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [string_to_sign cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSString *signature = [HMAC base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
return signature;
}
In Swift 3 I managed to get as far as:
func calculateSignature(withPublicApiKey publicApiKey: String, andApiPrivateKey privateApiKey: String, withHttpMethod httpMethod: String, andRoute route: String, exiresIn expireTime: String) -> String {
let string_to_sign = "\(publicApiKey):\(httpMethod):\(route):\(expireTime)"
let cKey = privateApiKey.cString(using: String.Encoding.ascii)
let cData = Data.base64EncodedString(Data.init)
var cHMAC = [CUnsignedChar](repeating: 0, count: Int(CC_SHA1_DIGEST_LENGTH))
But I don't know how to proceed here. I have been able to import Crypto related things into my Swift project. Please assist.
Try this:
func calculateSignature(withPublicApiKey publicApiKey: String, andApiPrivateKey privateApiKey: String, withHttpMethod httpMethod: String, andRoute route: String, exiresIn expireTime: String) -> String {
let string_to_sign = "\(publicApiKey):\(httpMethod):\(route):\(expireTime)"
let cKey = privateApiKey.cString(using: .ascii)!
let cData = string_to_sign.cString(using: .ascii)!
var cHMAC = [CUnsignedChar](repeating: 0, count: Int(CC_SHA1_DIGEST_LENGTH))
CCHmac(CCHmacAlgorithm(kCCHmacAlgSHA1), cKey, cKey.count - 1, cData, cData.count - 1, &cHMAC)
let HMAC = Data(bytes: &cHMAC, count: Int(CC_SHA1_DIGEST_LENGTH))
return HMAC.base64EncodedString(options: .lineLength64Characters)
}
My personal experience is that Swift really hates pointers. If you code makes heavy use of pointer, it's easer to write them in C/ObjC.

How to pass const void *bytes to an Objective-C block from Swift

I'm writing some Swift (1.2) code that calls an Objective-C library (transcribing some earlier Objective-C code). It's not clear how it's supposed to work. I've tried a bunch of variations, and nothing satisfies the compiler. Here's the Objective-C code I started with:
// Objective-C
NSData *fileData = //
const void *bytes = fileData.bytes;
unsigned int bufferSize = 1024; //Arbitrary
[object writeDataWithBlock:
^BOOL(BOOL(^writeData)(const void *bytes, unsigned int length), NSError**(actionError)) {
for (NSUInteger offset = 0; offset <= fileData.length; offset += bufferSize) {
unsigned int size = (unsigned int)MIN(fileData.length - i, bufferSize);
writeData(&bytes[i], size);
}
return YES;
}];
So far, I've gotten as far as this Swift code:
// Swift
let fileData = //
let ptr = UnsafePointer<UInt8>(fileData.bytes)
let bufferSize = 1024 // Arbitrary
object.writeDataWithBlock() { (writeData, actionError) -> Bool in
for var offset = 0; offset <= fileData.length; offset += bufferSize {
let size = CUnsignedInt(min(fileData.length - offset, bufferSize))
writeData(UnsafePointer<Void>(ptr[offset]), size)
}
return true
}
That's giving me this error:
Cannot find an initializer for type 'UnsafePointer' that accepts an arguments list of type '(UInt8)'
When I remove the UnsafePointer<Void> conversion to do a more direct translation, yielding this line:
writeData(&ptr[offset], size)
I get this error, pointing at the & character:
Type of expression is ambiguous without more context
Without the &, it yields a UInt8, giving me this error:
Cannot invoke 'writeData' with an argument list of type '(Uint8, UInt32)'
What do I need to do to read the bytes out of the NSData sequentially and pass them onto another method?
Pointer arithmetic should (and does) get you the results you're looking for.
let fileData = //
let bufferSize = 1024 // Arbitrary
object.writeDataWithBlock() { (writeData, actionError) -> Bool in
for var offset = 0; offset <= fileData.length; offset += bufferSize {
let size = CUnsignedInt(min(fileData.length - offset, bufferSize))
writeData(fileData.bytes + offset, size)
}
return true
}

obj-c to Swift convert code

So I'm trying to convert some obj-c code into swift but I'm stuck on a part of it.
How can I do something like this in swift?
- (void)sometMethod:(NSString *)s {
NSString * crlfString = [s stringByAppendingString:#"\r\n"];
uint8_t *buf = (uint8_t *)[crlfString UTF8String];}
The real problem is this line
uint8_t *buf = (uint8_t *)[crlfString UTF8String];
What do you want to do with the UTF-8 buffer? Generally, it is more convenient to convert the string directly to an NSData object. But you can also translate your Obj-C code one by one to Swift. Here are both variants:
import Foundation
func appendCRLFAndConvertToUTF8_1(s: String) -> NSData {
let crlfString: NSString = s.stringByAppendingString("\r\n")
let buffer = crlfString.UTF8String
let bufferLength = crlfString.lengthOfBytesUsingEncoding(NSUTF8StringEncoding)
let data = NSData(bytes: buffer, length: bufferLength)
return data;
}
func appendCRLFAndConvertToUTF8_2(s: String) -> NSData {
let crlfString = s + "\r\n"
return crlfString.dataUsingEncoding(NSUTF8StringEncoding)!
}
let s = "Hello 😄"
let data1 = appendCRLFAndConvertToUTF8_1(s)
data1.description
let data2 = appendCRLFAndConvertToUTF8_2(s)
data2.description
data1 == data2
And if you want to iterate over the UTF-8 code units and not deal with a buffer, use something like:
for codeUnit in String.utf8 {
println(codeUnit)
}

Handling File System Events in OSX

So im using the EventStream to watch a folder for change. Now it all works fine and I can see a log call back when I alter files in the folder, but I cant seem to call my folderWatch, it gives the error "use of undeclared identifier 'self'". I can use this function everywhere else, just not in the fsEventsCallback. Any help would be appreciated!
void fsEventsCallback(ConstFSEventStreamRef streamRef,
void *clientCallBackInfo,
size_t numEvents,
void *eventPaths,
const FSEventStreamEventFlags eventFlags[],
const FSEventStreamEventId eventIds[]){
[self folderWatch];
NSLog(#"2");
}
The reason is that fsEventsCallback is a C function and not an Objective-C instance method, so fsEventsCallback does not know anything about self.
You can use the info field in the FSEventStreamContext to pass self to the callback function. The following example assumes that your class is called Watcher.
(If you don't use ARC, you can omit all the __bridge casts.)
- (void)folderWatch
{
}
void fsEventsCallback(ConstFSEventStreamRef streamRef,
void *info,
size_t numEvents,
void *eventPaths,
const FSEventStreamEventFlags eventFlags[],
const FSEventStreamEventId eventIds[])
{
Watcher *watcher = (__bridge Watcher *)info;
[watcher folderWatch];
}
- (void)startWatching
{
FSEventStreamContext context;
context.info = (__bridge void *)(self);
context.version = 0;
context.retain = NULL;
context.release = NULL;
context.copyDescription = NULL;
NSArray *pathsToWatch = #[#"/path/to/watch"];
self.eventStream = FSEventStreamCreate(NULL,
&fsEventsCallback,
&context,
(__bridge CFArrayRef)(pathsToWatch),
kFSEventStreamEventIdSinceNow,
1.0,
kFSEventStreamCreateFlagFileEvents
);
}

Create a CVPixelBufferRef with OpenGLCompatibility

I am trying to create a CVPixelBuffer to allocate a bitmap in it and bind it to an OpenGL texture under IOS 5, but I having some problems on it. I can generate the pixel buffer but the IOSurface is always null, so I can not use it with CVOpenGLESTextureCacheCreateTextureFromImage. Since I am not using a Camera or a Video, but an self generated bitmap I can not use CVSampleBufferRef to get the pixel buffer from it.
I let you my code in case some of you knows how to solve this issue.
CVPixelBufferRef renderTarget = nil;
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
2,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CFDictionarySetValue(attrs, kCVPixelBufferOpenGLCompatibilityKey, [NSNumber numberWithBool:YES]);
CVReturn err = CVPixelBufferCreateWithBytes(kCFAllocatorDefault, tmpSize.width, tmpSize.height, kCVPixelFormatType_32ARGB, imageData, tmpSize.width*4, 0, 0, attrs, &renderTarget);
If I dump the new PixelBufferRef I get:
1 : <CFString 0x3e43024c [0x3f8f8650]>{contents = "OpenGLCompatibility"} = <CFBoolean 0x3f8f8a10 [0x3f8f8650]>{value = true}
2 : <CFString 0x3e43026c [0x3f8f8650]>{contents = "IOSurfaceProperties"} = <CFBasicHash 0xa655750 [0x3f8f8650]>{type = immutable dict, count = 0,
entries =>
}
}
And I get an -6683 error (kCVReturnPixelBufferNotOpenGLCompatible) if I try to use it with a TextureCache, this way:
CVOpenGLESTextureCacheCreateTextureFromImage (
kCFAllocatorDefault,
self.eagViewController.textureCache,
renderTarget,
nil, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
tmpSize.width,
tmpSize.height,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTextureTemp);