how to pass Float in Swift in Dictionary into ObjectiveC method as NSDictionary without turning it into a String? - objective-c

ObjC method:
- (void)postRequestToURN:(NSString *)URN
parameters:(NSDictionary *)parameters
Swift code calling it:
var paramTest:NSDictionary = NSMutableDictionary()
paramTest.setValue( Int(amount), forKey:"autopayAmt" )
paramTest.setValue( Float(amount), forKey:"autopayAmt1" )
paramTest.setValue( Double(amount), forKey:"autopayAmt2" )
paramTest.setValue( (amount as NSNumber), forKey:"autopayAmt3" )
paramTest.setValue( true, forKey:"autopayAmt4" )
Note: this must be some Swift mapping thing. On Swift 3 as of now. When I put the code with NSDictionary as the parameter then I get this compile error message. This doesn't really make sense as the parameter is an NSDictionary * in the ObjectiveC code.
Cannot convert value of type 'NSDictionary' to expected argument type '[AnyHashable : Any]!'
instead of numbers being in the JSON, I get strings. I need numbers.
{
autopayAmt = 125;
autopayAmt1 = "125.43";
autopayAmt2 = "125.4300003051758";
autopayAmt3 = "125.43";
autopayAmt4 = 1;
}
in the Objective C code the method parameters is passed as:
parameters _TtGCs26_SwiftDeferredNSDictionaryVs11AnyHashableP__ * 0x608000038260 0x0000608000038260
Looks like its goofed up when it goes from the Swift code to the ObjC code. Here's what I print when in ObjectiveC code:
(lldb) po parameters
{
autopayAmt = 125;
autopayAmt1 = "125.43";
autopayAmt2 = "125.4300003051758";
autopayAmt3 = "125.43";
autopayAmt4 = 1;
}
versus if I click up in the stack to the Swift code and print then I see this:
lldb) po paramTest
▿ 5 elements
▿ 0 : 2 elements
- key : autopayAmt
- value : 125
▿ 1 : 2 elements
- key : autopayAmt2
- value : 125.4300003051758
▿ 2 : 2 elements
- key : autopayAmt1
- value : 125.43
▿ 3 : 2 elements
- key : autopayAmt4
- value : 1
▿ 4 : 2 elements
- key : autopayAmt3
- value : 125.43
I just want to know how to pass floats as numbers so they will be sent that way to the JSON server. Data of type Int does output as numbers...

Related

F# type constraints indexable

I'm trying to make a type that should represent a "slice" of some indexable collection.
I know that there are some similar types in F# but not one that specifies the criteria that I need.
To do this it needs to carry a reference to the collection of type 'content and the content needs to be indexable. So I tried this constraint since a type only needs to have the member Item (get/set) so I tried this
type Slice<'a, 'content when 'content: (member Item: int -> 'a)>
This still throw the usual error
So is it possible to constrain a type to still be generic but constraint to be indexable?
I think something like this should work:
type Slice<'a, 'content when 'content: (member get_Item: int -> 'a)> =
{
Content : 'content
Start : int
Stop : int
}
with
member inline slice.get_Item(i) =
slice.Content.get_Item(slice.Start + i)
I've implemented get_Item on Slice as well, so you can take a slice of a slice. Here are some values of this type:
let strSlice =
{
Content = "hello"
Start = 1
Stop = 2
}
let arraySlice =
{
Content = [| 2; 4; 6; 8 |]
Start = 0
Stop = 3
}
let strSliceSlice =
{
Content = strSlice
Start = 0
Stop = 1
}
[<Interface>]
type Indexable<'a> =
abstract member Item: int -> 'a with get
[<Struct>]
type Slice<'a> =
{
content: Indexable<'a>
start: int
len: int
}
with
interface Indexable<'a> with
member I.Item with get(i) = I.[idx]
member S.Item with get(idx) =
if idx >= S.len
then raise(IndexOutOfRangeException())
else S.content.[S.start+idx]
This works.

Converting String using specific encoding to get just one character

I'm on this frustrating journey trying to get a specific character from a Swift string. I have an Objective-C function, something like
- ( NSString * ) doIt: ( char ) c
that I want to call from Swift.
This c is eventually passed to a C function in the back that does the weightlifting here but this function gets tripped over when c is or A0.
Now I have two questions (apologies SO).
I am trying to use different encodings, especially the ASCII variants, hoping one would convert (A0) to spcae (20 or dec 32). The verdict seems to be that I need to hardcode this but if there is a failsafe, non-hardcoded way I'd like to hear about it!
I am really struggling with the conversion itself. How do I access a specific character using a specific encoding in Swift?
a) I can use
s.utf8CString[ i ]
but then I am bound to UTF8.
b) I can use something like
let s = "\u{a0}"
let p = UnsafeMutablePointer < CChar >.allocate ( capacity : n )
defer
{
p.deallocate()
}
// Convert to ASCII
NSString ( string : s ).getCString ( p,
maxLength : n,
encoding : CFStringConvertEncodingToNSStringEncoding ( CFStringBuiltInEncodings.ASCII.rawValue ) )
// Hope for 32
let c = p[ i ]
but this seems overkill. The string is converted to NSString to apply the encoding and I need to allocate a pointer, all just to get a single character.
c) Here it seems Swift String's withCString is the man for the job, but I can not even get it to compile. Below is what Xcode's completion gives but even after fiddling with it for a long time I am still stuck.
// How do I use this
// ??
s.withCString ( encodedAs : _UnicodeEncoding.Protocol ) { ( UnsafePointer < FixedWidthInteger & UnsignedInteger > ) -> Result in
// ??
}
TIA
There are two withCString() methods: withCString(_:) calls the given closure with a pointer to the contents of the string, represented as a null-terminated sequence of UTF-8 code units. Example:
// An emulation of your Objective-C method.
func doit(_ c: CChar) {
print(c, terminator: " ")
}
let s = "a\u{A0}b"
s.withCString { ptr in
var p = ptr
while p.pointee != 0 {
doit(p.pointee)
p += 1
}
}
print()
// Output: 97 -62 -96 98
Here -62 -96 is the signed character representation of the UTF-8 sequence C2 A0 of the NO-BREAK SPACE character U+00A0.
If you just want to iterate over all UTF-8 characters of the string sequentially then you can simply use the .utf8 view. The (unsigned) UInt8 bytes must be converted to the corresponding (signed) CChar:
let s = "a\u{A0}b"
for c in s.utf8 {
doit(CChar(bitPattern: c))
}
print()
I am not aware of a method which transforms U+00A0 to a “normal” space character, so you have to do that manually. With
let s = "a\u{A0}b".replacingOccurrences(of: "\u{A0}", with: " ")
the output of the above program would be 97 32 98.
The withCString(encodedAs:_:) method calls the given closure with a pointer to the contents of the string, represented as a null-terminated sequence of code units. Example:
let s = "a\u{A0}b€"
s.withCString(encodedAs: UTF16.self) { ptr in
var p = ptr
while p.pointee != 0 {
print(p.pointee, terminator: " ")
p += 1
}
}
print()
// Output: 97 160 98 8364
This method is probably of limited use for your purpose because it can only be used with UTF8, UTF16 and UTF32.
For other encodings you can use the data(using:) method. It produces a Data value which is a sequence of UInt8 (an unsigned type). As above, these must be converted to the corresponding signed character:
let s = "a\u{A0}b"
if let data = s.data(using: .isoLatin1) {
data.forEach {
doit(CChar(bitPattern: $0))
}
}
print()
// Output: 97 -96 98
Of course this may fail if the string is not representable in the given encoding.

Get exif data in mac OS development

I'm brand new to objective-C and I'm trying to get exif data on some images in my Mac OS app. It looks like I need a C library to get the exif data, so I googled around, and found some C libraries, but I'm struggling on how to implement them.
1 Do I need to get a C library in order to read exif data on an image taken from a DSLR camera? (Thing like date taken)
I tried this library http://libexif.sourceforge.net/ and I dug around on the site and downloaded from here: http://www.hmug.org/pub/MacOS_X/BSD/Libraries/Graphics/libexif/, which goes to this link: http://www.hmug.org/pub/MacOS_X/BSD/Libraries/Graphics/libexif/libexif-0.6.21-1-osx8.tar.gz
I drug these files into xcode, and it looks like files are added correctly to my library, I think.
I'm not sure now how to now use these C classes. I tried including the files like this
#include "exif-data.h"
#include "exif-loader.h"
Is this right? Should I be doing it a different way?
Then I'm really confused on the usage. On the documentation page here http://libexif.sourceforge.net/api/ it says
An application using libexif would typically first create an ExifLoader to load EXIF data into memory. From there, it would extract that data as an ExifData to start manipulating it. Each IFD is represented by its own ExifContent within that ExifData, which contains all the tag data in ExifEntry form. If the MakerNote data is required, an ExifMnoteData can be extracted from the ExifData and manipulated with the MakerNote functions.
What is the syntax for "creating an ExifLoader"?
Sorry for the noob questions! Any help is appreciated.
You can use Apples own API to get image Exif.
Here is a CGImageSource Reference
And CGimageProperties
Her is a quick example:
NSURL *imageFileURL = [NSURL fileURLWithPath:#"/Users/USERNAME/Documents/tasting_menu_004.jpg"];
CGImageSourceRef imageSource = CGImageSourceCreateWithURL((CFURLRef)imageFileURL, NULL);
NSDictionary *treeDict;
NSMutableString *exifData;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], (NSString *)kCGImageSourceShouldCache,
nil];
CFDictionaryRef imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, ( CFDictionaryRef)options);
CFRelease(imageSource);
if (imageProperties) {
treeDict = [NSDictionary dictionaryWithDictionary:(NSDictionary*)(imageProperties)];
id exifTree = [treeDict objectForKey:#"{Exif}"];
exifData = [NSMutableString stringWithString:#""];
for (NSString *key in [[exifTree allKeys] sortedArrayUsingSelector:#selector(compare:)])
{
NSString* locKey = [[NSBundle bundleWithIdentifier:#"com.apple.ImageIO.framework"] localizedStringForKey:key value:key table: #"CGImageSource"];
id value = [exifTree valueForKey:key] ;
[exifData appendFormat:#"key =%# ; Value = %# \n", locKey,value];
}
NSLog(#" exifData %#", exifData);
Log -->exifData
key =Aperture Value ; Value = 4.643856
key =Color Space ; Value = 65535
key =Custom Rendered ; Value = 0
key =Date Time Digitized ; Value = 2013:06:13 08:35:07
key =Date Time Original ; Value = 2013:06:13 08:35:07
key =Exif Version ; Value = (
2,
2,
1
)
key =Exposure Bias Value ; Value = 0
key =Exposure Mode ; Value = 1
key =Exposure Program ; Value = 1
key =Exposure Time ; Value = 0.0125
key =FNumber ; Value = 5
key =Flash ; Value = 9
key =Focal Length ; Value = 17
key =Focal Plane Resolution Unit ; Value = 2
key =Focal Plane X Resolution ; Value = 3849.211788896504
key =Focal Plane Y Resolution ; Value = 3908.141962421712
key =ISO Speed Ratings ; Value = (
800
)
key =Max Aperture Value ; Value = 4
key =Metering Mode ; Value = 5
key =Pixel X Dimension ; Value = 5181
key =Pixel Y Dimension ; Value = 3454
key =Scene Capture Type ; Value = 0
key =Shutter Speed Value ; Value = 6.321928
key =Subject Distance ; Value = 1.22
key =Sub-second Time Digitized ; Value = 25
key =Sub-second Time Original ; Value = 25
key =White Balance ; Value = 0
This post did it: http://devmacosx.blogspot.com/2011/07/nsimage-exif-metadata.html. I can now read all the exif data that I'm looking for.

Structs data scrambled upon retrieval from NSValue

I'm putting in the values to a struct here :
[self createEntityWithX:20 andY:20 withType:0 withWidth:20 andLength:20 atSpeed:5];
...
...
-(void)createEntityWithX:(int)newEntityX andY:(int)newEntityY withType:(int)newEntityType withWidth:(int)newEntityWidth andLength:(int)newEntityLength atSpeed:(int)newEntitySpeed
{
Entity tmpEntity;
tmpEntity.entityX = newEntityX;
tmpEntity.entityY = newEntityY;
tmpEntity.entityLength = newEntityLength;
tmpEntity.entityWidth = newEntityWidth;
tmpEntity.entityType = newEntityType;
tmpEntity.entitySpeed = newEntitySpeed;
NSValue *tmp = [NSValue valueWithBytes:&tmpEntity objCType:#encode(struct Entity)];
[entityArray addObject:tmp];
}
-(Entity)retreiveEntityFromValue:(NSValue *)value
{
Entity entity1;
[value getValue:&entity1];
return entity1;
}
And the data is being scrambled when i get it back? I created the entity with : X:20 Y:20 Width:20 and getting back x : 1712736 Y : 0 Width : 0?
X is completely random, but the rest remain 0
Where have i gone wrong?
Instead of using integers try using floats (1) or you could pass the data as a point(2).
1)http://www.techotopia.com/index.php/Objective-C_2.0_Data_Types#float_Data_Type
2)http://mobiledevelopertips.com/c/cgrect-cgsize-and-cgpoint.html

Objective-C: Getting the highest enum type from a collection

If I have the following enum type:
typedef enum {Type1=0, Type2, Type3} EnumType;
And the following code (which will work fine if converted to Java):
NSArray *allTypes = [NSArray arrayWithObjects:[NSNumber numberWithInt:Type1], [NSNumber numberWithInt:Type2], [NSNumber numberWithInt:Type3], nil];
EnumType aType = -1;
NSLog(#"aType is %d", aType); // I expect -1
// Trying to assign the highest type in the array to aType
for (NSNumber *typeNum in allTypes) {
EnumType type = [typeNum intValue];
NSLog(#"type is: %d", type);
if (type > aType) {
aType = type;
}
}
NSLog(#"aType is %d", aType); // I expect 2
The resulted logs are:
TestEnums[11461:b303] aType is: -1
TestEnums[11461:b303] type is: 0
TestEnums[11461:b303] type is: 1
TestEnums[11461:b303] type is: 2
TestEnums[11461:b303] aType is: -1
And when I inspect the value of aType using a breakpoint, I see:
aType = (EnumType) 4294967295
Which is according to Wikipedia the maximum unsigned long int value for 32-bit systems.
Does this mean that I cannot assign a value to enum types that is not
in the valid range of the type's values?
Why is the value of the log (-1) differ from the real value
(4294967295)? Does it have something to do with the specifier (%d)?
How can I achieve want I'm trying to do here without adding a new
type to represent an invalid value? Note that the collection may
sometimes be empty, this is why I'm using -1 at the beginning to indicate that there is no type if the collection was empty.
Note: I'm new to Objective-C/ANSI-C.
Thanks,
Mota
EDIT:
Here is something weird I've found. If I change the condition inside the loop to:
if (type > aType || aType == -1)
I get the following logs:
TestEnums[1980:b303] aType is -1
TestEnums[1980:b303] type is: 0
TestEnums[1980:b303] type is: 1
TestEnums[1980:b303] type is: 2
TestEnums[1980:b303] aType is 2
Which is exactly what I'm looking for! The weird part is how's (aType == -1) true, while (Type1 > -1), (Type2 > -1) and (Type3 > -1) are not?!
It seems like EnumType is defined to be an unsigned type. When you assign it to -1, this value actually rolls back to the highest possible value for an unsigned 32-bit integer (as you found). So, by starting the value at -1, you are ensuring that no other value that you compare it to could possibly be higher, because it is assigned to the maximum value for the data type (4294967295).
I'd suggest just starting the counter at 0 instead, as it is the lowest possible value for an EnumType.
EnumType aType = 0;
If you want to check to see if any value was chosen, you can check the count of the collection to see if there are any values first.