enum acting like an unsigned int in Xcode 4.6 even when enum is defined as a signed int - objective-c

I have only been able to recreate this bug using xCode 4.6. Everything works as expected using Xcode 4.5
The issue is myVal has the correct bit structure to represent an int val of -1. However, it is showing a value of 4294967295 which is the value of the same bit structure if represented by an unsigned int. You'll notice that if i cast myVal to an int it will show the correct value. This is strange, because the enum should be an int to being with.
here is a screen shot showing the value of all of my variables in the debugger at the end of main. http://cl.ly/image/190s0a1P1b1t
typedef enum : int {
BTEnumValueNegOne = -1,
BTEnumValueZero = 0,
BTEnumValueOne = 1,
}BTEnumValue;
int main(int argc, const char * argv[])
{
#autoreleasepool {
//on this line of code myVal is 0
BTEnumValue myVal = BTEnumValueZero;
//we are adding -1 to the value of zero
myVal += BTEnumValueNegOne;
//at this moment myVal has the exact bit stucture
//of a signed int at -1, but it is displaying it
//as a unsigned int, so its value is 4294967295
//however, if i cast the enum (which should already
//be an int with a signing) to an int, it displays
//the correct value of -1
int myIntVal = (int)myVal;
}
return 0;
}

The new, preferred way to declare enum types is with the NS_ENUM macro as explained in this post: http://nshipster.com/ns_enum-ns_options/

Related

Passing dynamic array to struct in c++

In every example I saw that tries to use a dynamic size for an array in a struct uses global constants at some point. What I am trying to do is pass an integer variable that is decided by the user to a structure that I create storing an array of that size, thus dynamic. Obviously the code below doesn't work, but it gives you an idea of what I plan on accomplishing
struct Node {
char input;
int playingBoard[size];
Node* pNext;
};
int main(){
cout<<"enter board size"<<endl;
cin>>size;
int playingBoard[size];
}
struct Node
{
int countr;
int playingBoard[];
};
int countr;
...
struct Node *p = malloc(offsetof(Node, playingBoard) +
countr* sizeof *p->playingBoard);
p->countr= countr;
...
or an independent dynamically-allocated array
struct Node
{
int countr;
int *playingBoard;
};
Node holder;
...
holder.playingBoard =
malloc(holder.countr * sizeof *holder.playingBoard);

How to interpret objective-c type specifier (e.g. returned by method_copyReturnType())?

Given I have a type specifier as returned by method_copyReturnType(). In the GNU runtime delivered with the GCC there are various methods to work with such a type specifier like objc_sizeof_type(), objc_alignof_type() and others.
When using the Apple runtime there are no such methods.
How can I interpret a type specifier string (e.g. get the size of a type) using the Apple runtime without implementing an if/else or case switch for myself?
[update]
I am not able to use the Apple Foundation.
I believe that you're looking for NSGetSizeAndAlignment:
Obtains the actual size and the aligned size of an encoded type.
const char * NSGetSizeAndAlignment (
const char *typePtr,
NSUInteger *sizep,
NSUInteger *alignp
);
Discussion
Obtains the actual size and the aligned size of the first data type represented by typePtr and returns a pointer to the position of the next data type in typePtr.
This is a Foundation function, not part of the base runtime, which is probably why you didn't find it.
UPDATE: Although you didn't initially mention that you're using Cocotron, it is also available there. You can find it in Cocotron's Foundation, in NSObjCRuntime.m.
Obviously, this is much better than rolling your own, since you can trust it to always correctly handle strings generated by its own runtime in the unlikely event that the encoding characters should change.
For some reason, however, it's unable to handle the digit elements of a method signature string (which presumably have something to do with offsets in memory). This improved version, by Mike Ash will do so:
static const char *SizeAndAlignment(const char *str, NSUInteger *sizep, NSUInteger *alignp, int *len)
{
const char *out = NSGetSizeAndAlignment(str, sizep, alignp);
if(len)
*len = out - str;
while(isdigit(*out))
out++;
return out;
}
afaik, you'll need to bake that info into your binary. just create a function which returns the sizeof and alignof in a struct, supports the types you must support, then call that function (or class method) for the info.
The program below shows you that many of the primitives are just one character. So the bulk of the function's implementation could be a switch.
static void test(SEL sel) {
Method method = class_getInstanceMethod([NSString class], sel);
const char* const type = method_copyReturnType(method);
printf("%s : %s\n", NSStringFromSelector(sel).UTF8String, type);
free((void*)type);
}
int main(int argc, char *argv[]) {
#autoreleasepool {
test(#selector(init));
test(#selector(superclass));
test(#selector(isEqual:));
test(#selector(length));
return 0;
}
}
and you could then use this as a starting point:
typedef struct t_pair_alignof_sizeof {
size_t align;
size_t size;
} t_pair_alignof_sizeof;
static t_pair_alignof_sizeof MakeAlignOfSizeOf(size_t align, size_t size) {
t_pair_alignof_sizeof ret = {align, size};
return ret;
}
static t_pair_alignof_sizeof test2(SEL sel) {
Method method = class_getInstanceMethod([NSString class], sel);
const char* const type = method_copyReturnType(method);
const size_t length = strlen(type);
if (1U == length) {
switch (type[0]) {
case '#' :
return MakeAlignOfSizeOf(__alignof__(id), sizeof(id));
case '#' :
return MakeAlignOfSizeOf(__alignof__(Class), sizeof(Class));
case 'c' :
return MakeAlignOfSizeOf(__alignof__(signed char), sizeof(signed char));
...

enum values: NSInteger or int?

tl;dr Version
How are the data types of an enum's constants guaranteed to be NSUInteger instead of unsigned int when declaring an enum thusly:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The typedef to NSUInteger does not appear to be tied to the enum declaration in any way.
Full Version
I was reading through Apple's 64-Bit Transition Guide for Cocoa for some guidance on enum values and I came away with a question. Here's a (lengthy) quote from the Enumeration Constants section, emphasis mine:
A problem with enumeration (enum) constants is that their data types are frequently indeterminate. In other words, enum constants are not predictably unsigned int. With conventionally constructed enumerations, the compiler actually sets the underlying type based on what it finds. The underlying type can be (signed) int or even long. Take the following example:
type enum {
MyFlagError = -1,
MyFlagLow = 0,
MyFlagMiddle = 1,
MyFlagHigh = 2
} MyFlagType;
The compiler looks at this declaration and, finding a negative value assigned to one of the member constants, declares the underlying type of the enumeration int. If the range of values for the members does not fit into an int or unsigned int, then the base type silently becomes 64-bit (long). The base type of quantities defined as enumerations can thus change silently size to accord with the values in the enumeration. This can happen whether you're compiling 32-bit or 64-bit. Needless to say, this situation presents obstacles for binary compatibility.
As a remedy for this problem, Apple has decided to be more explicit about the enumeration type in the Cocoa API. Instead of declaring arguments in terms of the enumeration, the header files now separately declare a type for the enumeration whose size can be specified. The members of the enumeration and their values are declared and assigned as before. For example, instead of this:
typedef enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
} NSCellType;
there is now this:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The enumeration type is defined in terms of NSInteger or NSUInteger to make the base enumeration type 64-bit capable on 64-bit architectures.
My question is this: given that the typedef doesn't appear to be tied explicitly to the enum declaration, how does one know if their data types are unsigned int or NSUInteger?
There is now NS_ENUM starting Xcode 4.5:
typedef NS_ENUM(NSUInteger, NSCellType) {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
And you can consider NS_OPTIONS if you work with binary flags:
typedef NS_OPTIONS(NSUInteger, MyCellFlag) {
MyTextCellFlag = 1 << 0,
MyImageCellFlag = 1 << 1,
};
I run a test on the simulator so the intention of the test is check the size of different integer types. For that, the result of sizeof was printed in the console. So I test this enum values:
typedef enum {
TLEnumCero = 0,
TLEnumOne = 1,
TLEnumTwo = 2
} TLEnum;
typedef enum {
TLEnumNegativeMinusOne = -1,
TLEnumNegativeCero = 0,
TLEnumNegativeOne = 1,
TLEnumNegativeTwo = 2
} TLEnumNegative;
typedef NS_ENUM(NSUInteger, TLUIntegerEnum) {
TLUIntegerEnumZero = 0,
TLUIntegerEnumOne = 1,
TLUIntegerEnumTwo = 2
};
typedef NS_ENUM(NSInteger, TLIntegerEnum) {
TLIntegerEnumMinusOne = -1,
TLIntegerEnumZero = 0,
TLIntegerEnumOne = 1,
TLIntegerEnumTwo = 2
};
Test Code:
NSLog(#"sizeof enum: %ld", sizeof(TLEnum));
NSLog(#"sizeof enum negative: %ld", sizeof(TLEnumNegative));
NSLog(#"sizeof enum NSUInteger: %ld", sizeof(TLUIntegerEnum));
NSLog(#"sizeof enum NSInteger: %ld", sizeof(TLIntegerEnum));
Result for iPhone Retina (4-inch) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 4
sizeof enum NSInteger: 4
Result for iPhone Retina (4-inch 64 bit) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 8
sizeof enum NSInteger: 8
Conclusion
A generic enum can be an int or unsigned int types of 4 bytes for 32 or 64 bits.
As we already know NSUInteger and NSInteger are 4 bytes for 32 bits and 8 bytes in 64 bits compiler for iOS.
These are two separate declarations. The typedef guarantees that, when you use that type, you always get an NSUInteger.
The problem with an enum is not that it's not large enough to hold the value. In fact, the only guarantee you get for an enum is that sizeof(enum Foo) is large enough to hold whatever values you've currently defined in that enum. But its size may change if you add another constant. That's why Apple do the separate typedef, to maintain binary stability of the API.
The data types of the enum's constants are not guaranteed to be NSUInteger, but they are guaranteed to be cast to NSUInteger every time you use them through NSCellType.
In other words, the declaration decrees that although the enum's values would currently fit into an unsigned int, the storage reserved for them when accessed through NSCellType should be an NSUInteger.

Expected identifier or '(' before '.' token

I'm new to Objective-C so I'm using a book to get to grips with it. I'm at a bit where it's explaining structs and I can't for the life of me get them to work.
I have the following code:
int main (int argc, char *argv[])
{
struct node
{
int nodeID;
int x;
int y;
BOOL isActive;
};
typedef struct node myNode;
myNode.nodeID = 1;
}
and I'm getting the error written in the title. Every time I search for this error online I found different variations such as 'before '>' token' or 'before '}' token' but i can't find anything with the '.' token and it's really frustrating and I assume it's somethings ridiculously trivial and basic. Any help would be appreciated.
I believe you're trying to modify the actual type itself. nodeA is now the type of that struct, much like int. You need to do something like nodeA myNode, then you would be able to perform myNode.nodeID = 1 without error.
I've got it sorted now, I used the following and it seems to be fixed now:
int main (int argc, char *argv[])
{
struct node
{
int nodeID;
int x;
int y;
BOOL isActive;
};
struct node myNode;
myNode.nodeID = 1;
myNode.x = 100;
myNode.y = 200;
myNode.isActive = TRUE;
}
Thanks for all your help Darth! :)
I think the problem with the original code was, it was trying to make myNode a type name using typedef. Thus, myNode is NOT a variable that assignment can happen to. Rather, it was another alias for struct node.

How do I get the int value from object_getIvar(self, myIntVar) as it returns a pointer

if the variable in object_getIvar is a basic data type (eg. float, int, bool) how do I get the value as the function returns a pointer (id) according to the documentation. I've tried casting to an int, int* but when I try to get that to NSLog, I get error about an incompatible pointer type.
Getting:
myFloat = 2.34f;
float myFloatValue;
object_getInstanceVariable(self, "myFloat", (void*)&myFloatValue);
NSLog(#"%f", myFloatValue);
Outputs:
2.340000
Setting:
float newValue = 2.34f;
unsigned int addr = (unsigned int)&newValue;
object_setInstanceVariable(self, "myFloat", *(float**)addr);
NSLog(#"%f", myFloat);
Outputs:
2.340000
For ARC:
Inspired by this answer: object_getIvar fails to read the value of BOOL iVar.
You have to cast function call for object_getIvar to get basic-type ivars.
typedef int (*XYIntGetVariableFunction)(id object, const char* variableName);
XYIntGetVariableFunction intVariableFunction = (XYIntGetVariableFunction)object_getIvar;
int result = intVariableFunction(object, intVarName);
I have made a small useful macro for fast definition of such function pointers:
#define GET_IVAR_OF_TYPE_DEFININTION(type, capitalized_type) \
typedef type (*XY ## capitalized_type ## GetVariableFunctionType)(id object, Ivar ivar); \
XY ## capitalized_type ## GetVariableFunctionType XY ## capitalized_type ## GetVariableFunction = (XY ## capitalized_type ## GetVariableFunctionType)object_getIvar;
Then, for basic types you need to specify calls to macro (params e.g. (long long, LongLong) will fit):
GET_IVAR_OF_TYPE_DEFININTION(int, Int)
And after that a function for receiving int(or specified) variable type become available:
int result = XYIntGetVariableFunction(object, variableName)
The value that is returned is the value from the right place in the object; just not the right type. For int and BOOL (but not float), you could just cast the pointer to an int or BOOL, since pointers and ints are the same size and they can be cast to each other:
(int)object_getIvar(obj, myIntVar)
It's probably boxing the value in an NSNumber. You can verify this by NSLogging the returned id's className, like so:
id returnedValue = object_getIvar(self, myIntVar);
NSLog(#"Class: %#", [returnedValue className]);
Edit: I found another question just like this one here: Handling the return value of object_getIvar(id object, Ivar ivar)
From my own experimentation, it would appear that my original assumption was incorrect. int and float and other primitives appear to be returned as the actual value. However, it would be appropriate to use ivar_getTypeEncoding to verify that the returned value is the type that you're expecting it to be.
you can use object_getInstanceVariable directly: (haven't tested it)
void *ptr_to_result;
object_getInstanceVariable(obj, "intvarname", &ptr_to_result);
float result = *(float *)ptr_to_result;