c, obj c enum without tag or identifier - objective-c

im learning cocos2d [open gl wrapper for objective C on iPhone], and now playing with sprites have found this in a example,
enum {
easySprite = 0x0000000a,
mediumSprite = 0x0000000b,
hardSprite = 0x0000000c,
backButton = 0x0000000d,
magneticSprite = 0x0000000e,
magneticSprite2 = 0x0000000f
};
...
-(id) init
{...
/second sprite
TSprite *med = [TSprite spriteWithFile:#"butonB.png"]; //blue
[med SetCanTrack:YES];
[self addChild: med z:1 tag:mediumSprite];
med.position=ccp(299,230);
[TSprite track:med];
so the variable defined in the enum is used in the tag name of the created sprite object,
but i don understand
why give values in hexa to the tags to use
the enum with out tags
as I knew this enum in obj C and C
typedef enum {
JPG,
PNG,
GIF,
PVR
} kImageType;
thanks!

Usually, when you are creating an enum, you want to use it as a type (variable, method parameters etc.).
In this case, it's just a way how to declare integer constants. Since thay don't want to use the enum as type, the name is not necessary.
Edit:
Hexadecimal numbers are commonly used when the integer is a binary mask. You won't see any operators like +,-,*,/ used with such a number, you'll see bitwise operators (!, &, |, ^).
Every digit in a hexadecimal number represents 4 bits. The whole number is a 32-bit integer and by writing it in hexadecimal in this case, you are saying that you are using only the last four bits and the other bits can be used for something else. This wouldn't be obvious from a decimal number.

Enums are automatically assigned values, incremented from 0 but you can assign your own values.
If you don't specify any values they will be starting from 0 as in:
typedef enum {
JPG,
PNG,
GIF,
PVR
} kImageType;
But you could assign them values:
typedef enum {
JPG = 0,
PNG = 1,
GIF = 2,
PVR = 3
} kImageType;
or even
typedef enum {
JPG = 100,
PNG = 0x01,
GIF = 100,
PVR = 0xff
} kImageType;
anything you want, repeating values are ok as well.
I'm not sure why they are given those specific values but they might have some meaning related to use.

Well, you seem to be working off a terrible example. :)
At least as far as enums are concerned. It's up to anyone to define the actual value of an enum entry, but there's no gain to use hex numbers and in particular there's no point in starting the hex numbers with a through f (10 to 15). The example will also work with this enum:
enum {
easySprite = 10,
mediumSprite,
hardSprite,
backButton,
magneticSprite,
magneticSprite2
};
And unless there's some point in having the enumeration start with value 10, it will probably work without specifying any concrete values.

Related

How to represent ObjC enum AVAudioSessionPortOverride which has declaration of int and string using Dart ffi?

I'm working on a cross platform sound API for Flutter.
We're trying to stop using Objective C/Swift for the iOS portion of the API and we're using Dart ffi as a replacement.
ffi(foreign function interface) allows dart to call into an Obj C API.
This means we need to create a dart library which wraps the Obj C audio library.
Whilst doing this we encountered the AVAudioSessionPortOverride enum which has two declarations; AVAudioSessionPortOverrideSpeaker = 'spkr' and AVAudioSessionPortOverrideNone = 0.
I'm confused as to what's going on here as one of these declarations is an int whilst the other is a string.
I note that AVAudioSessionPortOverride extends an NSUInteger so how is the string being handled. Is it somehow being converted to an int? if so any ideas on how I would do this in dart?
Here's what we have so far:
class AVAudioSessionPortOverride extends NSUInteger {
const AVAudioSessionPortOverride(int value) : super(value);
static AVAudioSessionPortOverride None = AVAudioSessionPortOverride(0);
static const AVAudioSessionPortOverride Speaker =
AVAudioSessionPortOverride('spkr');
}
'spkr' is in fact an int. See e.g. How to convert multi-character constant to integer in C? for an explanation of how this obscure feature in C works.
That said, if you look at the Swift representation of the PortOverride enum, you'll see this:
/// For use with overrideOutputAudioPort:error:
public enum PortOverride : UInt {
/// No override. Return audio routing to the default state for the current audio category.
case none = 0
/// Route audio output to speaker. Use this override with AVAudioSessionCategoryPlayAndRecord,
/// which by default routes the output to the receiver.
case speaker = 1936747378
}
Also, see https://developer.apple.com/documentation/avfoundation/avaudiosession/portoverride/speaker
Accordingly, 0 and 1936747378 are the values you should use.
Look at this
NSLog(#"spkr = %x s = %x p = %x k = %x r = %x", 'spkr', 's', 'p', 'k', 'r' );
Apple is doing everything your lecturer warned you against. You can get away with this since the string is 4 chars (bytes) long. If you make it longer you'll get a warning. The string gets converted to an int as illustrated in the code snippet above. You could reverse it by accessing the four bytes one by one and printing them as a character.
Spoiler - it will print
spkr = 73706b72 s = 73 p = 70 k = 6b r = 72

Different Objective-C enums with the same literals

I wish to have two different enums, but they might have the same literal; for example:
typedef enum {ONE,TWO,THREE,FOUR,FIVE,SIX} NumbersEnum;
typedef enum {ONE,TWO,THREE,FIVE,EIGHT} FibonacciEnum;
This will raise a compile error because ONE, TWO, THREE, FIVE are repeated in both enums.
Is there a way to make this work as-is (not changing the literals' names or adding a prefix or suffix)?
Is there any way my code using the literals can look like this: int num = NumbersEnum.SIX; and not like this int num = SIX;?
No. That's part of the C and Objective-C language from the beginning of time. You're not going to change it, and nobody is going to change it for you.
You cannot do this with enums; their members are global and the names must be unique. There is, however, a neat technique you can use to make pseudo-namespaces for constants with structs.
Declare your "namespace" in the appropriate header:
extern const struct _FibonacciNumbers
{
int one;
int two;
int three;
int five;
} FibonacciNumbers;
Then initialize the values in an implementation file:
const struct _FibonacciNumbers FibonacciNumbers = {
.one = 1,
.two = 2,
.three = 3,
.five = 5
};
You now access a constant as, e.g., FibonacciNumbers.one, and other struct types can use the same names since the names are private to each of them.
So that's "No" for your first option, but "Yes" to the second.

Using C style unsigned char array and bitwise operators in Swift

I'm working on changing some Objective-C Code over to Swift, and I cannot figure out for the life of me how to take care of unsigned char arrays and bitwise operations in this specific instance of code.
Specifically, I'm working on converting the following Objective-C code (which deals with CoreBluetooth) to Swift:
unsigned char advertisementBytes[21] = {0};
[self.proximityUUID getUUIDBytes:(unsigned char *)&advertisementBytes];
advertisementBytes[16] = (unsigned char)(self.major >> 8);
advertisementBytes[17] = (unsigned char)(self.major & 255);
I've tried the following in Swift:
var advertisementBytes: CMutablePointer<CUnsignedChar>
self.proximityUUID.getUUIDBytes(advertisementBytes)
advertisementBytes[16] = (CUnsignedChar)(self.major >> 8)
The problems I'm running into are that getUUIDBytes in Swift seems to only take a CMutablePointer<CUnsignedChar> object as an argument, rather than an array of CUnsignedChars, so I have no idea how to do the later bitwise operations on advertisementBytes, as it seems it would need to be an unsignedChar array to do so.
Additionally, CMutablePointer<CUnsignedChar[21]> throws an error saying that fixed length arrays are not supported in CMutablePointers in Swift.
Could anyone please advise on potential work-arounds or solutions? Many thanks.
Have a look at Interacting with C APIs
Mostly this
C Mutable Pointers
When a function is declared as taking a CMutablePointer
argument, it can accept any of the following:
nil, which is passed as a null pointer
A CMutablePointer value
An in-out expression whose operand is a stored lvalue of type Type,
which is passed as the address of the lvalue
An in-out Type[] value,
which is passed as a pointer to the start of the array, and
lifetime-extended for the duration of the call
If you have declared a
function like this one:
SWIFT
func takesAMutablePointer(x: CMutablePointer<Float>) { /*...*/ } You
can call it in any of the following ways:
SWIFT
var x: Float = 0.0
var p: CMutablePointer<Float> = nil
var a: Float[] = [1.0, 2.0, 3.0]
takesAMutablePointer(nil)
takesAMutablePointer(p)
takesAMutablePointer(&x)
takesAMutablePointer(&a)
So you code becomes
var advertisementBytes = CUnsignedChar[]()
self.proximityUUID.getUUIDBytes(&advertisementBytes)
advertisementBytes[16] = CUnsignedChar(self.major >> 8)

how to get the size of the following array

int first[] = {1, 4};
int second[] = {2, 3, 7};
arrayOfCPointers[0] = first;
arrayOfCPointers[1] = second;
NSLog(#"size of %lu", sizeof(arrayOfCPointers[0]) / sizeof(int));
I want to have an array of sub arrays. Each sub array needs to be a different size. But I need to be able to find out the size of each sub array?
The Log keeps returning 1
You need to store the size somewhere. The language does not do so for bare C arrays. All you have is the address of the first element.
I'd write a wrapper class or struct to hold the array and it's metadata (like length).
typedef struct tag_arrayholder
{
int* pArray;
int iLen;
}ArrayHolder;
int first[] = {1, 4};
ArrayHolder holderFirst;
holderFirst.pArray = first;
holderFirst.iArrayLen = sizeof(first) / sizeof(int);
arrayOfCPointers[0] = holderFirst;
NSLog(#"size of %lu", arrayOfCPointers[0].iLen);
Or, like trojanfoe said, store special value marking the last position (exactly the approach zero-terminated string uses)
The "sizeof" instruction could be used to know the amount of bytes used by the array, but it works only with static array, with dynamics one it returns the pointer size. So with static array you could use this formula : sizeof(tab)/sizeof(tab[0]) to know the size of your array because the first part give you the tab size in bytes and the second the size of an element, so the result is your amount of element in your array ! But with a dynamic array the only way is to store the size somewhere or place a "sentinal value" at the end of your array and write a loop which count elements for you !
(Sorry for my English i'm french :/)
The NSLog statement is printing the value 1 because the expression you're using is dividing the size of the first element of the array (which is the size of an int) by the size of an int.
So what you currently have is this:
NSLog(#"size of %lu", sizeof(arrayOfCPointers[0]) / sizeof(int));
If you remove the array brackets, you'll get the value you're looking for:
NSLog(#"size of %lu", sizeof(arrayOfCPointers) / sizeof(int));
As other answers have pointed out, this won't work if you pass the array to another method or function, since all that's passed in that case is an address. The only reason the above works is because the array's definition is in the local scope, so the compiler can use the type information to compute the size.

enum values: NSInteger or int?

tl;dr Version
How are the data types of an enum's constants guaranteed to be NSUInteger instead of unsigned int when declaring an enum thusly:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The typedef to NSUInteger does not appear to be tied to the enum declaration in any way.
Full Version
I was reading through Apple's 64-Bit Transition Guide for Cocoa for some guidance on enum values and I came away with a question. Here's a (lengthy) quote from the Enumeration Constants section, emphasis mine:
A problem with enumeration (enum) constants is that their data types are frequently indeterminate. In other words, enum constants are not predictably unsigned int. With conventionally constructed enumerations, the compiler actually sets the underlying type based on what it finds. The underlying type can be (signed) int or even long. Take the following example:
type enum {
MyFlagError = -1,
MyFlagLow = 0,
MyFlagMiddle = 1,
MyFlagHigh = 2
} MyFlagType;
The compiler looks at this declaration and, finding a negative value assigned to one of the member constants, declares the underlying type of the enumeration int. If the range of values for the members does not fit into an int or unsigned int, then the base type silently becomes 64-bit (long). The base type of quantities defined as enumerations can thus change silently size to accord with the values in the enumeration. This can happen whether you're compiling 32-bit or 64-bit. Needless to say, this situation presents obstacles for binary compatibility.
As a remedy for this problem, Apple has decided to be more explicit about the enumeration type in the Cocoa API. Instead of declaring arguments in terms of the enumeration, the header files now separately declare a type for the enumeration whose size can be specified. The members of the enumeration and their values are declared and assigned as before. For example, instead of this:
typedef enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
} NSCellType;
there is now this:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The enumeration type is defined in terms of NSInteger or NSUInteger to make the base enumeration type 64-bit capable on 64-bit architectures.
My question is this: given that the typedef doesn't appear to be tied explicitly to the enum declaration, how does one know if their data types are unsigned int or NSUInteger?
There is now NS_ENUM starting Xcode 4.5:
typedef NS_ENUM(NSUInteger, NSCellType) {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
And you can consider NS_OPTIONS if you work with binary flags:
typedef NS_OPTIONS(NSUInteger, MyCellFlag) {
MyTextCellFlag = 1 << 0,
MyImageCellFlag = 1 << 1,
};
I run a test on the simulator so the intention of the test is check the size of different integer types. For that, the result of sizeof was printed in the console. So I test this enum values:
typedef enum {
TLEnumCero = 0,
TLEnumOne = 1,
TLEnumTwo = 2
} TLEnum;
typedef enum {
TLEnumNegativeMinusOne = -1,
TLEnumNegativeCero = 0,
TLEnumNegativeOne = 1,
TLEnumNegativeTwo = 2
} TLEnumNegative;
typedef NS_ENUM(NSUInteger, TLUIntegerEnum) {
TLUIntegerEnumZero = 0,
TLUIntegerEnumOne = 1,
TLUIntegerEnumTwo = 2
};
typedef NS_ENUM(NSInteger, TLIntegerEnum) {
TLIntegerEnumMinusOne = -1,
TLIntegerEnumZero = 0,
TLIntegerEnumOne = 1,
TLIntegerEnumTwo = 2
};
Test Code:
NSLog(#"sizeof enum: %ld", sizeof(TLEnum));
NSLog(#"sizeof enum negative: %ld", sizeof(TLEnumNegative));
NSLog(#"sizeof enum NSUInteger: %ld", sizeof(TLUIntegerEnum));
NSLog(#"sizeof enum NSInteger: %ld", sizeof(TLIntegerEnum));
Result for iPhone Retina (4-inch) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 4
sizeof enum NSInteger: 4
Result for iPhone Retina (4-inch 64 bit) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 8
sizeof enum NSInteger: 8
Conclusion
A generic enum can be an int or unsigned int types of 4 bytes for 32 or 64 bits.
As we already know NSUInteger and NSInteger are 4 bytes for 32 bits and 8 bytes in 64 bits compiler for iOS.
These are two separate declarations. The typedef guarantees that, when you use that type, you always get an NSUInteger.
The problem with an enum is not that it's not large enough to hold the value. In fact, the only guarantee you get for an enum is that sizeof(enum Foo) is large enough to hold whatever values you've currently defined in that enum. But its size may change if you add another constant. That's why Apple do the separate typedef, to maintain binary stability of the API.
The data types of the enum's constants are not guaranteed to be NSUInteger, but they are guaranteed to be cast to NSUInteger every time you use them through NSCellType.
In other words, the declaration decrees that although the enum's values would currently fit into an unsigned int, the storage reserved for them when accessed through NSCellType should be an NSUInteger.