What is the difference between these two enums - objective-c

So in my my travelings i've alwasys seen enums defines like this (when a bit map is desired)
enum {
UIControlStateNormal = 0,
UIControlStateHighlighted = 1 << 0, // used when UIControl isHighlighted is set
UIControlStateDisabled = 1 << 1,
UIControlStateSelected = 1 << 2, // flag usable by app (see below)
};
However, I've just recently looked at the NSJSONSerilization class to come across an enum defined as so
enum {
NSJSONReadingMutableContainers = (1UL << 0),
NSJSONReadingMutableLeaves = (1UL << 1),
NSJSONReadingAllowFragments = (1UL << 2)
};
typedef NSUInteger NSJSONReadingOptions;
So I guess my question is what does the UL do. What is the difference between 1 << 1 and 1UL << 1

In C++, UL just means the literal is an unsigned long integer type. The default integer literal is int.

There's no difference between 1 << 1 and 1UL << 1, but there can be a difference between 1 << 33 and 1UL << 33. Depending on platform, and unsigned long can get bigger than an int, so if the enum has lots of values, an int might not be safe to use.

Practically, there is no difference in your code.
The type of 1 in the first is int, and the type of 1UL in the second is unsigned long.

The code will work the same, there is no real difference.
However, type of 1 in the first code is int, while the type of 1UL in the second code is unsigned long.

Related

Inline Documentation Comment Block for Typedef Enum

Problem: How can a typedef enum in Objective-C (iOS) include Documentation Comment Blocks?
Context: I'm building a .Framework and need to insure that my Parser is well documented internally for 3rd Party Developers to enjoy Much More Better. :)
Code:
/*!
#typedef SCElementTypes
#brief Types of Element SCParser may find and attempt to define
#constant kCharacters Not a Tag.
#constant kOpenTag Tag Opens
#constant kCloseTag Tag Closes
#constant kSingleTag Tag Is Single
*/
typedef enum SCElementTypes : NSUInteger {
kCharacters = (1 << 0),
kOpenTag = (1 << 1),
kCloseTag = (1 << 2),
kSingleTag = (1 << 3)
} SCElementTypes;
Note: I know how to make Documentation Comment Blocks work for a typedef (among many things), but not typedef enum...
Like so:
typedef enum SCElementTypes : NSUInteger
{
/**Character description*/
kCharacters = (1 << 0),
/**OpenTag description*/
kOpenTag = (1 << 1),
/**...*/
kCloseTag = (1 << 2),
kSingleTag = (1 << 3)
} SCElementTypes;

How are bitwise operators being used in this code?

I was looking at PSPDFkit sample code and saw this:
NSDictionary *options = #{kPSPDFProcessorAnnotationTypes :
#(PSPDFAnnotationTypeNone & ~PSPDFAnnotationTypeLink)
};
The constants PSPDFAnnotationTypeNone and PSPDFAnnotationTypeLink are defined below:
// Available keys for options. kPSPDFProcessorAnnotationDict in
// form of pageIndex -> annotations.
// ..
extern NSString *const kPSPDFProcessorAnnotationTypes;
// Annotations defined after the PDF standard.
typedef NS_OPTIONS(NSUInteger, PSPDFAnnotationType) {
PSPDFAnnotationTypeNone = 0,
PSPDFAnnotationTypeLink = 1 << 1, // Links and multimedia extensions
PSPDFAnnotationTypeHighlight = 1 << 2, // (Highlight, Underline, StrikeOut) -
PSPDFAnnotationTypeText = 1 << 3, // FreeText
PSPDFAnnotationTypeInk = 1 << 4,
PSPDFAnnotationTypeShape = 1 << 5, // Square, Circle
PSPDFAnnotationTypeLine = 1 << 6,
PSPDFAnnotationTypeNote = 1 << 7,
PSPDFAnnotationTypeStamp = 1 << 8,
PSPDFAnnotationTypeRichMedia = 1 << 10, // Embedded PDF videos
PSPDFAnnotationTypeScreen = 1 << 11, // Embedded PDF videos
PSPDFAnnotationTypeUndefined = 1 << 31, // any annotation whose type not recognized
PSPDFAnnotationTypeAll = UINT_MAX
};
I understand that ~ is the bitwise not operator and & the bitwise and operator, but what is the purpose of their application in this code?
NSDictionary *options = #{kPSPDFProcessorAnnotationTypes :
#(PSPDFAnnotationTypeNone & ~PSPDFAnnotationTypeLink)
};
Based on comments below, the above could have been written simply as
NSDictionary *options = #{kPSPDFProcessorAnnotationTypes :#(PSPDFAnnotationTypeNone)};
Since it is the same as (0 & ~2) => 0. What's the point of adding the & ~PSPDFAnnotationTypeLink part?
"~" is the bitwise not-operator.
As "&" the bitwise and.
These are usually used for bitmask (like in your example) or other binary operations (as the name lets suggest). More info on wiki - Operators in C and C++.
They are in no relationship to literals.
First of all, I don't know obj-c, only C, but I guess the '&' is 'bitwise AND' and the '~' is bitwise NOT.
It's the bitwise NOT operator (same as many C-based languages), which inverts all bits in the underlying value.
So, for example, the eight bit value 0x57 (binary 0101 0111) becomes 1010 1000 or 0xa8.
See here for a more complete description of the various bitwise operators.

enum values: NSInteger or int?

tl;dr Version
How are the data types of an enum's constants guaranteed to be NSUInteger instead of unsigned int when declaring an enum thusly:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The typedef to NSUInteger does not appear to be tied to the enum declaration in any way.
Full Version
I was reading through Apple's 64-Bit Transition Guide for Cocoa for some guidance on enum values and I came away with a question. Here's a (lengthy) quote from the Enumeration Constants section, emphasis mine:
A problem with enumeration (enum) constants is that their data types are frequently indeterminate. In other words, enum constants are not predictably unsigned int. With conventionally constructed enumerations, the compiler actually sets the underlying type based on what it finds. The underlying type can be (signed) int or even long. Take the following example:
type enum {
MyFlagError = -1,
MyFlagLow = 0,
MyFlagMiddle = 1,
MyFlagHigh = 2
} MyFlagType;
The compiler looks at this declaration and, finding a negative value assigned to one of the member constants, declares the underlying type of the enumeration int. If the range of values for the members does not fit into an int or unsigned int, then the base type silently becomes 64-bit (long). The base type of quantities defined as enumerations can thus change silently size to accord with the values in the enumeration. This can happen whether you're compiling 32-bit or 64-bit. Needless to say, this situation presents obstacles for binary compatibility.
As a remedy for this problem, Apple has decided to be more explicit about the enumeration type in the Cocoa API. Instead of declaring arguments in terms of the enumeration, the header files now separately declare a type for the enumeration whose size can be specified. The members of the enumeration and their values are declared and assigned as before. For example, instead of this:
typedef enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
} NSCellType;
there is now this:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The enumeration type is defined in terms of NSInteger or NSUInteger to make the base enumeration type 64-bit capable on 64-bit architectures.
My question is this: given that the typedef doesn't appear to be tied explicitly to the enum declaration, how does one know if their data types are unsigned int or NSUInteger?
There is now NS_ENUM starting Xcode 4.5:
typedef NS_ENUM(NSUInteger, NSCellType) {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
And you can consider NS_OPTIONS if you work with binary flags:
typedef NS_OPTIONS(NSUInteger, MyCellFlag) {
MyTextCellFlag = 1 << 0,
MyImageCellFlag = 1 << 1,
};
I run a test on the simulator so the intention of the test is check the size of different integer types. For that, the result of sizeof was printed in the console. So I test this enum values:
typedef enum {
TLEnumCero = 0,
TLEnumOne = 1,
TLEnumTwo = 2
} TLEnum;
typedef enum {
TLEnumNegativeMinusOne = -1,
TLEnumNegativeCero = 0,
TLEnumNegativeOne = 1,
TLEnumNegativeTwo = 2
} TLEnumNegative;
typedef NS_ENUM(NSUInteger, TLUIntegerEnum) {
TLUIntegerEnumZero = 0,
TLUIntegerEnumOne = 1,
TLUIntegerEnumTwo = 2
};
typedef NS_ENUM(NSInteger, TLIntegerEnum) {
TLIntegerEnumMinusOne = -1,
TLIntegerEnumZero = 0,
TLIntegerEnumOne = 1,
TLIntegerEnumTwo = 2
};
Test Code:
NSLog(#"sizeof enum: %ld", sizeof(TLEnum));
NSLog(#"sizeof enum negative: %ld", sizeof(TLEnumNegative));
NSLog(#"sizeof enum NSUInteger: %ld", sizeof(TLUIntegerEnum));
NSLog(#"sizeof enum NSInteger: %ld", sizeof(TLIntegerEnum));
Result for iPhone Retina (4-inch) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 4
sizeof enum NSInteger: 4
Result for iPhone Retina (4-inch 64 bit) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 8
sizeof enum NSInteger: 8
Conclusion
A generic enum can be an int or unsigned int types of 4 bytes for 32 or 64 bits.
As we already know NSUInteger and NSInteger are 4 bytes for 32 bits and 8 bytes in 64 bits compiler for iOS.
These are two separate declarations. The typedef guarantees that, when you use that type, you always get an NSUInteger.
The problem with an enum is not that it's not large enough to hold the value. In fact, the only guarantee you get for an enum is that sizeof(enum Foo) is large enough to hold whatever values you've currently defined in that enum. But its size may change if you add another constant. That's why Apple do the separate typedef, to maintain binary stability of the API.
The data types of the enum's constants are not guaranteed to be NSUInteger, but they are guaranteed to be cast to NSUInteger every time you use them through NSCellType.
In other words, the declaration decrees that although the enum's values would currently fit into an unsigned int, the storage reserved for them when accessed through NSCellType should be an NSUInteger.

Bit shifting coding efficiency (i.e. neat tricks)

I'm working on a Objective-C program where I'm getting bitfields over the network, and need to set boolean variables based on those bits.
Currently I'm representing the bitfields as int's, and then using bit shifting, similar to this (all the self properties are BOOL):
typedef enum {
deleteFlagIndex = 0,
uploadFlagIndex = 1,
moveFlagIndex = 2,
renameFlagIndex = 3
} PrivilegeFlagIndex;
int userFlag = 5; //for example
// this would be YES
self.delete = ((userFlag & (1 << deleteFlagIndex)) == (1 << deleteFlagIndex));
// this would be NO
self.upload = ((userFlag & (1 << uploadFlagIndex)) == (1 << uploadFlagIndex));
And this works (to the best of my knowledge) but I'm wondering - is there a more efficient concise way to code all the bit twiddling using a fancy trick/hack? I ask because I'll be doing this for a lot of flags (more than 30).
I did realize I could use this method this as well:
self.move = ((userFlag & (1 << moveFlagIndex)) > 0)
...which does reduce the amount of typing, but I don't know if there's a good reason to not do it that way.
Edit: Revised to say concise rather than efficient - I wasn't worried about execution performance, but rather tips and best practices for doing this in a smart way.
Try:
typedef enum {
deleteFlag = 1 << 0,
uploadFlag = 1 << 1,
moveFlag = 1 << 2,
renameFlag = 1 << 3
} PrivilegeFlags;
Now you can combine them using | directly.
Usually, it suffices to check against 0:
if (userFlag & deleteFlag) {
// delete...
}
You may want to try to use bitfields and let the compiler do the optimization itself.

How to use enums with bit flags

I have an enum declaration using bit flags and I cant exactly figure out on how to use this.
enum
{
kWhite = 0,
kBlue = 1 << 0,
kRed = 1 << 1,
kYellow = 1 << 2,
kBrown = 1 << 3,
};
typedef char ColorType;
I suppose to store multiple colors in one colorType I should OR the bits together?
ColorType pinkColor = kWhite | kRed;
But suppose I would want to check if pinkColor contains kRed, how would I do this?
Anyone care to give me an example using the provided ColorType example ?
Yes, use bitwise OR (|) to set multiple flags:
ColorType pinkColor = kWhite | kRed;
Then use bitwise AND (&) to test if a flag is set:
if ( pinkColor & kRed )
{
// do something
}
The result of & has any bit set only if the same bit is set in both operands. Since the only bit in kRed is bit 1, the result will be 0 if the other operand doesn't have this bit set too.
If you need to get whether a particular flag is set as a BOOL rather than just testing it in an if condition immediately, compare the result of the bitwise AND to the tested bit:
BOOL hasRed = ((pinkColor & kRed) == kRed);