Forward-declare ns_options in Objective-C - objective-c

How to forward declare NS_OPTIONS in Objective-C?
Related SO question for NS_ENUMS: Forward-declare enum in Objective-C
Unanswered question on Apple Dev Forum: https://forums.developer.apple.com/thread/16305
typedef NS_OPTIONS(NSInteger, MSSOption) {
MSSOptionNone = 0,
MSSOptionName = 1 << 0,
MSSOptionEmail = 1 << 1,
MSSOptionTelephone = 1 << 2
};

Strictly the same as for NS_ENUM, so the answers from Forward-declare enum in Objective-C are all valid.
To forward declare your NS_OPTIONS, you have two solutions:
solution 1
typedef NS_ENUM(NSInteger, MSSOption);
solution 2
typedef NS_OPTIONS(NSInteger, MSSOption);
Both solutions work fine. Tested with Xcode 9.3.1 and Xcode 10.1.
Demonstration at https://github.com/Coeur/StackOverflow50499172.

Related

enums in nim for wrapper of c library

I am trying to create a simple nim wrapper around the Clever Audio Plugin c library.
In c there is an enum of format flags that can be activated using bitwise operations.
summary of the c code
# definitions
enum clap_note_dialect {
CLAP_NOTE_DIALECT_CLAP = 1 << 0,
CLAP_NOTE_DIALECT_MIDI = 1 << 1,
CLAP_NOTE_DIALECT_MIDI_MPE = 1 << 2,
CLAP_NOTE_DIALECT_MIDI2 = 1 << 3,
};
typedef struct clap_note_port_info {
...
uint32_t supported_dialects; // bitfield, see clap_note_dialect
...
} clap_note_port_info_t;
# implementation
info->supported_dialects =
CLAP_NOTE_DIALECT_CLAP | CLAP_NOTE_DIALECT_MIDI_MPE | CLAP_NOTE_DIALECT_MIDI2;
using c2nim I get the following nim code:
type
clap_note_dialect* = enum
CLAP_NOTE_DIALECT_CLAP = 1 shl 0,
CLAP_NOTE_DIALECT_MIDI = 1 shl 1,
CLAP_NOTE_DIALECT_MIDI_MPE = 1 shl 2,
CLAP_NOTE_DIALECT_MIDI2 = 1 shl 3
clap_note_port_info* {.bycopy.} = object
...
supported_dialects*: uint32 ## bitfield, see clap_note_dialect
# implementation:
info.supported_dialects = CLAP_NOTE_DIALECT_CLAP or CLAP_NOTE_DIALECT_MIDI_MPE or
CLAP_NOTE_DIALECT_MIDI2
When compiling I get an mismatch error and message that "expression 'CLAP_NOTE_DIALECT_CLAP' is of type: clap_note_dialect"
How can I let nim know that my enum should be uint32 values?
Note that you may also use an enum set in Nim when you have to wrap C enums that are used as ored bits. I did that in the GTK wrapper. You can find an example at the end of the the "Sets" section here: https://ssalewski.de/nimprogramming.html#_sets
But some care is necessary, so for plain and ugly wrappers, or unexperienced people, using distinct ints may be another solution.
This fix came from user Vindaar on the Nim #main discord channel:
"in order to or the enum values you'll want to wrap them in an ord, so:"
info.supported_dialects = ord(CLAP_NOTE_DIALECT_CLAP) or ord(CLAP_NOTE_DIALECT_MIDI_MPE) or ord(CLAP_NOTE_DIALECT_MIDI2)

Setting bitmask enum declared in objective-c from swift

I'm trying to use Parse SDK for iOS in my new project. It has viewController with enum property;
typedef enum {
PFLogInFieldsNone = 0,
PFLogInFieldsUsernameAndPassword = 1 << 0,
PFLogInFieldsPasswordForgotten = 1 << 1,
PFLogInFieldsLogInButton = 1 << 2,
PFLogInFieldsFacebook = 1 << 3,
PFLogInFieldsTwitter = 1 << 4,
PFLogInFieldsSignUpButton = 1 << 5,
PFLogInFieldsDismissButton = 1 << 6,
PFLogInFieldsDefault = PFLogInFieldsUsernameAndPassword | PFLogInFieldsLogInButton | PFLogInFieldsSignUpButton | PFLogInFieldsPasswordForgotten | PFLogInFieldsDismissButton
} PFLogInFields;
According to tutorial in Objective-C I should set it in this way:
[logInViewController setFields: PFLogInFieldsTwitter | PFLogInFieldsFacebook | PFLogInFieldsDismissButton];
I'm trying to do it in this way(using swift):
loginViewController.fields = PFLogInFieldsTwitter | PFLogInFieldsFacebook | PFLogInFieldsDismissButton
But I get error:"'PFLogInFields' is not convertible to 'Bool'"
So, what is the correct way to set such kind of properties?
Consecutive enums in Objective-C should be refactored to use NS_ENUM, and bit-field enums should be refactored to use NS_OPTIONS.
You should change
typedef enum {
//...
} PFLogInFields;
to
typedef NS_OPTIONS(NSInteger, PFLogInFields) {
//...
};
I had the same problem as you. See this answer for how to set PFLogInFields in Swift. It worked for me!!
In Swift you have to prefix the enums with the Type. I am not sure if this works automatically with Objective-C imports, but it might:
logInViewController.fields = PFLogInFields.PFLogInFieldsTwitter | ...
If the library were ported to Swift standard, the fields would already expect PFLoginFields and the enum items would be defined in a manner so that you can write
logInViewController.fields = .Twitter | .Facebook ...

Large number of bitwise enums

I've got a question regarding bitwise enums that I just can't seem to resolve. I've got a number of flags that are represented by a bitwise enum as in the following example:
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3
};
typedef NSUInteger MyEnum;
All is fine with the above example. Based on my research and various helpful posts in stackoverflow (this for example), I've concluded that, using the above example, I'm essentially given 32 options (or shifts if you will), each option representing 1 bit in a 32-bit series of options, which basically tells me that I can go all the way to EnumThirtyTwo = 1 << 31.
My question is this:
Suppose I've more than 32, say 75 flags for example, to represent using a bitwise enum. How would that best be represented?
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3,
...
...
EnumSeventyFive = 1<<75
};
typedef NSUInteger MyEnum;
Would it be a simple matter of changing the declaration of my enum type, say, to: typedef long int MyEnum; or typedef long MyEnum?
You can use a few simple macros/functions and a struct containing a char array of sufficient size - gives you call-by-value semantics, i.e. just like real enums. E.g. something along the lines of (typed directly into answer):
typedef struct
{
char bits[10]; // enough for 80 bits...
} SeventyFiveFlags;
typedef enum
{
EnumOne = 0,
...
EnumSeventyFive = 74
} SeventyFiveFlagNames;
NS_INLINE BOOL testFlag(SeventyFiveFlags flags, SeventyFiveFlagNames bit)
{
return (flags.bits[bit >> 3] & (1 << (bit & 0x7))) != 0;
}
However you can also use the bitstring(3) functions/macros if you are OK with call-by-reference semantics. These create (heap or stack) bit strings of any length. Use your enum to provide symbolic names for the bit numbers rather than masks, e.g.:
#include <bitstring.h>
typedef enum
{
EnumOne = 0,
...
EnumSeventyFive = 74,
SeventyFiveFlagsSize = 75
} SeventyFiveFlagNames;
typedef bitstr_t *SeventyFiveFlags;
// local (stack) declaration & use
SeventyFiveFlags seventyFive;
bit_decl(seventyFive, SeventyFiveFlagsSize); // declare
bit_nclear(seventyFive, EnumOne, EnumSeventyFive); // set all false
if( bit_test(seventyFive, EnumFortyTwo) ) // test
You can always wrap this up as a class if heap allocation only is OK.
Maybe I am talking about something irrelevant.
I think having too much flag in an enum is not a good practise. Having this large amount of flag, there must be ways to group them up like:
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3
};
typedef NSUInteger widthRelated;
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3
};
typedef NSUInteger heightRelated;

enum values: NSInteger or int?

tl;dr Version
How are the data types of an enum's constants guaranteed to be NSUInteger instead of unsigned int when declaring an enum thusly:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The typedef to NSUInteger does not appear to be tied to the enum declaration in any way.
Full Version
I was reading through Apple's 64-Bit Transition Guide for Cocoa for some guidance on enum values and I came away with a question. Here's a (lengthy) quote from the Enumeration Constants section, emphasis mine:
A problem with enumeration (enum) constants is that their data types are frequently indeterminate. In other words, enum constants are not predictably unsigned int. With conventionally constructed enumerations, the compiler actually sets the underlying type based on what it finds. The underlying type can be (signed) int or even long. Take the following example:
type enum {
MyFlagError = -1,
MyFlagLow = 0,
MyFlagMiddle = 1,
MyFlagHigh = 2
} MyFlagType;
The compiler looks at this declaration and, finding a negative value assigned to one of the member constants, declares the underlying type of the enumeration int. If the range of values for the members does not fit into an int or unsigned int, then the base type silently becomes 64-bit (long). The base type of quantities defined as enumerations can thus change silently size to accord with the values in the enumeration. This can happen whether you're compiling 32-bit or 64-bit. Needless to say, this situation presents obstacles for binary compatibility.
As a remedy for this problem, Apple has decided to be more explicit about the enumeration type in the Cocoa API. Instead of declaring arguments in terms of the enumeration, the header files now separately declare a type for the enumeration whose size can be specified. The members of the enumeration and their values are declared and assigned as before. For example, instead of this:
typedef enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
} NSCellType;
there is now this:
enum {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
typedef NSUInteger NSCellType;
The enumeration type is defined in terms of NSInteger or NSUInteger to make the base enumeration type 64-bit capable on 64-bit architectures.
My question is this: given that the typedef doesn't appear to be tied explicitly to the enum declaration, how does one know if their data types are unsigned int or NSUInteger?
There is now NS_ENUM starting Xcode 4.5:
typedef NS_ENUM(NSUInteger, NSCellType) {
NSNullCellType = 0,
NSTextCellType = 1,
NSImageCellType = 2
};
And you can consider NS_OPTIONS if you work with binary flags:
typedef NS_OPTIONS(NSUInteger, MyCellFlag) {
MyTextCellFlag = 1 << 0,
MyImageCellFlag = 1 << 1,
};
I run a test on the simulator so the intention of the test is check the size of different integer types. For that, the result of sizeof was printed in the console. So I test this enum values:
typedef enum {
TLEnumCero = 0,
TLEnumOne = 1,
TLEnumTwo = 2
} TLEnum;
typedef enum {
TLEnumNegativeMinusOne = -1,
TLEnumNegativeCero = 0,
TLEnumNegativeOne = 1,
TLEnumNegativeTwo = 2
} TLEnumNegative;
typedef NS_ENUM(NSUInteger, TLUIntegerEnum) {
TLUIntegerEnumZero = 0,
TLUIntegerEnumOne = 1,
TLUIntegerEnumTwo = 2
};
typedef NS_ENUM(NSInteger, TLIntegerEnum) {
TLIntegerEnumMinusOne = -1,
TLIntegerEnumZero = 0,
TLIntegerEnumOne = 1,
TLIntegerEnumTwo = 2
};
Test Code:
NSLog(#"sizeof enum: %ld", sizeof(TLEnum));
NSLog(#"sizeof enum negative: %ld", sizeof(TLEnumNegative));
NSLog(#"sizeof enum NSUInteger: %ld", sizeof(TLUIntegerEnum));
NSLog(#"sizeof enum NSInteger: %ld", sizeof(TLIntegerEnum));
Result for iPhone Retina (4-inch) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 4
sizeof enum NSInteger: 4
Result for iPhone Retina (4-inch 64 bit) Simulator:
sizeof enum: 4
sizeof enum negative: 4
sizeof enum NSUInteger: 8
sizeof enum NSInteger: 8
Conclusion
A generic enum can be an int or unsigned int types of 4 bytes for 32 or 64 bits.
As we already know NSUInteger and NSInteger are 4 bytes for 32 bits and 8 bytes in 64 bits compiler for iOS.
These are two separate declarations. The typedef guarantees that, when you use that type, you always get an NSUInteger.
The problem with an enum is not that it's not large enough to hold the value. In fact, the only guarantee you get for an enum is that sizeof(enum Foo) is large enough to hold whatever values you've currently defined in that enum. But its size may change if you add another constant. That's why Apple do the separate typedef, to maintain binary stability of the API.
The data types of the enum's constants are not guaranteed to be NSUInteger, but they are guaranteed to be cast to NSUInteger every time you use them through NSCellType.
In other words, the declaration decrees that although the enum's values would currently fit into an unsigned int, the storage reserved for them when accessed through NSCellType should be an NSUInteger.

Why are enums with negative values causing problems in Objective-C/C?

For various implementation reasons, I've defined the following enum:
typedef enum HBSnakeMovementDirection
{
HBSnakeMovementDirectionUp = 1,
HBSnakeMovementDirectionDown = -1,
HBSnakeMovementDirectionRight = 2,
HBSnakeMovementDirectionLeft = -2
}
HBSnakeMovementDirection;
However, if I try to use HBSnakeMovementDirectionRight, I get the following warning:
Implicit conversion changes signedness: 'int' to 'HBSnakeMovementDirection'
It has no problem with any of the other enum values. What's the problem here? I thought it might have to do with mixing negative and positive enum values, but I can't find out anything definitive about this.
(I was able to come up with all positive enum values that allow me to work around this issue, but it still stumped me, so I thought I'd ask about it.)
I should state that, as with all my projects, I enable almost every warning—hence, -Wconversion's complaints—and treat them as errors. (I like to be as strict as possible at compile time.) I'm using LLVM 1.6.
UPDATE 1: Literally any use of HBSnakeMovementDirectionRight results in the preceding warning:
HBSnakeMovementDirection movementDirectionRight = HBSnakeMovementDirectionRight;
I have to cast HBSnakeMovementDirectionRight to HBSnakeMovementDirection to silence the warning.
UPDATE 2: As requested, here is the entire build command that's being issued on my machine:
http://pastie.org/1580957
UPDATE 3: Here is the exact project I'm working on hosted on GitHub:
https://github.com/LucasTizma/Hebi
Specifically, the following tree:
https://github.com/LucasTizma/Hebi/tree/89262e2e53881584daf029e3dd5f1e99dfbd6f96
As Darren said, it does look like a compiler bug, and Dave said it doesn’t happen with Clang 2.0.
I’ve found that the following type definition makes the OP code compile with Clang 1.6:
typedef enum HBSnakeMovementDirection
{
HBSnakeMovementDirectionUp = 1, // Default movement direction upon initialization via -init
HBSnakeMovementDirectionDown = -1,
HBSnakeMovementDirectionLeft = -2,
HBSnakeMovementDirectionRight = 2,
NBSnakeMovementDirectionNone = -3
}
HBSnakeMovementDirection;
(note the additional NBSnakeMovementDirectionNone)
This could be related to LLVM bug 1884, which has been fixed.
I can reproduce this. It certainly looks like a compiler bug to me. The presence of negative values in the enum causes the compiler to mistakenly think that the value of "2" is outside of the range of the enum, hence the warning.
The behavior is the same whether you specify "2" or "HBSnakeMovementDirectionRight": It accepts 1 and rejects 2.
Edit: I tested this in an existing iPhone project, setting the compiler LLVM 1.6 and setting the -Wconversion flag.
typedef enum HBSnakeMovementDirection
{
neg1 = -1,
pos1 = 1,
pos2 = 2,
} HBSnakeMovementDirection;
HBSnakeMovementDirection d = -3; // Warning: Can't convert int to HBSnakeMovementDirection
HBSnakeMovementDirection d = -2; // OK
HBSnakeMovementDirection d = -1; // OK
HBSnakeMovementDirection d = 0; // OK
HBSnakeMovementDirection d = 1; // OK
HBSnakeMovementDirection d = 2; // Warning: Can't convert int to HBSnakeMovementDirection
HBSnakeMovementDirection d = pos2; // Warning: Can't convert int to HBSnakeMovementDirection
Definitely looks like a compiler bug. I opened the project in Xcode 3 and compiled, and got the error. When I opened the project in Xcode 4 and used the clang2.0 compiler, I got no warnings.