enums in nim for wrapper of c library - wrapper

I am trying to create a simple nim wrapper around the Clever Audio Plugin c library.
In c there is an enum of format flags that can be activated using bitwise operations.
summary of the c code
# definitions
enum clap_note_dialect {
CLAP_NOTE_DIALECT_CLAP = 1 << 0,
CLAP_NOTE_DIALECT_MIDI = 1 << 1,
CLAP_NOTE_DIALECT_MIDI_MPE = 1 << 2,
CLAP_NOTE_DIALECT_MIDI2 = 1 << 3,
};
typedef struct clap_note_port_info {
...
uint32_t supported_dialects; // bitfield, see clap_note_dialect
...
} clap_note_port_info_t;
# implementation
info->supported_dialects =
CLAP_NOTE_DIALECT_CLAP | CLAP_NOTE_DIALECT_MIDI_MPE | CLAP_NOTE_DIALECT_MIDI2;
using c2nim I get the following nim code:
type
clap_note_dialect* = enum
CLAP_NOTE_DIALECT_CLAP = 1 shl 0,
CLAP_NOTE_DIALECT_MIDI = 1 shl 1,
CLAP_NOTE_DIALECT_MIDI_MPE = 1 shl 2,
CLAP_NOTE_DIALECT_MIDI2 = 1 shl 3
clap_note_port_info* {.bycopy.} = object
...
supported_dialects*: uint32 ## bitfield, see clap_note_dialect
# implementation:
info.supported_dialects = CLAP_NOTE_DIALECT_CLAP or CLAP_NOTE_DIALECT_MIDI_MPE or
CLAP_NOTE_DIALECT_MIDI2
When compiling I get an mismatch error and message that "expression 'CLAP_NOTE_DIALECT_CLAP' is of type: clap_note_dialect"
How can I let nim know that my enum should be uint32 values?

Note that you may also use an enum set in Nim when you have to wrap C enums that are used as ored bits. I did that in the GTK wrapper. You can find an example at the end of the the "Sets" section here: https://ssalewski.de/nimprogramming.html#_sets
But some care is necessary, so for plain and ugly wrappers, or unexperienced people, using distinct ints may be another solution.

This fix came from user Vindaar on the Nim #main discord channel:
"in order to or the enum values you'll want to wrap them in an ord, so:"
info.supported_dialects = ord(CLAP_NOTE_DIALECT_CLAP) or ord(CLAP_NOTE_DIALECT_MIDI_MPE) or ord(CLAP_NOTE_DIALECT_MIDI2)

Related

Setting bitmask enum declared in objective-c from swift

I'm trying to use Parse SDK for iOS in my new project. It has viewController with enum property;
typedef enum {
PFLogInFieldsNone = 0,
PFLogInFieldsUsernameAndPassword = 1 << 0,
PFLogInFieldsPasswordForgotten = 1 << 1,
PFLogInFieldsLogInButton = 1 << 2,
PFLogInFieldsFacebook = 1 << 3,
PFLogInFieldsTwitter = 1 << 4,
PFLogInFieldsSignUpButton = 1 << 5,
PFLogInFieldsDismissButton = 1 << 6,
PFLogInFieldsDefault = PFLogInFieldsUsernameAndPassword | PFLogInFieldsLogInButton | PFLogInFieldsSignUpButton | PFLogInFieldsPasswordForgotten | PFLogInFieldsDismissButton
} PFLogInFields;
According to tutorial in Objective-C I should set it in this way:
[logInViewController setFields: PFLogInFieldsTwitter | PFLogInFieldsFacebook | PFLogInFieldsDismissButton];
I'm trying to do it in this way(using swift):
loginViewController.fields = PFLogInFieldsTwitter | PFLogInFieldsFacebook | PFLogInFieldsDismissButton
But I get error:"'PFLogInFields' is not convertible to 'Bool'"
So, what is the correct way to set such kind of properties?
Consecutive enums in Objective-C should be refactored to use NS_ENUM, and bit-field enums should be refactored to use NS_OPTIONS.
You should change
typedef enum {
//...
} PFLogInFields;
to
typedef NS_OPTIONS(NSInteger, PFLogInFields) {
//...
};
I had the same problem as you. See this answer for how to set PFLogInFields in Swift. It worked for me!!
In Swift you have to prefix the enums with the Type. I am not sure if this works automatically with Objective-C imports, but it might:
logInViewController.fields = PFLogInFields.PFLogInFieldsTwitter | ...
If the library were ported to Swift standard, the fields would already expect PFLoginFields and the enum items would be defined in a manner so that you can write
logInViewController.fields = .Twitter | .Facebook ...

How are bitwise operators being used in this code?

I was looking at PSPDFkit sample code and saw this:
NSDictionary *options = #{kPSPDFProcessorAnnotationTypes :
#(PSPDFAnnotationTypeNone & ~PSPDFAnnotationTypeLink)
};
The constants PSPDFAnnotationTypeNone and PSPDFAnnotationTypeLink are defined below:
// Available keys for options. kPSPDFProcessorAnnotationDict in
// form of pageIndex -> annotations.
// ..
extern NSString *const kPSPDFProcessorAnnotationTypes;
// Annotations defined after the PDF standard.
typedef NS_OPTIONS(NSUInteger, PSPDFAnnotationType) {
PSPDFAnnotationTypeNone = 0,
PSPDFAnnotationTypeLink = 1 << 1, // Links and multimedia extensions
PSPDFAnnotationTypeHighlight = 1 << 2, // (Highlight, Underline, StrikeOut) -
PSPDFAnnotationTypeText = 1 << 3, // FreeText
PSPDFAnnotationTypeInk = 1 << 4,
PSPDFAnnotationTypeShape = 1 << 5, // Square, Circle
PSPDFAnnotationTypeLine = 1 << 6,
PSPDFAnnotationTypeNote = 1 << 7,
PSPDFAnnotationTypeStamp = 1 << 8,
PSPDFAnnotationTypeRichMedia = 1 << 10, // Embedded PDF videos
PSPDFAnnotationTypeScreen = 1 << 11, // Embedded PDF videos
PSPDFAnnotationTypeUndefined = 1 << 31, // any annotation whose type not recognized
PSPDFAnnotationTypeAll = UINT_MAX
};
I understand that ~ is the bitwise not operator and & the bitwise and operator, but what is the purpose of their application in this code?
NSDictionary *options = #{kPSPDFProcessorAnnotationTypes :
#(PSPDFAnnotationTypeNone & ~PSPDFAnnotationTypeLink)
};
Based on comments below, the above could have been written simply as
NSDictionary *options = #{kPSPDFProcessorAnnotationTypes :#(PSPDFAnnotationTypeNone)};
Since it is the same as (0 & ~2) => 0. What's the point of adding the & ~PSPDFAnnotationTypeLink part?
"~" is the bitwise not-operator.
As "&" the bitwise and.
These are usually used for bitmask (like in your example) or other binary operations (as the name lets suggest). More info on wiki - Operators in C and C++.
They are in no relationship to literals.
First of all, I don't know obj-c, only C, but I guess the '&' is 'bitwise AND' and the '~' is bitwise NOT.
It's the bitwise NOT operator (same as many C-based languages), which inverts all bits in the underlying value.
So, for example, the eight bit value 0x57 (binary 0101 0111) becomes 1010 1000 or 0xa8.
See here for a more complete description of the various bitwise operators.

Large number of bitwise enums

I've got a question regarding bitwise enums that I just can't seem to resolve. I've got a number of flags that are represented by a bitwise enum as in the following example:
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3
};
typedef NSUInteger MyEnum;
All is fine with the above example. Based on my research and various helpful posts in stackoverflow (this for example), I've concluded that, using the above example, I'm essentially given 32 options (or shifts if you will), each option representing 1 bit in a 32-bit series of options, which basically tells me that I can go all the way to EnumThirtyTwo = 1 << 31.
My question is this:
Suppose I've more than 32, say 75 flags for example, to represent using a bitwise enum. How would that best be represented?
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3,
...
...
EnumSeventyFive = 1<<75
};
typedef NSUInteger MyEnum;
Would it be a simple matter of changing the declaration of my enum type, say, to: typedef long int MyEnum; or typedef long MyEnum?
You can use a few simple macros/functions and a struct containing a char array of sufficient size - gives you call-by-value semantics, i.e. just like real enums. E.g. something along the lines of (typed directly into answer):
typedef struct
{
char bits[10]; // enough for 80 bits...
} SeventyFiveFlags;
typedef enum
{
EnumOne = 0,
...
EnumSeventyFive = 74
} SeventyFiveFlagNames;
NS_INLINE BOOL testFlag(SeventyFiveFlags flags, SeventyFiveFlagNames bit)
{
return (flags.bits[bit >> 3] & (1 << (bit & 0x7))) != 0;
}
However you can also use the bitstring(3) functions/macros if you are OK with call-by-reference semantics. These create (heap or stack) bit strings of any length. Use your enum to provide symbolic names for the bit numbers rather than masks, e.g.:
#include <bitstring.h>
typedef enum
{
EnumOne = 0,
...
EnumSeventyFive = 74,
SeventyFiveFlagsSize = 75
} SeventyFiveFlagNames;
typedef bitstr_t *SeventyFiveFlags;
// local (stack) declaration & use
SeventyFiveFlags seventyFive;
bit_decl(seventyFive, SeventyFiveFlagsSize); // declare
bit_nclear(seventyFive, EnumOne, EnumSeventyFive); // set all false
if( bit_test(seventyFive, EnumFortyTwo) ) // test
You can always wrap this up as a class if heap allocation only is OK.
Maybe I am talking about something irrelevant.
I think having too much flag in an enum is not a good practise. Having this large amount of flag, there must be ways to group them up like:
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3
};
typedef NSUInteger widthRelated;
enum
{
EnumNone=0,
EnumOne = 1<<0,
EnumTwo = 1<<1,
EnumThree = 1<<2,
EnumFour = 1<<3
};
typedef NSUInteger heightRelated;

Bit shifting coding efficiency (i.e. neat tricks)

I'm working on a Objective-C program where I'm getting bitfields over the network, and need to set boolean variables based on those bits.
Currently I'm representing the bitfields as int's, and then using bit shifting, similar to this (all the self properties are BOOL):
typedef enum {
deleteFlagIndex = 0,
uploadFlagIndex = 1,
moveFlagIndex = 2,
renameFlagIndex = 3
} PrivilegeFlagIndex;
int userFlag = 5; //for example
// this would be YES
self.delete = ((userFlag & (1 << deleteFlagIndex)) == (1 << deleteFlagIndex));
// this would be NO
self.upload = ((userFlag & (1 << uploadFlagIndex)) == (1 << uploadFlagIndex));
And this works (to the best of my knowledge) but I'm wondering - is there a more efficient concise way to code all the bit twiddling using a fancy trick/hack? I ask because I'll be doing this for a lot of flags (more than 30).
I did realize I could use this method this as well:
self.move = ((userFlag & (1 << moveFlagIndex)) > 0)
...which does reduce the amount of typing, but I don't know if there's a good reason to not do it that way.
Edit: Revised to say concise rather than efficient - I wasn't worried about execution performance, but rather tips and best practices for doing this in a smart way.
Try:
typedef enum {
deleteFlag = 1 << 0,
uploadFlag = 1 << 1,
moveFlag = 1 << 2,
renameFlag = 1 << 3
} PrivilegeFlags;
Now you can combine them using | directly.
Usually, it suffices to check against 0:
if (userFlag & deleteFlag) {
// delete...
}
You may want to try to use bitfields and let the compiler do the optimization itself.

How to use enums with bit flags

I have an enum declaration using bit flags and I cant exactly figure out on how to use this.
enum
{
kWhite = 0,
kBlue = 1 << 0,
kRed = 1 << 1,
kYellow = 1 << 2,
kBrown = 1 << 3,
};
typedef char ColorType;
I suppose to store multiple colors in one colorType I should OR the bits together?
ColorType pinkColor = kWhite | kRed;
But suppose I would want to check if pinkColor contains kRed, how would I do this?
Anyone care to give me an example using the provided ColorType example ?
Yes, use bitwise OR (|) to set multiple flags:
ColorType pinkColor = kWhite | kRed;
Then use bitwise AND (&) to test if a flag is set:
if ( pinkColor & kRed )
{
// do something
}
The result of & has any bit set only if the same bit is set in both operands. Since the only bit in kRed is bit 1, the result will be 0 if the other operand doesn't have this bit set too.
If you need to get whether a particular flag is set as a BOOL rather than just testing it in an if condition immediately, compare the result of the bitwise AND to the tested bit:
BOOL hasRed = ((pinkColor & kRed) == kRed);