I need to write a method like this:
-(void)doStuff:(int)options;
Based in the a typedef enum like this:
typedef enum
{
FirstOption,
SecondOption,
ThirdOption
} MyOptions
What I need to to do in order to be able to call the method in this way (i.e. calling the method with more than one option "enabled":
[self doStuff:(FirstOption | ThirdOption)];
Do I need to setup the typedef enum differently? And how I check the options received in the method, a simple if (options == ...)?
What you call for is a bit array. Define your options like this:
enum MyOptions
{
FirstOption = 0x01, // first bit
SecondOption = 0x02, // second bit
ThirdOption = 0x04, // third bit
... = 0x08 ...
};
Combine the options like you suggested, with |, and test for them with & (options & SecondOption).
- (void) doStuff: (MyOptions) options
{
if ( options & FirstOption )
{
// do something fancy
}
if ( options & SecondOption )
{
// do something awesome
}
if ( (options & SecondOption) && ( options & ThirdOption) )
{
// do something sublime
}
}
I'm no expert at Obj-C, but in other languages I would use flags. If each value is a power of two, then it will have only one bit set for that value. You can use that bit as a bool for a particular option.
FirstOption = 1
SecondOption = 2
ThirdOption = 4
FirstAndThird = FirstOption | ThirdOption; // Binary Or sets the bits
// Binary And checks the bits
IsFirst = FirstAndThird & FirstOption == FirstOption; // is true
IsSecond = FirstAndThird & SecondOption == SecondOption; // is false
IsThird = FirstAndThird & ThirdOption == ThirdOption; // is true
This question may also be of use.
Related
In a qmk_firmware configuration, I can map a single modifier key to a new keymap layer using MO(x).
How can I do this such that two modifier keys must be pressed simultaneously?
Example: I would like to map [Fn][Ctrl]-[d] to generate a Unicode delta, [Fn][Ctrl]-[e] to generate a Unicode epsilon, etc.
You have to put your specific code into the custom Quantum function process_record_user:
bool process_record_user(uint16_t keycode, keyrecord_t *record) {
// enter your code for [Fn][Ctrl]-[α] here
return true;
}
You can try the Quantum tab in the Firmware Builder. Even if it is end-of-life, you'll see, what is meant.
You can also set UNICODEMAP_ENABLE = yes and use the Unicode Map.
Something like this should do (half) the trick:
In function layer_state_set_user, check for layer Fn (some bit arithmetic similar to update_tri_layer_state) and the Ctrl state (get_mods() & MOD_MASK_CTRL). When in Fn layer and Ctrl is down, deactivate Ctrl (del_mods(MOD_MASK_CTRL)) and return state | x; otherwise return state & ~x.
Unfortunately, this will probably require Ctrl to be pressed before Fn.
At least that's what happens in my similar shift + layer _L4 => layer _S_L4 implementation in my keymap. It requires shift to be pressed before LT(_L4,…).
For the other direction (_L4 + shift => _S_L4), I have MO(_S_L4) on the shift keys in layer _L4, but that is somehow disabled by my layer_state_set_user.
What should "Fn + E" and "Fn + D" do (when Ctrl is not held)? For sake of example, I'll assume you want them to do PgUp and PgDn.
Here is how I would go about implementing this:
Enable Unicode input: In rules.mk, add UNICODEMAP_ENABLE = yes. In config.h, add #define UNICODE_SELECTED_MODES UC_WINC if you are on Windows. See the Unicode documentation for other OSs and options.
Define an "Fn" layer in the keymap.
Define a couple custom keycodes, and place them in the Fn layer at the d and e positions.
Handle the custom keys in process_record_user(), using get_mods() to test whether Ctrl is held. See also macros that respond to mods. Then use send_unicode_string() to type the Unicode symbol itself.
Code sketch:
// Copyright 2022 Google LLC.
// SPDX-License-Identifier: Apache-2.0
enum layers { BASE, FN, /* Other layers... */ };
enum custom_keycodes {
UPDIR = SAFE_RANGE,
FN_E,
FN_D,
// Other custom keys...
};
const uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] PROGMEM = {
[BASE] = LAYOUT(...),
[FN] = LAYOUT(..., FN_E, ..., FN_D, ...),
...
};
const uint32_t unicode_map[] PROGMEM = {};
static void symbol_key(uint16_t* registered_key,
keyrecord_t* record,
uint16_t default_keycode,
const char* symbol,
const char* uppercase_symbol) {
if (record->event.pressed) { // On key press.
const uint8_t mods = get_mods();
const bool ctrl_held = (mods & MOD_MASK_CTRL) != 0;
const bool shifted = (mods & MOD_MASK_SHIFT) != 0;
if (ctrl_held) { // Is the Ctrl mod held?
unregister_mods(MOD_MASK_CTRL); // Clear the Ctrl mod.
// Type Unicode symbol according to whether shift is active.
send_unicode_string(shifted ? uppercase_symbol : symbol);
*registered_key = KC_NO;
} else {
*registered_key = default_keycode;
register_code16(*registered_key);
}
} else { // On key release.
unregister_code16(*registered_key);
*registered_key = KC_NO;
}
}
bool process_record_user(uint16_t keycode, keyrecord_t* record) {
const uint8_t mods = get_mods();
const bool ctrl_held = (mods & MOD_MASK_CTRL) != 0;
const bool shifted = (mods & MOD_MASK_SHIFT) != 0;
switch (keycode) {
case FN_E: {
static uint16_t registered_key = KC_NO;
symbol_key(®istered_key, record, KC_PGUP, "ε", "Ε");
} return false;
case FN_D: {
static uint16_t registered_key = KC_NO;
symbol_key(®istered_key, record, KC_PGDN, "δ", "Δ");
} return false;
}
return true;
}
OK. Looked at the various answers when I typed out the question, and don't see the answer listed.
This seems such a basic, fundamental issue that I MUST be doing something wrong, but I am being driven crazy (actually, not much of a "drive." More of a short putt) trying to figure out what I am getting wrong.
I have created a test project that can be downloaded here (small project).
In any case, I encountered this, when working on importing a large Objective-C SDK that I wrote into Swift.
Everything works great. Until... I try to do comparisons on enums declared in C.
Obviously, C enums are not turned into Swift enums, but I am having trouble figuring out how to use them.
Here's a sample of what you will see in the test project;
I have a couple of C files that declare enums:
#import <Foundation/Foundation.h>
typedef enum
{
StandardEnumType_Undef = 0xFF,
StandardEnumType_Value0 = 0x00,
StandardEnumType_Value1 = 0x01,
StandardEnumType_Value2 = 0x02
} StandardEnumType;
typedef NS_ENUM ( unsigned char, AppleEnumType )
{
AppleEnumType_Undef = 0xFF,
AppleEnumType_Value0 = 0x00,
AppleEnumType_Value1 = 0x01,
AppleEnumType_Value2 = 0x02
};
StandardEnumType translateIntToEnumValue ( int inIntValue );
int translateEnumValueToInt ( StandardEnumType inValue );
AppleEnumType translateIntToAppleEnumValue ( int inIntValue );
int translateAppleEnumValueToInt ( AppleEnumType inValue );
The functions mentioned do pretty much what it says on the tin. I won't include them.
I did the bridging header and all that.
I am trying to use them in the initial load of the Swift app:
func application(application: UIApplication!, didFinishLaunchingWithOptions launchOptions: NSDictionary!) -> Bool
{
let testStandardConst:StandardEnumType = translateIntToEnumValue ( 0 )
// Nope.
// if ( testStandardConst.toRaw() == 0 )
// {
//
// }
// This don't work.
// if ( testStandardConst == StandardEnumType_Value0 )
// {
//
// }
// Neither does this.
// if ( testStandardConst == StandardEnumType_Value0.toRaw() )
// {
//
// }
// Not this one, either.
// if ( testStandardConst.toRaw() == StandardEnumType_Value0 )
// {
//
// }
let testAppleConst:AppleEnumType = translateIntToAppleEnumValue ( 0 )
// This don't work.
// if ( testAppleConst == AppleEnumType.AppleEnumType_Value0 )
// {
//
// }
// Neither does this.
// if ( testAppleConst == AppleEnumType.AppleEnumType_Value0.toRaw() )
// {
//
// }
// Nor this.
// if ( testAppleConst == .AppleEnumType_Value0 )
// {
//
// }
return true
}
I can't seem to get the $##!! enums to compare to either ints or anything else.
I am often humbled by the stupid errors I make, and sometimes need them pointed out to me, so please humble me.
Thanks!
When working with typedef NS_ENUM enums in swift you should omit enum name in comparison and assignment, here how you can compare it:
if ( testAppleConst == .Value0 ) { ... }
You can read more about this in Apple do s for Swift here.
There is a global enum defined in Objective-C:
typedef enum {
UMSocialSnsTypeNone = 0,
UMSocialSnsTypeQzone = 10,
UMSocialSnsTypeSina = 11, //sina weibo
} UMSocialSnsType;
This code sets the sharetype of a platform:
snsPlatform.shareToType = UMSocialSnsTypeDouban;
In Swift, I want to get the sharetype of the platform:
var snstype = snsPlatform!.shareToType
println(snstype)
Result: UMSocialSnsType (has 1 child)
snstype.toRaw()
Error: UMSocialSnsType does not have a member named "toRaw"
From what I can tell, UMSocialSNSType was declared in Objective-C without using the NS_ENUM macro, so it wasn't imported as a Swift enum. That means that instead of being able to use .toRaw() or UMSocialSNSType.Douban you have to use the different enumeration values as constant structs. Unfortunately the type also doesn't have the appropriate operators (== or ~=) set up, so you have to compare the value property.
var snstype = snsPlatform!.shareToType
switch snstype.value {
case UMSocialSnsTypeDouban.value:
println("douban")
case UMSocialSnsTypeEmail.value:
println("email")
default:
println("other")
}
if snstype.value == UMSocialSnsTypeDouban.value {
println("douban")
}
The good news is that it looks like all the constants autocomplete in Xcode, so you should be able to do find the comparisons you need to do that way.
It looks like the Swift-version of the bridged typedef...enum must be something like:
struct UMSocialSnsType {
var value:Int
init(_ val:Int) {
value = val
}
}
let UMSocialSnsTypeNone = UMSocialSnsType(0)
let UMSocialSnsTypeQzone = UMSocialSnsType(10)
let UMSocialSnsTypeSina = UMSocialSnsType(11)
// etc
Whereas if it had been declared in Objective-C with the NS_ENUM macro, it would look like:
enum UMSocialSnsType: Int {
case UMSocialSnsTypeNone = 0
case UMSocialSnsTypeQzone = 10, UMSocialSnsTypeSina // etc.
}
I am trying to implement the following typedef
typedef NS_OPTIONS (NSInteger, MyCellCorners) {
MyCellCornerTopLeft,
MyCellCornerTopRight,
MyCellCornerBottomLeft,
MyCellCornerBottomRight,
};
and correctly assign a value with
MyCellCorners cellCorners = (MyCellCornerTopLeft | MyCellCornerTopRight);
when drawing my cell, how can I check which of the options match so I can correctly draw it.
Use bit masking:
typedef NS_OPTIONS (NSInteger, MyCellCorners) {
MyCellCornerTopLeft = 1 << 0,
MyCellCornerTopRight = 1 << 1,
MyCellCornerBottomLeft = 1 << 2,
MyCellCornerBottomRight = 1 << 3,
};
MyCellCorners cellCorners = MyCellCornerTopLeft | MyCellCornerTopRight;
if (cellCorners & MyCellCornerTopLeft) {
// top left corner set
}
if (etc...) {
}
The correct way to check for this value is to first bitwise AND the values and then check for equality to the required value.
MyCellCorners cellCorners = MyCellCornerTopLeft | MyCellCornerTopRight;
if ((cellCorners & MyCellCornerTopLeft) == MyCellCornerTopLeft) {
// top left corner set
}
The following reference explains why this is correct and provides other insights into enumerated types.
Reference: checking-for-a-value-in-a-bit-mask
I agree with NSWill. I recently had a similar issue with wrong comparison.
The right if statement should be:
if ((cellCorners & MyCellCornerTopLeft) == MyCellCornerTopLeft){
I've got 5 states in my app, and I use BOOL flags to mark them. But it isn't straightforward, because I have to write 5 lines to change all flags when I want to change state.
Can you write some ideas or simple code to solve this problem?
code:
//need to choose second state
flag1 = false;
flag2 = true;
flag3 = false;
flag4 = false;
flag5 = false;
Also, it's to bad because I can choose 2 states one time.
P.S.
I found modern and more Apple-way. Answer below.
Use typedef enum to define all possible states using bitmasks.
Note this will give you a maximum of up to 64 different states (on most platforms). If you need more possible states, this solution will not work.
Handling this scheme will require you to fully understand and safely handle boolean algebra.
//define all possible states
typedef enum
{
stateOne = 1 << 0, // = 1
stateTwo = 1 << 1, // = 2
stateThree = 1 << 2, // = 4
stateFour = 1 << 3, // = 8
stateFive = 1 << 4 // = 16
} FiveStateMask;
//declare a state
FiveStateMask state;
//select single state
state = stateOne; // = 1
//select a mixture of two states
state = stateTwo | stateFive; // 16 | 2 = 18
//add a state
state |= stateOne; // 18 | 1 = 19
//remove stateTwo from our state (if set)
if ((state & stateTwo) == stateTwo)
{
state ^= stateTwo; // 19 ^ 2 = 17
}
//check for a single state (while others might also be selected)
if ((state & stateOne) == stateOne)
{
//stateOne is selected, do something
}
//check for a combination of states (while others might also be selected)
if ((state & (stateOne | stateTwo)) == stateOne | stateTwo)
{
//stateOne and stateTwo are selected, do something
}
//the previous check is a lot nicer to read when using a mask (again)
FiveStateMask checkMask = stateOne | stateTwo;
if ((state & checkMask) == checkMask)
{
//stateOne and stateTwo are selected, do something
}
You can always use a byte (unsigned char) size variable using its' bits
as flags (each bit acts as one BOOL flag).
Good instructions to set/clear/toggle/check a bit is here.
Offcourse you'd want to set kind of human readable names for this
flags, i.e.:
#define flag1 1
#define flag2 2
#define flag3 4
#define flag4 8
#define flag5 16
Nowadays we have got another option for flags. It is NS_ENUM.
typedef NS_ENUM(NSInteger, UITableViewCellStyle) {
UITableViewCellStyleDefault,
UITableViewCellStyleValue1,
UITableViewCellStyleValue2,
UITableViewCellStyleSubtitle
};
First arg for type and second for name.