So, my current understanding of enum is that I can use it to make constants that correspond to numbers. So,
typedef enum
{
number0 = 0,
number1 = 1,
.
.
.
} Numbers;
would allow me to refer to 0 as number0 in every part of the code. This seems to work fine. However, I'm having trouble figuring out how to use this in an Xcode Project. For example, say I write one class, NumberCounter, and include this code in the header file. Then, I write another class, numberCalculator. If I want to use the same definitions in the second class, do I have to A) write the classes in the same source file, or B) include the above code in every file I want to use the numbers?
If I include the code in one class, and exclude it in the second, I get (when trying to have a function return something of type Numbers) a Parse Issue Expected a Type error, but if I include the code in both, it gives an error about 'Redefinition of enumerator'. Currently, my workaround is to include the code in every file, then use the preprocessor to make sure it is only executed once - i.e:
#ifndef NumberDef
#define NumberDef
typedef enum
{
number0 = 0,
number1 = 1,
.
.
.
} Numbers;
#endif
This works, but I feel like there should be a nice simple way of doing this. What am I missing here?
Include the header it's defined in.
Related
I have an error that used to look like this in Objective-C
NSString * const JKConfigurationErrorDomain;
typedef NS_ENUM(NSInteger, JKConfigurationCode) {
JKConfigurationCodeUnknown,
JKConfigurationCodeSomethingBad,
JKConfigurationCodeParsing,
};
Now, this is ugly to use in Swift. But since Swift 4, we can use NSErrorDomain and NS_ERROR_ENUM to make the imported error much nicer in Swift:
NSErrorDomain const JKConfigurationErrorDomain;
typedef NS_ERROR_ENUM(JKConfigurationErrorDomain, JKConfigurationCode) {
JKConfigurationCodeUnknown,
JKConfigurationErrorSomethingBad,
JKConfigurationErrorParsing,
};
This means I can now do stuff in Swift like this:
if let myError = error as? JKConfigurationError, myError.code = .somethingBad {
// handle it
}
instead of having to cast error to NSError, then check its .domain then look at the .code which is an integer, etc.
So far, so good. But my library is called JKConfiguration and there is already a JKConfiguration object (the center piece of the library) in there and as soon as I start using JKConfiguration anywhere in the library code I get an error:
'JKConfiguration' is ambiguous for type lookup in this context
I don't get it, why? What does NSErrorDomain or NS_ERROR_ENUM do such that the type lookup becomes ambiguous and how can I fix it?
What I tried already:
use NS_SWIFT_NAME on the NS_ERROR_ENUM typedef and rename it to something else. Looking at the generated Swift header, the rename works, but doesn't solve the issue
Change the name of the error domain (and thus of the generated error type in Swift). Seems to work according to the generated Swift header, but the issue still persists. Why is that?
The issue is not, as I initially thought, in the name of the error domain. Nor is it a problem with the library name. It’s a problem of the error enum‘s name, in the example above: JKConfigurationCode.
What the Compiler does for the enum cases of an NS_ERROR_ENUM is two-fold:
use the name of the enum and remove that prefix from all enum cases before importing them to swift
create an enum with the given name to hold those cases. If the given name ends with Code remove that suffix.
So that last part is the issue. It means that NS_ERROR_ENUM(AnyDomainName, JKConfigurationCode) generates an enum in Swift to hold the error codes with the name JKConfiguration (without the Code) prefix. But that type already exists in my example, which leads to the ambiguity.
So the solution is to change
NS_ERROR_ENUM(JKConfigurationErrorDomain, JKConfigurationCode)
to
NS_ERROR_ENUM(JKConfigurationErrorDomain, JKConfigurationSomethingCode)
Or similar.
Don’t forget to update all the prefixes of your enum cases though, as it seems the compiler won’t find them if the prefixes don’t match the enum name.
Why doesn’t NS_SWIFT_NAME work to rename the enum?
Best I can tell, NS_SWIFT_NAME causes the type to be renamed but not the cases. This leads to an empty type (Swift chooses a struct in that case) being generated for the error code as Swift doesn’t seem to find the cases. And the original container for the enum cases still has the offending name.
In my game I have a header file that contains properties and functions for seasons in my game. These properties are all static and include a float representing the current season and another float representing the current point in the transition between seasons, being zero if it isn't transitioning.
Several functions throughout my game rely on the transition (two at this point) and one is working perfectly. Although, in another instance this isn't working at all.
In the class responsible for controlling the background for my game, when ever the "SeasonTransition" variable is referenced it just comes up zero. But in the other class, where the variable is referenced exactly the same way, it comes up with the real value.
This is a picture after a breakpoint has been called after the game could update a few frames:
Once again these variables are declared in a c header file:
#import "somestuff.h"
static float SeasonTransition
etc...
This shouldn't be doing this right? How could I fix this?
EDIT:
The Season.h file is as follows:
//GL.h contains different functions and global variables to be used anywhere in the project.
//This file, like Season.h is a singular header file with static declarations, and is setup
//the same way. I have been developing this from the start of the project and havent had any
//problems with it.
#import "GL.h"
static float currentSeason;
static float SeasonTransition;
static void UpdateSeason(){
currentSeason += 0.0002f;
float TransitionLength = 0.15f;
float SeasonDepth = Clamp(currentSeason - floorf(currentSeason), 0, TransitionLength);
float bigTL = TransitionLength / 4;
float endTL = TransitionLength;
float Speed2 = 0;
float Speed1 = 1;
float bRatio = SeasonDepth / bigTL;
float eRatio = SeasonDepth / endTL;
SeasonTransition = (SeasonDepth < TransitionLength) ?
((SeasonDepth < bigTL) ?
(Speed1 * bRatio) + (Speed2 * (1.0f - bRatio)) :
(Speed1 * (1.0f - eRatio)) + (Speed2 * eRatio))
:
Speed2;
}
If you put static float SeasonTransition; into two separate C files (or one header file included by two separate C files), each C file will have its own independent copy of the variable.
If one of those C files then modifies the variable, it will modify its copy. It will not touch the one in the other C file. That sounds like the situation you're in.
The normal way to do this is to define the variable in one and declare it external in the other, something like:
file1.c:
int myVar; // it exists here.
file2.c:
extern int myVar; // it exists, but elsewhere.
You don't want to mark it static in the first since that effectively makes it invisible to the second. And you mark it extern in the second so that it knows the variable exists elsewhere (in the first).
You would actually see the effect if it weren't static. When the linker came to link those two files together, it would complain about having two variables with the same name.
There are many variations on how to do that, I've shown the simplest. It's probably better to have something like:
file1.h:
extern int myVar; // so everyone knows about the variable
// just by including this.
file1.c:
#include "file1.h" // or import for ObjC.
int myVar; // the actual variable.
file2.c:
#include "file1.h" // now we know about it, in the OTHER C file.
I could be wrong but I think the problem might be you don't quit understand how include/import work. These are not quit language features but preprocessor feature. When you include somewhere in a file, your say take the entire contents of that other file and stick it in here before your start compiling. So if you include the same header file in multiple different other files you will end up with multiple version of that static variable, with out the static you will get a compiler error because you have redefined the same variable multiple times. import works almost the same except if the preprocessor determines that the included file has already be include into the destination file (could be indirectly through another include), then it will not include the file again. If you understand this you can then see that declaring static variable within you header is quit strange, because you will end up with multiple versions of that variable everywhere that header is included. Normally you want to make the variable global in which case you define it in a .c or .m file and then declare it extern in the header or you want the variable to be private then you declare it static in the .c or .m file.
What static does is to hide the variable declaration from the linker, so the linker can not recognise that all the different declarations of the same name should be treated as the same variable.
#ifndef Season_h
#define Season_h
... your header stuff
#endif
Also, if you dont call updateSeason seasonTransition will be zero.
I have a .h file that defines a few hundred constants. Let's assume this to be one of them:
#define KDSomeItem 1
I know that the Objective-C runtime API can be used to retrieve a list of instance variable names: like detailed in this question: How do I list all fields of an object in Objective-C?
I also know that the getter [object valueForKey:theName] can be used to access the ivars as found in an earlier question of mine: How to access a property/variable using a String holding its name
My question is can something simmilar be done with constants? That is can I:
a) get a list of all constants programmatically
b) access a constant if I have a string holding its name
e.g. if I had a String like this NSString * theName = #"KDSomeItem"; could I evaluate my constant using this string?
You would not be able to do this: unlike instance variables, #define-d constants do not leave a trace after the preprocessor is done with them. They leave no metadata behind - all instances of KDSomeItem in the body of your program will be replaced with 1 even before the Objective C compiler proper gets to analyze your code.
If you need the constant names to be available at run time, you would need to build all the necessary metadata yourself. In order to do that, you may want to look into "stringizing" operator of the preprocessor:
#define STR(X) #X
This macro can be applied to a preprocessor constant, and produce a C string literal with its name:
const char *nameOfKDSomeItem = STR(KDSomeItem); // same as "KDSomeItem"
Nope. Your constant is a preprocessor macro. It is textually substituted into your source code where you use it.
I'm new to Objective-C, and I have a few questions regarding const and the preprocessing directive #define.
First, I found that it's not possible to define the type of the constant using #define. Why is that?
Second, are there any advantages to use one of them over the another one?
Finally, which way is more efficient and/or more secure?
First, I found that its not possible to define the type of the constant using #define, why is that?
Why is what? It's not true:
#define MY_INT_CONSTANT ((int) 12345)
Second, are there any advantages to use one of them over the another one?
Yes. #define defines a macro which is replaced even before compilation starts. const merely modifies a variable so that the compiler will flag an error if you try to change it. There are contexts in which you can use a #define but you can't use a const (although I'm struggling to find one using the latest clang). In theory, a const takes up space in the executable and requires a reference to memory, but in practice this is insignificant and may be optimised away by the compiler.
consts are much more compiler and debugger friendly than #defines. In most cases, this is the overriding point you should consider when making a decision on which one to use.
Just thought of a context in which you can use #define but not const. If you have a constant that you want to use in lots of .c files, with a #define you just stick it in a header. With a const you have to have a definition in a C file and
// in a C file
const int MY_INT_CONST = 12345;
// in a header
extern const int MY_INT_CONST;
in a header. MY_INT_CONST can't be used as the size of a static or global scope array in any C file except the one it is defined in.
However, for integer constants you can use an enum. In fact that is what Apple does almost invariably. This has all the advantages of both #defines and consts but only works for integer constants.
// In a header
enum
{
MY_INT_CONST = 12345,
};
Finally, which way is more efficient and/or more secure?
#define is more efficient in theory although, as I said, modern compilers probably ensure there is little difference. #define is more secure in that it is always a compiler error to try to assign to it
#define FOO 5
// ....
FOO = 6; // Always a syntax error
consts can be tricked into being assigned to although the compiler might issue warnings:
const int FOO = 5;
// ...
(int) FOO = 6; // Can make this compile
Depending on the platform, the assignment might still fail at run time if the constant is placed in a read only segment and it's officially undefined behaviour according to the C standard.
Personally, for integer constants, I always use enums for constants of other types, I use const unless I have a very good reason not to.
From a C coder:
A const is simply a variable whose content cannot be changed.
#define name value, however, is a preprocessor command that replaces all instances of the name with value.
For instance, if you #define defTest 5, all instances of defTest in your code will be replaced with 5 when you compile.
It is important to understand the difference between the #define and the const instructions which are not meant to the same things.
const
const is used to generate an object from the asked type that will be, once initialised, constant. It means that it is an object in the program memory and can be used as readonly.
The object is generated every time the program is launched.
#define
#define is used in order to ease the code readability and future modifications. When using a define, you only mask a value behind a name. Hence when working with a rectangle you can define width and height with corresponding values. Then in the code, it will be easier to read since instead of numbers there will be names.
If later you decide to change the value for the width you would only have to change it in the define instead of a boring and dangerous find/replace in your whole file.
When compiling, the preprocessor will replace all the defined name by the values in the code. Hence, there is no time lost using them.
In addition to other peoples comments, errors using #define are notoriously difficult to debug as the pre-processor gets hold of them before the compiler.
Since pre-processor directives are frowned upon, I suggest using a const. You can't specify a type with a pre-processor because a pre-processor directive is resolved before compilation. Well, you can, but something like:
#define DEFINE_INT(name,value) const int name = value;
and use it as
DEFINE_INT(x,42)
which would be seen by the compiler as
const int x = 42;
First, I found that its not possible to define the type of the constant using #define, why is that?
You can, see my first snippet.
Second, are there any advantages to use one of them over the another one?
Generally having a const instead of a pre-processor directive helps with debugging, not as much in this case (but still does).
Finally, which way is more efficient and/or more secure?
Both are as efficient. I'd say the macro can potentially be more secure as it can't be changed during run-time, whereas a variable could.
I have used #define before to help create more methods out of one method like if I have something like.
// This method takes up to 4 numbers, we don't care what the method does with these numbers.
void doSomeCalculationWithMultipleNumbers:(NSNumber *)num1 Number2:(NSNumber *)num2 Number3:(NSNumber *)num23 Number3:(NSNumber *)num3;
But I also what to have a method that only takes 3 numbers and 2 numbers so instead of writing two new methods I am going to use the same one using the #define, like so.
#define doCalculationWithFourNumbers(num1, num2, num3, num4) \
doSomeCalculationWithMultipleNumbers((num1), (num2), (num3), (num4))
#define doCalculationWithThreeNumbers(num1, num2, num3) \
doSomeCalculationWithMultipleNumbers((num1), (num2), (num3), nil)
#define doCalculationWithTwoNumbers(num1, num2) \
doSomeCalculationWithMultipleNumbers((num1), (num2), nil, nil)
I think this is a pretty cool thing to have, I know you can go straight to the method and just put nil in the spaces you don't want but if you are building a library it is very useful. Also this is how
NSLocalizedString(<#key#>, <#comment#>)
NSLocalizedStringFromTable(<#key#>, <#tbl#>, <#comment#>)
NSLocalizedStringFromTableInBundle(<#key#>, <#tbl#>, <#bundle#>, <#comment#>)
are done.
Whereas I don't believe you can do this with constants. But constants do have there benefits over #define like you can't specify a type with a #define because it is a pre-processor directive that is resolved before compilation, and if you get an error with #define they are harder to debug then constants. Both have there benefits and downsides but I would say it all depends on the programmer to which one you decided to use. I have written a library with both them in using the #define to do what I have shown and constants for declaring constant variables which I need to specify a type on.
Why do constants in all examples I've seen always start with k? And should I #define constants in header or .m file?
I'm new to Objective C, and I don't know C. Is there some tutorial somewhere that explains these sorts of things without assuming knowledge of C?
Starting constants with a "k" is a legacy of the pre-Mac OS X days. In fact, I think the practice might even come from way back in the day, when the Mac OS was written mostly in Pascal, and the predominant development language was Pascal. In C, #define'd constants are typically written in ALL CAPS, rather than prefixing with a "k".
As for where to #define constants: #define them where you're going to use them. If you expect people who #import your code to use the constants, put them in the header file; if the constants are only going to be used internally, put them in the .m file.
Current recommendations from Apple for naming constants don't include the 'k' prefix, but many organizations adopted that convention and still use it, so you still see it quite a lot.
The question of what the "k" means is answered in this question.
And if you intend for files other than that particular .m to use these constants, you have to put the constants in the header, since they can't import the .m file.
You might be interested in Cocoa Dev Central's C tutorial for Cocoa programmers. It explains a lot of the core concepts.
The k prefix comes from a time where many developers loved to use Hungarian notation in their code. In Hungarian notation, every variable has a prefix that tells you what type it is. pSize would be a pointer named "size" whereas iSize would be an integer named "size". Just looking at the name, you know the type of a variable. This can be pretty helpful in absence of modern IDEs that can show you the type of any variable at any time, otherwise you'd always have to search the declaration to know it. Following the trend of the time, Apple wanted to have a common prefix for all constants.
Okay, why not c then, like c for "constant"? Because c was already taken, in Hungarian notation, c is for "counter" (cApple means "count of apples"). There's a similar problem with the class, being a keyword in many languages, so how do you name a variable that points to a class? You will find tons of code naming this variable klass and thus k was chosen, k as in "konstant". In many languages this word actually does start with a k, see here.
Regarding your second question: You should not use #define for constant at all, if you can avoid it, as #define is typeless.
const int x = 10; // Type is int
const short y = 20; // Type is short
const uint64_t z = 30; // Type is for sure UInt64
const double d = 5000; // Type is for sure double
const char * str = "Hello"; // Type is for sure char *
#define FOO 90
What type is FOO? It's some kind of number. But what kind of number? So far any type or no type at all. Type will depend on how and where you use FOO in your code.
Also if you have a fixed set of numbers, use an enum as then the compiler can verify you are using a valid value and enum values are always constant.
If you have to use a define, it won't matter where you define it. Header files are files you share among multiple code files, so if you need the same define in more than one place, you write it into a header file and include that header file wherever that define is needed. What you write into a code file is only visible within that code file, except for non-static functions and Obj-C classes that are both globally visible by default. But unless a function is declared in a header file and that header file is included into a code file where you want to use that function, the compiler will not know how this function looks like (what parameters it expects, what result value it returns), so it cannot check any of this and must rely that you call it correctly (usually this will cause it to create a warning). Obj-C classes cannot be used at all, unless you tell the current code file at least that this name is the name of a class, yet if you want to actually do something with that class (other than just passing it around), the compiler needs to know the interface of the class, that's why interfaces go into header files (if the class is only used within the current code file, writing interface and implementation into the file is legal and will work, too).
k for "konvention". Seriously; it is just convention.
You can put a #define wherever you like; in a header, in the .m at the top, in the .m right next to where you use it. Just put it before any code that uses it.
The "intro to objective-c" documentation provided with the Xcode tool suite is actually quite good. Read it a few times (I like to re-read it once every 2 to 5 years).
However, neither it nor any of the C books that I'm aware of will answer these particular questions. The answers sort of become obvious through experience.
I believe it is because of the former prevalence of Hungarian Notation, so k was chosen because c stood for character. ( http://en.wikipedia.org/wiki/Hungarian_notation )
--Alan