Is it possible to use some thing like this in Objective-C:
#define number_of_items 10
and then using it as:
int arr[number_of_items];
Yes, assuming you mean Objective C. It's pretty much a superset of "proper" C so this is perfectly okay. It's also okay in both C and C++.
You can see that it works in the following transcript:
pax> cat qq.m
#import <objc/Object.h>
// First method.
#define number_of_items 10
int arr[number_of_items];
// Second method.
#define NUMBER_OF_ROWS 10
#interface test : Object{ int xyzzy[NUMBER_OF_ROWS]; }
#end;
pax> vi qq.m ; gcc -o qq.o -c qq.m -lobjc
pax> # no errors occurred
And, now that we've finally seen what you're actually using:
#define IS_IPAD (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad)
#define NUMBER_OF_ROWS_ (IS_IPAD? 18: 18)
NUMBER_OF_ROWS_ is not a constant, since it depends on the return value of the function UI_USER_INTERFACE_IDIOM().
In other words, it cannot be calculated at compile time. That's why you're getting the error. You can see this by compiling the following code:
#define IS_IPAD (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad)
#define NUMBER_OF_ROWS_ (IS_IPAD ? 18: 20)
int UI_USER_INTERFACE_IDIOM(void) {return 20;}
int UIUserInterfaceIdiomPad;
int main (void) {
int arr[NUMBER_OF_ROWS_];
return 0;
}
Under gcc --pedantic, you get:
qq.m: In function ‘main’:
qq.m:8: warning: ISO C90 forbids variable length array ‘arr’
You either need to use a dynamically adjustable collection like NSMutableArray or use an array of the maximum size desired and only use what you need of that.
Related
This code isn't working due to Error that I couldn't get which is :
Expected ‘)’
type specifier missing default to ‘int’
Expected parameter declaration
conflicting types or ’NSLog'
this error only appear in line 4
please help thanks,
#import <Foundation/Foundation.h>
#define SYS
#ifdef SYS
NSLog (#"SYS is Define ");
#endif
#define minimum(x,y) (x < y ? x:y)
#define Lower_case(x) ((x>'a') && (x<'z'))
#define ToUper_case(x) ((x-'a')+'A')
#define Uper_case(x) (Lower_case(x) ? (x-'a')+'A':x)
#interface NewDef : NSObject
#end
You have a code statement:
NSLog (#"SYS is Define ");
which is not inside any method/function. This is not allowed in (Objective-)C(++).
You can probably achieve what you wish using:
#pragma message "SYS is Define "
This is a compile-time instruction, just as #define is, to the compiler to (somehow) present a message. In Xcode if this line is reached it will be marked with a warning icon:
If you comment out the #define SYS then the mark will go away as the line is no longer reached:
HTH
Now, I know that this is a simple question for MacOS, but when I compile a code with 'arc4random % n' in it, I just get an error log in Terminal saying:
main.m:9: error: ‘arc4random’ undeclared (first use in this function)
main.m:9: error: (Each undeclared identifier is reported only once
main.m:9: error: for each function it appears in.)
and I use:
gcc `gnustep-config --objc-flags` -lgnustep-base main.m -o main
to compile it
and here's my code (if it helps) :
#import <Foundation/Foundation.h>
int main (int argc, const char * argv[])
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
int number, guess;
number = arc4random() % 101;
while (!guess == number) {
NSLog (#"Please guess a number between 1 and 100");
scanf ("%i", &guess);
if (guess < number) {
NSLog (#"Sorry, guessed too low!");
}
else if (guess > number) {
NSLog (#"Sorry, guessed too high!");
}
}
NSLog (#"You guessed correct!");
[pool drain];
return 0;
}
You may consider using clang instead of gcc
Use
clang -fno-objc-arc main.m -framework Foundation -o main
Also I'd use arc4random_uniform(101) instead of arc4random() % 101, since the former is bias free.
A few things:
Your use of >> and <<, these are not valid comparison operators. This will compile, but not perform what you expect. You either need to use > (greater than), >= (greater than or equals), < (less than) or <= (less than or equals).
Your compile error is due to your use of arc4random. This is a function, but you've not used it as such. You need to change your line to
number = arc4random() % 101;
Not 100% sure on this, but %i in your scanf looks like it should be %d
So I have an issue with using a constant variable in the following switch statement in Objective-C.
I have Constants.h with the following:
// Constants.h
extern NSInteger const TXT_NAME;
And Constants.m as:
// Constants.m
#import "Constants.h"
NSInteger const TXT_NAME = 1;
Then in TabBasic.m I am trying to use this constant in a switch-case statement:
// TabBasic.m
#import "TabBasic.h"
#import "Constants.h"
... code ...
- (IBAction)saveValue:(id)sender {
if ([sender isKindOfClass: [UITextField class]]) {
UITextField *txtField = (UITextField *) sender;
switch (txtField.tag) {
case TXT_NAME:
NSLog(#"Set property name to: %#", txtField.text);
break;
}
}
}
But unfortunately it is giving me the following two errors on the "case TXT_NAME:" line:
Expression is not an integer constant expression
Case label does not reduce to an integer constant
Does anyone know what I'm doing wrong? The "tag" variable of a UITextField returns an NSInteger, so I don't see the issue...
Thanks for your help!
Quick solution, you should place NSInteger const TXT_NAME = 1; in Constants.h, and don't need anything in Constants.m.
Reason: If you set the value of the constant in the .m, it is not visible by other translation units that only include the .h file. The value of the constant must be known at compile time to be able to be used in a case within a switch.
Update:
The above works when compiling in Objective-C++. You need to have your files end in .mm instead of .m for them to be compiled in Objective-C++ instead of Objective-C.
In order to work in Objective-C, you should define your constant either like this:
#define TXT_NAME 1
Or even better, like this:
enum {TXT_NAME = 1};
I would normally follow what Apple seem to do and define a typedef enum in the .h file like this.
typedef NS_ENUM(NSInteger, PSOption) {
PSOption1,
PSOption2,
PSOption3,
PSOption4,
};
You can then use it in your case statement and even pass it into functions as well as a type e.g.
- (void)myMethod:(PSOption)option;
A further advantage of doing this over a #define is code completion and compiler checking
Update: I'm working with the GNU-runtime on Linux. The problem does not occur on MacOS with the Apple-runtime.
Update 2: I compiled the GNU-runtime on MacOS and build the example with it. The error does not occur on MacOS with the GNU-runtime. I would say the problem is the glibc (since backtrace and backtrace_symbols are glibc extensions).
When printing a backtrace in a GCC compiled Objective-C app using backtraceand backtrace_symbols, I don't get any Objective-C symbols. Only the filenames, addresses and C-symbols appear.
I compiled with -g and linked with -rdynamic.
My test app:
void _printTrace()
{
void *addr[1024];
int aCount = backtrace(addr, 1024);
char **frameStrings = backtrace_symbols(addr, aCount);
for (int i = 0; i < aCount; i++) {
printf("%s\n", frameStrings[i]);
}
free(frameStrings);
}
#interface TheObject
+ (void)_printTrace;
+ (void)printTrace;
#end
#implementation TheObject
+ (void)_printTrace
{
_printTrace();
}
+ (void)printTrace
{
[self _printTrace];
}
#end
void printTrace()
{
[TheObject printTrace];
}
int main(int argc, char **argv)
{
printTrace();
return 0;
}
and it's output:
./test.bin(_printTrace+0x1f) [0x8048e05]
./test.bin() [0x8048e60]
./test.bin() [0x8048e8b]
./test.bin(printTrace+0x34) [0x8048ec5]
./test.bin(main+0xf) [0x8048eda]
/lib/libc.so.6(__libc_start_main+0xe5) [0xb7643bb5]
./test.bin() [0x8048b51]
Is there a way to let the Objective-C symbols appear in this backtrace?
dladdr() only reports global and weak symbols. But all Objective-C function symbols are local:
$ readelf -s so_backtrace
Symbol table '.dynsym' contains 29 entries:
…
Symbol table '.symtab' contains 121 entries:
Num: Value Size Type Bind Vis Ndx Name
…
49: 08048a01 13 FUNC LOCAL DEFAULT 14 _c_TheObject___printTrace
50: 08048a0e 47 FUNC LOCAL DEFAULT 14 _c_TheObject__printTrace
…
You can verify that local symbols are never returned by looking at the GNU libc source code yourself. backtrace_symbols() is defined in sysdeps/generic/elf/backtracesyms.c. It relies on _dl_addr(), which is defined in elf/dl-addr.c, to provide it with the symbol names. That ultimately calls determine_info(). If it can, it uses the the GNU hash table, which does not include local symbols by design:
49 /* We look at all symbol table entries referenced by the hash
50 table. */
…
60 /* The hash table never references local symbols so
61 we can omit that test here. */
If the GNU hash table isn't present, it falls back to standard hash table. This includes all the symbols, but the determine_info() code filters out all but the global symbols and weak symbols:
90 if ((ELFW(ST_BIND) (symtab->st_info) == STB_GLOBAL
91 || ELFW(ST_BIND) (symtab->st_info) == STB_WEAK)
To symbolicate the Objective-C function addresses, you would have to perform the look-up yourself and not filter out the local function symbols. Further, you would have to demangle the Objective-C function symbols to restore _c_TheObject___printTrace to +[TheObject _printTrace].
GNUstep's NSException implementation doesn't use backtrace, instead it uses libbfd (binary file descriptor). I think the function that actually does the work is called static void find_address, which you can view here. Using this trivial example, I get the results that follow.
#include <Foundation/Foundation.h>
#interface Test : NSObject {}
+ (void) test;
#end
#implementation Test
+ (void) test
{
Class GSStackTrace = objc_getClass("GSStackTrace");
id stack = [GSStackTrace currentStack];
for (int i = 0; i < [stack frameCount]; i++)
{
NSLog (#"%#", [[stack frameAt:i] function]);
}
}
#end
int main(int argc, char **argv)
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
[Test test];
[pool release];
return 0;
}
Output (when compiled with debug symbols):
2010-10-18 14:14:46.188 a.out[29091] +[GSStackTrace currentStack]
2010-10-18 14:14:46.190 a.out[29091] +[Test test]
2010-10-18 14:14:46.190 a.out[29091] main
2010-10-18 14:14:46.190 a.out[29091] __libc_start_main
You may be able to pick apart GSStackTrace. It is a “private” class (that's why I need to use objc_getClass, you'll also get lots of unrecognised selector warnings), but it seems to contain all the code necessary to read Objective-C class names.
Tested on Ubuntu 9.04 with GNUstep configured with --enable-debug (so that GSFunctionInfo is included in the build).
I expect you'll need to ask the ObjC run time about the addresses to get symbol information. The addresses returned from backtrace() could probably be passed to something like object_getClass() to get the class, for example. I haven't tried any of this but it's where I'd look next in this case.
(Question updated after first comment)
int max_size = 20;
int h[max_size];
Debugging gives a value of [-1] for h when using max_size to initialize;
If instead I initialize using an integer. So the code is:int h[20] , it works fine.
This was with GCC 4.2 on Mac OS X 10.6.
I just compiled and ran the following program incorporating your code:
#import <Foundation/Foundation.h>
int main() {
int max_size = 20;
int h[max_size];
h[0] = 5;
NSLog(#"It is %d", h[0]);
return 0;
}
It worked fine. The problem is something besides simply declaring an array.
This was with GCC 4.0.1 on Mac OS X 10.4.
If I recall correctly, some compilers need to know the size of stack-allocated arrays explicitly at compile-time. This (possibly) being the case, you could make your max_size variable const or an #define macro (or an integer literal, as you've done.) Alternatively, you could dynamically allocate the array and then the size could be any-old variable
ex:
int *array = calloc(max_size, sizeof(int));