Reason for getting random garbage value in Xcode - objective-c

#import <Foundation/Foundation.h>
int main(int argc, const char * argv[])
{
#autoreleasepool {
NSLog(#"HELLo %i");
}
return 0;
}
I tried to print a integer value in objective-C XCode compiler but i forgot to specify the variable. When i executed it, i got some garbage value like 4144 for integer, 98489866930523080936567411769317361312251531363217687183360.000000 for float and values like that for other data types too.
I'm just really interested in knowing the reason behind this garbage output ??????

This is because the values to print are passed in registers or on the stack/frame to the function that will print the results.
The printf() function (or in your case NSLog), and functions like it, take 1 or more parameters. The maximum number of parameters is not specified by the function header.
This function first gets a pointer to the string to print, and parses it. Every time it encounters %i, %d, %f etc., it starts pulling the values to print off of the stack, one by one.
In your case, the value it is pulling is uninitialized since it isn't specified, and in this case its value translated to 4144 when interpreted as an integer.
There are some good answers here:
How do vararg functions find out the number of arguments in machine code?

Related

Is an int in Objective-C automatically initialized to 0? [duplicate]

This question already has answers here:
Does an int in Objective-C have a default value of 1?
(4 answers)
Closed 8 years ago.
I am reading a book called "Programming in Objective-C", Sixth Edition, by Stephen G. Kochan. It has the following statement on page 144 which is confusing me:
Local variables that are basic C data types have no default initial value, so you must set them to some value before using them.
Yet when I have the following code, it still works, and displays 0:
#import <Foundation/Foundation.h>
int main(int argc, const char * argv[])
{
int number;
NSLog(#"%i", number);
return 0;
}
Isn't int a basic C data type?
"Basic C data types have no default initial value", it does not mean they will not have a value if you do not initialize them, it is just that you will not know in advance what that value will be.
In your case, that number just happen to have a zero, but it could have other value.
Local variables are allocated on the stack. The initial value of a local variable has no guaranteed value. Instead the value of the local variable depends entirely on whatever random values were left by the previous function that used that particular region of the stack.
In the case of the main function, the initial values of local variables may seem predictable since main is the first function to run and use that region of the stack. However, the compiler makes no effort, and the language specification has no requirement, to guarantee the initial value of the local variables.
In summary, always explicitly initialize local variables before using them.

Why cStringUsingEncoding: returns const char * instead of char *?

cStringUsingEncoding: returns a "const char *" despite it is returning a dynamically allocated C string(from it's documentation). So, what is the purpose of const here? We could simply modify the returned C string by casting to char *.
cStringUsingEncoding:
The returned C string is guaranteed to be valid only until either the
receiver is freed, or until the current autorelease pool is emptied,
whichever occurs first.
I think library is following the common practice of pointer-to-const; it's not expected to be modified or released.
From Objective-C runtime;
const char * object_getClassName(id obj) -- Nothing specified about the returned string.
char * method_copyArgumentType(Method method, unsigned int index) -- You must free the string with free(). (May be it's advising because of it's returning a copy.)
The common pattern is that you should not modify buffers that you don't own. const documents and (somewhat) enforces this.
As for cStringUsingEncoding:, The documentation is saying that the returned buffer is only valid as long as the NSString from which you received it, or for the duration of the current autorelease pool. This implies that you do not own the returned buffer, because you're not expected to release it.
Your last two examples from the runtime follow the same convention:
const char * object_getClassName(id obj)
Doesn't inform you that you should release the buffer, and the name doesn't contain any indication that you own the buffer. Therefore you don't free() it, and you don't modify it.
char * method_copyArgumentType(Method method, unsigned int index)
The docs explicitly tell you that you should free the buffer, and the function name contains the tell-tale copy which also implies that you own the buffer. Therefore you can modify it all you want, and must free() it.
Thing is, the result is const because
modifying it will not change the string itself, and the cString is really just meant to be a different representation of the string
it will probably return the same cString "over and over again", as long as the string doesn't change.
Other than that, declaring a result to be const even if the implementation doesn't enforce or require that is something an interface designer can do, maybe because he wants it to be treated that way. And it leaves the path open to optimize things for cases where the "const" is useful.

How to make objective C string manipulations generate the name of a particular constant?

I've created a Constants.h file with a list of:
#define kw00 #"foo"
#define kw01 #"bar"
...
I also use #import Constants.h in the .h. Using newQuote method, I'm trying to randomly select one of the kw strings, but am having difficulty discovering how to call a reference to a constant that is defined within the string kwString.
-(IBAction)newQuote
{
int rNumber = arc4random() % kwTotal;
(rNumber <9)
{
NSString *kwString = [#"kw0" stringByAppendingString:[NSString stringWithFormat:#"%d", rNumber]];
}
}
Thoughts and suggestions would be most appreciated.
It simply isn't possible to access things this way. Those "constants" don't even exist at runtime, or when the compiler sees your code — they're translated by the preprocessor into literal strings.
You should instead create an array, and then you can just get the element at a given index.
(In general, any time you're naming things with sequential numbers on the end, the answer to any problems you might have is "Use an array.")

Objective-C pointer values

I'm compiling an application using X-Code 3.2.6 (64-bit). The application is compiling against the 10.5 SDK and in 32 Bit Intel Architecture.
I've declared a character array as:
char iptmp[ARRAY_SIZE];
so I'm calling a function thus:
myfunc(&iptmp);
Where myfunc is declared:
void myfunc(char** value)
{
...
};
With the intention of loading the character array with the contents of another string with strncpy. When you see what's below you might appreciate why I don't simply do something like: strcpy(iptmp, myfunc()); but here is the problem:
Value of iptmp prior to function call: 0xb0206f5a
Value of *value in function: 0xffffb020
I've tried various things to resolve this problem, but the only thing that seems to stick is to receive a UINT32 value and cast:
myfunc((UINT32) &iptmp);
void myfunc(UINT32 value)
{
char* target = (char*) value;
...
}
This is causing havoc in my code. What is going on with the pointer value?
What happens here is that iptmp is a location in memory. If you write iptmp you will get the address of the aray. However, you will also get the address of it if you write &iptmp. However, you assume that you will get a pointer to a pointer to the array.
The best way to handle this is simply doing:
void myfunc(char * value)
{
...
};
The pointer value will point to the array, which you can modify anyway you like.
When you derefence *value, you're saying "take the pointer stored in value, and load the bytes at that location as if they were a char *". But the bytes at the location pointed to by value aren't a char * - they're the first bytes of iptmp[] itself (in your case, the first 4 bytes).
The root cause is that you're passing &iptmp, which has type char (*)[ARRAY_SIZE], to a function that expects a char ** parameter. These types are not interchangeable, as you've found. The correct declaration for the function would be:
void myfunc(char (*value)[ARRAY_SIZE])
{
/* ... */
}
You can then pass &iptmp, and you will find that *value has the value that you expect.
Why not just
void myfunc(char *value)
{
strncpy(value, ...);
}
and
myfunc(iptmp);
Remember, arrays and pointers in C are not the same things, although you may have heard the opposite many times. An array is an object whose size is equal to its length multiplied by the size of each of its elements, while a pointer is just like a single int but with special semantics.
Hence, the two expressions iptmp and &iptmp yield the same result, namely the starting address of the array. iptmp yields a pointer value for convenience, but that doesn't mean that the object iptmp is a pointer itself.
By attempting to get the address of the address of the array, you really intend to perform &(&iptmp), which is a meaningless, erroneous operation.

Quick Multiplication Question - Cocoa

I'm still learning, and I'm just stuck. I want the user to enter any number and in result, my program will do this equation:
x = 5*y
(y is the number the user adds, x is outcome)
How would I do this? I'm not sure if I'm suppose to add in an int or NSString. Which should I use, and should I enter anything in the header files?
I'm not sure if I'm suppose to add in an int or NSString.
Well, one of these is a numeric type and the other is a text type. How do you multiply text? (Aside from repeating it.)
You need a numeric type.
I would caution against int, since it can only hold integers. The user wouldn't be able to enter “0.5” and get 2.5; when you converted the “0.5” to an int, the fractional part would get lopped off, leaving only the integral part, which is 0. Then you'd multiply 5 by 0, and the result you return to the user would be 0.
Use double. That's a floating-point type; as such, it can hold fractional values.
… should I enter anything in the header files?
Yes, but what you enter depends on whether you want to use Bindings or not (assuming that you really are talking about Cocoa and not Cocoa Touch).
Without Bindings, declare an outlet to the text field you're going to retrieve the multiplier from, and another to the text field you're going to put the product into. Send the input text field a doubleValue message to get the multiplier, and send the output text field a setDoubleValue: message with the product.
With Bindings, declare two instance variables holding double values—again, one for the multiplier and one for the product—along with properties exposing the instance variables, then synthesize the properties, and, finally, bind the text fields' value bindings to those properties.
If you're retrieving the NSString from a UI, then it's pretty simple to do:
NSString * answer = [NSString stringWithFormat:#"%d", [userInputString integerValue]*5];
This can be done without any objective C. That is, since Objective-C is a superset of C, the problem can be solved in pure C.
#include <stdio.h>
int main(void)
{
int i;
fscanf(stdin, "%d", &i);
printf("%d\n", i * 5);
}
In the above the fscanf takes care of converting the character(s) read on the standard input to a number and storing it in i.
However, if you had characters from some other source in a char* and needed to convert them to an int, you could create an NSString* with the – initWithCString:encoding: and then use its intValue method, but in this particular problem that simply isn't needed.