I have an integer, with a value of 2. I append that to an NSMutableData object with:
[data appendBytes:&intVal length:2];
The number 2 is the number of bytes I want from the int. When I log the data, what I want to see is <0002> (one empty byte followed by one non-empty byte), but what I get is <0200>.
Am I missing something? The order and length of the bytes needs to be very specific. This is for a direct socket connection API. I'm not really sure what I'm doing wrong here. Maybe I'm just reading it wrong.
Thanks for the help.
Am I missing something?
Yes, the endianness of your system doesn't match what you need. Convert it to either little or big endian (the POSIX C library has functions for this purpose somewhere in the <netinet.h> or <inet.h> headers).
NSData's description method prints it's values in hexadecimal format. This means that it needs 4 digits to represent 2 bytes, every byte may map 2^8=256 different value, every hexadecimal digit may map 16 possibile values, so 16x16x16x16 = 2^16, which is exactly what you can map with 2 bytes.
Here is the answer, It works great!
uint16_t intVal = 2;
Byte *byteData = (Byte*)malloc(2);
byteData[1] = majorValue & 0xff;
byteData[0] = (majorValue & 0xff00) >> 8;
NSData * result = [NSData dataWithBytes:byteData length:sizeof(uint16_t)];
NSLog(#"result=%#",result);
Related
I'm trying to read in the first four bytes of a file. I know that this works correctly with the following C code:
FILE *file = fopen(URL.path.UTF8String, "rb");
uint data;
fread(&data, 4, 1, file);
NSLog(#"%u", data);
This prints out: 205
I'm trying to find the equivalent way of doing this in Objective-C/with Cocoa functions. I've tried a number of things. I feel like the following is close:
NSFileHandle *fileHandle = [NSFileHandle fileHandleForReadingFromURL:URL error:nil];
NSData *data2 = [fileHandle readDataOfLength:4];
NSLog(#"%#", data2);
NSLog(#"%u", (uint)data2.bytes);
This prints out: < cd000000 >
and: 1703552
As expected, the first four bytes of the file are indeed CD000000.
I'm assuming there's one of two things causing the difference (or both):
fread is not counting the 0s following the CD. I've confirmed this by only reading in 1 byte with the fileHandle, but sometimes this number will extend greater than one byte, so I can't restrict it like this. Do I need to manually check that the bytes coming in aren't 00?
This has something to do with endianness. I have tried a number of functions such as CFSwapInt32BigToHost but have not been able to get back the right value. It would be great if anyone can enlighten me as to how endianness works/effects this.
You are not dereferencing the data.
NSLog(#"%u", (uint)data2.bytes); // wrong
The "quick hack" version is like this:
NSLog(#"%u", *(uint *) data2.bytes); // hack
A more robust solution requires copying to a variable somewhere, to get the alignment right, but this doesn't matter on all platforms:
uint value;
[data getBytes:&value length:sizeof(value)];
NSLog(#"%u", value);
Another solution is to explicitly read the data byte-by-byte, which is most portable, has no alignment issues on any platform, and has no byte-order issues on any platform:
unsigned char *p = data.bytes;
uint value = (unsigned) p[0] | ((unsigned) p[1] << 8) |
((unsigned) p[2] << 16) | ((unsigned) p[3] << 24);
NSLog(#"%u", value);
As you can see, there are good reasons why we avoid putting binary data in files ourselves, and leave it to libraries or use text formats.
This can't be an issue with byte order, because fread() is working correctly. The fread() function and the -readDataOfLength: method will both give you the same result: a chunk of bytes.
You attempt reinterpret a sequence of 4 bytes as an unsigned int. This is not guaranteed to work on all platforms. It will only work if sizeof(unsigned int) equals 4. And it will only work if the byte order is the same for reading and writing.
Furthermore, you are not printing the scalars correctly with NSLog.
fread() in binary mode won't do anything to your data, you'll get the bytes as they are in the file.
It's absolutely byte ordering that is causing this, but I don't know anything about Apple's Objective C API:s. I don't understand why you don't need to do pointer accesses to the data2 object, even (why isn't data2.bytes failing, and data2->bytes needed?).
Also, the documentation for NSData doesn't say anything about byte order that I could find.
Using NSMethodSignature I can get a methods argument types via getArgumentTypeAtIndex:. Which returns a c-string based off of this documentation. So like "i" for int and "I" for unsigned.
Is there a function somewhere that takes in this encoding and returns the types size in bytes?
Something like this:
int paramSize = typeEncodingSize("i");
NSLog(#"%s is %d bytes", "i", paramSize);
//this would be the encoding for a struct that has three fields. An id, a pointer and an int.
paramSize = typeEncodingSize("{example=#*i}"); //two 8 byte pointers & one 4 byte int
NSLog(#"%s is %d bytes", "{example=#*i}", paramSize);
which would output:
i is 4 bytes
{example=#*i} is 20 bytes
I figure there must be an api function for this somewhere since the docs for [NSInvocation setArgument:atIndex:] say
The number of bytes copied is determined by the argument size.
I understand that this is old, but I've hit the same wall.
The solution seems to be method NSGetSizeAndAlignment.
https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Miscellaneous/Foundation_Functions/#//apple_ref/c/func/NSGetSizeAndAlignment
Have you tried sizeof()? That's the usual way to determine the size of a struct or other type.
Caleb's sizeof() is correct, because basically you can pass anything to #encode only if it's also accepted by sizeof().There's nothing like #decode. You can download class-dump source code and look into CDTypeParser.m file for example how to parse it.
I've been trying to send packets to a minecraft server from my custom Cocoa application (written in objective-c of course). I am a little confused as how to do that though. I did it in Java. That was very easy. Doing this is objective-c though is proving to be a bit more challenging.
This is the code that I am using:
- (void)handshake
{
PacketHandshake *packet = [PacketHandshake packetString:[NSString stringWithFormat:#"%#;%#:%i", username, IP, PORT]];
[packet writeData:dataOut];
}
Which calls:
- (void)writeData:(NSOutputStream *)dataOut
{
[super writeData:dataOut]; //Writes the "header" which is a char with the value of 0x02 (char packetID = 0x02)
NSUInteger len = [string lengthOfBytesUsingEncoding:NSUTF16BigEndianStringEncoding]; //Getting the length of the string i guess?
NSData *data = [string dataUsingEncoding:NSUTF16BigEndianStringEncoding]; //Getting string bytes?
[dataOut write:(uint8_t*)len maxLength:2]; //Send the length?
[dataOut write:[data bytes] maxLength:[data length]]; //Send the actual string?
}
I have established a successful connection to the server beforehand, but I don't really know whether or not I am sending the packets correctly. Could somebody please explain how I should send various data types and objects. (int, byte/char, short, double, NSString, BOOL/bool)
Also, is there any specific or universal way to send packets like the ones required by Minecraft?
Ok, I guess the question is now: how do data types, mainly strings, relate in Java and Objective-C?
Any help is appreciated, thank you!
Nobody knows?
Maybe you're running into a network/host byte order problem? I know very little about Minecraft- but I note that it's mentioned here that shorts in the Minecraft protocol use network byte order, which is big-endian (all other data types are 1 byte long so endianness is not relevant).
All x86 machines use little-endian.
I don't know whether your PacketHandshake class is converting the data before sending it- if not you could use the c library functions ntohs() and htons(), for which you'd need to include sys/types.h and
netinet/in.h
The link also mentions that strings are 64 byte array of standard ASCII chars, padded with 0x20s. You can get the ASCII value out of an NSString by doing [string UTF8String], which returns const char*- i.e. your standard C String ending with a 0x0, and then maybe pad it. But if it just works in Java, then maybe you don't need to.
I need to put a short and integer at the begging of a message that i am sending to a java server. The server is expecting to read a short (message id) then an integer (message length). I've read in the stackoverflow that NSMutableData is similar to java ByteBuffer.
I am trying to pack the message into NSMutableData then send it.
So this is what I have but is not working !.
NSMutableData *data = [NSMutableData dataWithLength:(sizeof(short) + sizeof(int))];
short msg_id = 2;
int length = 198;
[data appendBytes:&msg_id length:sizeof(short)];
[data appendBytes:&length length:sizeof(int)];
send(sock, data, 6, 0);
The server is using Java ByteBuffer to read in the received data. So the bytes coming in is:
32,120,31,0,2,0
which is invalid.
The correct value so the ByteBuffer can read them as .getShort() and .getInt()
0,2,0,0,0,-66
You're basically putting stuff into the NSData object correctly, but you're not using it with the send function correctly. First off, as dreamlax suggests, use NSMutableData's -initWithCapacity initializer to get a capacity, not zeroed bytes.
Your data pointer is a pointer to an Objective-C (NSData) object, not a the actual raw byte buffer. The send function is a classic UNIX-y C function, and doesn't know anything about Objective-C objects. It expects a pointer to the actual bytes:
send(sock, [data bytes], [data length], 0);
Also, FWIW, note that endianness matters here if you're expecting to recover the multibyte fields on the server. Consider using HTONL and HTONS on the short and int values before putting them in the NSData buffer, assuming the server expects "network" byte order for its packet format (though maybe you control that).
I think your use of dataWithLength: will give you an NSMutableData object with 6 bytes all initialised to 0, but then you append 6 more bytes with actual values (so you'll end up with 12 bytes all up). I'm assuming here that short is 2 bytes and int is 4. I believe you want to use dataWithCapacity: to hint how much memory to reserve for your data that you are packing.
As quixoto has pointed out, you need to use the bytes method, which returns a pointer to the first byte of the actual data. The length method will return the number of bytes you have.
Another thing you need to watch out for is endianness. The position of the most significant byte is dependent on the underlying architecture.
I have an int value which needs to be converted into a byte array.
How do you go about doing this in Objective-C? Are there methods to do this?
Thank you,
Converted in what way? Do you want it in little endian, big endian, or native byte order? If you want it in native byte order, then all you need to do is:
int val = //... initialize integer somehow
char* bytes = (char*) &val;
int len = sizeof(int);
That said, the best way to manipulate the bytes of an integer is to do bitwise operations. For example, to get the lowest order byte, you can use val&0xFF, to get the next you use (val>>8)&0xFF, then (val>>16)&0xFF, then (val>>24)&0xFF, etc.
Of course, it depends on the size of your data type. If you do those kinds of things, you really should include <inttypes.h>;, and use uint8_t, uint16_t, uint32_t, or uint64_t, depending on how large an integer you want; otherwise, you cannot reliably play around with larger numbers of bytes.
I suspect that what you want to do is to pass this byte array somewhere possibly to a NSMutableData object. You just pass the address &val
Example:
[myData appendBytes:&myInteger length:sizeof(myInteger)];
This link is more complete and deals with endianness:
Append NSInteger to NSMutableData