Good evening,
I wrote the code below in C to read and print from a binary file a random hexadecimal string with a constant length, knowing a certain offset and the location of the string regarding the offset.
FILE *f = NULL;
unsigned char *buffer = NULL;
unsigned long fileLen;
unsigned char bytes[] = {0x22, 0x22, 0x22, 0x22};
f = fopen(argv[1], "rb");
if (!f)
{
fprintf(stderr, "Unable to open %s\n", argv[1]);
return -1;
}
fseek(f, 0, SEEK_END);
fileLen=ftell(f);
fseek(f, 0, SEEK_SET);
buffer=malloc(fileLen);
if (!buffer)
{
fprintf(stderr, "Memory error!\n");
fclose(f);
return -1;
}
fread(buffer, fileLen, 1, f);
fclose(f);
unsigned int *p = memmem(buffer, fileLen, bytes, 4);
if (!p) {
return -1;
}
unsigned long off_to_string = 4 + 0x12 + ((void *)p) - ((void *)buffer);
for (unsigned long c = off_to_string; c < off_to_string+0x30; c++)
{
printf("%.2X", (int)buffer[c]);
}
printf("\n");
free(buffer);
I would like to use this code in a Cocoa app, I tried to make something like:
NSMutableString *tehString = [[NSMutableString alloc] init];
for (unsigned long c = off_to_string; c < off_to_string+0x30; c++)
{
[tehString appendString:...]
}
the problem is I must pass a NSString to appendString: and I don't even know how to print its hexadecimal representation.
Thanks !
PS: Feel free to improve the code of the "hexadecimal string reader" :)
NSStrings can be created using the same string format specifiers.
so you could do this:
[tehString appendString:[NSString stringWithFormat:#"%.2X", (int)buffer[c]]];
caveat: I've just typed that in so it might be a little off - but you get the idea.
Related
In my program, I receive a NSString like this one : AA010158AA7D385002. And I need to pass it to a method which accept a char byte array, as below :
char[9] = {0xAA, 0x01, 0x01, 0x58, 0xAA, 0x7D, 0x38, 0x50, 0x02};
How to convert my NSString to char byte array like this one?
Thanks!
NSString *strCharData = #"AA010158AA7D385002";
const char *characterRes = [strCharData cStringUsingEncoding:NSUTF8StringEncoding];
or
NSString *strCharData = #"AA010158AA7D385002";
const char *characterRes = [strCharData UTF8String];
Use this answer if i am correct,i did little coding but might be there are possibilities of simpler solutions also like #user3182143
NSString * inputStr = #"AA010158AA7D385002";
NSMutableArray *charByteArray = [[NSMutableArray alloc]initWithCapacity:1];
int i = 0;
for (i = 0; i+2 <= inputStr.length; i+=2) {
NSRange range = NSMakeRange(i, 2);
NSString* charStr = [inputStr substringWithRange:range];
[charByteArray addObject:[NSString stringWithFormat:#"0x%#",charStr]];
}
Output :
char[9] = (
0xAA,
0x01,
0x01,
0x58,
0xAA,
0x7D,
0x38,
0x50,
0x02
)
Since your text is hex and you want actual bytes out (which each correspond to two hex characters), you'll have to manually convert each character into the corresponding number, and then add them together into the correct numerical value.
To do this, I'm taking advantage of the fact that, in ASCII characters, a...z are in a row, as are 0...9. I'm also taking advantage of the fact that hexadecimal is valid ASCII, and that Unicode characters from 0...127 are identical to their corresponding ASCII characters.
Below is a program that does this and prints out the original string's characters as well as the calculated bytes (as hex again):
#import <Foundation/Foundation.h>
int main(int argc, char *argv[])
{
#autoreleasepool
{
NSString *hexStr = #"1234567890abcdef12";
unsigned char theBytes[9] = {};
for( NSUInteger x = 0; x < sizeof(theBytes); x++ )
{
unsigned char digitOne = [hexStr characterAtIndex: x * 2];
if( digitOne >= 'a' )
digitOne -= 'a' -10;
else
digitOne -= '0';
unsigned char digitTwo = [hexStr characterAtIndex: (x * 2) +1];
if( digitTwo >= 'a' )
digitTwo -= 'a' -10;
else
digitTwo -= '0';
printf("%01x%01x",digitOne,digitTwo);
theBytes[x] = (digitOne << 4) | digitTwo;
}
printf("\n");
for( int x = 0; x < sizeof(theBytes); x++ )
printf("%02x",theBytes[x]);
printf("\n");
}
}
Note: This code naïvely assumes that you are providing a correct string. I.e. your input has to consist of lowercase characters and numbers only, and exactly 18 characters. Anything else and you get a wrong result or an exception.
I finally managed to find the answer to my own question. I am posting it here in case it helps someone else.
So I use a method to convert an NSString hex to bytes :
#implementation NSString (HexToBytes)
- (NSData *)hexToBytes
{
NSMutableData *data = [NSMutableData data];
int idx;
for (idx = 0; idx + 2 <= self.length; idx += 2) {
NSRange range = NSMakeRange(idx, 2);
NSString *hexStr = [self substringWithRange:range];
NSScanner *scanner = [NSScanner scannerWithString:hexStr];
unsigned int intValue;
[scanner scanHexInt:&intValue];
[data appendBytes:&intValue length:1];
}
return data;
}
#end
And then, I simply use it like that :
NSString *str = #"AA010158AA7D385002";
NSData *databytes = [str hexToBytes];
char *bytePtr = (char *)[databytes bytes];
And I finally get my char array. Hope it helps someone else.
NSString *strVal= #"BAAA";
How do I convert the above string into a bit value? Should I do the byte conversion before this?
Need help on this. I have previously checked this question for integer conversion.
Whether you convert the string to UTF-8 first is up to you; it depends what you want. Internally NSString stores characters as UTF-16 (might be UCS-2 actually) using the unichar type, so you need to decide whether you want the binary of the internal representation or of some other external encoding.
I expect you want the binary of the UTF-8 encoding, so try this (tested):
#import <Foundation/Foundation.h>
#interface BinFuncs : NSObject
+ (NSString *)binaryOfString:(NSString *)str;
#end
#implementation BinFuncs
+ (NSString *)binaryOfString:(NSString *)str {
NSMutableString *binStr = [[NSMutableString alloc] init];
const char *cstr = [str UTF8String];
size_t len = strlen(cstr);
for (size_t i = 0; i < len; i++) {
uint8_t c = cstr[i];
for (int j = 0; j < 8; j++) {
[binStr appendString:((c & 0x80) ? #"1" : #"0")];
c <<= 1;
}
}
return binStr;
}
#end
int main(int argc, const char **argv) {
#autoreleasepool {
for (int i = 1; i < argc; i++) {
NSString *binStr = [BinFuncs binaryOfString:[NSString stringWithUTF8String:argv[i]]];
NSLog(#"%#", binStr);
}
}
return 0;
}
$ clang -o binstr binstr.m -framework Foundation
$ ./binstr 'the quick brown fox jumps over the lazy dog'
2013-09-30 09:19:49.674 binstr[58474:707] 01110100011010000110010100100000011100010111010101101001011000110110101100100000011000100111001001101111011101110110111000100000011001100110111101111000001000000110101001110101011011010111000001110011001000000110111101110110011001010111001000100000011101000110100001100101001000000110110001100001011110100111100100100000011001000110111101100111
I'm trying to develop a simple application where i can encrypt a message. The algorithm is Caesar's algorithm and for example, for 'Hello World' it prints 'KHOOR ZRUOG' if the increment is 3 (standard).
My problem is how to take each single character and increment it...
I've tried this:
NSString *text = #"hello";
int q, increment = 3;
NSString *string;
for (q = 0; q < [text length]; q++) {
string = [text substringWithRange:NSMakeRange(q, 1)];
const char *c = [string UTF8String] + increment;
NSLog(#"%#", [NSString stringWithUTF8String:c]);
}
very simple but it doesn't work.. My theory was: take each single character, transform into c string and increment it, then return to NSString and print it, but xcode print nothing, also if i print the char 'c' i can't see the result in console. Where is the problem?
First of all, incrementing byte by byte only works for ASCII strings. If you use UTF-8, you will get garbage for glyphs that have multi-byte representations.
With that in mind, this should work (and work faster than characterAtIndex: and similar methods):
NSString *foo = #"FOOBAR";
int increment = 3;
NSUInteger bufferSize = [foo length] + 1;
char *buffer = (char *)calloc(bufferSize, sizeof(char));
if ([foo getCString:buffer maxLength:bufferSize encoding:NSASCIIStringEncoding]) {
int bufferLen = strlen(buffer);
for (int i = 0; i < bufferLen; i++) {
buffer[i] += increment;
if (buffer[i] > 'Z') {
buffer[i] -= 26;
}
}
NSString *encoded = [NSString stringWithCString:buffer
encoding:NSASCIIStringEncoding];
}
free(buffer);
first of all replace your code with this:
for (q = 0; q < [text length]; q++) {
string = [text substringWithRange:NSMakeRange(q, 1)];
const char *c = [string UTF8String];
NSLog(#"Addr: 0x%X", c);
c = c + increment;
NSLog(#"Addr: 0x%X", c);
NSLog(#"%#", [NSString stringWithUTF8String:c]);
}
Now you can figure out your problem. const char *c is a pointer. A pointer saves a memory address.
When I run this code the first log output is this:
Addr: 0x711DD10
that means the char 'h' from the NSString named string with the value #"h" is saved at address 0x711DD10 in memory.
Now we increment this address by 3. Next output is this:
Addr: 0x711DD13
In my case at this address there is a '0x00'. But it doesn't matter what is actually there because a 'k' won't be there (unless you are very lucky).
If you are happy there is a 0x00 too. Because then nothing bad will happen. If you are unlucky there is something else. If there is something other than 0x00 (or the string delimiter or "end of string") NSString will try to convert it. It might crash while trying this, or it might open a huge security hole.
so instead of manipulating pointers you have to manipulate the values where they point to.
You can do this like this:
for (q = 0; q < [text length]; q++) {
string = [text substringWithRange:NSMakeRange(q, 1)];
const char *c = [string UTF8String]; // get the pointer
char character = *c; // get the character from this pointer address
character = character + 3; // add 3 to the letter
char cString[2] = {0, 0}; // create a cstring with length of 1. The second char is \0, the delimiter (the "end marker") of the string
cString[0] = character; // assign our changed character to the first character of the cstring
NSLog(#"%#", [NSString stringWithUTF8String:cString]); // profit...
}
I've been doing a lot of reading on how to convert a string to a hex value. Here is what I found to accomplish this:
NSString * hexString = [NSString stringWithFormat:#"%x", midiValue];
This returned some "interesting" results and upon reading a little further I found a post that mentioned this
"You're passing a pointer to an object representing a numeric value, not the numeric value proper."
So I substituted 192 instead of "midiValue" and it did what I expected.
How would I pass the string value and not the pointer?
Decleration of midiValue:
NSString *dMidiInfo = [object valueForKey:#"midiInformation"];
int midiValue = dMidiInfo;
You probably need to do something like this:
NSNumberFormatter *numberFormatter= [[NSNumberFormatter alloc] init];
int anInt= [[numberFormatter numberFromString:string ] intValue];
also, I think there is some example code in the xcode documentation for converting to and from a hex value, in the QTMetadataEditor sample. in the MyValueFormatter class.
+ (NSString *)hexStringFromData:(NSData*) dataValue{
UInt32 byteLength = [dataValue length], byteCounter = 0;
UInt32 stringLength = (byteLength*2) + 1, stringCounter = 0;
unsigned char dstBuffer[stringLength];
unsigned char srcBuffer[byteLength];
unsigned char *srcPtr = srcBuffer;
[dataValue getBytes:srcBuffer];
const unsigned char t[16] = "0123456789ABCDEF";
for (; byteCounter < byteLength; byteCounter++){
unsigned src = *srcPtr;
dstBuffer[stringCounter++] = t[src>>4];
dstBuffer[stringCounter++] = t[src & 15];
srcPtr++;
}
dstBuffer[stringCounter] = '\0';
return [NSString stringWithUTF8String:(char*)dstBuffer];
}
+ (NSData *)dataFromHexString:(NSString*) dataValue{
UInt32 stringLength = [dataValue length];
UInt32 byteLength = stringLength/2;
UInt32 byteCounter = 0;
unsigned char srcBuffer[stringLength];
[dataValue getCString:(char *)srcBuffer];
unsigned char *srcPtr = srcBuffer;
Byte dstBuffer[byteLength];
Byte *dst = dstBuffer;
for(; byteCounter < byteLength;){
unsigned char c = *srcPtr++;
unsigned char d = *srcPtr++;
unsigned hi = 0, lo = 0;
hi = charTo4Bits(c);
lo = charTo4Bits(d);
if (hi== 255 || lo == 255){
//errorCase
return nil;
}
dstBuffer[byteCounter++] = ((hi << 4) | lo);
}
return [NSData dataWithBytes:dst length:byteLength];
}
Hopefully this helps.
if you are messing with a simple iPhone app in Xcode using the iPhone 5.1 Simulator then this is useful:
//========================================================
// This action is executed whenever the hex button is
// tapped by the user.
//========================================================
- (IBAction)hexPressed:(UIButton *)sender
{
// Change the current base to hex.
self.currentBase = #"hex";
// Aquire string object from text feild and store in a NSString object
NSString *temp = self.labelDisplay.text;
// Cast NSString object into an Int and the using NSString method StringWithFormat
// which is similar to c's printf format then output into hexidecimal then return
// this NSString object with the hexidecimal value back to the text field for display
self.labelDisplay.text=[NSString stringWithFormat:#"%x",[temp intValue]];
}
I have a long hex value stored as a NSString, something like:
c12761787e93534f6c443be73be31312cbe343816c062a278f3818cb8363c701
How do I convert it back into a binary value stored as a char*
This is a little sloppy, but should get you on the right track:
NSString *hexString = #"c12761787e93534f6c443be73be31312cbe343816c062a278f3818cb8363c701";
char binChars[128]; // I'm being sloppy and just making some room
const char *hexChars = [hexString UTF8String];
NSUInteger hexLen = strlen(hexChars);
const char *nextHex = hexChars;
char *nextChar = binChars;
for (NSUInteger i = 0; i < hexLen - 1; i++)
{
sscanf(nextHex, "%2x", nextChar);
nextHex += 2;
nextChar++;
}
There was a thread on this (or on a very similar) hexadecimal conversion topic a couple of weeks back over on one of the Cocoa mailing lists.
I can't reasonably reproduce the full discussion here (long thread), but the message that starts off the thread is here:
http://www.cocoabuilder.com/archive/message/cocoa/2009/5/9/236391
I do wish there were a Cocoa method for this task, but (pending finding that or pending its implementation) the code (by Mr Gecko, posted at http://www.cocoabuilder.com/archive/message/cocoa/2009/5/10/236424) looks like it would work here.
static NSString* hexval(NSData *data) {
NSMutableString *hex = [NSMutableString string];
unsigned char *bytes = (unsigned char *)[data bytes];
char temp[3];
int i = 0;
for (i = 0; i < [data length]; i++) {
temp[0] = temp[1] = temp[2] = 0;
(void)sprintf(temp, "%02x", bytes[i]);
[hex appendString:[NSString stringWithUTF8String: temp]];
}
return hex;
}