How to convert an "initWithBytes" NSString to unsigned char*?
here's how the NSString produced
NSString *byte = [[[NSString alloc] initWithBytes:inputbytes length:inputlength encoding:NSASCIIStringEncoding] autorelease];
which contains bytes data that should not be changed in bit level(will be the input of base64 conversion)
The data in the byte is 0x01 0x00 0x01, 3 bytes
and I want to convert it into an unsigned char*
How should I do it?
Thanks
Generally speaking, you should work with NSString directly rather than attempting to move in and out of Objective-C. But there are reasons to drop down to C (e.g. using a C-only API) so I'll humour you. ;)
If you just need to pass the data to another function, you can do this:
const unsigned char *cString = (const unsigned char *)[byte cStringUsingEncoding: NSASCIIStringEncoding];
If you need to modify the C string (i.e. make it non-const), you will need to make a copy:
const unsigned char *cString = (const unsigned char *)strdup([byte cStringUsingEncoding: NSASCIIStringEncoding]);
And then free() it when you're done with it:
free(cString);
Note that this will not modify the original C string or the NSString you made from it. If you need an NSString you can modify, use NSMutableString.
Try the cStringUsingEncoding: method of NSString:
https://developer.apple.com/library/mac/#documentation/Cocoa/Reference/Foundation/Classes/NSString_Class/Reference/NSString.html#//apple_ref/doc/uid/20000154-BAJHJHHD
Related
How can I convert an iOS (objective-c) NSString value to a byte value?
JAVA uses .getBytes(), but I do not know how to do it on iOS.
uint8_t *dbytes = (uint8_t *)[Value bytes];
I expected to be something like this… but I do not want it.
You can call -[NSString UTF8String] to change NSString into const char*.
Example: const char *bytes = [Value UTF8String];
Reference: Apple Document
It is not clear in what encoding you expect your results to be.
Java's getBytes() returns String bytes using default encoding which is platform dependant, so instead use getBytes(charsetname) providing charset explicitly. You should communicate to use a specific encoding to avoid interoperability issues.
On iOS side, you can use -[NSStrings dataUsingEncoding:]
e.g.
NSData *bytes = [#"Hello" dataUsingEncoding:NSUTF8StringEncoding];
First, use UTF8String translate to unsigned long, then use 0xff & translate to Byte, as below:
NSString *str =#"xxx";
unsigned long red = strtoul([str UTF8String], 0, 16);
Byte bt = (Byte)(0xff & red);
I know that this question is a possible duplicate, but even after looking at some Google tutorials and questions even on this forum none of them gives me a decent answer about this subject.
I have:
NSString *str = #"text";
And I would like to do something like:
char cstring [512] = str;
(this only shows what I want to do, after looking at Apple's NSString Class Ref I didn't even think about using it).
Up to now I have:
char command [512] = [[NSString stringWithFormat:#"text"] cStringUsingEncoding:NSUTF8StringEncoding];
Still, with that I get errors.
Any solution?
try const char *command = [str UTF8String];
A c string is returned as a pointer, not as an array of characters. To use it, you can change your variable to a pointer.
const char *command = [theString cStringUsingEncoding:NSUTF8StringEncoding];
Since you want the UTF8 encoding, you can use the UTF8String convenience method.
const char *command = [theString UTF8String];
If you need the data to be stored in a character array, you can use the getCString:maxLength:encoding: method, passing the array as the buffer. This will allow you to store the string directly to the buffer, and will tell you if the buffer is too small.
char command[512];
if(![theString getCString:command maxLength:sizeof(command)/sizeof(*command) encoding:NSUTF8StringEncoding]) {
NSLog(#"Command buffer too small");
}
In objective c how does an xmlchar data type work? I can't seem to find any API documentation. Specifically my declaration looks like:
const xmlChar **attributes
Am I classifying this correctly by saying it's an objective c data type or is it specific to cocoa or just C?
You can see the declaration xmlChar in xmlString.h found in libxml2 library as:
typedef unsigned char xmlChar;
Now for xmlChar to NSString conversion
xmlChar *str = (unsigned char *)"I am to be converted";
NSString *convertedString = [NSString stringWithUTF8String:(const char *)str];
from NSString to xmlChar conversion;
NSString *str = #"I am to be converted";
xmlChar *convertedString = (xmlChar *)[str UTF8String];
I don't think that Objective-C has any sort of xmlChar data type. In Cocoa, you will generally use an NSXMLParser, which just uses NSString. Are you perhaps thinking of libxml2? In that case, it's simply defined as an unsigned char representing a UTF-8 octet (to help the compiler give warnings if you try to cast it to a char). You can generally treat an xmlChar * like a regular char *, as long as you keep in mind that it's UTF-8 encoded rather than in ASCII (so, truncating it may give invalid characters, comparing values to sort them won't work unless you implement locale-aware comparisons, etc).
i want to casting my NSString to a constant char
the code is shown below :
NSString *date = #"12/9/2009";
char datechar = [date UTF8String]
NSLog(#"%#",datechar);
however it return the warning
assignment makes integer from pointer without a cast
and cannot print the char properly,can somebody tell me what is the problem
Try something more like this:
NSString* date = #"12/9/2009";
char* datechar = [date UTF8String];
NSLog(#"%s", datechar);
You need to use char* instead of char, and you have to print C strings using %s not %# (%# is for objective-c id types only).
I think you want to use:
const char *datechar = [date UTF8String];
(note the * in there)
Your code has 2 problems:
1) "char datechar..." is a single-character, which would only hold one char / byte, and wouldn't hold the whole array that you are producing from your date/string object. Therefore, your line should have a (*) in-front of the variable to store multi characters rather than just the one.
2) After the above fix, you would still get a warning about (char *) vs (const char *), therefore, you would need to "cast" since they are technically the same results. Change the line of:
char datechar = [date UTF8String];
into
char *datechar = (char *)[date UTF8String];
Notice (char *) after the = sign, tells the compiler that the expression would return a (char *) as opposed to it's default (const char *).
I know you have already marked the answer earlier however, I thought I could contribute to explain the issues and how to fix in more details.
I hope this helps.
Kind Regards
Heider
I would add a * between char and datechar (and a %s instead of a %#):
NSString *date=#"12/9/2009"; char * datechar=[date UTF8String];
NSLog(#"%s",datechar);
I was suffering for a long time to convert NSString to char to use for this function
-(void)gpSendDTMF:(char) digit callID: (int)cid;
I have tried every answer of this question/many things from Google search but it did not work for me.
Finally I have got the solution.
Solution:
NSString *digit = #"5";
char dtmf;
char buf[2];
sprintf(buf, "%d", [digit integerValue]);
dtmf = buf[0];
I'm trying to use a function that has the following signature to sign a HTTP request:
extern void hmac_sha1(const unsigned char *inText, int inTextLength, unsigned char* inKey, const unsigned int inKeyLength, unsigned char *outDigest);
And this is the method I wrote to use it:
- (NSString *)sign: (NSString *)stringToSign {
NSString *secretKey = #"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx";
const unsigned char *inText = (unsigned char *)[stringToSign UTF8String];
int inTextLength = [stringToSign length];
unsigned char *inKey = (unsigned char *)[secretKey UTF8String];
const unsigned int inKeyLength = (unsigned int)[secretKey length];
unsigned char *outDigest;
hmac_sha1(inText, inTextLength, inKey, inKeyLength, outDigest);
NSString *output = [NSString stringWithUTF8String:(const char *)outDigest];
return output;
}
The problem is I'm sure this is not the way I'm supposed to do this casting, as inside this hmac_sha1 function I get a EXC_BAD_ACCESS exception.
Since I am new to Objective-C and have close to no experience in C (surprise!) I don't really know what to search for. Any tips on how I can start solving this?
Thanks in advance!
BTW, I got the reference for this function here in stackoverflow.
It looks like the problem is not with the casting, but with outDigest. The fifth argument to hmac_sha1 should point to an already allocated buffer of size 20 bytes (I think).
If you change the line that says
unsigned char *outDigest;
to say
#define HMACSHA1_DIGEST_SIZE 20
void *outDigest = malloc(HMACSHA1_DIGEST_SIZE);
That should get you past the crash inside hmac_sha1.
Then you've got the problem of converting the data at outDigest into an NSString. It looks like hmac_sha1 will put 20 bytes of random-looking data at outDigest, and not a null terminated UTF-8 string, so stringWithUTF8String: won't work. You might want to use something like this instead if you have to return an NSString:
NSString *output = [[NSString alloc] initWithBytesNoCopy:outDigest
length:HMACSHA1_DIGEST_SIZE
encoding:NSASCIIStringEncoding
freeWhenDone:YES];
I don't think NSString is really the right type for the digest, so it might be worth changing your method to return an NSData if you can.
This wasn't part of your question but it's a bug nonetheless, you shouldn't use -length to get the byte count of an UTF8 string. That method returns the number of Unicode characters in the string, not the number of bytes. What you want is -lengthOfBytesUsingEncoding:.
NSUInteger byteCount = [stringToSign lengthOfBytesUsingEncoding:NSUTF8StringEncoding];
Also be aware that the result does not account for a terminating NULL character.
Are you sure you don't need to allocate some memory for outDigest before calling hmac_sha1? Since you pass in a pointer, rather than a pointer to a pointer, there's no way that the memory can be allocated inside the routine.