a noob question here.
I am trying to make an automatic search and replace process for characters' ASCII values in a string.
so, I have a string constructed from a content of a UITextField
NSString *searchText;
searchText = (mmText.text);
then I do a little loop and check all entered characters for their ASCII values. if they're not in the allowed range I want to search and replace them with something else (? for now)
so let's say I am in the loop and I get to a ASCII 45 character (it's a minus sign):
int asciiCode = 45;
now I would like to find the ASCII 45 character in the string and replace it with a question mark
This is what I am doing at the moment:
NSString *ascStr = [NSString stringWithFormat:#"%c", asciiCode];
NSRange matchSpace;
matchSpace = [searchText rangeOfString: ascStr];
if (matchSpace.location == NSNotFound)
{}
else
NSMutableString *searchandReplace = [NSMutableString stringWithString: searchText];
[searchandReplace replaceCharactersInRange: [searchandReplace rangeOfString: ascStr] withString: #"?"];
mmText.text = searchandReplace;
}
This works fine for a regular ASCII value (0-255), but it doesn't seem to work for the extended ASCII values coming from foreign languages. For example when using the Korean language mode, one of the main character looks like a double crossed W, but when printed via NSLog it looks like a copyright sign. This is probably the reason the search and replace procedure doesn't work for it. It has an ASCII value of 8361.
any ideas ? thank you!
it turns out it was as simple as changing:
NSString *ascStr = [NSString stringWithFormat:#"%c", asciiCode];
to
NSString *ascStr = [NSString stringWithFormat:#"%C", asciiCode];
%c
8-bit unsigned character (unsigned char), printed by NSLog() as an ASCII character, or, if not an ASCII character, in the octal format \ddd or the Unicode hexadecimal format \udddd, where d is a digit
%C
16-bit Unicode character (unichar), printed by NSLog() as an ASCII character, or, if not an ASCII character, in the octal format \ddd or the Unicode hexadecimal format \udddd, where d is a digit
Related
i just updated to ios 7 sdk, and I would like to trim/replace the whitespace between characters of a string whereby the numbers are taken out from ABAddressBook.
I have tried using the replace " " with "" code below, but this code doesnt seems to work in ios7 sdk, it works fine in ios 6 sdk by the way.
NSString *TrimmedNumberField = [self.numberField.text
stringByReplacingOccurrencesOfString:#" " withString:#""];
is there any other way I could do it in IOS 7?
EDIT:
It's a phone number type that I'm trying.
Input: "+65 12 345 6789"
The output i got from NSLog is " 12 345 6789"
I realized that when I added into NSDictionary and view it in NSLog, it appears that it contains a unix code representation of \u00a0 which is similar to the "dot in the middle" which is not equals to a fullstop.
thanks in advance.
Found the answer from here
phoneNumber = [phoneNumber stringByReplacingOccurencesOfString:#"." withString:#""];
// where #"." was created by typing Option+ Spacebar
The number is extracted from ABAddressbook.
You can loop over the string and remove whitespace as long as there is any
NSString *someString = #"A string with multiple spaces and other whitespace.";
NSMutableString *mutableCopy = [someString mutableCopy];
// get first occurance of whitespace
NSRange range = [mutableCopy rangeOfCharacterFromSet:[NSCharacterSet whitespaceCharacterSet]];
// If there is a match for the whitespace ...
while (range.location != NSNotFound) {
// ... delete it
[mutableCopy deleteCharactersInRange:range];
// and get the next whitespace
range = [mutableCopy rangeOfCharacterFromSet:[NSCharacterSet whitespaceCharacterSet]];
}
// no more whitespace. You can get back to an immutable string
someString = [mutableCopy copy];
The result with the string above is Astringwithmultiplespacesandotherwhitespace.
Try This:
NSString *str = #" untrimmed string ";
NSString *trimmed = [str stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceCharacterSet]];
Try This
[yourString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceCharacterSet]];
whitespaceCharacterSet Apple Documentation for iOS says
Returns an NSData object encoding the receiver in binary format.
(NSData *)bitmapRepresentation
Return Value
An NSData object encoding the receiver in binary format.
Discussion
This format is suitable for saving to a file or otherwise transmitting or archiving.
A raw bitmap representation of a character set is a byte array of 2^16 bits (that is, 8192 bytes). The value of the bit at position n represents the presence in the character set of the character with decimal Unicode value n. To test for the presence of a character with decimal Unicode value n in a raw bitmap representation, use an expression such as the following:
So Try This
NSString *testString = #" Eek! There are leading and trailing spaces ";
NSString *trimmedString = [testString stringByTrimmingCharactersInSet:
[NSCharacterSet whitespaceAndNewlineCharacterSet]];
When presented with an #"a", i'd like to be able to get it's ascii value of 97.
I thought this does it
NSString *c = [[NSString alloc] initWithString:#"a"];
NSLog(#"%d", [c intValue]); // Prints 0, expected 97
But ... you guessed it (or knew it :)) .. it does not.
How can i get an ascii value of a NSString*, pointing to a single character?
NSString *str = #"a";
unichar chr = [str characterAtIndex:0];
NSLog(#"ascii value %d", chr);
And why your method does not work is because you are operating on a STRING remember? Not a single character. Its still a NSString.
NSLog(#"%d",[c characterAtIndex:0]);
NSString class reference: The integer value of the receiver’s text, assuming a decimal representation and skipping whitespace at the beginning of the string. Returns INT_MAX or INT_MIN on overflow. Returns 0 if the receiver doesn’t begin with a valid decimal text representation of a number.
So it returned 0 because you called intValue on invalid decimal text representation of a number.
How do I convert a char to an NSString in Objective-C?
Not a null-terminated C string, just a simple char c = 'a'.
You can use stringWithFormat:, passing in a format of %c to represent a character, like this:
char c = 'a';
NSString *s = [NSString stringWithFormat:#"%c", c];
You can make a C-string out of one character like this:
char cs[2] = {c, 0}; //c is the character to convert
NSString *s = [[NSString alloc] initWithCString:cs encoding: SomeEncoding];
Alternatively, if the character is known to be an ASCII character (i. e. Latin letter, number, or a punctuation sign), here's another way:
unichar uc = (unichar)c; //Just extend to 16 bits
NSString *s = [NSString stringWithCharacters:&uc length:1];
The latter snippet with surely fail (not crash, but produce a wrong string) with national characters. For those, simple extension to 16 bits is not a correct conversion to Unicode. That's why the encoding parameter is needed.
Also note that the two snippets above produce a string with diferent deallocation requirements. The latter makes an autoreleased string, the former makes a string that needs a [release] call.
I have a NSString object and want to change it into unichar.
int decimal = [[temp substringFromIndex:2] intValue]; // decimal = 12298
NSString *hex = [NSString stringWithFormat:#"0x%x", decimal]; // hex = 0x300a
NSString *chineseChar = [NSString stringWithFormat:#"%C", hex];
// This statement log a different Chinese char every time I run this code
NSLog(#"%#",chineseChar);
When I see the log, It gives different character every time when I run my code.
m I missing something...?
The %C format specifier takes a 16-bit Unicode character (unichar) as input, not an NSString. You're passing in an NSString, which is getting reinterpreted as an integer character; since the string can be stored at a different address in memory each time you run, you get that address as an integer, which is why you get a different Chinese character every time you run your code.
Just pass in the character as an integer:
unichar decimal = 12298;
NSString *charStr = [NSString stringWithFormat:#"%C", decimal];
// charStr is now a string containing the single character U+300A,
// LEFT DOUBLE ANGLE BRACKET
How about -[NSString characterAtIndex:]? It wants a character index and returns a unichar.
How would I, in objective-c, make it so only strings with a-z characters were allowed? I.E. no & characters, no - characters, etc.
Thanks!
Christian Stewart
NSCharacterSets are going to be the key here. First you'll need the character set of alphabetical characters:
NSCharacterSet* letters = [NSCharacterSet characterSetWithRange:NSMakeRange('a', 26)];
And then, if you want to check if the string contains a character that's not a letter, you can use this set's inverse:
NSCharacterSet* notLetters = [letters invertedSet];
Then use NSString's rangeOfCharacterFromSet: with notLetters, and if the range doesn't start with NSNotFound, there are forbidden characters in your string.
NSRange badCharacterRange = [myString rangeOfCharacterFromSet:notLetters];
if (badCharacterRange.location != NSNotFound) // found bad characters