How do I convert an int to a char and also back from char to int?
e.g 12345 == abcde
Right now I have it using a whole bunch of case statement, wonder if there is a smarter way of doing that?
Thanks,
Tee
I would recommend use ASCII values and just typecast.
In most cases it is best to just use the ASCII values to encode letters; however if you wanted to use 1 2 3 4 to represent 'a' 'b' 'c' 'd' then you could use the following.
For example, if you wanted to convert the letter 1 to 'a' you could do:
char letter = (char) 1 + 96;
as in ASCII 97 corresponds to the character 'a'. Likewise you can convert the character 'a' to the integer 1 as follows
int num = (int) 'a' - 96;
Of course it is just easier to use ASCII values to start with and avoid adding or subtracting as shown above. :-D
If you want just to map 'a' -> 1, 'b' -> 2, ..., 'i' -> 9, you should do simply the following:
int convert(char* s)
{
if (!s) return -1; // error
int result = 0;
while (*s)
{
int digit = *s - 'a' + 1;
if (digit < 1 || digit > 9)
return -1; // error
result = 10 * result + digit;
}
return result;
}
However, you should still care about 0s (which letter do you want to map to 0?) and overflow (my code doesn't check for it).
unsigned char encryPt[1];
encryPt[0] = (char)1;
Related
I'm trying to find the simplest way to convert a digit (0..9) into the respective character '0'..'9' in Kotlin.
My initial attempt was to write the following code:
fun convertToCharacter() {
val number = 0
val character = number.toChar()
println(character)
}
Of course, after running, I quickly saw that this produces \u0000, and not '0' like I expected. Then, remembering from how to do this in Java, I modified the code to add '0', but then this would not compile.
fun convertToCharacter() {
val number = 0
val character = number.toChar() + '0'
println(character)
}
What is the appropriate way to convert a number into its respective character counterpart in Kotlin? Ideally, I'm trying to avoid pulling up the ASCII table to accomplish this (I know I can add 48 to the number since 48 -> '0' in ASCII).
val character = '0' + number
is the shortest way, given that the number is in range 0..9
Kotlin stdlib provides this function since 1.5.0.
fun Int.digitToChar(): Char
Returns the Char that represents this decimal digit. Throws an exception if this value is not in the range 0..9.
If this value is in 0..9, the decimal digit Char with code '0'.code + this is returned.
Example
println(5.digitToChar()) // 5
println(3.digitToChar(radix = 8)) // 3
println(10.digitToChar(radix = 16)) // A
println(20.digitToChar(radix = 36)) // K
Like you said, probably the easiest way to convert an Int to the Char representation of that same digit is to add 48 and call toChar():
val number = 3
val character = (number + 48).toChar()
println(character) // prints 3
If you don't want to have the magic 48 number in your program, you could first parse the number to a String and then use toCharArray()[0] to get the Char representation:
val number = 3
val character = number.toString().toCharArray()[0]
println(character) // prints 3
Edit: in the spirit of the attempt in your question, you can do math with '0'.toInt() and get the result you were expecting:
val number = 7
val character = (number + '0'.toInt()).toChar()
println(number) // prints 7
How about 0.toString() instead of 0.toChar() ? If you are specifically after single digits, then 0.toString()[0] will give you a Char type
You can use an extension like this:
fun Int.toReadableChar(): Char {
return ('0'.toInt() + this).toChar()
}
You can apply this to any other class you want :)
Example:
println(7.toReadableChar())
>> 7
I am a newbie to golang and want to find a way to define a single byte variable.
It's a demo program in Effective Go reference.
package main
import (
"fmt"
)
func unhex(c byte) byte{
switch {
case '0' <= c && c <= '9':
return c - '0'
case 'a' <= c && c <= 'f':
return c - 'a' + 10
case 'A' <= c && c <= 'F':
return c - 'A' + 10
}
return 0
}
func main(){
// It works fine here, as I wrap things with array.
c := []byte{'A'}
fmt.Println(unhex(c[0]))
//c := byte{'A'} **Error** invalid type for composite literal: byte
//fmt.Println(unhex(c))
}
As you see I can wrap a byte with array, things goes fine, but How can I define a single byte without using array? thanks.
In your example, this would work, using the conversion syntax T(x):
c := byte('A')
Conversions are expressions of the form T(x) where T is a type and x is an expression that can be converted to type T.
See this playground example.
cb := byte('A')
fmt.Println(unhex(cb))
Output:
10
If you don't want to use the := syntax, you can still use a var statement, which lets you explicitly specify the type. e.g:
var c byte = 'A'
I'm developing a word game and basically I want to assign an integer value to each character of the alphabet.
Currently, I have a helper to return the value for each char but I'm wondering how I should construct the initial data structure.
At the moment it is a dictionary containing each of the letters of the alphabet as the key and I want the points to be the object for that key. What is the best practise to set the points object?
I want to avoid things like this:
if (_letter == 'a' || _letter == 'A') _points = 1;
else if (_letter == 'b' || _letter == 'B') _points = 4;
else if (_letter == 'c' || _letter == 'C') _points = 3;
Many Thanks
You could use a 26-element C array of integers, where each integer is the point value of that letter, with the 0th element being A, the 1st element being B, etc. You then just lowercase the letter before subtracting 'a' from it and use that as the index into the array. At this point, the only thing left is to prevent non-ASCII-alphabetic characters from being considered, which can be done with a simple range check after lowercasing (if (c >= 'a' && c <= 'z')).
You can take advantage of the fact that chars are integers in C (and therefore Objective-C as well), and simply have an array of ints, keyed off the lowercase version of the char - 'a', like so:
int *letterValues[] = {1,4,3}; // a = 1, b=4, etc...
char thisChar = 'B';
int thisCharVal = letterValues[tolower(thisChar) - 'a'];
Note that this uses tolower, which is declared in (std library), and that subtracting 'a' from a lowercase alpha char is essentially deducing it's "index" in the alphabet: 'a' - 'a' = 0, the "first" item, 'b'-'a' = 1, etc. Therefore, your array initializer ({1,4,3}), is simply the values you want to assign to the chars in order (or use designated initializers if you wish, and you can use chars there as well:
int *letterValues[] = {1,4,3, 'z'=4};
If it's just teh standard latin alphabet, with no umlauts or other special characters, the easiest way is to just make an array of 26 ints for the "point" values:
int LookupPoints(char c)
{
static const unsigned char Points[26]={1,4,3, ... };
c=c|0x20; /* all lower case */
assert(c>='a' && c<='z');
return Points[c-'a'];
}
If you want to include other characters you could check for them specifically, or make an alphabetically sorted array of structs (with the character and the value) to search with bsearch.
EDIT: of course, a dictionary will work too. Just use whatever feels like the easiest solution for your specific case.
I currently have code in objective C that can pull out an integer's most significant digit value. My only question is if there is a better way to do it than with how I have provided below. It gets the job done, but it just feels like a cheap hack.
What the code does is that it takes a number passed in and loops through until that number has been successfully divided to a certain value. The reason I am doing this is for an educational app that splits a number up by it's value and shows the values added all together to produce the final output (1234 = 1000 + 200 + 30 + 4).
int test = 1;
int result = 0;
int value = 0;
do {
value = input / test;
result = test;
test = [[NSString stringWithFormat:#"%d0",test] intValue];
} while (value >= 10);
Any advice is always greatly appreciated.
Will this do the trick?
int sigDigit(int input)
{
int digits = (int) log10(input);
return input / pow(10, digits);
}
Basically it does the following:
Finds out the number of digits in input (log10(input)) and storing it in 'digits'.
divides input by 10 ^ digits.
You should now have the most significant number in digits.
EDIT: in case you need a function that get the integer value at a specific index, check this function out:
int digitAtIndex(int input, int index)
{
int trimmedLower = input / (pow(10, index)); // trim the lower half of the input
int trimmedUpper = trimmedLower % 10; // trim the upper half of the input
return trimmedUpper;
}
i am not sure about my english, but i need to get the unit digit of an integer.
WITHOUT complex algorithm but with some API or another trick.
for example :
int a= 53;
int b=76;
this i add because i almost always dont "meet the quality standards" to post! its drive me crazy! please , fix it ! it took me 10 shoots to post this,and other issue also.
i need to get a=3 and b=6 in a simple smart way.
same about the other digit.
thanks a lot .
here is how to split the number into parts
int unitDigit = a % 10; //is 3
int tens= (a - unitDigit)/10; //is 53-3=50 /10 =5
You're looking for % operator.
a=a%10;//divides 'a' by 10, assigns remainder to 'a'
WARNING
here is how to divine the number into parts
int unitDigit = a % 10; //is 3
int tens= (a - unitDigit)/10; //is 53-3=50 /10 =5
this answer is totally incorrect. It may work only in a number of cases. For example try to get the first digit of 503 via this way
It seems the simplest answer (but not very good in performance):
int a = ...;
int digit = [[[NSString stringWithFormat:#"%d", a] substringToIndex:1] intValue]; //or use substringWithRange to get ANY digit
Modulo operator will help you (as units digit is a reminder when number is divided by 10):
int unitDigit = a % 10;
The following code "gets" the digits of a given number and counts how many of them divide the number exactly.
int findDigits(long long N){
int count = 0;
long long newN = N;
while(newN) // kinda like a right shift
{
int div = newN % 10;
if (div != 0)
if (N % div == 0) count++;
newN = newN / 10;
}
return count;
}