get random alphapet without create list of all alphabets - kotlin

I want to get random alphabet without create list of all alphabets and get random index
I tried this
fun getChar(){
val alphabets = ('a'..'z').toList()
return alphabets[Random().nextInt(alphabets.size)]
}

You can use ASCII Character codes. generate a random number from 97 representing a, to 122 representing z, then use toChar(charCode) to get the corresponding letter.
also see: ASCII Table
ALSO as Madhu Bhat said in the comments you can use ('a'..'z').random() for a more precise way.

Related

How do I reverse each value in a column bit wise for a hex number?

I have a dataframe which has a column called hexa which has hex values like this. They are of dtype object.
hexa
0 00802259AA8D6204
1 00802259AA7F4504
2 00802259AA8D5A04
I would like to remove the first and last bits and reverse the values bitwise as follows:
hexa-rev
0 628DAA592280
1 457FAA592280
2 5A8DAA592280
Please help
I'll show you the complete solution up here and then explain its parts below:
def reverse_bits(bits):
trimmed_bits = bits[2:-2]
list_of_bits = [i+j for i, j in zip(trimmed_bits[::2], trimmed_bits[1::2])]
reversed_bits = [list_of_bits[-i] for i in range(1,len(list_of_bits)+1)]
return ''.join(reversed_bits)
df['hexa-rev'] = df['hexa'].apply(lambda x: reverse_bits(x))
There are possibly a couple ways of doing it, but this way should solve your problem. The general strategy will be defining a function and then using the apply() method to apply it to all values in the column. It should look something like this:
df['hexa-rev'] = df['hexa'].apply(lambda x: reverse_bits(x))
Now we need to define the function we're going to apply to it. Breaking it down into its parts, we strip the first and last bit by indexing. Because of how negative indexes work, this will eliminate the first and last bit, regardless of the size. Your result is a list of characters that we will join together after processing.
def reverse_bits(bits):
trimmed_bits = bits[2:-2]
The second line iterates through the list of characters, matches the first and second character of each bit together, and then concatenates them into a single string representing the bit.
def reverse_bits(bits):
trimmed_bits = bits[2:-2]
list_of_bits = [i+j for i, j in zip(trimmed_bits[::2], trimmed_bits[1::2])]
The second to last line returns the list you just made in reverse order. Lastly, the function returns a single string of bits.
def reverse_bits(bits):
trimmed_bits = bits[2:-2]
list_of_bits = [i+j for i, j in zip(trimmed_bits[::2], trimmed_bits[1::2])]
reversed_bits = [list_of_bits[-i] for i in range(1,len(list_of_bits)+1)]
return ''.join(reversed_bits)
I explained it in reverse order, but you want to define this function that you want applied to your column, and then use the apply() function to make it happen.

Dividing the input in python

I'm doing a project where I need to insert coordinates in the console to return a place in a grid. My grid is 10*10 and has numbers in the rows and Letters in the columns.
I want to be able to input something like A1 and for it to be interpreted as "column1, row1"
So far I have got:
def get_coor():
user_input = input("Please enter coordinates (row,col) ? ")
coor = user_input.split(" ")
return coor
But I'm only able to split if I have a space. Is there any other function to help me in this situation?
Strings are iterable in Python.
If you write:
user_input = input("Please enter coordinates (row,col)?")
<input A1>
Then user_input[0] will be A and user_input[1] will be 1.
Therefore, no need for the split :)
Split is used precisely for the use case when there is a space: it returns a list of all the strings between the occurrences of the character given as an argument (in your case a space).

How to shrink 10 digit numeric into 2 character

I have input comprising five character upper-case English letters e.g ABCDE and I need to convert this into two character unique ASCII output.
e.g. ABCDE and ZZZZZ should both give two different outputs
I have converted from ABCDE into hex which gives me 4142434445, but from this can I get to a two character output value I require?
Example:
INPUT1 = ABCDE
Converted to hex = 4142434445
INPUT2 = 4142434445
OUTPUT = ?? Any 2 ASCII Characters
Other examples of INPUT1 =
BIRAL
BRMAL
KLAAX
So you're starting with a 5-digit base-26 number, and you want to squeeze that into some 2-digit scheme with base n?
All possible 1-5 digit base-26 numbers gives you a number space of 26^5 = 11,881,376.
So you want the minimum n where n^2 >= 11,881,376.
Which gives you 3446.
Now it's up to you to go and find a suitable glyph block somewhere in UTF where you can reliably block-out 3446 separate characters to act as your new base/alphabet. And construct a mapping from your 5-char base-26 ABCDE type number onto your 2-char base-3446 wierd-glyph number. Good luck with that.
There's not enough variety in ASCII to do this, since it's only 128 printable characters. Limiting yourself to 2-chars of ASCII means you can only address a number space of 16384.

How to chose options in a while loop

My program --> I Will ask the user to introduce a number and I want to make that if the number is not in a random sequence (I choose 1,2,3) of numbers, the user need to write again a number until the number they enter is in the sequence:
a = (1,2,3)
option = int(input(''))
while option != a:
print('Enter a number between 1 and 3 !!')
option = int(input(''))
So as you can see I use the variable as a tuple but I don't know how to do it.. =(
Assuming the use of a tuple is obligatory, you will need to get input as a string, because it is iterable type. It will alow you easily convert to int, sign by sign, thru list comprehension. Now you have a list of ints, which you simply convert to a tuple. The final option variable looks:
option = tuple([int(sign) for sign in str(input(''))])
But consider keeping your signature in int instead of tuple. Int number is also unequivocal if its about sequence. In python 123 == 132 returns False. That way, you need only to replace:
a = (1,2,3)
by a:
a = 123
And script will works.

SQL - Create Unique AlphaNumeric based on a 10-digit integer stored as VARCHAR

I'm trying to emulate a function in SQL that a client has produced in Excel. In effect, they have a unique, 10-digit numeric value (VARCHAR) as the primary key in one of their enterprise database systems. Within another database, they require a unique, 5-digit alphanumeric identifier. They want that 5-digit alphanumeric value to be a representation of the 10-digit number. So what they did in excel was to split the 10-digit number into pairs, then convert each of those pairs into a hexadecimal value, then stitch them back together.
The EXCEL equation is:
=IF(VALUE(MID(A2,1,4))>0,DEC2HEX(VALUE(MID(A2,3,2)))&DEC2HEX(VALUE(MID(A2,5,2)))&DEC2HEX(VALUE(MID(A2,7,2)))&DEC2HEX(VALUE(MID(A2,9,2))),DEC2HEX(VALUE(MID(A2,5,2)))&DEC2HEX(VALUE(MID(A2,7,2)))&DEC2HEX((VALUE(MID(A2,9,2)))))
I need the SQL equivalent of this. Of course, should someone out there know a better way to accomplish their goal of "a 5-digit alphanumeric identifier" based off the 10-digit number, I'm all ears.
ADDED 8/2/2011
First of all, thank you to everyone for the replies. Nice to see folks willing to help and even enjoying it! Based on all the responses, I'm apt to tell my client they're intent is sound, only their method is off kilter. I'd also like to recommend a solution. So the challenge remains, just modified slightly:
CHALLENGE: Within SQL, take a 10 digit, unique NUMERIC string and represent it ALPHANUMERICALLY in as few characters as possible. The resulting string must also be unique.
Note that the first 3-4 characters in the 10-digit string are likely to be zeros, and that they could be stripped to shorten the resulting alphanumeric string. Not required, but perhaps helpful.
This problem is inherently impossible. You have a 10 digit numeric value that you want to convert to a 5 digit alphanumeric value. Since there are 10 numeric characters, this means that there are 10^10 = 10 000 000 000 unique values for your 10 digit number. Since there are 36 alphanumeric characters (26 letters + 10 numbers), there are 36^5 = 60 466 176 unique values for your 5 digit number. You cannot map a set of 10 billion elements into a set with around 60 million.
Now, lets take a closer look at what your client's code is doing:
So what they did in excel was to split the 10-digit number into pairs, then convert each of those pairs into a hexadecimal value, then stitch them back together.
This isn't 100% accurate. The excel code never uses the first 2 digits, but performs this operation on the remaining 8. There are two main problems with this algorithm which may not be intuitively obvious:
Two 10 digit numbers can map to the same 5 digit number. Consider the numbers 1000000117 and 1000001701. The last four digits of 1000000117 get mapped to 1 11, where the last four digits of 1000001701 get mapped to 11 1. This causes both to map to 00111.
The 5 digit number may not even end up being 5 digits! For example, 1000001616 gets mapped to 001010.
So, what is a possible solution? Well, if you don't care if that 5 digit number is unique or not, in MySQL you can use something like:
hex(<NUMERIC VALUE> % 0xFFFFF)
The log of 10^10 base 2 is 33.219280948874
> return math.log(10 ^ 10) / math.log(2)
33.219280948874
> = 2 ^ 33.21928
9999993422.9114
So, it takes 34 bits to represent this number. In hex this will take 34/4 = 8.5 characters, much more than 5.
> return math.log(10 ^ 10) / math.log(16)
8.3048202372184
The Excel macro is ignoring the first 4 (or 6) characters of the 10 character string.
You could try encoding in base 36 instead of 16. This will get you to 7 characters or less.
> return math.log(10 ^ 10) / math.log(36)
6.4254860446923
The popular base 64 encoding will get you to 6 characters
> return math.log(10 ^ 10) / math.log(64)
5.5365468248123
Even Ascii85 encoding won't get you down to 5.
> return math.log(10 ^ 10) / math.log(85)
5.1829075929158
You need base 100 to get to 5 characters
> return math.log(10 ^ 10) / math.log(100)
5
There aren't 100 printable ASCII characters, so this is not going to work, as zkhr explained as well, unless you're willing to go beyond ASCII.
I found your question interesting (although I don't claim to know the answer) - I googled a bit for you out of interest and found this which may help you http://dpatrickcaldwell.blogspot.com/2009/05/converting-decimal-to-hexadecimal-with.html