How to use ByteArray.getOrElse - kotlin

I don't undertand how to specify the default value for `ByteArray.getOrElse() function.
I tried:
myInt = dat.getOrElse(0, 0).toInt()
but compiler complains with the following error:
The integer literal does not conform to the expected type (Int) -> Byte
How to specify the default value?

The expected type of the second argument (defaultValue) is (Int) -> Byte which is a lambda that takes an Int and returns a Byte.
myInt = dat.getOrElse(index = 100, defaultValue = {
i ->
// use i to calcuate your Byte that should be returned...
// or return a fixed value
i * 1 // for example
})
Signature of getOrElse:
fun ByteArray.getOrElse(
index: Int,
defaultValue: (Int) -> Byte
): Byte

The second argument is a function literal
myInt = dat.getOrElse(100, { /** what is there is no element 100*/ 0 })

Related

Input out of range for Int datatype, not passing a testcase

I am trying to solve the following question on LeetCode; Write a function that takes an unsigned integer and returns the number of '1' bits it has. Constraints: The input must be a binary string of length 32.
I have written the following code for that which works fine for inputs 00000000000000000000000000001011 and 00000000000000000000000010000000 (provided internally by the website) but give output 0 for input 11111111111111111111111111111101 and in my local compiler for the last input it says "out of range"
class Solution {
// you need treat n as an unsigned value
fun hammingWeight(n:Int):Int {
var num = n
var setCountBit = 0
while (num > 0) {
setCountBit++
num= num and num-1
}
return setCountBit
}
}
To correctly convert binary string to Int and avoid "out of range error", you need to do the following (I believe LeetCode does the same under the hood):
fun binaryStringToInt(s: String): Int = s.toUInt(radix = 2).toInt()
"11111111111111111111111111111101" is equivalent to 4294967293. This is greater than Int.MAX_VALUE, so it will be represented as negative number after .toInt() convertion (-3 in this case).
Actually, this problem could be solved with one-liner in Kotlin 1.4:
fun hammingWeight(n: Int): Int = n.countOneBits()
But LeetCode uses Kotlin 1.3.10, so you need to adjust your solution to handle negative Ints as well.
Please change the type of your input variable from Int to a type like Double .At the moment The given value is bigger than the maximum value that a type Int number can store.

Is it possible to create an optional literal in Swift?

I have explained my query in the code snippet below. I am looking for this type of syntax for Obj-C interoperability. Specifically I see a difference in behaviour of aCoder.encode(count, forKey: "count") API when count is Int (non-optional) vs Int? (optional)
import Foundation
let num = 5
// Swift's type system will infer this as Int (non-optional)
print(type(of: num))
// Prints: Int
let optNum: Int? = 5
// This is explicitly typed as an optional Int
print(type(of: optNum))
// Prints Optional<Int>
Is it possible to use a literal to implicitly type a var/let to optional?
// let imlicitOptional = 5?
// print(type(of: imlicitOptional))
// The above line should print: Optional<Int>
// or
// let imlicitOptional = num?
// print(type(of: imlicitOptional))
// The above line should print: Optional<Int>
Optional is a plain enum not any specific magic type. So, you can create a value using Optional:
let implicitOptional = Optional(5)
print(type(of: implicitOptional)) // Optional<Int>
I don't know why you need this but you can do like this
let opt = 5 as Int?
// or
let opt = Optional(5)
// or
let opt = 5 as Optional // thanks to vacawama for this
Actually you can even create an operator that returns an optional but I think it's kinda useless.
postfix operator >?
postfix func >?<T>(value: T) -> T? {
return Optional(value) // or return value as T?
}
let a = 5>?

What happens with toInt()

I currently start to learn Kotlin and I was making this code
val a = "1"
val b = a[0]
val c = b.toInt()
println(c)
When I run the code, the result is 49. What really happened? Because I think the result will be 1.
a is a String, which is a CharSequence. That is why you can access a[0] in the first place. a.get(0) or a[0] then returns a Char. Char on the other hand returns its character value when calling toInt(), check also the documentation of toInt().
So your code commented:
val a = "1" // a is a String
val b = a[0] // b is a Char
val c = b.toInt() // c is (an Int representing) the character value of b
If you just want to return the number you rather need to parse it or use any of the answers you like the most of: How do I convert a Char to Int?
(one simple way being b.toString().toInt()).
a is String,
when you get a[index] return type is char,
in kotlin char.toInt method return ASCII code of the character and it's 49
if you want to get the integer value of "1" just use toString method
val a = "1"
val b = a[0].toString()
val c = b.toInt()
println(c)
prints:1
In your example a is a String, but String. String is under the hood an Array of Char. And by accessing your String using a[0] operator, you get first element of this Char Array. So you get Char '1', not String "1". And now, when you run '1'.toInt() function on Char - it will return ASCII code of that Char. When you run "1".toInt() on String - it will convert this String into Int "1". When you need to get Int value of first letter in your String, you need to convert it first into String:
val a = "123"
val b = a[0].toString() // returns first Char of String "123" and converts to String
val c = b.toInt() // returns Int: 1
or in one line:
"123"[0].toString().toInt()

Why single char and "single char String" not equal when converted to long (.toLong())

I wanted to sum the digits of Long variable and add it to the variable it self, I came with the next working code:
private fun Long.sumDigits(): Long {
var n = this
this.toString().forEach { n += it.toString().toLong() }
return n
}
Usage: assert(48.toLong() == 42.toLong().sumDigits())
I had to use it.toString() in order to get it work, so I came with the next test and I don't get it's results:
#Test
fun toLongEquality() {
println("'4' as Long = " + '4'.toLong())
println("\"4\" as Long = " + "4".toLong())
println("\"42\" as Long = " + "42".toLong())
assert('4'.toString().toLong() == 4.toLong())
}
Output:
'4' as Long = 52
"4" as Long = 4
"42" as Long = 42
Is it a good practice to use char.toString().toLong() or there is a better way to convert char to Long?
Does "4" represented by chars? Why it is not equal to it char representation?
From the documentation:
class Char : Comparable (source) Represents a 16-bit Unicode
character. On the JVM, non-nullable values of this type are
represented as values of the primitive type char.
fun toLong(): Long
Returns the value of this character as a Long.
When you use '4' as Long you actually get the Unicode (ASCII) code of the char '4'
As mTak says, Char represents a Unicode value. If you are using Kotlin on the JVM, you can define your function as follows:
private fun Long.sumDigits() = this.toString().map(Character::getNumericValue).sum().toLong()
There's no reason to return Long rather than Int, but I've kept it the same as in your question.
Non-JVM versions of Kotlin don't have the Character class; use map {it - '0'} instead.

NSMutableData to CConstPointer conversion in Swift needed

The following Swift code (writing bytes to a stream) is rewritten from Objective-C:
var outputStream : NSOutputStream = NSOutputStream()
var pData : NSMutableData = NSMutableData()
var pType : Int = 1
let pMessage : String = "Device_Description\0\0\x01" // 16BitChar with escapeSequence
var pLength : Int = 8+pMessage.lengthOfBytesUsingEncoding(NSUTF16LittleEndianStringEncoding)
pData.appendBytes(&pLength, length: 4)
pData.appendBytes(&pType, length: 4)
pData.appendData((pMessage as NSString).dataUsingEncoding(NSUTF16LittleEndianStringEncoding))
outputStream.write(pData.bytes, maxLength: pData.length)
pData.bytes is of type COpaquePointer, but CConstPointer<Uint8>is needed by the write operation. Any hints for the correct conversion? Thanks in advance.
As Jack wu has outlined, but somewhat incompletely, the following code works just the same as using the UnsafePointer option:
var byteData = [UInt8]()
pData.getBytes(&byteData)
self.outputStream!.write(byteData, maxLength: pData.length)
From the Swift & Objc interop book section here : https://developer.apple.com/library/prerelease/ios/documentation/swift/conceptual/buildingcocoaapps/InteractingWithCAPIs.html
C Constant Pointers
When a function is declared as taking a CConstPointer argument,
it can accept any of the following:
nil, which is passed as a null pointer
A CMutablePointer, CMutableVoidPointer, CConstPointer, CConstVoidPointer, or AutoreleasingUnsafePointer value, which
is converted to CConstPointer if necessary
An in-out expression whose operand is an lvalue of type Type, which is passed as the address of the lvalue
A Type[] value, which is passed as a pointer to the start of the array, and lifetime-extended for the duration of the call
I believe then it can work like this:
var p: [Uint8] = []
pData.getBytes(&p)
outputStream.write(p, maxLength: pData.length)
I found a simple solution right now, by use of UnsafePointer<T>():
var outputStream : NSOutputStream = NSOutputStream()
var pData : NSMutableData = NSMutableData()
var pType : Int = 1
let pMessage : String = "Device_Description\0\0\x01" // 16BitChar with escapeSequence
var pLength : Int = 8+pMessage.lengthOfBytesUsingEncoding(NSUTF16LittleEndianStringEncoding)
pData.appendBytes(&pLength, length: 4)
pData.appendBytes(&pType, length: 4)
pData.appendData(ptpMessage.dataUsingEncoding(NSUTF16LittleEndianStringEncoding))
outputStream.write(UnsafePointer<UInt8>(pData.bytes), maxLength: pData.length)
#holex: Thanks for your input. I know this solution is not quite Swifty, but it´s working for now.