so i am just starting to teach myself a bit ios and am an absolute amateur (please dont be too harsh and forgive questions, of which to you the answers might be obvious). so i wanted to get this following code to work:
var int bildBreite;
var float bildHoehe, bildFlaeche;
var bildBreite = 8;
var bildHoehe = 4.5;
var bildFlaeche = bildBreite * bildHoehe;
but get the mistakes as seen on this screenshot here (link: http://prntscr.com/6cdkvv )
concerning the screenshot: do not be confused with the "6" on the very right of line 10 , this was because before the "4.5" it was saying "6".
also, this is a xcode playground and the code i am trying out is from an objective c beginners guide, if that helps. (this whole swift and objective c mix is too confusing)
i would really appreciate an answer and if you have more questions to figure it out properly, i am happy to answer as quick as possible.
all the best
leo
Let's go over the code:
var int bildBreite;
var float bildHoehe, bildFlaeche;
You are trying to set the types of some variables in a non-Swift way. In order to declare variables of a specific type, you must add the type after the variable name: var bildBreite: Int Also note that class and struct names in Swift start with a capital letter, as #blacksquare noted.
var bildBreite = 8;
var bildHoehe = 4.5;
Here, it looks like you are trying to re-declare the variables. This isn't allowed.
var bildFlaeche = bildBreite * bildHoehe;
You are trying to multiply data between two different number types. This is not allowed in Swift. In order to actually do any math work, you would need to convert one to the same type as the other.
This version should work:
var bildBreite: Int;
var bildHoehe: Float;
var bildFlaeche: Float;
bildBreite = 8;
bildHoehe = 4.5;
bildFlaeche = Float(bildBreite) * bildHoehe;
The following code is a more concise version of your code:
// Inferred to be an Int
var bildBreite = 8
// This forces it to a Float, otherwise it would be a Double
var bildHoehe: Float = 4.5
// First, we convert bildBreite to a Float, then times it with Float bildHoehe.
// The answer is inferred to be a Float
var bildFlaeche = Float(bildBreite) * bildHoehe
int should be Int and float should be Float. Class and primitive names in Swift always begin with a capital letter.
To declare a variable in Swift, you do this:
var variableName: Int = intValue
Or you can let the compiler decide the type for you:
var variableNameA = 1
var variableNameB = 1.5
If you want to declare a variable and then assign it later you use an optional or an implicitly unwrapped optional:
var variableNameA: Int?
var variableNameB: Int!
variableNameA = 1
variableNameB = 2
var variableNameC = variableNameA! + variableNameB
As you can see from the last example, variableNameA is optional (meaning it can have a null value, so it must be unwrapped before usage using the ! operator; an implicitly unwrapped optional like variableNameB doesn't need to be unwrapped, but it's less safe and should be used with caution.
Note also that you only use var when you first declare a variable.
Good luck!
var bildBreite:Int = 8
var bildHoehe:Float = 4.5
var bildFlaeche:Float = bildBreite * bildHoehe
You dont need semi colons. You need to not try to apply java or any other language to this learn the syntax and you will be good
Related
I am trying to solve the following question on LeetCode; Write a function that takes an unsigned integer and returns the number of '1' bits it has. Constraints: The input must be a binary string of length 32.
I have written the following code for that which works fine for inputs 00000000000000000000000000001011 and 00000000000000000000000010000000 (provided internally by the website) but give output 0 for input 11111111111111111111111111111101 and in my local compiler for the last input it says "out of range"
class Solution {
// you need treat n as an unsigned value
fun hammingWeight(n:Int):Int {
var num = n
var setCountBit = 0
while (num > 0) {
setCountBit++
num= num and num-1
}
return setCountBit
}
}
To correctly convert binary string to Int and avoid "out of range error", you need to do the following (I believe LeetCode does the same under the hood):
fun binaryStringToInt(s: String): Int = s.toUInt(radix = 2).toInt()
"11111111111111111111111111111101" is equivalent to 4294967293. This is greater than Int.MAX_VALUE, so it will be represented as negative number after .toInt() convertion (-3 in this case).
Actually, this problem could be solved with one-liner in Kotlin 1.4:
fun hammingWeight(n: Int): Int = n.countOneBits()
But LeetCode uses Kotlin 1.3.10, so you need to adjust your solution to handle negative Ints as well.
Please change the type of your input variable from Int to a type like Double .At the moment The given value is bigger than the maximum value that a type Int number can store.
I have explained my query in the code snippet below. I am looking for this type of syntax for Obj-C interoperability. Specifically I see a difference in behaviour of aCoder.encode(count, forKey: "count") API when count is Int (non-optional) vs Int? (optional)
import Foundation
let num = 5
// Swift's type system will infer this as Int (non-optional)
print(type(of: num))
// Prints: Int
let optNum: Int? = 5
// This is explicitly typed as an optional Int
print(type(of: optNum))
// Prints Optional<Int>
Is it possible to use a literal to implicitly type a var/let to optional?
// let imlicitOptional = 5?
// print(type(of: imlicitOptional))
// The above line should print: Optional<Int>
// or
// let imlicitOptional = num?
// print(type(of: imlicitOptional))
// The above line should print: Optional<Int>
Optional is a plain enum not any specific magic type. So, you can create a value using Optional:
let implicitOptional = Optional(5)
print(type(of: implicitOptional)) // Optional<Int>
I don't know why you need this but you can do like this
let opt = 5 as Int?
// or
let opt = Optional(5)
// or
let opt = 5 as Optional // thanks to vacawama for this
Actually you can even create an operator that returns an optional but I think it's kinda useless.
postfix operator >?
postfix func >?<T>(value: T) -> T? {
return Optional(value) // or return value as T?
}
let a = 5>?
I am getting the error Type 'CUnsignedChar?' has no subscript members which produces a lot of results in stackoverflow however I can't seem to utilise any of the other available answers for my example. Its clearly a casting issue but I don't not see how to overcome it
I am doing a obj-c to swift conversion and I have a variable being set up as follows
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer : CUnsignedChar = bBuff1[0]
//..
//..
var backGreyBufferOffset : Int = localTexOffset * 512
var grey_val = 0
self.backGreyBuffer[Int(backGreyBufferOffset)]! = grey_val; //Subscript error here
This is the obj-c code that uses in-outs.
unsigned char bBuff1[256*512];
unsigned char *backGreyBuffer = &bBuff1[0];
//..
grey_val = 0;
backGreyBuffer[backGreyBufferOffset] = grey_val;
Any suggestions about the right direction would be great.
I noticed that only a small change is needed in your code. You should make backGreyBuffer a pointer:
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer = UnsafeMutablePointer(mutating: bBuff1)
// ....
var backGreyBufferOffset = localTexOffset * 512
backGreyBuffer[backGreyBufferOffset] = grey_val
I'm trying to capture a window list in a Mac OS X app using Swift. The CGWindowListCreateImageFromArray function requires a CFArray. I've tried several things and this is the closest I've got. Or is there a better way to convert the array?
import Cocoa
// Example swift array of CGWindowID's
var windowIDs = [CGWindowID]();
windowIDs.append(1);
windowIDs.append(2);
// Convert to CFArray using CFArrayCreate
let allocator = kCFAllocatorDefault
let numValues = windowIDs.count as CFIndex
let callbacks: UnsafePointer<CFArrayCallBacks> = nil
var values: UnsafeMutablePointer<UnsafePointer<Void>> = nil
/* how do I convert windowIDs to UnsafeMutablePointer<UnsafePointer<Void>> for the values? */
let windowIDsCFArray = CFArrayCreate(allocator, values, numValues, callbacks);
let capture = CGWindowListCreateImageFromArray(CGRectInfinite, windowIDsCFArray, CGWindowImageOption(kCGWindowListOptionOnScreenOnly));
You can initialize your UnsafeMutablePointer with your array so long as you set your CGWindowIDs to CFTypeRef:
var windows: [CFTypeRef] = [1, 2]
var windowsPointer = UnsafeMutablePointer<UnsafePointer<Void>>(windows)
var cfArray = CFArrayCreate(nil, windowsPointer, windows.count, nil)
Converted Ian's answer to Swift 4:
let windows = [CGWindowID(17), CGWindowID(50), CGWindowID(59)]
let pointer = UnsafeMutablePointer<UnsafeRawPointer?>.allocate(capacity: windows.count)
for (index, window) in windows.enumerated() {
pointer[index] = UnsafeRawPointer(bitPattern: UInt(window))
}
let array: CFArray = CFArrayCreate(kCFAllocatorDefault, pointer, windows.count, nil)
let capture = CGImage(windowListFromArrayScreenBounds: CGRect.infinite, windowArray: array, imageOption: [])!
let image: NSImage = NSImage(cgImage: capture, size: NSSize.zero)
Swift.print(image)
Arrays in Swift are bridged to NSArray, given they contain objects, e.g., conform to [AnyObject] type. Since CGWindowID is a UInt32, you need to convert it to NS family, array's map() method is an elegant approach.
var windows: [CGWindowID] = [CGWindowID(1), CGWindowID(2)]
var array: CFArray = windows.map({NSNumber(unsignedInt: $0)}) as CFArray
This, however, doesn't reflect on the actual CGWindowListCreateImageFromArray problem. Here's the working solution for that:
let windows: [CGWindowID] = [CGWindowID(17), CGWindowID(50), CGWindowID(59)]
let pointer: UnsafeMutablePointer<UnsafePointer<Void>> = UnsafeMutablePointer<UnsafePointer<Void>>.alloc(windows.count)
for var i: Int = 0, n = windows.count; i < n; i++ {
pointer[i] = UnsafePointer<Void>(bitPattern: UInt(windows[i]))
}
let array: CFArray = CFArrayCreate(kCFAllocatorDefault, pointer, windows.count, nil)
let capture: CGImage = CGWindowListCreateImageFromArray(CGRectInfinite, array, CGWindowImageOption.Default)!
let image: NSImage = NSImage(CGImage: capture, size: NSZeroSize)
Swift.print(image) // <NSImage 0x7f83a3d16920 Size={1440, 900} Reps=("<NSCGImageSnapshotRep:0x7f83a3d2dea0 cgImage=<CGImage 0x7f83a3d16840>>")>
I'm not great at ObjC, please correct if wrong, but from what I understand by playing with the SonOfGrab example and particular piece of code below is that the final pointer structure contains window ids (UInt32) not inside the memory cell (memory property of UnsafePointer instance), but inside memory address (hashValue property).
const void *windowIDs[2];
windowIDs[0] = 10;
windowIDs[1] = 20;
It's interesting, since values aren't stored in the memory, but inside the address descriptors, with oldest architectures being 32-bit UInt32 values fit perfectly into address pointers. Perhaps back in the days when the memory was a limiting factor this made a lot of sense and was a great approach. Discovering this all night in Swift in 2016 made me suicidal.
What's worse it fails in Xcode 7.2 playground with certain window ids, probably because of the way it handles memory, but works in the actual app.
I'm working on changing some Objective-C Code over to Swift, and I cannot figure out for the life of me how to take care of unsigned char arrays and bitwise operations in this specific instance of code.
Specifically, I'm working on converting the following Objective-C code (which deals with CoreBluetooth) to Swift:
unsigned char advertisementBytes[21] = {0};
[self.proximityUUID getUUIDBytes:(unsigned char *)&advertisementBytes];
advertisementBytes[16] = (unsigned char)(self.major >> 8);
advertisementBytes[17] = (unsigned char)(self.major & 255);
I've tried the following in Swift:
var advertisementBytes: CMutablePointer<CUnsignedChar>
self.proximityUUID.getUUIDBytes(advertisementBytes)
advertisementBytes[16] = (CUnsignedChar)(self.major >> 8)
The problems I'm running into are that getUUIDBytes in Swift seems to only take a CMutablePointer<CUnsignedChar> object as an argument, rather than an array of CUnsignedChars, so I have no idea how to do the later bitwise operations on advertisementBytes, as it seems it would need to be an unsignedChar array to do so.
Additionally, CMutablePointer<CUnsignedChar[21]> throws an error saying that fixed length arrays are not supported in CMutablePointers in Swift.
Could anyone please advise on potential work-arounds or solutions? Many thanks.
Have a look at Interacting with C APIs
Mostly this
C Mutable Pointers
When a function is declared as taking a CMutablePointer
argument, it can accept any of the following:
nil, which is passed as a null pointer
A CMutablePointer value
An in-out expression whose operand is a stored lvalue of type Type,
which is passed as the address of the lvalue
An in-out Type[] value,
which is passed as a pointer to the start of the array, and
lifetime-extended for the duration of the call
If you have declared a
function like this one:
SWIFT
func takesAMutablePointer(x: CMutablePointer<Float>) { /*...*/ } You
can call it in any of the following ways:
SWIFT
var x: Float = 0.0
var p: CMutablePointer<Float> = nil
var a: Float[] = [1.0, 2.0, 3.0]
takesAMutablePointer(nil)
takesAMutablePointer(p)
takesAMutablePointer(&x)
takesAMutablePointer(&a)
So you code becomes
var advertisementBytes = CUnsignedChar[]()
self.proximityUUID.getUUIDBytes(&advertisementBytes)
advertisementBytes[16] = CUnsignedChar(self.major >> 8)