I am getting the error Type 'CUnsignedChar?' has no subscript members which produces a lot of results in stackoverflow however I can't seem to utilise any of the other available answers for my example. Its clearly a casting issue but I don't not see how to overcome it
I am doing a obj-c to swift conversion and I have a variable being set up as follows
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer : CUnsignedChar = bBuff1[0]
//..
//..
var backGreyBufferOffset : Int = localTexOffset * 512
var grey_val = 0
self.backGreyBuffer[Int(backGreyBufferOffset)]! = grey_val; //Subscript error here
This is the obj-c code that uses in-outs.
unsigned char bBuff1[256*512];
unsigned char *backGreyBuffer = &bBuff1[0];
//..
grey_val = 0;
backGreyBuffer[backGreyBufferOffset] = grey_val;
Any suggestions about the right direction would be great.
I noticed that only a small change is needed in your code. You should make backGreyBuffer a pointer:
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer = UnsafeMutablePointer(mutating: bBuff1)
// ....
var backGreyBufferOffset = localTexOffset * 512
backGreyBuffer[backGreyBufferOffset] = grey_val
Related
I have the following code written in Objective-C:
int port1 = SERVER_DEVICE_PORT;
int port2 = SERVER_DEVICE_PORT>>8;
Byte port1Byte[1] = {port1};
Byte port2Byte[1] = {port2};
NSData *port1Data = [[NSData alloc]initWithBytes: port1Byte length: sizeof(port1Byte)];
NSData *port2Data = [[NSData alloc]initWithBytes: port2Byte length: sizeof(port2Byte)];
I have converted it to Swift 3 like so:
let port1: Int = Int(SERVER_DEVICE_PORT)
let port2: Int = Int(SERVER_DEVICE_PORT) >> 8
let port1Bytes: [UInt8] = [UInt8(port1)]
let port2Bytes: [UInt8] = [UInt8(port2)]
let port1Data = NSData(bytes: port1Bytes, length: port1)
let port2Data = NSData(bytes: port2Bytes, length: port2)
However, with this code I am receiving the following error:
How can this be fixed?
The easiest way in Swift 3 to get the two lowest bytes from a 32 bit value is
var SERVER_DEVICE_PORT : Int32 = 55056
let data = Data(buffer: UnsafeBufferPointer(start: &SERVER_DEVICE_PORT, count: 1))
// or let data = Data(bytes: &SERVER_DEVICE_PORT, count: 2)
let port1Data = data[0]
let port2Data = data[1]
print(port1Data, port2Data)
This results UInt8 values, to get Data use
let port1Data = Data([data[0]])
let port2Data = Data([data[1]])
If – for some reason – the 32bit value is big endian (most significant byte in the smallest address) then port1Data = data[3] and port2Data = data[2].
Hi I am trying to bind depth memory buffer but I get an error saying as below. I have no idea why this error is popping up.
The depth format is VK_FORMAT_D16_UNORM and the usage is VK_IMAGE_USAGE_DEPTH_STENCIL_ATTACHMENT_BIT. I have read online that the TILING shouldnt be linear but then I get a different error. Thanks!!!
The code for creating and binding the image is as below.
VkImageCreateInfo imageInfo = {};
// If the depth format is undefined, use fallback as 16-byte value
if (Depth.format == VK_FORMAT_UNDEFINED) {
Depth.format = VK_FORMAT_D16_UNORM;
}
const VkFormat depthFormat = Depth.format;
VkFormatProperties props;
vkGetPhysicalDeviceFormatProperties(*deviceObj->gpu, depthFormat, &props);
if (props.linearTilingFeatures & VK_FORMAT_FEATURE_DEPTH_STENCIL_ATTACHMENT_BIT) {
imageInfo.tiling = VK_IMAGE_TILING_LINEAR;
}
else if (props.optimalTilingFeatures & VK_FORMAT_FEATURE_DEPTH_STENCIL_ATTACHMENT_BIT) {
imageInfo.tiling = VK_IMAGE_TILING_OPTIMAL;
}
else {
std::cout << "Unsupported Depth Format, try other Depth formats.\n";
exit(-1);
}
imageInfo.sType = VK_STRUCTURE_TYPE_IMAGE_CREATE_INFO;
imageInfo.pNext = NULL;
imageInfo.imageType = VK_IMAGE_TYPE_2D;
imageInfo.format = depthFormat;
imageInfo.extent.width = width;
imageInfo.extent.height = height;
imageInfo.extent.depth = 1;
imageInfo.mipLevels = 1;
imageInfo.arrayLayers = 1;
imageInfo.samples = NUM_SAMPLES;
imageInfo.queueFamilyIndexCount = 0;
imageInfo.pQueueFamilyIndices = NULL;
imageInfo.sharingMode = VK_SHARING_MODE_EXCLUSIVE;
imageInfo.usage = VK_IMAGE_USAGE_DEPTH_STENCIL_ATTACHMENT_BIT;
imageInfo.flags = 0;
// User create image info and create the image objects
result = vkCreateImage(deviceObj->device, &imageInfo, NULL, &Depth.image);
assert(result == VK_SUCCESS);
// Get the image memory requirements
VkMemoryRequirements memRqrmnt;
vkGetImageMemoryRequirements(deviceObj->device, Depth.image, &memRqrmnt);
VkMemoryAllocateInfo memAlloc = {};
memAlloc.sType = VK_STRUCTURE_TYPE_MEMORY_ALLOCATE_INFO;
memAlloc.pNext = NULL;
memAlloc.allocationSize = 0;
memAlloc.memoryTypeIndex = 0;
memAlloc.allocationSize = memRqrmnt.size;
// Determine the type of memory required with the help of memory properties
pass = deviceObj->memoryTypeFromProperties(memRqrmnt.memoryTypeBits, 0, /* No requirements */ &memAlloc.memoryTypeIndex);
assert(pass);
// Allocate the memory for image objects
result = vkAllocateMemory(deviceObj->device, &memAlloc, NULL, &Depth.mem);
assert(result == VK_SUCCESS);
// Bind the allocated memeory
result = vkBindImageMemory(deviceObj->device, Depth.image, Depth.mem, 0);
assert(result == VK_SUCCESS);
Yes, linear tiling may not be supported for depth usage Images.
Consult the specification and Valid Usage section of VkImageCreateInfo. The capability is queried by vkGetPhysicalDeviceFormatProperties and vkGetPhysicalDeviceImageFormatProperties commands. Though depth formats are "opaque", so there is not much reason to use linear tiling.
This you seem to be doing in your code.
But the error informs you that you are trying to use a memory type that is not allowed for the given Image. Use vkGetImageMemoryRequirements command to query which memory types are allowed.
Possibly you have some error there (you are using 0x1 which is obviously not part of 0x84 per the message). You may want to reuse the example code in the Device Memory chapter of the specification. Provide your memoryTypeFromProperties implementation for more specific answer.
I accidentally set the typeIndex to 1 instead of i and it works now. In my defense I have been vulkan coding the whole day and my eyes are bleeding :). Thanks for the help.
bool VulkanDevice::memoryTypeFromProperties(uint32_t typeBits, VkFlags
requirementsMask, uint32_t *typeIndex)
{
// Search memtypes to find first index with those properties
for (uint32_t i = 0; i < 32; i++) {
if ((typeBits & 1) == 1) {
// Type is available, does it match user properties?
if ((memoryProperties.memoryTypes[i].propertyFlags & requirementsMask) == requirementsMask) {
*typeIndex = i;// was set to 1 :(
return true;
}
}
typeBits >>= 1;
}
// No memory types matched, return failure
return false;
}
I was pointed to this objc snippet from WWDC 14, but I work on a Swift project.
CMIOObjectPropertyAddress prop = {
kCMIOHardwarePropertyAllowScreenCaptureDevices,
kCMIOObjectPropertyScopeGlobal,
kCMIOObjectPropertyElementMaster
};
UInt32 allow = 1;
CMIOObjectSetPropertyData(kCMIOObjectSystemObject, &prop, 0, NULL, sizeof(allow), &allow);
I then tried rewriting to Swift:
var prop : CMIOObjectPropertyAddress {
kCMIOHardwarePropertyAllowScreenCaptureDevices
kCMIOObjectPropertyScopeGlobal
kCMIOObjectPropertyElementMaster
}
var allow:UInt32 = 1
CMIOObjectSetPropertyData(kCMIOObjectSystemObject, &prop, 0, nil, sizeof(UInt32), &allow)
But it doesn't even validate. I don't know how to translate the CMIOObjectPropertyAddress struct. Xcode says
/Users/mortenjust/Dropbox/hack/learning/screenrec/screenrec/deleteme.swift:32:61:
Cannot assign to a get-only property 'prop'
A C struct translates as a Swift struct. Use the implicit memberwise initializer:
var prop = CMIOObjectPropertyAddress(
mSelector: UInt32(kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: UInt32(kCMIOObjectPropertyScopeGlobal),
mElement: UInt32(kCMIOObjectPropertyElementMaster))
The cool part is when you type CMIOObjectPropertyAddress(, code completion gives you the rest.
You're right, just got it running right this second. Turns out I also had to correct for some of the types. Here's the complete translation:
var prop = CMIOObjectPropertyAddress(
mSelector: CMIOObjectPropertySelector(kCMIOHardwarePropertyAllowScreenCaptureDevices),
mScope: CMIOObjectPropertyScope(kCMIOObjectPropertyScopeGlobal),
mElement: CMIOObjectPropertyElement(kCMIOObjectPropertyElementMaster))
var allow : UInt32 = 1
var dataSize : UInt32 = 4
var zero : UInt32 = 0
CMIOObjectSetPropertyData(CMIOObjectID(kCMIOObjectSystemObject), &prop, zero, nil, dataSize, &allow)
var session = AVCaptureSession()
session.sessionPreset = AVCaptureSessionPresetHigh
var devices = AVCaptureDevice.devices()
for device in AVCaptureDevice.devices() {
println(device)
}
so i am just starting to teach myself a bit ios and am an absolute amateur (please dont be too harsh and forgive questions, of which to you the answers might be obvious). so i wanted to get this following code to work:
var int bildBreite;
var float bildHoehe, bildFlaeche;
var bildBreite = 8;
var bildHoehe = 4.5;
var bildFlaeche = bildBreite * bildHoehe;
but get the mistakes as seen on this screenshot here (link: http://prntscr.com/6cdkvv )
concerning the screenshot: do not be confused with the "6" on the very right of line 10 , this was because before the "4.5" it was saying "6".
also, this is a xcode playground and the code i am trying out is from an objective c beginners guide, if that helps. (this whole swift and objective c mix is too confusing)
i would really appreciate an answer and if you have more questions to figure it out properly, i am happy to answer as quick as possible.
all the best
leo
Let's go over the code:
var int bildBreite;
var float bildHoehe, bildFlaeche;
You are trying to set the types of some variables in a non-Swift way. In order to declare variables of a specific type, you must add the type after the variable name: var bildBreite: Int Also note that class and struct names in Swift start with a capital letter, as #blacksquare noted.
var bildBreite = 8;
var bildHoehe = 4.5;
Here, it looks like you are trying to re-declare the variables. This isn't allowed.
var bildFlaeche = bildBreite * bildHoehe;
You are trying to multiply data between two different number types. This is not allowed in Swift. In order to actually do any math work, you would need to convert one to the same type as the other.
This version should work:
var bildBreite: Int;
var bildHoehe: Float;
var bildFlaeche: Float;
bildBreite = 8;
bildHoehe = 4.5;
bildFlaeche = Float(bildBreite) * bildHoehe;
The following code is a more concise version of your code:
// Inferred to be an Int
var bildBreite = 8
// This forces it to a Float, otherwise it would be a Double
var bildHoehe: Float = 4.5
// First, we convert bildBreite to a Float, then times it with Float bildHoehe.
// The answer is inferred to be a Float
var bildFlaeche = Float(bildBreite) * bildHoehe
int should be Int and float should be Float. Class and primitive names in Swift always begin with a capital letter.
To declare a variable in Swift, you do this:
var variableName: Int = intValue
Or you can let the compiler decide the type for you:
var variableNameA = 1
var variableNameB = 1.5
If you want to declare a variable and then assign it later you use an optional or an implicitly unwrapped optional:
var variableNameA: Int?
var variableNameB: Int!
variableNameA = 1
variableNameB = 2
var variableNameC = variableNameA! + variableNameB
As you can see from the last example, variableNameA is optional (meaning it can have a null value, so it must be unwrapped before usage using the ! operator; an implicitly unwrapped optional like variableNameB doesn't need to be unwrapped, but it's less safe and should be used with caution.
Note also that you only use var when you first declare a variable.
Good luck!
var bildBreite:Int = 8
var bildHoehe:Float = 4.5
var bildFlaeche:Float = bildBreite * bildHoehe
You dont need semi colons. You need to not try to apply java or any other language to this learn the syntax and you will be good
The following Swift code (writing bytes to a stream) is rewritten from Objective-C:
var outputStream : NSOutputStream = NSOutputStream()
var pData : NSMutableData = NSMutableData()
var pType : Int = 1
let pMessage : String = "Device_Description\0\0\x01" // 16BitChar with escapeSequence
var pLength : Int = 8+pMessage.lengthOfBytesUsingEncoding(NSUTF16LittleEndianStringEncoding)
pData.appendBytes(&pLength, length: 4)
pData.appendBytes(&pType, length: 4)
pData.appendData((pMessage as NSString).dataUsingEncoding(NSUTF16LittleEndianStringEncoding))
outputStream.write(pData.bytes, maxLength: pData.length)
pData.bytes is of type COpaquePointer, but CConstPointer<Uint8>is needed by the write operation. Any hints for the correct conversion? Thanks in advance.
As Jack wu has outlined, but somewhat incompletely, the following code works just the same as using the UnsafePointer option:
var byteData = [UInt8]()
pData.getBytes(&byteData)
self.outputStream!.write(byteData, maxLength: pData.length)
From the Swift & Objc interop book section here : https://developer.apple.com/library/prerelease/ios/documentation/swift/conceptual/buildingcocoaapps/InteractingWithCAPIs.html
C Constant Pointers
When a function is declared as taking a CConstPointer argument,
it can accept any of the following:
nil, which is passed as a null pointer
A CMutablePointer, CMutableVoidPointer, CConstPointer, CConstVoidPointer, or AutoreleasingUnsafePointer value, which
is converted to CConstPointer if necessary
An in-out expression whose operand is an lvalue of type Type, which is passed as the address of the lvalue
A Type[] value, which is passed as a pointer to the start of the array, and lifetime-extended for the duration of the call
I believe then it can work like this:
var p: [Uint8] = []
pData.getBytes(&p)
outputStream.write(p, maxLength: pData.length)
I found a simple solution right now, by use of UnsafePointer<T>():
var outputStream : NSOutputStream = NSOutputStream()
var pData : NSMutableData = NSMutableData()
var pType : Int = 1
let pMessage : String = "Device_Description\0\0\x01" // 16BitChar with escapeSequence
var pLength : Int = 8+pMessage.lengthOfBytesUsingEncoding(NSUTF16LittleEndianStringEncoding)
pData.appendBytes(&pLength, length: 4)
pData.appendBytes(&pType, length: 4)
pData.appendData(ptpMessage.dataUsingEncoding(NSUTF16LittleEndianStringEncoding))
outputStream.write(UnsafePointer<UInt8>(pData.bytes), maxLength: pData.length)
#holex: Thanks for your input. I know this solution is not quite Swifty, but it´s working for now.