Converting Byte to UInt8 (Swift 3) - objective-c

I have the following code written in Objective-C:
int port1 = SERVER_DEVICE_PORT;
int port2 = SERVER_DEVICE_PORT>>8;
Byte port1Byte[1] = {port1};
Byte port2Byte[1] = {port2};
NSData *port1Data = [[NSData alloc]initWithBytes: port1Byte length: sizeof(port1Byte)];
NSData *port2Data = [[NSData alloc]initWithBytes: port2Byte length: sizeof(port2Byte)];
I have converted it to Swift 3 like so:
let port1: Int = Int(SERVER_DEVICE_PORT)
let port2: Int = Int(SERVER_DEVICE_PORT) >> 8
let port1Bytes: [UInt8] = [UInt8(port1)]
let port2Bytes: [UInt8] = [UInt8(port2)]
let port1Data = NSData(bytes: port1Bytes, length: port1)
let port2Data = NSData(bytes: port2Bytes, length: port2)
However, with this code I am receiving the following error:
How can this be fixed?

The easiest way in Swift 3 to get the two lowest bytes from a 32 bit value is
var SERVER_DEVICE_PORT : Int32 = 55056
let data = Data(buffer: UnsafeBufferPointer(start: &SERVER_DEVICE_PORT, count: 1))
// or let data = Data(bytes: &SERVER_DEVICE_PORT, count: 2)
let port1Data = data[0]
let port2Data = data[1]
print(port1Data, port2Data)
This results UInt8 values, to get Data use
let port1Data = Data([data[0]])
let port2Data = Data([data[1]])
If – for some reason – the 32bit value is big endian (most significant byte in the smallest address) then port1Data = data[3] and port2Data = data[2].

Related

Audiotoolbox API AudioConvertNewSpecific returning wrong output packet size when converting to G711u instead of AAC in Objective C

I am trying to convert the RAW PCM buffers to G711u Codec in IOS using Audio Toolbox. I see many code examples working for the AAC codec using the same code and APIs.But when I try to do it for PCMU then it is not working. AudioConverter is setup with no errors but when I try to query the output packet size in AudioConverterGetProperty then it returns value of 2 which seems wrong to me because when I use same code for AAC codec then it returns output packet size of 600. Below here I paste my code and thanks in advance to help me out in this .
=====Source Code ===========
OSStatus result = 0;
AudioStreamBasicDescription in = {0}, out = {0};
in.mSampleRate = frequencyInHz;
in.mChannelsPerFrame = channelCount;
in.mBitsPerChannel = 16;
in.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagsNativeEndian | kAudioFormatFlagIsPacked;
in.mFormatID = kAudioFormatLinearPCM;
in.mFramesPerPacket = 1;
in.mBytesPerFrame = in.mBitsPerChannel * in.mChannelsPerFrame / 8;
in.mBytesPerPacket = in.mFramesPerPacket*in.mBytesPerFrame;
m_in = in;
out.mFormatID = kAudioFormatULaw;
out.mFormatFlags = 0;
out.mFramesPerPacket = kSamplesPerFrame;
out.mSampleRate = frequencyInHz;
out.mChannelsPerFrame = channelCount;
m_out = out;
UInt32 outputBitrate = bitrate;
UInt32 propSize = sizeof(outputBitrate);
UInt32 outputPacketSize = 1024;
const OSType subtype = kAudioFormatULaw;
AudioClassDescription requestedCodecs[2] = {
{
kAudioEncoderComponentType,
subtype,
kAppleSoftwareAudioCodecManufacturer
},
{
kAudioEncoderComponentType,
subtype,
kAppleHardwareAudioCodecManufacturer
}
};
result = AudioConverterNewSpecific(&in, &out, 2, requestedCodecs, &m_audioConverter);
if(result !=0) {
DLog("Error AudioConverterNewSpecific %x \n", (int)result);
}
result = AudioConverterSetProperty(m_audioConverter, kAudioConverterEncodeBitRate, propSize, &outputBitrate);
if(result !=0) {
DLog("Error AudioConverterSetProperty %x \n", (int)result);
}
result = AudioConverterGetProperty(m_audioConverter, kAudioConverterPropertyMaximumOutputPacketSize, &propSize, &outputPacketSize);
if(result !=0) {
DLog("Error AudioConverterGetProperty %x \n", (int)result);
}
DLog("The output packet size is %x \n", (int)outputPacketSize);
======Output============
Error AudioConverterSetProperty 70726f70
The output packet size is 2
The problem is that the output packet size is returned to 2 and when I try to use the encoder then it gives me the output in 2 bytes for each call. I think we are not receiving the correct output packet size . How to correct it?

CUnsignedChar has no subscript members

I am getting the error Type 'CUnsignedChar?' has no subscript members which produces a lot of results in stackoverflow however I can't seem to utilise any of the other available answers for my example. Its clearly a casting issue but I don't not see how to overcome it
I am doing a obj-c to swift conversion and I have a variable being set up as follows
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer : CUnsignedChar = bBuff1[0]
//..
//..
var backGreyBufferOffset : Int = localTexOffset * 512
var grey_val = 0
self.backGreyBuffer[Int(backGreyBufferOffset)]! = grey_val; //Subscript error here
This is the obj-c code that uses in-outs.
unsigned char bBuff1[256*512];
unsigned char *backGreyBuffer = &bBuff1[0];
//..
grey_val = 0;
backGreyBuffer[backGreyBufferOffset] = grey_val;
Any suggestions about the right direction would be great.
I noticed that only a small change is needed in your code. You should make backGreyBuffer a pointer:
var bBuff1 = [CUnsignedChar](repeating: 0, count: Int(256*512))
var backGreyBuffer = UnsafeMutablePointer(mutating: bBuff1)
// ....
var backGreyBufferOffset = localTexOffset * 512
backGreyBuffer[backGreyBufferOffset] = grey_val

How to convert that UnsafeMutablePointer<UnsafeMutablePointer<Float>> variable into AudioBufferList?

I have this EZAudio method in my Swift project, to capture audio from the microphone:
func microphone(microphone: EZMicrophone!, hasAudioReceived bufferList: UnsafeMutablePointer<UnsafeMutablePointer<Float>>, withBufferSize bufferSize: UInt32, withNumberOfChannels numberOfChannels: UInt32) {
}
But what I really need is to have that "bufferList" parameter coming in as an AudioBufferList type, in order to send those audio packets through a socket, just like I did in Objective C:
//Objective C pseudocode:
for(int i = 0; i < bufferList.mNumberBuffers; ++i){
AudioBuffer buffer = bufferList.mBuffers[i];
audio = ["audio": NSData(bytes: buffer.mData, length: Int(buffer.mDataByteSize))];
socket.emit("message", audio);
}
How can I convert that UnsafeMutablePointer> variable into AudioBufferList?
I was able to stream audio from the Microphone, into a socket, like this:
func microphone(microphone: EZMicrophone!, hasBufferList bufferList: UnsafeMutablePointer<AudioBufferList>, withBufferSize bufferSize: UInt32, withNumberOfChannels numberOfChannels: UInt32) {
let blist:AudioBufferList=bufferList[0]
let buffer:AudioBuffer = blist.mBuffers
let audio = ["audio": NSData(bytes: buffer.mData, length: Int(buffer.mDataByteSize))];
socket.emit("message", audio);//this socket comes from Foundation framework
}
This general AudioStreamDescriptor setup worked for me, you might have to tweak it for your own needs or ommit some parts, like the waveform animation:
func initializeStreaming() {
var streamDescription:AudioStreamBasicDescription=AudioStreamBasicDescription()
streamDescription.mSampleRate = 16000.0
streamDescription.mFormatID = kAudioFormatLinearPCM
streamDescription.mFramesPerPacket = 1
streamDescription.mChannelsPerFrame = 1
streamDescription.mBytesPerFrame = streamDescription.mChannelsPerFrame * 2
streamDescription.mBytesPerPacket = streamDescription.mFramesPerPacket * streamDescription.mBytesPerFram
streamDescription.mBitsPerChannel = 16
streamDescription.mFormatFlags = kAudioFormatFlagIsSignedInteger
microphone = EZMicrophone(microphoneDelegate: self, withAudioStreamBasicDescription: sstreamDescription, startsImmediately: false)
waveview?.plotType=EZPlotType.Buffer
waveview?.shouldFill = false
waveview?.shouldMirror = false
}
It was complicated to get this thing running, good luck!
I believe you would create a AudioBufferList pointer and use the result of the memory function.
let audioBufferList = UnsafePointer<AudioBufferList>(bufferList).memory

NSMutableData to CConstPointer conversion in Swift needed

The following Swift code (writing bytes to a stream) is rewritten from Objective-C:
var outputStream : NSOutputStream = NSOutputStream()
var pData : NSMutableData = NSMutableData()
var pType : Int = 1
let pMessage : String = "Device_Description\0\0\x01" // 16BitChar with escapeSequence
var pLength : Int = 8+pMessage.lengthOfBytesUsingEncoding(NSUTF16LittleEndianStringEncoding)
pData.appendBytes(&pLength, length: 4)
pData.appendBytes(&pType, length: 4)
pData.appendData((pMessage as NSString).dataUsingEncoding(NSUTF16LittleEndianStringEncoding))
outputStream.write(pData.bytes, maxLength: pData.length)
pData.bytes is of type COpaquePointer, but CConstPointer<Uint8>is needed by the write operation. Any hints for the correct conversion? Thanks in advance.
As Jack wu has outlined, but somewhat incompletely, the following code works just the same as using the UnsafePointer option:
var byteData = [UInt8]()
pData.getBytes(&byteData)
self.outputStream!.write(byteData, maxLength: pData.length)
From the Swift & Objc interop book section here : https://developer.apple.com/library/prerelease/ios/documentation/swift/conceptual/buildingcocoaapps/InteractingWithCAPIs.html
C Constant Pointers
When a function is declared as taking a CConstPointer argument,
it can accept any of the following:
nil, which is passed as a null pointer
A CMutablePointer, CMutableVoidPointer, CConstPointer, CConstVoidPointer, or AutoreleasingUnsafePointer value, which
is converted to CConstPointer if necessary
An in-out expression whose operand is an lvalue of type Type, which is passed as the address of the lvalue
A Type[] value, which is passed as a pointer to the start of the array, and lifetime-extended for the duration of the call
I believe then it can work like this:
var p: [Uint8] = []
pData.getBytes(&p)
outputStream.write(p, maxLength: pData.length)
I found a simple solution right now, by use of UnsafePointer<T>():
var outputStream : NSOutputStream = NSOutputStream()
var pData : NSMutableData = NSMutableData()
var pType : Int = 1
let pMessage : String = "Device_Description\0\0\x01" // 16BitChar with escapeSequence
var pLength : Int = 8+pMessage.lengthOfBytesUsingEncoding(NSUTF16LittleEndianStringEncoding)
pData.appendBytes(&pLength, length: 4)
pData.appendBytes(&pType, length: 4)
pData.appendData(ptpMessage.dataUsingEncoding(NSUTF16LittleEndianStringEncoding))
outputStream.write(UnsafePointer<UInt8>(pData.bytes), maxLength: pData.length)
#holex: Thanks for your input. I know this solution is not quite Swifty, but it´s working for now.

Sample implementation of java CBCBlockCipherMac in Objective c

Can anyone share a sample code on how to implement CBCBlockCipherMac in objective C. here is how far I got and its giving a different result from the java implementation.
const unsigned char key[16] = "\x1\x2\x3\x4\x5\x6\x7\x8\x9\x0\x1\x2\x3\x4\x5\x6";
const unsigned char data[14] = "\x54\x68\x69\x73\x69\x73\x6d\x79\x73\x74\x72\x69\x6e\x67";
CMAC_CTX *ctx = CMAC_CTX_new();
ret = CMAC_Init(ctx, key, sizeof(key), EVP_des_ede3(), 0);
printf("CMAC_Init = %d\n", ret);
ret = CMAC_Update(ctx, data, sizeof(data));
printf("CMAC_Update = %d\n", ret);
size_t size;
//unsigned int size;
unsigned char tag[4];
ret = CMAC_Final(ctx, tag, &size);
printf("CMAC_Final = %d, size = %u\n", ret, size);
CMAC_CTX_free(ctx);
printf("expected: 391d1520\n"
"got: ");
size_t index;
for (index = 0; index < sizeof(tag) - 1; ++index) {
printf("%02x", tag[index]);
if ((index + 1) % 4 == 0) {
printf(" ");
}
}
printf("%02x\n", tag[sizeof(tag) - 1]);
And my java code looks like this
String *data = "Thisismystring";
String *keyString = "1234567890123456";
bytes[]mac = new byte[4];
CBCBlockCipherMac macCipher = new CBCBlockCipherMac(DESedeEngine);
DESedeParameters keyParameter = new DESedeParameters(keyString.getBytes());
DESedeEngine engine = new DESedeEngine();
engine,init(true, keyParameter);
byte[] dataBytes = data.getBytes();
macCipher.update(dataBytes,0,data.length());
macCipher.doFinal(mac,0);
byte[] macBytesEncoded = Hex.encode(mac);
String macString = new String(macBytesEncoded);
This gives me "391d1520". But the objective c gives me "01000000"
CMAC is not the same as CBC MAC. CMAC has an an additional step at the beginning and the end of the calculation. If possible I would suggest you upgrade your Java code to use CMAC, as CBC is not as secure, e.g. using org.bouncycastle.crypto.macs.CMac.
OpenSSL does not seem to implement CBC MAC directly (at least, I cannot find any reference to it). So if you need it, you need to implement it yourself.
You can use CBC mode encryption with a zero IV and take the last 16 bytes of the encryption. Of course, this means you need to store the rest of the ciphertext in a buffer somewhere, or you need to use the update functions smartly (reusing the same buffer over and over again for the ciphertext).