What is the equivalent of the kAudioFileReadPermission in Swift for AudioFileOpenURL? - objective-c

Do you know what is the equivalent for Swift 3.1 with the following Objective-C 2.0 code:
theErr = AudioFileOpenURL((__bridge CFURLRef)audioURL, kAudioFileReadPermission, 0, &audioFile);
Any clues?

If you use autocompletion, it will hold your hand through this process.
And when you get to that field, hit enter, and then go to the end, enter a . (as it's clearly an enumeration), and you can see the options:
But it is:
var audioURL: URL = ...
var audioFileID: AudioFileID?
let status = AudioFileOpenURL(audioURL as CFURL, .readPermission, .allZeros, &audioFileID)

CFURLRef has been renamed to CFURL. URL is toll-free bridged to CFURL, all you need to do is cast an url to CFURL using as.
Here's the updated code for Swift 4:
let audioURL: URL = ...
var audioFile: AudioFileID?
let status = AudioFileOpenURL(audioURL as CFURL,
.readPermission,
0,
&audioFile)

Related

How to read file comment field

In OS X Finder there is 'Comment' file property. It can be checked in finder by adding 'Comment' column or edited/checked after right clicking on file or folder and selecting 'Get info'.
How to read this value in swift or objective-c?
I checked already NSURL and none of them seems to be the right ones
Do not use the low-level extended attributes API to read Spotlight metadata. There's a proper Spotlight API for that. (It's called the File Metadata API.) Not only is it a pain in the neck, there's no guarantee that Apple will keep using the same extended attribute to store this information.
Use MDItemCreateWithURL() to create an MDItem for the file. Use MDItemCopyAttribute() with kMDItemFinderComment to obtain the Finder comment for the item.
Putting the pieces together (Ken Thomases reading answer above and writing answer link) you can extend URL with a computed property with a getter and a setter to read/write comments to your files:
update: Xcode 8.2.1 • Swift 3.0.2
extension URL {
var finderComment: String? {
get {
guard isFileURL else { return nil }
return MDItemCopyAttribute(MDItemCreateWithURL(kCFAllocatorDefault, self as CFURL), kMDItemFinderComment) as? String
}
set {
guard isFileURL, let newValue = newValue else { return }
let script = "tell application \"Finder\"\n" +
String(format: "set filePath to \"%#\" as posix file \n", absoluteString) +
String(format: "set comment of (filePath as alias) to \"%#\" \n", newValue) +
"end tell"
guard let appleScript = NSAppleScript(source: script) else { return }
var error: NSDictionary?
appleScript.executeAndReturnError(&error)
if let error = error {
print(error[NSAppleScript.errorAppName] as! String)
print(error[NSAppleScript.errorBriefMessage] as! String)
print(error[NSAppleScript.errorMessage] as! String)
print(error[NSAppleScript.errorNumber] as! NSNumber)
print(error[NSAppleScript.errorRange] as! NSRange)
}
}
}
}
As explained in the various answers to Mac OS X : add a custom meta data field to any file,
Finder comments can be read and set programmatically with getxattr() and setxattr(). They are stored as extended attribute
"com.apple.metadata:kMDItemFinderComment", and the value is a property
list.
This works even for files not indexed by Spotlight, such as those on a network server volume.
From the Objective-C code here
and here I made this simple Swift function
to read the Finder comment (now updated for Swift 4 and later):
func finderComment(url : URL) -> String? {
let XAFinderComment = "com.apple.metadata:kMDItemFinderComment"
let data = url.withUnsafeFileSystemRepresentation { fileSystemPath -> Data? in
// Determine attribute size:
let length = getxattr(fileSystemPath, XAFinderComment, nil, 0, 0, 0)
guard length >= 0 else { return nil }
// Create buffer with required size:
var data = Data(count: length)
// Retrieve attribute:
let result = data.withUnsafeMutableBytes { [count = data.count] in
getxattr(fileSystemPath, XAFinderComment, $0.baseAddress, count, 0, 0)
}
guard result >= 0 else { return nil }
return data
}
// Deserialize to String:
guard let data = data, let comment = try? PropertyListSerialization.propertyList(from: data,
options: [], format: nil) as? String else {
return nil
}
return comment
}
Example usage:
let url = URL(fileURLWithPath: "/path/to/file")
if let comment = finderComment(url: url) {
print(comment)
}
The function returns an optional string which is nil if the file
has no Finder comment, or if anything went wrong while retrieving it.

Offline rendering with The Amazing Audio Engine

This post is also posted on The Amazing Audio Engine forum.
Hi everyone, I am new to The Amazing Audio Engine and iOS dev, and have been trying to figure out how to get the BPM of a track.
So far I have found two articles on offline rendering on the forum:
http://forum.theamazingaudioengine.com/discussion/comment/1743/#Comment_1743
http://forum.theamazingaudioengine.com/discussion/comment/649#Comment_649
As far as I know the AEAudioControllerRenderMainOutput function is only correctly implemented in this fork.
I am trying to do offline rendering to process a track and then use the algorithm described here (JavaScript) and implemented here.
So far I'm loading this fork, and I am using Swift (I am part of Make School Summer Academy at the moment, which teaches Swift).
When playing a track this code works for me (No offline rendering!)
let file = NSBundle.mainBundle().URLForResource("track", withExtension:
"m4a")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)
audioController = AEAudioController(audioDescription: AEAudioController.nonInterleavedFloatStereoAudioDescription())
let receiver = AEBlockAudioReceiver { (source, time, frames, audioBufferList) -> Void in
let leftSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData)
// Advance the buffer sizeof(float) * 512
let rightSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData) + 512
println("leftSamples: \(leftSamples) rightSamples: \(rightSamples)")
}
audioController.addChannels([channel])
audioController.addOutputReceiver(receiver)
audioController.start()
Trying offline rendering
This is the code I am trying to run while I am using this fork
audioController = AEAudioController(audioDescription: AEAudioController.nonInterleaved16BitStereoAudioDescription())
let file = NSBundle.mainBundle().URLForResource("track", withExtension: "mp3")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)
audioController.addChannels([channel])
audioController.start(nil)
audioController.stop()
var t = AudioTimeStamp()
let bufferLength: UInt32 = 4096
var buffer = AEAllocateAndInitAudioBufferList(audioController.audioDescription, Int32(bufferLength))
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer)
var renderDuration: NSTimeInterval = channel.duration
var sampleRate: Float64 = audioController.audioDescription.mSampleRate
var lengthInFrames: UInt32 = UInt32(renderDuration * sampleRate)
var songBuffer: [Float64]
t.mFlags = UInt32(kAudioTimeStampSampleTimeValid)
var frequencyAnalyzer = FrequencyAnalyzer()
println("renderDuration \(renderDuration)")
var outIsOpen = Boolean()
AUGraphClose(audioController.audioGraph)
AUGraphIsOpen(audioController.audioGraph, &outIsOpen)
println("AUGraphIsOpen: \(outIsOpen)")
for (var i: UInt32 = 0; i < lengthInFrames; i += bufferLength) {
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer);
t.mSampleTime += Float64(bufferLength)
println(t.mSampleTime)
let leftSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData)
let rightSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData) + 512
println("leftSamples: \(leftSamples.memory) rightSamples: \(rightSamples.memory)")
}
AEFreeAudioBufferList(buffer)
AUGraphOpen(audioController.audioGraph)
audioController.start(nil)
audioController.stop()
Offline rendering is not working for me ATM. The second example is not working it's getting me a lot of mixed errors which I don't understand.
A very common one is inside the channelAudioProducer function on this line:
// Tell mixer/mixer's converter unit to render into audio status = AudioUnitRender(group->converterUnit ? group->converterUnit : group->mixerAudioUnit, arg->ioActionFlags, &arg->originalTimeStamp, 0, *frames, audio);
It gives me EXC_BAD_ACCESS (code=EXC_I386_GPFLT). Among other errors this one is very common.
I am sorry I am a total noob on this field but some stuff I don't really understand. Should I use nonInterleaved16BitStereoAudioDescription or nonInterleavedFloatStereoAudioDescription? How does this implement the mData?
I would love to get some help on this since I'm kind of lost at the moment. Please when you answer me try to explain it as fully as you can, I am new to this stuff.
NOTE: Posting code in Objective-C is fine if you don't know Swift.

how to Pass Raw Json to post request in Swift?

Hi I am new to swift please spare me.
I need to post to particular API but the api is not a fan of key value pair the api expect raw json as post data
I use this library here to make post request.
this is my code
func postItem(itemname: String, itemnumber: Int, itemcode:String, url:String, baseURL:String, completion: (result: Dictionary<String, AnyObject>) -> ()){
var dict: Dictionary<String, AnyObject>!
var params: Dictionary<String,AnyObject> = ["parentItem": ["itemname":itemname,"itemnumber":itemnumber,"itemcode":code]]
let data = NSJSONSerialization.dataWithJSONObject(params, options: NSJSONWritingOptions.PrettyPrinted, error: nil)
let string = NSString(data: data!, encoding: NSUTF8StringEncoding)
var request = HTTPTask()
request.requestSerializer = JSONRequestSerializer()
request.requestSerializer.headers[headerKey] = getToken() //example of adding a header value
request.POST(url, parameters: params, success: {(response: HTTPResponse) in
if response.responseObject != nil {
let data = response.responseObject as NSData
var error: NSError?
dict = NSJSONSerialization.JSONObjectWithData(data, options: NSJSONReadingOptions.MutableContainers, error: &error) as Dictionary<String, AnyObject>;
completion(result: dict)
}
},failure: {(error: NSError, response: HTTPResponse?) in
dict = ["error" : "error" ]
completion(result: dict)
})
}
i need to pass this kind of raw json in api
eg. {"parentItem": {"itemname":"Cocoa","itemnumber":123,"itemcode":"cocoa-12-A"}}
but when I println my params because it is dictionary it generate something like
["parentItem": ["itemname"="Cocoa"; "itemnumber"=123; "itemcode"="cocoa-12-A"]]
I just couldn't convert the params to JSON because the library I'm using is expecting dictionary and I'm having a hard time creating my own class.
could anyone help me? any comments and suggestion would do. Thanks in advance.
Why don't use Alamofire framework ? It's pretty good and sends standard json

Send and Receive MIDI with Swift and CoreMIDI

I'm not very familiar with Swift/Objective-C or the Cocoa environment and I've been having a lot of trouble figuring out how to send or listen for data from a USB device with CoreMIDI. I'm trying to send the message (144, 36, 5) to my MIDI controller (an Ableton Push) which I have accomplished before using the Bitwig Studio Scripting API. I haven't been able to find much on this other than Apple's docs and they haven't been particularly helpful for me. So far I've figured out how to get a list of devices and check out their names, but beyond that I'm stuck.
What I've written so far trying to send MIDI:
var pushDevice = MIDIGetDevice(2)
var secondEntity = MIDIDeviceGetEntity(pushDevice, 1)
var pushDestination = MIDIEntityGetDestination(secondEntity, 0)
var midiPort = MIDIPortRef()
let myData : [Byte] = [ Byte(144), Byte(36), Byte(5) ]
var packet = UnsafeMutablePointer<MIDIPacket>.alloc(1)
var pkList = UnsafeMutablePointer<MIDIPacketList>.alloc(1)
packet = MIDIPacketListInit(pkList)
packet = MIDIPacketListAdd(pkList, 1024, packet, 0, 3, myData)
MIDISend(midiPort, pushDestination, pkList)
I feel like a bit of a goof for not being able to figure this out, I imagine it has to be a simple solution that I'm just not able to figure out for some reason. I don't think I'm properly constructing the MIDIPacketList or the MIDIPort, and I have no idea how to go about creating a callback and listening for MIDI messages.
I figured out how to send MIDI data by grabbing the device via it's unique ID. I'm not sure how memory management works in Swift, so keep that in mind. I will check back in later if I figure out how to properly create a callback and listen for MIDI input.
import Foundation
import CoreMIDI
var midiClient = MIDIClientRef()
var result = MIDIClientCreate("Foo Client", nil, nil, &midiClient)
var outputPort = MIDIPortRef()
result = MIDIOutputPortCreate(midiClient, "Output", &outputPort);
var endPoint = MIDIObjectRef()
var foundObj = MIDIObjectType()
result = MIDIObjectFindByUniqueID(UNIQUE_ID, &endPoint, &foundObj)
var pkt = UnsafeMutablePointer<MIDIPacket>.alloc(1)
var pktList = UnsafeMutablePointer<MIDIPacketList>.alloc(1)
let midiData : [Byte] = [Byte(144), Byte(36), Byte(5)]
pkt = MIDIPacketListInit(pktList)
pkt = MIDIPacketListAdd(pktList, 1024, pkt, 0, 3, midiData)
MIDISend(outputPort, endPoint, pktList)

Quickblox QBLGeoDataGetRequest: currentPosition is always "200.000000;200.000000"

I'm trying to get all the check-ins near some point on the map:
QBLGeoDataGetRequest *getRequest = [QBLGeoDataGetRequest request];
getRequest.currentPosition = CLLocationCoordinate2DMake(10.0, 10.0);
getRequest.radius = 100;
[QBLocation geoDataWithRequest:getRequest delegate:self];
But strangely this code transforms to the following request:
GET http://api.quickblox.com/geodata/find.xml
headers:{
"QB-SDK" = "iOS 1.5.2";
"Qb-Token" = xxxxxxxxxxxxxxxxxxxxxxx;
"QuickBlox-REST-API-Version" = "0.1.1";
}
parameters:{
"current_position" = "200.000000;200.000000";
radius = 100;
}
"current_position" is always "200.000000;200.000000", whatever I change getRequest.currentPosition to.
But why? And how to fix this?
P.S. There's no code that could cause any side effects, I just authorize and perform the QBLGeoDataGetRequest.
Just try updated QuickBlox iOS SDK 1.7 http://quickblox.com/developers/IOS#Download_iOS_SDK
This feature works OK in last version