How to retrieve vCard in xmpp , which delegate method and flow is use for get vCard in cocoa os x app? - objective-c

I have implement xmpp framework in my cocoa os x application. currently am working on vCard. i have done work on set all required field's data of login user and it stored successfully, but i have no proper solution for how to get login user's stored vcard and am not aware of it. Please give me solution for this problem. i have been suffering for this from last 3 days. help me as soon as
Thanx in advance
i have used below code to set vCard field
dispatch_queue_t queue = dispatch_queue_create("queue", DISPATCH_QUEUE_PRIORITY_DEFAULT);
dispatch_async(queue, ^
{
XMPPvCardCoreDataStorage *xmppvCardStorage = [XMPPvCardCoreDataStorage sharedInstance];
XMPPvCardTempModule *xmppvCardTempModule = [[XMPPvCardTempModule alloc] initWithvCardStorage:xmppvCardStorage];
[xmppvCardTempModule activate:[self xmppStream]];
XMPPvCardTemp *myvCardTemp = [xmppvCardTempModule myvCardTemp];
if (!myvCardTemp)
{
NSXMLElement *vCardXML = [NSXMLElement elementWithName:#"vCard" xmlns:#"vcard-temp"];
XMPPvCardTemp *newvCardTemp = [XMPPvCardTemp vCardTempFromElement:vCardXML];
[newvCardTemp setName:#"vCard"];
[newvCardTemp setNickname:lbl.stringValue];
[newvCardTemp setFormattedName:#"abc"];
[newvCardTemp setDesc:lbl_abt.stringValue];
[xmppvCardTempModule updateMyvCardTemp:newvCardTemp];
}
else
{
[myvCardTemp setName:#"vCard"];
[myvCardTemp setNickname:lbl.stringValue];
[myvCardTemp setFormattedName:#"abc"];
[myvCardTemp setDesc:lbl_abt.stringValue];
[xmppvCardTempModule updateMyvCardTemp:myvCardTemp];
}
});
}
And i also tried below code to retrieve vCard
/* XMPPvCardTempModule *xmppvCardTempModule;
XMPPvCardTemp *vCard =[xmppvCardTempModule vCardTempForJID:[XMPPJID jidWithString:#"xxxx"] shouldFetch:YES];
NSLog(#"V CARD :%#",vCard.nickname);*/
XMPPvCardCoreDataStorage* xmppvCardStorage = [XMPPvCardCoreDataStorage sharedInstance];
XMPPvCardTempModule* m = [[XMPPvCardTempModule alloc]initWithvCardStorage:xmppvCardStorage];
[m fetchvCardTempForJID:[XMPPJID jidWithString:#"xxxx"]ignoreStorage:YES];
NSLog(#"%#",xmppvCardStorage.description);
Please suggest me proper way to solve this problem :

Related

Offline rendering with The Amazing Audio Engine

This post is also posted on The Amazing Audio Engine forum.
Hi everyone, I am new to The Amazing Audio Engine and iOS dev, and have been trying to figure out how to get the BPM of a track.
So far I have found two articles on offline rendering on the forum:
http://forum.theamazingaudioengine.com/discussion/comment/1743/#Comment_1743
http://forum.theamazingaudioengine.com/discussion/comment/649#Comment_649
As far as I know the AEAudioControllerRenderMainOutput function is only correctly implemented in this fork.
I am trying to do offline rendering to process a track and then use the algorithm described here (JavaScript) and implemented here.
So far I'm loading this fork, and I am using Swift (I am part of Make School Summer Academy at the moment, which teaches Swift).
When playing a track this code works for me (No offline rendering!)
let file = NSBundle.mainBundle().URLForResource("track", withExtension:
"m4a")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)
audioController = AEAudioController(audioDescription: AEAudioController.nonInterleavedFloatStereoAudioDescription())
let receiver = AEBlockAudioReceiver { (source, time, frames, audioBufferList) -> Void in
let leftSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData)
// Advance the buffer sizeof(float) * 512
let rightSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData) + 512
println("leftSamples: \(leftSamples) rightSamples: \(rightSamples)")
}
audioController.addChannels([channel])
audioController.addOutputReceiver(receiver)
audioController.start()
Trying offline rendering
This is the code I am trying to run while I am using this fork
audioController = AEAudioController(audioDescription: AEAudioController.nonInterleaved16BitStereoAudioDescription())
let file = NSBundle.mainBundle().URLForResource("track", withExtension: "mp3")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)
audioController.addChannels([channel])
audioController.start(nil)
audioController.stop()
var t = AudioTimeStamp()
let bufferLength: UInt32 = 4096
var buffer = AEAllocateAndInitAudioBufferList(audioController.audioDescription, Int32(bufferLength))
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer)
var renderDuration: NSTimeInterval = channel.duration
var sampleRate: Float64 = audioController.audioDescription.mSampleRate
var lengthInFrames: UInt32 = UInt32(renderDuration * sampleRate)
var songBuffer: [Float64]
t.mFlags = UInt32(kAudioTimeStampSampleTimeValid)
var frequencyAnalyzer = FrequencyAnalyzer()
println("renderDuration \(renderDuration)")
var outIsOpen = Boolean()
AUGraphClose(audioController.audioGraph)
AUGraphIsOpen(audioController.audioGraph, &outIsOpen)
println("AUGraphIsOpen: \(outIsOpen)")
for (var i: UInt32 = 0; i < lengthInFrames; i += bufferLength) {
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer);
t.mSampleTime += Float64(bufferLength)
println(t.mSampleTime)
let leftSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData)
let rightSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData) + 512
println("leftSamples: \(leftSamples.memory) rightSamples: \(rightSamples.memory)")
}
AEFreeAudioBufferList(buffer)
AUGraphOpen(audioController.audioGraph)
audioController.start(nil)
audioController.stop()
Offline rendering is not working for me ATM. The second example is not working it's getting me a lot of mixed errors which I don't understand.
A very common one is inside the channelAudioProducer function on this line:
// Tell mixer/mixer's converter unit to render into audio status = AudioUnitRender(group->converterUnit ? group->converterUnit : group->mixerAudioUnit, arg->ioActionFlags, &arg->originalTimeStamp, 0, *frames, audio);
It gives me EXC_BAD_ACCESS (code=EXC_I386_GPFLT). Among other errors this one is very common.
I am sorry I am a total noob on this field but some stuff I don't really understand. Should I use nonInterleaved16BitStereoAudioDescription or nonInterleavedFloatStereoAudioDescription? How does this implement the mData?
I would love to get some help on this since I'm kind of lost at the moment. Please when you answer me try to explain it as fully as you can, I am new to this stuff.
NOTE: Posting code in Objective-C is fine if you don't know Swift.

Set Custom KeyEquivalent in Services Menu

OmniFocus has a Cocoa Service that allows you to create tasks based upon selected items.
It has a preference that allows you to set the keyboard shortcut that triggers the Service. This is not just a global hotkey, it's a bona fide Service that shows up in the menu.
You can the keyboard shortcut to pretty much any combination, including combinations with ⌥ and ^. This functionality is not documented - the docs seem to say that KeyEquivalents must be a ⌘+[⇧]+someKey.
Once this is set, I observe three things:
The OmniFocus Info.plist file does not contain a KeyEquivalent listed. This is not surprising, as the file is read-only.
The pbs -dump_pboard utility lists NSKeyEquivalent = {}; for the service.
Using NSDebugServices lists this interesting line that does not show up with most debugging sessions (Obviously, for keyboard shortcut ⌃⌥⌘M): OmniFocus: Send to Inbox (com.omnigroup.OmniFocus) has a custom key equivalent: <NSKeyboardShortcut: 0x7fb18a0d18f0 (⌃⌥⌘M)>.
So my questions are twofold, and I suspect they are related:
How do you dynamically change a service's KeyEquivalent?
How do you set the KeyEquivalent to a combination including ⌃ and ⌥
Thank you!
Figured it out. The basic process is described here: Register NSService with Command Alt NSKeyEquivalent
The code is this:
//Bundle identifier from Info.plist
NSString* bundleIdentifier = #"com.whatever.MyApp";
//Services -> Menu -> Menu item title from Info.plist
NSString* appServiceName = #"Launch My Service";
//Services -> Instance method name from Info.plist
NSString* methodNameForService = #"myServiceMethod";
//The key equivalent
NSString* keyEquivalent = #"#~r";
CFStringRef serviceStatusName = (CFStringRef)[NSString stringWithFormat:#"%# - %# - %#", bundleIdentifier, appServiceName, methodNameForService];
CFStringRef serviceStatusRoot = CFSTR("NSServicesStatus");
CFPropertyListRef pbsAllServices = (CFPropertyListRef) CFMakeCollectable ( CFPreferencesCopyAppValue(serviceStatusRoot, CFSTR("pbs")) );
// the user did not configure any custom services
BOOL otherServicesDefined = pbsAllServices != NULL;
BOOL ourServiceDefined = NO;
if ( otherServicesDefined ) {
ourServiceDefined = NULL != CFDictionaryGetValue((CFDictionaryRef)pbsAllServices, serviceStatusName);
}
NSUpdateDynamicServices();
NSMutableDictionary *pbsAllServicesNew = nil;
if (otherServicesDefined) {
pbsAllServicesNew = [NSMutableDictionary dictionaryWithDictionary:(NSDictionary*)pbsAllServices];
} else {
pbsAllServicesNew = [NSMutableDictionary dictionaryWithCapacity:1];
}
NSDictionary *serviceStatus = [NSDictionary dictionaryWithObjectsAndKeys:
(id)kCFBooleanTrue, #"enabled_context_menu",
(id)kCFBooleanTrue, #"enabled_services_menu",
keyEquivalent, #"key_equivalent", nil];
[pbsAllServicesNew setObject:serviceStatus forKey:(NSString*)serviceStatusName];
CFPreferencesSetAppValue (
serviceStatusRoot,
(CFPropertyListRef) pbsAllServicesNew,
CFSTR("pbs"));
Boolean result = CFPreferencesAppSynchronize(CFSTR("pbs"));
if (result) {
NSUpdateDynamicServices();
NSLog(#"successfully installed our alt-command-r service");
} else {
NSLog(#"couldn't install our alt-command-r service");
}
If the code succeeds, you can view this in ~/Library/Preferences/pbs.plist
You should see something like:
NSServicesStatus = {
"com.whatever.MyApp - Launch My Service - myServiceMethod" = {
enabled_context_menu = :true;
enabled_services_menu = :true;
key_equivalent = "#~r";
};

Quickblox QBLGeoDataGetRequest: currentPosition is always "200.000000;200.000000"

I'm trying to get all the check-ins near some point on the map:
QBLGeoDataGetRequest *getRequest = [QBLGeoDataGetRequest request];
getRequest.currentPosition = CLLocationCoordinate2DMake(10.0, 10.0);
getRequest.radius = 100;
[QBLocation geoDataWithRequest:getRequest delegate:self];
But strangely this code transforms to the following request:
GET http://api.quickblox.com/geodata/find.xml
headers:{
"QB-SDK" = "iOS 1.5.2";
"Qb-Token" = xxxxxxxxxxxxxxxxxxxxxxx;
"QuickBlox-REST-API-Version" = "0.1.1";
}
parameters:{
"current_position" = "200.000000;200.000000";
radius = 100;
}
"current_position" is always "200.000000;200.000000", whatever I change getRequest.currentPosition to.
But why? And how to fix this?
P.S. There's no code that could cause any side effects, I just authorize and perform the QBLGeoDataGetRequest.
Just try updated QuickBlox iOS SDK 1.7 http://quickblox.com/developers/IOS#Download_iOS_SDK
This feature works OK in last version

FFMpeg Out of sync audio/video in iOS application

The app saves the camera output into a mov. file, then turn it to flv format that sent by AVPacket to rtmp server.
It switch every time between two files, one is written by the camera output and the other one is sent.
My problem is that the audio/video is getting out of sync after a while.
The first buffer sent is allways 100% sync but after awhile it get messed.
I belive its a DTS-PTS problem..
if(isVideo)
{
packet->stream_index = VIDEO_STREAM;
packet->dts = packet->pts = videoPosition;
videoPosition += packet->duration = FLV_TIMEBASE * packet->duration * videoCodec->ticks_per_frame * videoCodec->time_base.num / videoCodec->time_base.den;
}
else
{
packet->stream_index = AUDIO_STREAM;
packet->dts = packet->pts = audioPosition;
audioPosition += packet->duration = FLV_TIMEBASE * packet->duration / audioRate;
//NSLog(#"audio position = %lld", audioPosition);
}
packet->pos = -1;
packet->convergence_duration = AV_NOPTS_VALUE;
// This sometimes fails without being a critical error, so no exception is raised
if((code = av_interleaved_write_frame(file, packet)))
{
NSLog(#"Streamer::Couldn't write frame");
}
av_free_packet(packet);
You can research this sample: http://unick-soft.ru/art/files/ffmpegEncoder-vs2008.zip
But this sample is for Windows.
In this sample I use pts only for audio stream:
if (pVideoCodec->coded_frame->pts != AV_NOPTS_VALUE)
{
pkt.pts = av_rescale_q(pVideoCodec->coded_frame->pts,
pVideoCodec->time_base, pVideoStream->time_base);
}
I was experiencing a similar issue when switching out the AVAssetWriters, and noticed that it went way if I only started using the new AVAssetWriter when I got a video sample
https://medium.com/#brandon.kobel/ios-seamless-video-chunks-4383a5a3a874

How to Listen For an Application Launch Event in Mac OS X?

I wrote an AppleScript to mount a SparseBundle image and I want it to be executed exactly when Time Machine launches.
Right now, I periodically check if Time Machine is running with AppleScript using on idle statement:
on idle
....
return <interval>
end idle
which isn't a robust way. In my opinion adding an event trigger for Application Launch event would be a better approach.
Could you please help?
An Objective-C or Python sample code (I'd prefer Python) is more than welcome.
What you are looking for is, NSDistributedNotificationCenter or NSWorkspace , these cocoa classes post notifications of application events, For workspace, things like application launches, mounting of drives etc.
To do this in python, you need PyObjC, which is basically python bindings for apple's cocoa classes. The documentation is sparse on their website, and there's a reason, as the documentation would be basically be the same as the Apple docs, so they only include the differences between the pyobjc api, and the cocoa API. If you understand how the objective c api is converted to python you are good to go. Check here: http://pyobjc.sourceforge.net/documentation/pyobjc-core/intro.html
I have included an example below which listens for Distributed notifications using python. The code below basically adds an observer and listens for itunes notifications. You could follow a similar structure, but instead add an observer for NSWorkspace. To figure out what you should be listening to, there is an application that will display all notifications going through your system. It's called notification watcher . Use this to figure out what you should be listening to. You could also convert the objective c code to python.
What the code below is doing
Defines a new class which inherits from NSObject, as defined by PyObjC
Defines a method, which gets passed the actual notification and prints it out
Creates an instance of Foundation.NSDistributedNotificationCenter.defaultCenter
Creates an instance of GetSongs
Registers an observer, passing it the class, the method that gets called when a notification is received and which application & event to monitor i.e "com.apple.iTunes.playerInfo"
Runs the event loop,
One thing that will trip you up, accessing attributes (objective c attributes) do not work the same as accessing python attributes. i.e in python you do class_name.att for objective c in python you have to call it like a function i.e from my example below: song.userInfo()
import Foundation
from AppKit import *
from PyObjCTools import AppHelper
class GetSongs(NSObject):
def getMySongs_(self, song):
print "song:", song
song_details = {}
ui = song.userInfo()
print 'ui:', ui
for x in ui:
song_details[x] = ui.objectForKey_(x)
print song_details
nc = Foundation.NSDistributedNotificationCenter.defaultCenter()
GetSongs = GetSongs.new()
nc.addObserver_selector_name_object_(GetSongs, 'getMySongs:', 'com.apple.iTunes.playerInfo',None)
NSLog("Listening for new tunes....")
AppHelper.runConsoleEventLoop()
Here's an example of the actual output... (YES BRITNEY ROCKS!, NOT! ;)
song NSConcreteNotification 0x104c0a3b0 {name = com.apple.iTunes.playerInfo; object = com.apple.iTunes.player; userInfo = {
Album = Circus;
"Album Rating" = 0;
"Album Rating Computed" = 1;
Artist = "Britney Spears";
"Artwork Count" = 1;
Genre = Pop;
"Library PersistentID" = 8361352612761174229;
Location = "file://localhost/Users/izze/Music/iTunes/iTunes%20Music/Britney%20Spears/Circus/02%20Circus.mp3";
Name = Circus;
PersistentID = 4028778662306031905;
"Play Count" = 0;
"Play Date" = "2010-06-26 08:20:57 +0200";
"Player State" = Playing;
"Playlist PersistentID" = 7784218291109903761;
"Rating Computed" = 1;
"Skip Count" = 1;
"Skip Date" = "2010-06-26 12:20:57 +0200";
"Store URL" = "itms://itunes.com/link?n=Circus&an=Britney%20Spears&pn=Circus";
"Total Time" = 192444;
"Track Count" = 16;
"Track Number" = 2;
}}
ui {
Album = Circus;
"Album Rating" = 0;
"Album Rating Computed" = 1;
Artist = "Britney Spears";
"Artwork Count" = 1;
Genre = Pop;
"Library PersistentID" = 8361352612761174229;
Location = "file://localhost/Users/izze/Music/iTunes/iTunes%20Music/Britney%20Spears/Circus/02%20Circus.mp3";
Name = Circus;
PersistentID = 4028778662306031905;
"Play Count" = 0;
"Play Date" = "2010-06-26 08:20:57 +0200";
"Player State" = Playing;
"Playlist PersistentID" = 7784218291109903761;
"Rating Computed" = 1;
"Skip Count" = 1;
"Skip Date" = "2010-06-26 12:20:57 +0200";
"Store URL" = "itms://itunes.com/link?n=Circus&an=Britney%20Spears&pn=Circus";
"Total Time" = 192444;
"Track Count" = 16;
"Track Number" = 2;
}
{u'Album Rating Computed': 1, u'Album': u'Circus', u'Rating Computed': True, u'Name': u'Circus', u'Artist': u'Britney Spears', u'Track Number': 2, u'Skip Date': 2010-06-26 12:20:57 +0200, u'Library PersistentID': 8361352612761174229L, u'Player State': u'Playing', u'Total Time': 192444L, u'Genre': u'Pop', u'Playlist PersistentID': 7784218291109903761L, u'Album Rating': 0, u'Location': u'file://localhost/Users/izze/Music/iTunes/iTunes%20Music/Britney%20Spears/Circus/02%20Circus.mp3', u'Skip Count': 1, u'Track Count': 16L, u'Artwork Count': 1, u'Play Date': 2010-06-26 08:20:57 +0200, u'PersistentID': 4028778662306031905L, u'Play Count': 0, u'Store URL': u'itms://itunes.com/link?n=Circus&an=Britney%20Spears&pn=Circus'}
This isn't too tough to do in Objc-C. You can access notifications for all applications through NSWorkspace and NSNotificationCenter. Create an object and register one of it's methods for notifications of type NSWorkspaceDidTerminateApplicationNotification. Something like:
#interface NotificationObserver : NSObject { }
- (void) applicationDidLaunch:(NSNotification*)notification;
#end
#implementation NotificationObserver : NSObject
- (void) applicationDidLaunch:(NSNotification*)notification
{
// Check the notification to see if Time Machine is being launched.
}
#end
void watch(void)
{
NSNotificationCenter* notificationCenter
= [[NSWorkspace sharedWorkspace] sharednotificationCenter];
NotificationObserver* observer = [[NotificationObserver alloc] init];
[notificationCenter addObserver:observer
selector:#selector(applicationDidTerminate:)
name:#"NSWorkspaceDidTerminateApplicationNotification"
object:nil];
}
This isn't an answer to your question, but it may solve your problem.
Why not just have your AppleScript launch Time Machine after it mounts the disk image? Then, instead of launching Time Machine directly, always invoke Time Machine via your script. You can even paste the Time Machine icon on to your AppleScript file, and name it "Time Machine" to make the illusion complete. :-)