convert pthread to objective-c - objective-c

Im trying to convert the following to objective-c code.
This is the current thread I have in C and works fine
//calling EnrollThread method on a thread in C
pthread_t thread_id;
pthread_create( &thread_id, NULL, EnrollThread, pParams );
//What the EnrollThread method structure looks like in C
void* EnrollThread( void *arg )
What my method structure looks like now that I've changed it to objective-c
-(void)enrollThreadWithParams:(LPBIOPERPARAMS)params;
Now I'm not sure how to call this objective-c method with the pthread_create call.
I've tried something like this:
pthread_create( &thread_id, NULL, [refToSelf enrollThreadWithParams:pParams], pParams );
But I believe I have it wrong. Can anyone enlighten me on why this does not work and what it is I need to do to fix it so that I can create my thread in the background? My UI is getting locked until the method finishes what it's doing.
I was thinking of also using dispatch_sync but I haven't tried that.

In objective C you don't really use pthread_create, although you can still use it, but the thread entry point needs to be a C function, so I'm not sure if this would be the best approach.
There are many options, as you can read in the Threading and Concurrency documents.
performSelectorInBackground method of NSObject (and subclasses)
dispatch_async (not dispatch_sync as you mentioned)
NSOperation and NSOperationQueue
NSThread class
I would suggest giving it a shot to the first one, since it is the easiest, and very straightforward, also the second one is very easy because you don't have to create external objects, you just place inline the code to be executed in parallel.

The go to reference for concurrent programming is the Concurrency Programming Guide which walks you through dispatch queues (known as Grand Central Dispatch, GCD) and operation queues. Both are incredibly easy to use and offer their own respective advantages.
In their simplest forms, both of these are pretty easy to use. As others have pointed out, the process for creating a dispatch queue and then dispatching something to that queue is:
dispatch_queue_t queue = dispatch_queue_create("com.domain.app", DISPATCH_QUEUE_CONCURRENT);
dispatch_async(queue, ^{
// something to do in the background
});
The operation queue equivalent is:
NSOperationQueue *queue = [[NSOperationQueue alloc] init];
[queue addOperationWithBlock:^{
// something to do in the background
}];
Personally, I prefer operation queues where:
I need controlled/limited concurrency (i.e. I'm going to dispatch a bunch of things to that queue and I want them to run concurrent with respect to not only the main queue, but also with respect to each other, but I don't want more than a few of those running simultaneously). A good example would be when doing concurrent network requests, where you want them running concurrently (because you get huge performance benefit) but you generally don't want more than four of them running at any given time). With an operation queue, one can specify maxConcurrentOperationCount whereas this tougher to do with GCD.
I need fine level of control over dependencies. For example, I'm going to start operations A, B, C, D, and E, but B is dependent on A (i.e. B shouldn't start before A finishes), D is dependent upon C, and E is dependent upon both B and D finishing.
I need to enjoy concurrency on tasks that, themselves, run asynchronously. Operations offer a fine degree of control over what determines when the operation is to be declared as isFinished with the use of NSOperation subclass that uses "concurrent operations". A common example is the network operation which, if you use the delegate-based implementation, runs asynchronously, but you still want to use operations to control the flow of one to the next. The very nice networking library, AFNetworking, for example, uses operations extensively, for this reason.
On the other hand, GCD is great for simple one-off asynchronous tasks (because you can avail yourself of built-in "global queues", freeing yourself from making your own queue), serial queues for synchronizing access to some shared resource, dispatch sources like timers, signaling between threads with semaphores, etc. GCD is generally where people get started with concurrent programming in Cocoa and Cocoa Touch.
Bottom line, I personally use operation queues for application-level asynchronous operations (network queues, image processing queues, etc.), where the degree of concurrency becomes and important issue). I tend to use GCD for lower-level stuff or quick and simple stuff. GCD (with dispatch_async) is a great place to start as you dip your toe into the ocean of concurrent programming, so go for it.
There are two things I'd encourage you to be aware of, regardless of which of these two technologies you use:
First, remember that (in iOS at least) you always want to do user interface tasks on the main queue. So the common patterns are:
dispatch_async(queue, ^{
// do something slow here
// when done, update the UI and model objects on the main queue
dispatch_async(dispatch_get_main_queue(), ^{
// UI and model updates can go here
});
});
or
[queue addOperationWithBlock:^{
// do something slow here
// when done, update the UI and model objects on the main queue
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
// do UI and model updates here
}];
}];
The other important issue to consider is synchronization and "thread-safety". (See the Synchronization section of the Threading Programming Guide.) You want to make sure that you don't, for example, have the main thread populating some table view while, at the same time, some background queue is changing the data used by that table view at the same time. You want to make sure that while any given thread is using some model object or other shared resource, that another thread isn't mutating it, leaving it in some inconsistent state.
There's too much to cover in the world of concurrent programming. The WWDC videos (including 2011 and 2012) offer some great background on GCD and asynchronous programming patterns, so make sure you avail yourself of that great resource.

If you already have working code, there is no reason to abandon pthreads. You should be able to use it just fine.
If, however, you want an alternative, but you want to keep your existing pthread entry point, you can do this easily enough...
dispatch_queue_t queue = dispatch_queue_create("EnrollThread", DISPATCH_QUEUE_SERIAL);
dispatch_async(queue, ^{
EnrollThread(parms);
});

Related

what is nsoperation?how to use it?

I am implementing the contact module basically adding,deleting,searching and listing the contacts.
Here i used file to persist the data like storing all the contacts in file(json format) and deserializing back to the object.
Now my target is to perform serialization and deserialization functions in background thread using nsoperation.And how one class extends nsopertions and what to do in that class.
I am new to mac os.And i cant understand what exactly nsoperation means?how to use it in my module.how to make them run concurrently.I had seen lot of tutorials but still it is very clumsy for me.I am really in need of help.Thanks in advance.
We have lot of answer to your question
What is NSOperation?
First Apple Reference Says
The NSOperation class is an abstract class you use to encapsulate the
code and data associated with a single task. Because it is abstract,
you do not use this class directly but instead subclass or use one of
the system-defined subclasses (NSInvocationOperation or
BlockOperation) to perform the actual task. Despite being abstract,
the base implementation of NSOperation does include significant logic
to coordinate the safe execution of your task. The presence of this
built-in logic allows you to focus on the actual implementation of
your task, rather than on the glue code needed to ensure it works
correctly with other system objects.
Then Simple meaning of NSOperation
NSOperation represents a single unit of work. It’s an abstract class
that offers a useful, thread-safe structure for modeling state,
priority, dependencies, and management.
Do you need to run it concurrently?
What is Concurrency?
Doing multiple things at the same time.
Taking advantage of number of cores available in multicore CPUs.
Running multiple programs in parallel.
Why NSOperationQueue?
For situations where NSOperation doesn’t make sense to build out a
custom NSOperation subclass, Foundation provides the concrete
implementations NSBlockOperation and NSInvocationOperation.
Examples of tasks that lend themselves well to NSOperation include
network requests, image resizing, text processing, or any other
repeatable, structured, long-running task that produces associated
state or data.
But simply wrapping computation into an object doesn’t do much without
a little oversight. That’s where NSOperationQueue comes in
What is NSOperationQueue?
NSOperationQueue regulates the concurrent execution of operations. It
acts as a priority queue, such that operations are executed in a
roughly First-In-First-Out manner, with higher-priority
(NSOperation.queuePriority) ones getting to jump ahead of
lower-priority ones. NSOperationQueue can also limit the maximum
number of concurrent operations to be executed at any given moment,
using the maxConcurrentOperationCount property.
NSOperationQueue itself is backed by a Grand Central Dispatch queue,
though that’s a private implementation detail.
To kick off an NSOperation, either call start, or add it to an
NSOperationQueue, to have it start once it reaches the front of the
queue. Since so much of the benefit of NSOperation is derived from
NSOperationQueue, it’s almost always preferable to add an operation to
a queue rather than invoke start directly.
Also
Operation queues usually provide the threads used to run their
operations. In OS X v10.6 and later, operation queues use the
libdispatch library (also known as Grand Central Dispatch) to initiate
the execution of their operations. As a result, operations are always
executed on a separate thread, regardless of whether they are
designated as concurrent or non-concurrent operations
So your code should be
NSOperationQueue *backgroundQueue = [[NSOperationQueue alloc] init];
[backgroundQueue addOperationWithBlock:^{
//Your Background Work kHere
.....
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
//Your Main Thread(UI) Work Here
....
}];
}];

Asynchronous Cocoa - Preventing "simple" (obvious) deadlocks in NSOperation?

When subclassing NSOperation to get a little chunk of work done, I've found out it's pretty easy to deadlock. Below I have a toy example that's pretty easy to understand why it never completes.
I can only seem to think through solutions that prevent the deadlock from the caller perspective, never the callee. For example, the caller could continue to run the run loop, not wait for finish, etc. If the main thread needs to be message synchronously during the operation, I'm wondering if there is a canonical solution that an operation subclasser can implement to prevent this type of deadlocking. I'm only just starting to dip my toe in async programming...
#interface ToyOperation : NSOperation
#end
#implementation ToyOperation
- (void)main
{
// Lots of work
NSString *string = #"Important Message";
[self performSelector:#selector(sendMainThreadSensitiveMessage:) onThread:[NSThread mainThread] withObject:string waitUntilDone:YES];
// Lots more work
}
- (void)sendMainThreadSensitiveMessage:(NSString *)string
{
// Update the UI or something that requires the main thread...
}
#end
- (int)main
{
ToyOperation *op = [[ToyOperation alloc] init];
NSOperationQueue *opQ = [[NSOperationQueue alloc] init];
[opQ addOperations: #[ op ] waitUntilFinished:YES]; // Deadlock
return;
}
If the main thread needs to be message synchronously during the
operation, I'm wondering if there is a canonical solution that an
operation subclasser can implement to prevent this type of
deadlocking.
There is. Never make a synchronous call to the main queue. And a follow-on: Never make a synchronous call from the main queue. And, really, it can be summed up as Never make a synchronous call from any queue to any other queue.
By doing that, you guarantee that the main queue is not blocked. Sure, there may be an exceptional case that tempts you to violate this rule and, even, cases where it really, truly, is unavoidable. But that very much should be the exception because even a single dispatch_sync() (or NSOpQueue waitUntilDone) has the potential to deadlock.
Of course, data updates from queue to queue can be tricky. There are several options; a concurrency safe data layer (very hard), only passing immutable objects or copies of the data (typically to the main queue for display purposes -- fairly easy, but potentially expensive), or you can go down the UUID based faulting like model that Core Data uses. Regardless of how you solve this, the problem isn't anything new compared to any other concurrency model.
The one exception is when replacing locks with queues (For example, instead of using #synchronized() internally to a class, use a serial GCD queue and use dispatch_sync() to that queue anywhere that a synchronized operation must take place. Faster and straightforward.). But this isn't so much an exception as solving a completely different problem.

GCD dispatch_async for app-lifetime task?

I've used GCD before with dispatch_async for background-threading units of work like parsing data from a network request (in, say, JSON or XML), and it's WONDERFUL. But what if I have a background task that's going to run for the length of the application? Is dispatch_async still a good fit for this case, or is there a better way to implement it?
Sure, but create your own dispatch queue for it. (If you use a global queue, you may tie up that queue--or one of its threads--forever.)
dispatch_queue_t dispatchQueue = dispatch_queue_create("com.mycompany.myapp.ForeverOperation", DISPATCH_QUEUE_SERIAL);
dispatch_async(dispatchQueue, ^{
// worker routine
});
More traditionally, you would create an explicit thread for this, and if it will run forever, that might make more "sense". But the results are basically the same.
NSThread * myWorkerThread = [[NSThread alloc] initWithTarget:...
[myWorkerThread start];
If you need to communicate with other threads/queues, as always, standard synchronization techniques may be required.
This really has nothing to do with dispatch_async, which is merely one approach to doing something in a background thread. If you want to do something in a background thread, do it in a background thread! Just be aware that doing this constantly can be a drag on your app, since you've only got so much processing time and only so many processors; you may end up having to study that in Instruments. You might want to break up your task into little bits and do it in short chunks every so often. Both GCD and NSOperation can help with that.

Do non-concurrent nsoperations execute in a serial fashion?

From: Apple docs on managing concurrency:
NsOperation
Write a custom subclass and override one method: main. The main method gets called to perform the operation when the NSOperationQueue schedules it to run. NSOperation classes written in this way are known as non-concurrent operations, because the developer is not responsible for spawning threads—multi-threading is all handled by the super class. (Don’t be confused by the terminology: just because an operation is non-concurrent, does not mean it cannot be executed concurrently, it simply means that you don't have to handle the concurrency yourself.)
I think overriding main is the easiest way to use NSOperation, but the apple site says its non-concurrent does it mean that the nsoperations in the nsoperation queue(when only overriding main) would execute serially?
I don't want to execute my operations serially, but I want to get my operations parallel with as minimum effort as possible.
No, they will not necessarily be run serially. Just as the paragraph you quoted noted, it "does not mean it cannot be executed concurrently".
So what does determine if they can run in parallel? The NSOperationQueue and any dependencies you set up between the ops. A little later in that same document, the section entitled "Running Operations" explains:
you can specify (limit) the number of threads the queue will spawn, and
"NSOperationQueue will automatically determine how many threads it should spawn."
And in the Concurrency Programming Guide it elaborates:
In most cases, operations are executed shortly after being added to a queue, but the operation queue may delay execution of queued operations for any of several reasons. Specifically, execution may be delayed if queued operations are dependent on other operations that have not yet completed. Execution may also be delayed if the operation queue itself is suspended or is already executing its maximum number of concurrent operations.
What you do give up by using non-concurrent operations is that the operation itself isn't intended to set up new threads, etc.
I am not sure about it, but i am using "dispatch_queue_create" to create parallel processes.
Suppose if i want to do two functionalities at same time, i am using the following code. Please see if it will help you,
dispatch_queue_t queue = dispatch_queue_create("FirstOperation", 0ul);
dispatch_async(queue, ^{
//Do your functionality here
dispatch_sync(dispatch_get_main_queue(), ^{
//Do your UI updates here
});
});
dispatch_release(queue);
dispatch_queue_t queue2 = dispatch_queue_create("SecondOperation", 0ul);
dispatch_async(queue2, ^{
//Do your functionality here
dispatch_sync(dispatch_get_main_queue(), ^{
//Do your UI updates here
});
});
dispatch_release(queue2);

use NSOperationQueue as a LIFO stack?

i need to do a series of url calls (fetching WMS tiles). i want to use a LIFO stack so the newest url call is the most important. i want to display the tile on the screen now, not a tile that was on the screen 5 seconds ago after a pan.
i can create my own stack from a NSMutableArray, but i'm wondering if a NSOperationQueue can be used as a LIFO stack?
You can set the priority of operations in an operation queue using -[NSOperation setQueuePriority:]. You'll have to rejigger the priorities of existing operations each time you add an operation, but you can achieve something like what you're looking for. You'd essentially demote all of the old ones and give the newest one highest priority.
Sadly I think NSOperationQueues are, as the name suggests, only usable as queues — not as stacks. To avoid having to do a whole bunch of manual marshalling of tasks, probably the easiest thing is to treat your queues as though they were immutable and mutate by copying. E.g.
- (NSOperationQueue *)addOperation:(NSOperation *)operation toHeadOfQueue:(NSOperationQueue *)queue
{
// suspending a queue prevents it from issuing new operations; it doesn't
// pause any already ongoing operations. So we do this to prevent a race
// condition as we copy operations from the queue
queue.suspended = YES;
// create a new queue
NSOperationQueue *mutatedQueue = [[NSOperationQueue alloc] init];
// add the new operation at the head
[mutatedQueue addOperation:operation];
// copy in all the preexisting operations that haven't yet started
for(NSOperation *operation in [queue operations])
{
if(!operation.isExecuting)
[mutatedQueue addOperation:operation];
}
// the caller should now ensure the original queue is disposed of...
}
/* ... elsewhere ... */
NSOperationQueue *newQueue = [self addOperation:newOperation toHeadOfQueue:operationQueue];
[operationQueue release];
operationQueue = newQueue;
It seems at present that releasing a queue that is still working (as will happen to the old operation queue) doesn't cause it to cancel all operations, but that's not documented behaviour so probably not trustworthy. If you want to be really safe, key-value observe the operationCount property on the old queue and release it when it goes to zero.
I'm not sure if you're still looking a solution, but I've the same problem has been bugging me for a while, so I went ahead and implemented an operation stack here: https://github.com/cbrauchli/CBOperationStack. I've used it with a few hundred download operations and it has held up well.
Sadly, you cannot do that without running into some tricky issues, because:
Important You should always configure dependencies before running your operations or adding them to an operation queue. Dependencies added afterward may not prevent a given operation object from running. (From: Concurrency Programming Guide: Configuring Interoperation Dependencies)
Take a look at this related question: AFURLConnectionOperation 'start' method gets called before it is ready and never gets called again afterwards
Found a neat implementation of stack/LIFO features on top of NSOperationQueue. It can be used as a category that extends NSOperationQueue or an NSOperationQueue LIFO subclass.
https://github.com/nicklockwood/NSOperationStack
The easiest way would be to separate your operations and data you will be processing, so you can add operations to NSOperationQueue as usual and then take data from a stack or any other data structure you need.
var tasks: [MyTask]
...
func startOperation() {
myQueue.addOperation {
guard let task = tasks.last else {
return
}
tasks.removeLast()
task.perform()
}
}
Now, obviously, you might need to ensure that tasks collection can be used concurrently, but it's a much more common problem with lots of pre-made solutions than hacking your way around NSOperationQueue execution order.