Grand Central Dispatch

In this article we're going to learn GCD (Grand Central Dispatch). We will take a look at what it is, and the idea behind it. Then we're going to see how to implement it in code, and finally we will see some example for real-world development.

I. Introduction

Concurrent programing is easy in iOS and OS X. Instead of manually starting a thread and handle all the synchronization, Apple decided to use the idea of worker threads - threads are managed by system, we don't worry about that, at all; instead, we submit tasks onto different queues for execution, and the system take care of all the heavy work behind the scene. If you're familiar with Java's Swing Worker, this would be easy for you to understand. Of course, it's not called Swing Worker in iOS or OS X, it's called Grand Central Dispatch (GCD).

II. Queues

Like we said before, all the tasks are submitted to queues, and the operating system handles all the scheduling and execution. Since we're not going to manually create and start threads, we have to have different types of queues to gain control of how these tasks should be executed. In this section we're only going to take a look at these queues, then we will talk about how to dispatch tasks onto them in the next section.

There are three types of queues in GCD:

  • Main: tasks execute serially on your application’s main thread.
  • Concurrent: tasks are dequeued in FIFO order, but run concurrently and can finish in any order.
  • Serial: tasks execute one at a time in FIFO order.

Notice that no matter which type of queue we choose to use, tasks submitted on the queues are always executed in FIFO order (regardless of if they're on the same type of queue or not). It's just the finishing time for the tasks are different on different types of queues.

Main Queue

We can get the main queue by calling dispatch_get_main_queue. Main queue is the only queue we cannot create by ourself. There is one and only one main queue in an application, which is created by the system.

Concurrent Queue

As for concurrent queue, four concurrent queues are automatically created by the system. To get them, simply call dispatch_get_global_queue. You can also create your own concurrent queues by calling dispatch_queue_create, we will take a look at an example soon. Notice these queues are concurrent, which means there is no guarantee which one will finish first, so thread racing and synchronization problems can occur.

Serial Queue

There is no default system-created serial queue, so we have to create them by ourselves. Like creating concurrent queue, we call dispatch_queue_create to create them. Except this time we pass in different parameters to make it serial.

III. The API

Now we've talked about all three types of queues, now let's take a look at how to create and use them. Below are the multiple ways you can get or create a queue.

/* Get the global main queue: */
dispatch_get_main_queue();

/* Get the global concurrent queues that are already created for you by the system. */
dispatch_queue_t dispatch_get_global_queue( long identifier, unsigned long flags);

/* Create a queue with given label for debugging purpose, which can be NULL.
"attr" is the type of queue, it can be serial or concurrent, by default NULL means serial queue. */
dispatch_queue_t dispatch_queue_create( const char *label, dispatch_queue_attr_t attr);

/* Dispatch a block onto a queue asynchronously. */
void dispatch_async( dispatch_queue_t queue, dispatch_block_t block);

/* Dispatch a block onto a queue synchronously. */
void dispatch_sync( dispatch_queue_t queue, dispatch_block_t block);

As you can see, not only we can submit tasks onto serial or concurrent queues, we can also choose if we want to do it synchronously or asynchronously. The difference is that, if you submit a task asynchronously, the code will not wait for the task to complete. In the code below, line print(arr) will be executed right after the dispatch_async call.

//-------- Using Swift to make it shorter and cleaner --------

// Create an array of integer with 10 elements
var arr = [5, 1, 6, 3, 8, 9, 3, 2, 8, 0]
// Dispatch a task onto global concurrent queue asynchronously.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
	print("Please blink.")
	bogoSort(arr)
}
print(arr)
print("The array is printed while (or even before) you blinked,\
and the array is NOT likely to be sorted.")

On the other hand, dispatch_sync behaves like the opposite.

var arr = [5, 1, 6, 3, 8, 9, 3, 2, 8, 0]
// Dispatch a task onto global concurrent queue synchronously.
dispatch_sync(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
	print("Start the very first episode of Game of Thrones,\
	come back when everyone is dead.")
	bogoSort(arr)
}
print(arr)
print("The array is sorted,\
and everyone in Game of Thrones is dead.")

Now you have an idea of what are dispatch_async and dispatch_sync. You may be wondering why we would ever need to use dispatch_sync, since it blocks the calling queue. We will take a look at that later, but yes, most of the time we only need dispatch_async. The important thing to understand here, is that: regardless of the type of queue we submit our tasks to, dispatch_async does not block the calling thread, it returns immediately; whereas dispatch_sync blocks the calling thread, and returns after the task is finished.

Queues manage the relationship between tasks, dispatch functions manage the relationship between a task block and the code following it outside of the block.

You can also nest dispatch functions. Why would we want to do that? Because if we want part of a task block execute on another queue, we will dispatch it again inside our task block. In fact, this happens more often than we expected. Remember that all UI related APIs has to run on main queue, so whenever there are UI related code in your task block, they are automatically submitted to main queue instead of running on the queue you point it to run on. Apple already did this this for us in their API implementation, so we don't have to manually dispatch UI related code every time.

It's totally fine and normal to nest dispatch functions, but we need to be careful with the type of queue and dispatch functions we use (although they don't directly influence each other):

// This will result in a dead lock:
dispatch_sync(mySerialQueue) { // Or "dispatch_async", doesn't matter in this case.
	// Some tasks
	dispatch_sync(mySerialQueue) {
		// Some other tasks
	}
}

In the example above, we have a dispatch_sync inside another dispatch function. Notice that it's the fact they are dispatching on the same queue caused dead lock. Remember that tasks are always executed in FIFO order, regardless of queue type and dispatch function used. Since the inner task is submitted after the outer task, it will not be executed until the outer block is done, because it's submitted onto a serial queue. So the two blocks will wait until each other finishes executing, which results in a dead lock.

Objective-C vs. Swift

No matter in Objective-C or in Swift, the syntax for submitting tasks are very similar. The different is under the hood.

When we talk about concurrency, we have to talk about variable synchronization. If you want to synchronize a variable or property in Objective-C, simply defined it as atomic.

@property (strong, atomic) UIButton *button;

In Swift, we can't define a variable as atomic or nonatomic, it uses a different approach. In short, the compiler does it for us. However, the compiler could be wrong. So things get easier in general, but sometimes more complicated.

/** API of function "dispatch_once":
	Executes a block object once and only once for the lifetime of an application.
	- parameters:
		- predicate: A pointer to a dispatch_once_t structure that is used to test whether the block has completed or not.
		- block: The block object to execute once.
*/
void dispatch_once( dispatch_once_t *predicate, dispatch_block_t block);

class Person {
	// Will be wrapped in "dispatch_once" function by Swift compiler during execution.x
	static let descriptionForClass = "A class that represents a person."
}

As we can see, the compiler will automatically wrap some class variable initialization in a dispatch_once function, so it's only executed once.

NSLock

What if the compiler's behavior is not what we wanted? Well, we can use NSLock to manually create our own lock with great flexibility.

let myLock = NSLock()
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
	myLock.lock()
	// Some operation should be synchronized
	myLock.unlock()
}

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
	myLock.lock()
	// Another operation should be synchronized
	myLock.unlock()
}

Dispatch Group

There are often times we want to wait for multiple dispatched action to finish, then do something based on the result. This is typically refereed as the "future" design. In Grand Central Dispatch, implementing "future" is very easy using dispatch_group.

// Create a group for listening from the server
let serverGroup = dispatch_group_create()

// Associates client1 code to the group.
dispatch_group_async(serverGroup, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
	client1.getResponseFromServer()
}

// Do same thing for client2
dispatch_group_async(serverGroup, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
	client2.getResponseFromServer()
}

// The first parameter is the group we're waiting for, the second one is timeout measured by seconds, in this case we wait forever until the server responds.
dispatch_group_wait(serverGroup, DISPATCH_TIME_FOREVER)
// Print done.
print("Both clients got response from server.")

If we don't want to wait for our group, we can wait for it on another queue and execute the code on that queue.

/** API for dispatch_group_notify:
	Schedules a block object to be submitted to a queue when a group of previously submitted block objects have completed.
	- parameters:
		- group: The dispatch group to observe. The group is retained by the system until the block has run to completion. This parameter cannot be NULL.
		- queue: The queue to which the supplied block is submitted when the group completes. The queue is retained by the system until the block has run to completion. This parameter cannot be NULL.
		- block: The block to submit when the group completes. This function performs a Block_copy and Block_release on behalf of the caller. This parameter cannot be NULL.
 */
void dispatch_group_notify( dispatch_group_t group, dispatch_queue_t queue, dispatch_block_t block);

// Modify our code to wait and print on another queue.
dispatch_group_notify(serverGroup, dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0)) {
	print("Both clients got response from server.")
}

IV. Conclusion

So that's how we can use Grand Central Dispatch to write concurrent code for iOS or OS X. There are a lot more to GCD we haven't touched yet, this is just the tip of the iceberg. I hope this introduction can get you started, and not feeling intimidated by GCD anymore. Overall, it's a very easy to use and flexible library.