Professional Documents
Culture Documents
2010-02-24
Apple Inc.
© 2010 Apple Inc.
All rights reserved.
Introduction Introduction 5
3
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CONTENTS
4
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
INTRODUCTION
Introduction
Introduced in Mac OS X v10.4, the QTKit framework is a powerful, feature-rich Objective-C API for manipulating
and rendering time-based media such as movies and audio files.
This tutorial explains how to build three different applications for playing, editing, and recording audio and
video media. You build these applications using the QTKit programming interfaces and Apple’s developer
tools, Xcode 3.2 and Interface Builder 3.2. This document describes QTKit for Mac OS X v10.6.
If you are a developer who wants to learn how to integrate playback, editing and recording of media into
your application, you should read the material in this document to get started. You don’t necessarily need
to be a seasoned Cocoa programmer to take advantage of the capabilities provided in the QTKit framework,
although you’ll find prior experience working with Objective-C, Xcode, and Interface Builder helpful to build
and compile the different code examples described in this tutorial.
This tutorial follows a progressive, learn-as-you-go structure. To be most successful, work through this tutorial
in the order presented.
● “Creating a Simple QTKit Media Player Application” (page 9) describes how to build and compile a
simple media player application, using Cocoa bindings and with a minimal number of lines of Objective-C
code.
● “Extending the Media Player Application” (page 19) explains how you can extend and enhance the
functionality of the media player application to support editing of video/audio files.
● “Customizing the Media Player Application” (page 25) discusses how you can extend the functionality
of the media player application by adding custom controls and selecting attributes to handle the precise
display and manipulation of high-definition, H.264-enabled movies. QuickTime X methods for more
efficient media playback in your Xcode project are also discussed.
● “Building a Simple QTKit Recorder Application” (page 35) discusses step-by-step how you can build a
simple yet powerful QTKit recorder application that lets you capture a video stream and record the media
to a QuickTime movie.
● “Adding Audio Input and DV Camera Support” (page 51) describes how you can extend the functionality
of your QTKit recorder player application by adding support for audio input and DV cameras with only
a few lines of Objective-C code.
● “Creating a QTKit Stop Motion Application” (page 55) describes how you can construct a simple stop
motion recorder application that lets you capture a live video feed, grab frames one at a time with great
accuracy, and then record the output of those frames to a QuickTime movie––with less than 100 lines
of Objective-C code.
To build your QTKit media player and recorder projects, make sure you are running Mac OS X v10.5 or later
and have these Apple developer tools installed on your system:
● Xcode 3.2 and Interface Builder 3.2. Apple provides a comprehensive suite of developer tools for
creating Mac OS X software. These tools include applications to help you design, create, debug, and
optimize your software. The suite also includes header files, sample code, and documentation. You can
download the Xcode tools from the Apple Developer Connection website. Registration is required, but
free.
Sample Code
The tutorial is based on the following three code samples, which you can download from the Apple Devel-
oper Connection website or view from within Xcode:
● MyMediaPlayer (constructed in “Creating a Simple QTKit Media Player Application” (page 9) and
“Extending the Media Player Application” (page 19)), which demonstrates how you can build, compile
and extend a simple yet powerful application for playback of video and audio media.
● MyRecorder (constructed in “Building a Simple QTKit Recorder Application” (page 35) and “Adding
Audio Input and DV Camera Support” (page 51)), which demonstrates how you can use the QTKit Capture
programming interface to create a fully functional application for recording audio/video media.
● StopMotion (constructed in “Creating a QTKit Stop Motion Application” (page 55)) demonstrates how
you can build an application for recording single frame images and outputting those images for playback
as a QuickTime movie.
See Also
For more information on the technologies and tools you use in this tutorial, consult the following Apple
documentation:
● QTKit Framework Reference contains the class and protocol reference documentation for the QTKit
framework.
● Interface Builder User Guide describes the latest version of Interface Builder 3.
● QTKit Application Programming Guide discusses in-depth the QTKit software architecture, best design
practices and coding techniques you can use in developing feature-rich media applications in Mac OS
X.
The various QuickTime and Cocoa mailing lists also provide a useful developer forum for raising issues and
answering questions that are posted. To subscribe, check out the QuickTime-API Mailing List and the Cocoa
Development list.
See Also 7
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
INTRODUCTION
Introduction
8 See Also
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CHAPTER 1
2. When the new project window appears, select Mac OS X > Application > Cocoa Application.
In the Options panel, check Create document-based application, then click Choose.
This builds a Cocoa-based application written in Objective-C that uses the NSDocument architecture.
3. Name the project MyMediaPlayer and navigate to the location where you want the Xcode application
to create the project folder.
4. Verify that the files in your project match those in the illustration.
Note that the icons representing both the MyDocument.xib and MainMenu.xib nibs have the extension
.xib, which indicates that your project is using Interface Builder 3xx. The xib file format is preferred
during development because it provides a diff-able text and SCM-friendly format. At build time, Xcode
automatically converts your project’s xib files to nib files so that they can be deployed with your
application. If you have existing nib files, however, you can also continue saving to that same format.
● From the Action menu in your Xcode project, choose Add > Add to Existing Frameworks.
● Click the Target “MyMediaPlayer” Info panel and verify you have linked to the QTKit.framework and
the Type Required is selected, as shown in the illustration.
● In the same Target “MyMediaPlayer” Info window, select the Properties button and open the
Properties pane.
● Select the DocumentType row. In the Extensions column, enter mov, and in the OS Types column,
enter MooV.
● In the DocumentType row, select Binary from the Store Type and Viewer from the Role pull-down
menu.
This completes the first basic sequence of steps in your project. In the next few sequences, you declare
the instance methods you need in Xcode before working with Interface Builder.
2. After the file’s import statement for Cocoa, add an import statement for QTKit.
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
3. Declare the instance variable movie that points to the QTMovie object.
{
QTMovie *movie;
}
The line returns the QTMovie object that you will bind your movie to later on.
Note: To clarify the Model-View-Controller separation in Cocoa, understand that the QTMovie object is
owned by the document and bound to the QTMovieView object; it does not “belong” to the QTMovieView
object.
At this point, the code in your MyDocument.h file should look like this.
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
@end
2. Set a URL location for obtaining the contents of your movie document and use the setMovie: method
to set the document’s movie to be the QTMovie object you just created.
● Scroll down to the block of code that includes the - (BOOL)readFromData: method.
● Replace that block of code with the following code in the next step.
3. Add the @synthesize directive to generate the getter and setter methods you need and to complete
your implementation.
@synthesize movie;
4. Deallocate memory.
-(void)dealloc
{
if (movie) {
[movie release];
}
[super dealloc];
}
This completes the second stage of your project. Next, you construct the user interface for your project using
Interface Builder 3.2.
Because of the integration between Xcode 3.2 and Interface Builder 3.2, the methods you declared in
your MyDocument.h file have been synchronously updated in Interface Builder 3.2.
3. In Interface Builder 3.2, select Tools > Library to open a library of plug-in controls.
● Scroll down until you find the QuickTime Movie View control in the objects library.
The QTMovieView object provides you with an instance of a view subclass to display QuickTime movies
that are supplied by QTMovie objects in your project.
4. Select the “Your document contents here” text object in the window and press Delete.
5. Drag the QTMovieView object from the library into your window and resize the object to fit the window.
Note that the object combines both a view and a control for playback of media.
● In the Inspector panel, select the Movie View Attributes icon, which appears as the first icon in the
row at the top of the panel.
7. Set the autosizing for the QTMovieView object in the Movie View Size, as shown in the illustration.
8. Connect the movie object to the QuickTime movie to be displayed. You do this by defining a Cocoa
binding:
● In the QTMovieView Inspector (select the QTMovieView and open with Command-Shift-I, if needed),
navigate to the Bindings panel. (You can also access it by selecting Tools > Bindings Panel.)
● In the Parameters section of the Bindings panel, open the movie parameter by clicking the disclosure
triangle next to it.
In this case, by specifying the binding above, you’ve instructed Cocoa essentially to create a connection
at runtime between the object that loads the user interface––that is, the File’s Owner object––and the
QTMovieView object. This connection provides the QTMovieView object with a reference to the
QuickTime movie that is to be opened and displayed.
9. Save your MyDocument.xib file in Interface Builder and return to your Xcode project.
That completes the user interface construction and coding of your media player. Now you’re ready to build
and compile the application in Xcode.
Before you build and compile your project, you need to specify that your player will open QuickTime movies,
that is, movies with the extension .mov. You accomplish this as follows:
Now you are ready to build and run the MyMediaPlayer project in Xcode. When the player launches, you can
open and play any QuickTime movie of your choice. Simply locate a .mov file and launch the movie from
the File > Open menu in your media player application.
Summary
● Create a Cocoa document-based application that lets you open and play QuickTime movies with fewer
than ten lines of Objective-C code.
● Work with Cocoa Bindings to create a connection at runtime between the object that loads the user
interface and the object that displays and plays back a QuickTime movie.
18 Summary
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CHAPTER 2
In this chapter, you extend your QTKit media player beyond the simple player you constructed in the previous
chapter. This time, when completed, your QTKit media player application allows you not only to display and
play back a video or audio file as a QuickTime movie but also to edit the contents of that movie file. To
implement this media player, you’ll be surprised, again, at how few lines of Objective-C code you’ll have to
write.
The goal of this chapter is to build on and extend your knowledge of the methods available in the QTKit
framework. The focus, as in the previous chapter, is on how to accomplish media playback and, in this case,
editing of video/audio content as efficiently as possible with a minimum of Objective-C code and in
conformance with techniques for best practices in developing Cocoa applications.
2. Select the MyMediaPlayer project you created in the previous chapter and open it.
4. Declare a mMovieView instance variable to point to the QTMovieView Interface Builder outlet, following
the line of code in which you declared the movie instance variable pointing to the QTMovie object.
At this point, the code in your MyDocument.h file should look like this:
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
@end
This completes the first stage of your project. Now you use Interface Builder 3.2 to construct the user interface
for your project.
Interface Builder and Xcode are designed to work seamlessly together, enabling you to construct and
implement the various elements in your project more efficiently and with less code overhead.
1. Launch Interface Builder 3.2 and open the MyDocument.xib file in your Xcode project window that you
created following the steps in the previous chapter in “Create the User Interface with Interface
Builder” (page 14).
2. Press Control-drag to wire up the File’s Owner to the movie view object, specifying the mMovieView
instance variable as an outlet.
This completes the sequence steps for constructing your media player user interface. In the next sequence,
you return to your MyDocument.m implementation file to add the necessary blocks of code for your project.
1. Open the MyDocument.m file and scroll down to the block of code that begins with the following:
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
2. To show the movie control grow box, use the setShowsResizeIndicator: method.
3. To hide the window’s resize indicator so it does not interfere with the movie control, use the
setShowsResizeIndicator: method.
4. Add the following lines so that your code looks like this:
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
[super windowControllerDidLoadNib:aController];
[mMovieView setShowsResizeIndicator:YES];
[[mMovieView window] setShowsResizeIndicator:NO];
}
5. Inside the block of code beginning with the readFromURL:ofType:error: method, which sets the
contents of the document by reading from a file or file package, of a specified type, located by a URL,
add this line:
[self setMovie:newMovie];
}
By calling the setAttribute:forKey: method and using the QTMovie editable attribute, you’ve
marked the movie as editable. This means that any editing operations you choose, such as cut, copy, and
paste, can be performed on the movie itself. The value for this key is of type NSNumber, interpreted as
a Boolean value. If the movie can be edited, the value is YES. Understanding how to use attributes to
specify and perform certain tasks is important in working with the QTKit API. Attributes you can access
in the QTKit framework are discussed in the next chapter, “Customizing the Media Player
Application” (page 25).
That completes the construction and coding of your extended media player application. Now you’re ready
to build and compile the application in Xcode.
After you’ve completed these steps, launch your media player in Xcode, open and display QuickTime movies
and perform editing on those movies.
1. In Xcode, build and run the media player application. In File > Open, select a movie and open it. The
movie is completely editable with a slider bar for editing.
2. To access the editing features of the player or to control playback, stopping or starting the sample movie,
control-click anywhere in the movie. A contextual menu appears.
Summary
● Extend the functionality of the MyMediaPlayer application, enabling you to open, play and edit
QuickTime movies or audio files.
● In code, mark the movie as “editable” to perform editing operations such as cut, copy, and paste by
using the setAttribute:forKey: method.
● Specify a movie controller grow box, using the setShowsResizeIndicator: method and hide the
window’s resize indicator, using the setShowsResizeIndicator: method.
Summary 23
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CHAPTER 2
Extending the Media Player Application
24 Summary
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CHAPTER 3
In this chapter, you further extend the capabilities and performance of your media player application. You
begin by adding your own control buttons from the Interface Builder library of plug-in controls. These buttons
enable you to control movie playback without relying on the built-in controls available in the QTMovieView
object. Next, you work with movie attributes that enable you to better handle the display and playback of
movies. You do this by using code to set up view resizing that respects the size and aspect ratio of displayed
media content. In the last section of this chapter, you implement, with QuickTime X, the capabilities of a
lightweight media player for faster, more efficient playback of media content.
The chapter builds on the understanding of the QTKit API you’ve gained from the previous chapters. The
code samples discussed in this chapter rely on the media player application you built in “Extending the Media
Player Application” (page 19).
Adding custom controls for movie playback to your media player project is relatively straightforward. You
define the instance variables in your declaration file and then launch Interface Builder to add the controls
from the library of available plug-in controls and wire up the play and pause buttons to the QTMovieView
object for custom control of movie playback.
Working with the MyMediaPlayer sample project you constructed in “Extending the Media Player
Application” (page 19), in the steps that follow, you create custom controls for your player application.
3. Declare an mMovieWindow instance variable that points to an NSWindow object as an Interface Builder
outlet.
4. Declare an mMovieView instance variable that points to a QTMovieView Interface Builder outlet.
At this point, the code in your MyDocument.h file should look like this.
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
Now you’re ready to add the control buttons you need to customize media playback of content in your Xcode
project.
1. Launch Interface Builder 3.2 and open the MyDocument.xib file in your Xcode project window.
2. Resize the QTMovieView object in your window so that you allow space at the bottom for placement
of your control buttons.
3. Choose Tools > Library to open the Interface Builder library of controls and scroll down until you find
the oval push button (NSButton) controls you need for your project.
● Select and drag two buttons, which you can name Play and Pause, into the lower region of your
window.
● Control-drag the Play button to the play: action of the QTMovieView object and the Pause button
to the pause: action of the QTMovieView object to wire up and connect both objects.
● In the Inspector panel, select the Movie View Attributes icon, which appears as the first icon in the
row at the top of the panel.
5. Save your the file in Interface Builder and return your Xcode project.
After you’ve completed the sequence of steps, simply build and run the MyMediaPlayer project in Xcode.
When the player launches, open and play any QuickTime movie of your choice.
Locate a .mov file and launch the movie from the File > Open menu in your media player application. Use
the custom controls you’ve added to play and pause the playback of media content.
If you open and review the QTMovie Class Reference, you find a series of tables that describe movie attributes.
These movie attributes are some of the most powerful features of the QTKit API, in that you can access the
data in a QTMovie object, which is stored as attributes, by using an attribute key. The data encapsulated as
attributes can be included in a dictionary and used appropriately in your code to handle a number of important
tasks.
Whenever you instantiate a new QTMovie object, you can use its specified attributes to perform certain tasks.
For example, you can use movie attributes in a dictionary to specify:
● The playback characteristics of the movie or other properties of the QTMovie object
Here are some of the keys that specify attributes of movie data you can access and manipulate in your
application. The complete list of movie attributes is available in the QTKit Framework Reference.
Attribute Description
Attribute Description
QTMovieIsSteppableAttribute Indicates whether the movie can be stepped from frame to frame.
QTMovieNaturalSizeAttribute Returns the size of the movie when displayed at full resolution.
In “Extending the Media Player Application” (page 19), you already accessed and set one of the important
attribute keys available in the QTMovie class when you added the following single line of code to your media
player application:
By setting that attribute, you enabled your media player to support simple cut, copy, and paste editing on
the movie.
1. Modify the code in your MyDocument.h declaration file so that it looks like this:
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
@end
3. In your MyDocument.m implementation file, add the following line of code after your #import
"MyDocument.h" statement:
This specifies a static default constant width of 320 pixels for movies that are not visual, that is, typically
audio files such as .mp3 audio clips as illustrated below. You want to be able to play back these files at
the appropriate width and height, and not as fully expanded QuickTime movies.
4. After your @implementation My Document directive, set up a notification that the natural size of your
movie has changed, if you are deploying your application using Mac OS X v10.6.
- (void)dealloc
{
QTMovie *movie = [self movie];
if (movie)
{
[[NSNotificationCenter defaultCenter] removeObserver:self
#if defined(MAC_OS_X_VERSION_10_6) && (MAC_OS_X_VERSION_MIN_REQUIRED >=
MAC_OS_X_VERSION_10_6)
name:QTMovieNaturalSizeDidChangeNotification
#else
name:QTMovieEditedNotification
#endif
object:movie];
}
[self setMovie:nil];
[super dealloc];
}
5. Now set the size value as the natural size attribute for a movie.
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
[super windowControllerDidLoadNib:aController];
object:movie];
}
else
[mMovieView setMovie:[QTMovie movie]];
[mMovieView setShowsResizeIndicator:YES];
- (void)sizeWindowToMovie
{
QTMovie *movie = [self movie];
if ([mMovieView isControllerVisible])
{
contentSize.height += [mMovieView movieControllerBounds].size.height;
if (contentSize.width == 0)
{
contentSize.width = kDefaultWidthForNonvisualMovies;
}
}
- (void)movieNaturalSizeDidChange:(NSNotification *)notification
{
[self sizeWindowToMovie];
}
if (newMovie) {
[newMovie setAttribute:[NSNumber numberWithBool:YES]
forKey:QTMovieEditableAttribute];
[self setMovie:newMovie];
}
return (newMovie != nil);
}
10. At the top of your file––after the const static CGFloat kDefaultWidthForNonvisualMovies
= 320; statement–– declare the interface for the sizeWindowToMovie method.
12. Click the MyDocument.xib in your project and open the nib in Interface Builder.
13. In Interface Builder, Control-drag from the File’s Owner icon to the QTMovieView object and wire up
the outlet to mMovieView.
16. Open any video/audio media content that QuickTime supports, such as H.264 movies, and it will display
and render at the appropriate, original size of the media.
By adding these blocks of code, you sized the window to the natural size of the movie when it opens and
displays media content, and you added a notification observer to ensure that the movie resizes to the size
of the window and notifies you when the size changes. These are important concepts to understand in
developing applications that behave properly and play back QuickTime and media content at its original
size.
If you need to play back media, you can take advantage of the high-performance playback efficiency provided
by QuickTime X in Mac OS X v10.6. You gain access to the media playback services through the QTKit
framework.
QuickTime X is a new media architecture developed by Apple that provides media services for QTKit application
developers who need to play back audio/video media. Available in Mac OS X v10.6, it is specifically designed
for efficient, high-performance playback of audio/video media, with optimized support for modern codecs
such as H.264 and AAC.
Using QuickTime X, you can open a media file, gather information about the playback characteristics of the
movie (such as its duration, the codecs used, and thumbnail images), display the movie in a view, and control
the loading and playback of that movie. However, movies using the new media services cannot be edited
or exported.
In Mac OS X v10.6, two new movie attributes are defined. The first attribute,
QTMovieOpenAsyncRequiredAttribute, indicates whether a QTMovie object must be opened
asynchronously. The second attribute, QTMovieOpenForPlaybackAttribute, indicates whether a QTMovie
object will be used only for playback and not for editing or exporting.
The default behavior of QTMovie is to open movies that can be made editable and exportable. You pass the
QTMovieOpenForPlaybackAttribute key with the value NSNumber numberWithBool:YES in the
dictionary of attributes passed to initWithAttributes:error:. This indicates that you are interested
only in playing the movie. If you don’t need to do editing or exporting, QTMovie may be able to select more
efficient code paths.
To open and playback a movie file, you use the following code:
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
[super windowControllerDidLoadNib:aController];
if ([self fileName]) {
NSDictionary *attributes = [NSDictionary dictionaryWithObjectsAndKeys:
[self fileName], QTMovieFileNameAttribute,
[[NSNumber numberWithBool:YES],
QTMovieOpenForPlaybackAttribute, nil];
movie = [[QTMovie alloc] initWithAttributes:attributes error:NULL];
[movieView setMovie:movie];
[movie release];
[[movieView movie] play];
}
}
Because the attributes dictionary contains a key-value pair with the QTMovieOpenForPlaybackAttribute
key and the value YES, QTKit uses the new media services, if possible, to play back the media content in the
selected file.
Summary
● Customize the functionality of the MyMediaPlayer application, enabling you to create custom start and
stop controls for the playback of QuickTime movie or audio files.
● Use movie attributes to access the data in a QTMovie object, enabling you to access and manipulate
movie data in your application
● Modify the MyMediaPlayer application for the playback of movies and media at their original or natural
size.
● Access the new media services available in QuickTime X and Mac OS X v10.6.
If you’ve worked through the coding examples in the first three chapters of this tutorial, you are well on your
way toward mastering the skills you need to have in order to develop applications for media playback.
The next three chapters introduce you to the methods and classes in the QTKit API that let you do capture
and recording of media content. You build a recorder application that is simple to design and implement,
yet powerful in its functionality.
Beyond that, you add code to the recorder application to support the capture of DV video and audio media,
and then create another application that enables you to capture and record still images, using the techniques
employed by the movie industry for stop motion animation. After recording those still images, you’ll be able
to output the content to a QuickTime movie, creating some unusual, animated effects.
Summary 33
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CHAPTER 3
Customizing the Media Player Application
In this chapter, you build a simple QTKit recorder application. When completed, your QTKit recorder will
allow you to capture a video stream and record the output of that stream to a QuickTime movie. To implement
this recorder, you’ll be surprised at how few lines of Objective-C code you have to write.
Your recorder application will have simple start and stop buttons that allow you to output and display your
captured files in QuickTime Player. For this project, you need an iSight camera, either built-in or plugged into
your Mac.
In building your QTKit recorder application as described in this tutorial, you work essentially with the following
three QTKit capture objects:
● QTCaptureDeviceInput. The input source for media devices, such as cameras and microphones.
Rather than simply jumping into Interface Builder and constructing your prototype there, visualize the
elements first in a rough design sketch, thinking of the design elements you want to incorporate into the
application.
QTCaptureView
object
Start Stop
Control buttons
In this prototype, you use three simple Cocoa objects: a QuickTime capture view, and two control buttons
to start and stop the actual recording process. These are the building blocks for your application. After you’ve
sketched them out, you can begin to think of how to hook them up in Interface Builder and what code you
need in your Xcode project to make this happen.
Follow the same steps you performed when you created and built the MyMediaPlayer application, described
in “Creating a Simple QTKit Media Player Application” (page 9).
2. When the new project window appears, select Cocoa Application and click Choose.
3. Name the project MyRecorder and navigate to the location where you want the Xcode application to
create the project folder. The Xcode project window appears, as shown here.
● From the Action menu in your Xcode project, choose Add > Add to Existing Frameworks.
● Click the Target “MyMediaPlayer” Info panel and verify you have linked to the QTKit.framework and
the Type Required is selected.
This completes the first sequence of steps in your project. In the next sequence, you define actions and
outlets in Xcode before working with Interface Builder.
1. Choose File > New File. In the panel, scroll down and select Cocoa > Objective-C class, which includes
the <Cocoa/Cocoa.h> header files.
2. Name your implementation file MyRecorderController.m. Check the item in the field below to name
your declaration file MyRecorderController.h.
#import <QTKit/QTKit.h>
1. In your MyRecorderController.h file, add the instance variable mCaptureView, which points to the
QTCaptureView object as an outlet.
- (IBAction)startRecording:(id)sender;
- (IBAction)stopRecording:(id)sender;
At this point, the code in your MyRecorderController.h file should look like this:
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
@end
3. Open the MyRecorderController.m file and add the following actions, along with the requisite braces.
- (IBAction)startRecording:(id)sender
{
}
- (IBAction)stopRecording:(id)sender
{
}
This completes the second stage of your project. Now use Interface Builder 3.2 to construct the user interface
for your project.
Now you construct and implement the various user interface elements in your project.
● Scroll down until you find the QuickTime Capture View object.
The QTCaptureView object provides you with an instance of a view subclass to display a preview
of the video output that is captured by a capture session.
● Drag the QTCaptureView object into your window and resize the object to fit the window, allowing
room for the two Start and Stop buttons in your QTKit recorder.
● Set the autosizing for the object in the Capture View Size panel, as illustrated below.
● Verify the controls, appearance, behavior, and memory items are checked, as shown in the illustration.
● Create the Start and Stop buttons for the MyRecorder Window.
● In the Library, select the Push Button control and drag it to the Window.
● Enter the text Start and duplicate the button to create another button as Stop.
6. Set up the autosizing for your buttons (as shown in the illustration) by selecting the button and clicking
the Button Size Inspector.
● In the Library, scroll down and select the blue cube control, which is an object (NSObject) you can
instantiate as your controller.
● Select the object and enter its name as My Recorder Controller. Then click the information
icon in the Inspector.
● When you click the Class Identity field, enter the first few letters of the class name into the text field.
Interface Builder autocompletes it for you.
The MyRecorderController object appears. If you have saved both your declaration and
implementation files as specified in “Determine the Actions and Outlets You Want” (page 37), Interface
Builder automatically updates the MyRecorderController class specified in your Xcode implementation
file. To verify and reconfirm that an update has occurred, press Return. If the identify field is not
automatically updated, you may need to specify manually that it is a MyRecorderController object.
In the next phase of your project, you continue working with Interface Builder and Xcode to construct and
implement the various elements in your project. Because of the seamless integration between Interface
Builder and Xcode, your code will run more efficiently and with less overhead.
● Control-drag from the MyRecorderController object in your nib file to the QTCaptureView
object.
● A transparent panel appears, displaying the IBOutlet instance variable, mCaptureView, that you’ve
specified in your declaration file.
2. Click the Interface Builder outlet mCaptureView to wire up the two objects.
Now you’re ready to add your Start and Stop push buttons and wire them up in your MainMenu.nib window.
1. Control-drag each of the Start and Stop buttons from the window to the MyRecorderController
object.
● Click the startRecording: method in the transparent Received Actions panel to connect the Start
button.
● Likewise, click the stopRecording: method in the Received Actions panel to connect the Stop
button.
Control-drag a connection from the window to the MyRecorderController object and click the outlet,
connecting the two objects.
3. To verify that you’ve correctly wired up your window object to your delegate object, select the Window
and click the Window Connections Inspector icon.
4. Verify that you’ve correctly wired up your outlets and received actions.
Select the My Recorder Controller object and click the My Recorder Controller Connections icon.
5. Check the My Recorder Controller Identity Inspector panel to confirm the class actions and class outlets.
6. Click the MainMenu.xib file to verify that Interface Builder and Xcode have worked together to
synchronize the actions and outlets you’ve specified.
A small green light appears at the left bottom corner of the MainMenu.xib file next to
MyRecorder.xcodeproj to confirm this synchronization.
8. Verify that your QTKit MyRecorder application appears as shown in the illustration below.
You’ve now completed your work in Interface Builder 3.2. In this next sequence of steps, you return to your
Xcode project and add a few lines of code in both your declaration and implementation files to build and
compile the application.
To complete the implementation of the MyRecorderController class, you need to define the instance
variables that point to the capture session, as well as to the input and output objects. In your Xcode project,
you add the instance variables to the interface declaration.
The complete code for your declaration file should look like this:
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
QTCaptureSession *mCaptureSession;
QTCaptureMovieFileOutput *mCaptureMovieFileOutput;
QTCaptureDeviceInput *mCaptureDeviceInput;
}
- (IBAction)startRecording:(id)sender;
- (IBAction)stopRecording:(id)sender;
@end
In the remaining steps, you modify the MyRecorderController.m implementation file. These steps
are arranged in logical, though not necessarily rigid, order.
- (void)awakeFromNib
{
mCaptureSession = [[QTCaptureSession alloc] init];
4. Find the device and create the device input. Then add it to the session.
6. Specify the compression options with an identifier with a size for video and a quality for audio.
[mCaptureMovieFileOutput setCompressionOptions:compressionOptions
forConnection:connection];
7. Associate the capture view in the user interface with the session.
[mCaptureView setCaptureSession:mCaptureSession];
}
[mCaptureSession startRunning];
}
9. Handle the closing of the window and notify for your input device, and then stop the capture session.
- (void)windowWillClose:(NSNotification *)notification
{
[mCaptureSession stopRunning];
[[mCaptureDeviceInput device] close];
}
- (void)dealloc
{
[mCaptureSession release];
[mCaptureDeviceInput release];
[mCaptureMovieFileOutput release];
[super dealloc];
}
11. Implement start and stop actions, and specify the output destination for your recorded media, in this
case a QuickTime movie (.mov) in your /Users/Shared folder.
- (IBAction)startRecording:(id)sender
{
[mCaptureMovieFileOutput recordToOutputFileURL:[NSURL
fileURLWithPath:@"/Users/Shared/My Recorded Movie.mov"]];
}
- (IBAction)stopRecording:(id)sender
{
[mCaptureMovieFileOutput recordToOutputFileURL:nil];
}
12. Finish recording and then launch your recording as a QuickTime movie on your Desktop.
- (void)captureOutput:(QTCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL forConnections:(NSArray
*)connections dueToError:(NSError *)error
{
[[NSWorkspace sharedWorkspace] openURL:outputFileURL];
}
Important: In Step 6, you performed an important operation that needs to be called out and fully understood.
You specified the compression options for your captured output by using an identifier that lets you define
the size for video and another identifier that defines the quality of audio.
The first identifier is @"QTCompressionOptions240SizeH264Video", which compresses the video you
captured using the H.264 codec, using a medium bit-rate setting with a dimension no larger than 320x240.
If you had not included this identifier and subsequent block of code, your captured video output would be
as a raw video file, that is, a much larger file that would require significant disk space and bandwidth.
The second identifier, @"QTCompressionOptionsHighQualityAACAudio", which you included in your
code, let you define the quality of audio output as AAC Audio, in this case, high-quality audio.
After you’ve saved your project, click Build and Go. After compiling, click the Start button to record, and the
Stop button to stop recording. The output of your captured session is saved as a QuickTime movie in the
path you specified in this code sample.
Using a simple iSight camera, you now have the capability of capturing and recording media at a specific
size and audio quality, and then outputting your recording to a QuickTime movie.
Summary
In this chapter, as a programmer, you focused on building a recorder application using three capture objects:
QTCaptureSession, QTCaptureMovieFileOutput and QTCaptureDeviceInput. These were the essential
building blocks for your project. You learned how to:
● Prototype and design the data model and controllers for your project in a rough sketch before constructing
the actual user interface.
● Create the user interface using the QTCaptureView plug-in from the Interface Builder library of controls.
● Specify compression options with an identifier for the size of your recorded video and the for quality of
your audio.
● Build and run the recorder application, using the iSight camera as your input device.
50 Summary
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CHAPTER 5
If you worked through the sequence of steps outlined in the previous chapter, you’re now ready to extend
the functionality of your QTKit recorder application.
In this chapter, you add audio input capability to your recorder application, as well as support for input from
DV cameras. This is accomplished with only a dozen lines of Objective-C code, with error handling included.
QTCaptureDeviceInput *mCaptureVideoDeviceInput;
QTCaptureDeviceInput *mCaptureAudioDeviceInput;
These are the audio and video input device variables that enable you to record audio, as well as
video from external DV cameras.
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>
QTCaptureSession *mCaptureSession;
QTCaptureMovieFileOutput *mCaptureMovieFileOutput;
QTCaptureDeviceInput *mCaptureVideoDeviceInput;
QTCaptureDeviceInput *mCaptureAudioDeviceInput;
}
- (IBAction)startRecording:(id)sender;
- (IBAction)stopRecording:(id)sender;
@end
1. Scroll down to the code block that begins with QTCaptureDevice *videoDevice.
● Find a video input device, such as the iSight camera, and open it.
2. If you can’t find or open a video input device, try to find and open a muxed video input device, such as
a DV camera.
Note that in a muxed video, the audio and video tracks are mixed together.
if (!success) {
videoDevice = [QTCaptureDevice
defaultInputDeviceWithMediaType:QTMediaTypeMuxed];
success = [videoDevice open:&error];
}
if (!success) {
[videoDevice = nil;
}
if (videoDevice) {
3. Add support for audio from an audio input device to the capture session.
Note that you added an audio type of QTMediaTypeSound to the video device to handle the chores of
capturing your audio stream in your capture session.
if (!success) {
audioDevice = nil;
}
if (audioDevice) {
mCaptureAudioDeviceInput = [[QTCaptureDeviceInput alloc]
initWithDevice:audioDevice];
[mCaptureView setCaptureSession:mCaptureSession];
[mCaptureSession startRunning];
}
- (void)windowWillClose:(NSNotification *)notification
{
[mCaptureSession stopRunning];
if ([[mCaptureVideoDeviceInput device] isOpen])
[[mCaptureVideoDeviceInput device] close];
if ([[mCaptureAudioDeviceInput device] isOpen])
[[mCaptureAudioDeviceInput device] close];
}
- (void)dealloc
{
[mCaptureSession release];
[mCaptureVideoDeviceInput release];
[mCaptureAudioDeviceInput release];
[mCaptureMovieFileOutput release];
[super dealloc];
}
- (IBAction)startRecording:(id)sender
.
.
.
- (IBAction)stopRecording:(id)sender
}
9. Specify the output destination for your recorded media, in this case, a QuickTime movie.
● Insert the block of code inside the IB startRecording: and stopRecording: actions in the
previous step.
{
[mCaptureMovieFileOutput recordToOutputFileURL:[NSURL
fileURLWithPath:@"/Users/Shared/My Recorded Movie.mov"]];
}
{
[mCaptureMovieFileOutput recordToOutputFileURL:nil];
- (void)captureOutput:(QTCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL forConnections:(NSArray
*)connections dueToError:(NSError *)error
{
[[NSWorkspace sharedWorkspace] openURL:outputFileURL];
}
@end
Now build and compile your extended QTKit recorder application. After you launch the application, you can
begin to capture audio from your iSight camera or audio/video from a DV camera. The output is again
recorded as a QuickTime movie and then automatically opened in QuickTime Player on your desktop.
Following the steps outlined in this chapter, you construct a simple stop motion recorder application that
lets you capture a live video feed, grab frames one at a time with great accuracy, and then record the output
of those frames to a QuickTime movie. You accomplish this with less than 100 lines of Objective-C code,
constructing the sample as you’ve done in previous chapters, in Xcode 3.2 and Interface Builder 3.2.
In building your stop motion recorder application, you work with the following three QTKit classes:
● QTCaptureDeviceInput. The input source for media devices, such as cameras and microphones.
The sample code described in this chapter does not support input from DV cameras, which are of type
QTMediaTypeMuxed, rather than QTMediaTypeVideo. To add support for DV cameras, read the chapter
“Extending the Media Player Application” (page 19).
Just as you’ve done in the section “Prototype the Recorder” (page 35), start by creating a rough sketch of
your QTKit stop motion recorder application. Think, again, of what design elements you want to incorporate
into the application.
QTCaptureView
QTMovieView
Add Frame
Control button
In this design prototype, you start with three simple objects: a capture view, a QuickTime movie view, and
a single button to add frames. These will be the building blocks for your application. You can add more
complexity to the design later on. After you’ve sketched out your prototype, think how to hook up the objects
in Interface Builder and what code you need in your Xcode project to make this happen. Note that you need
to add a movie controller to the QTMovieView object in the illustration above.
2. When the new project window appears, select Cocoa Document-based Application and click Choose.
3. Name the project StopMotion and navigate to the location where you want the Xcode application to
create the project folder.
4. From the Action menu in your Xcode project, choose Add > Add to Existing Frameworks.
Add the QTKit framework to your StopMotion project, which resides in the
/System/Library/Frameworks directory.
5. Now add the Quartz Core framework to your project, which also resides in the
/System/Library/Frameworks directory.
● Select QuartzCore.framework, and click Add when the Add To Targets window appears.
This completes the first sequence of steps in your project. In the next sequence, you define actions and
outlets in Xcode before working with Interface Builder.
Because you’ve already prototyped your QTKit stop motion recorder application, at least in rough form
with a clearly defined data model, you can now determine which actions and outlets need to be
implemented. In this case, you have a QTCaptureView object, which is a subclass of NSView, a
QTMovieView object to display your captured frames and one button to record your captured media
content and add each single frame to your QuickTime movie output.
In the file, delete the Cocoa import statement and replace it with the QTKit import statement.
#import <QTKit/QTKit.h>
2. Open the MyDocument.m implementation file in your project. Delete the contents of the file except for
the following lines of code:
#import "MyDocument.h"
@implementation MyDocument
- (NSString *)windowNibName
{
return @"MyDocument";
}
@end
1. In your MyDocument.h file, add the instance variables mCaptureView and mMovieView.
- (IBAction)addFrame:(id)sender;
4. Now open your MyDocument.m file and add the same action method, followed by braces for the code
you add later to implement this action.
- (IBAction)addFrame:(id)sender
{
}
This completes the second stage of your project. Now you work with Interface Builder 3.2 to construct the
user interface for your project. Be sure that you have saved both your declaration and implementation files,
so that the actions and outlets you declared are synchronously updated in your Interface Builder nib.
In the next phase of your project you construct the user interface for your project.
1. Open Interface Builder 3.2 and click the MyDocument.xib file in your Xcode project window.
2. In Interface Builder 3.2, select Tools > Library to open a library of objects.
The QTCaptureView object provides you with an instance of a view subclass to display a preview of
the video output that is captured by a capture session.
● Drag the QTCaptureView object into your window and resize the object to fit the window, allowing
room at the bottom for your Add Frame button (already shown in the illustration below) and to the
right for your QTMovieView object in your QTKit stop motion recorder application.
● Choose Tools > Inspector. In the Identity panel, select the information (“i”) icon. Click in the field
Class and your QTCaptureView object appears.
● Set the autosizing for the object in the Capture View Size panel.
3. Now repeat the same sequence described in Step #2 to add your QTMovieView object to your window
(already shown above).
● Select the QTMovieView object (symbolized by the blue Q) and drag it into your Window next to
the QTCaptureView object, shown below.
● Choose Tools > Inspector. In the Identity Inspector, select the information (“i”) icon. Click in the Class
field and your QTMovieView object appears.
● Set the autosizing for your QTMovieView object, as you did for the QTCaptureView object in the
steps above.
5. Define the window size you want in your MyDocument.xib by selecting the size icon (symbolized by a
yellow ruler) in the Window Size panel.
6. Specify the delegate outlet connections of File’s Owner in the Window Connections panel.
7. In the Library, select the Push Button control and drag it to the window.
● Set the autosizing for the button at the center and right outside corner, leaving the inside struts
untouched, as shown in the illustration.
● Note that the green light at the left corner of your StopMotion.xcodeproj is turned on, indicating
that Xcode and Interface Builder have synchronized the actions and outlets in your project.
In the last phase of your project, after adding and completing the implementation code, you’re ready to
capture single-frame video, using your stop motion recorder application, as shown below.
To complete the implementation of the MyDocument class, you define the instance variables that point to
the capture session, as well as to the device input and decompressed video output objects.
1. In your Xcode project, add the instance variables to the interface declaration.
The mMovie instance variable points to the QTMovie object, and the mCaptureSession instance variable
points to the QTCaptureSession object. Likewise, the *mCaptureDeviceInput instance variable
points to the QTCaptureDeviceInput object, and the next line declares that the
mCaptureDecompressedVideoOutput instance variable points to the
QTCaptureDecompressedVideoOutput object.
CVImageBufferRef mCurrentImageBuffer;
This instance variable stores the most recent frame that you’ve grabbed in a CVImageBufferRef.
That completes the code you need to add to your MyDocument.h file.
4. Create an empty movie that writes to mutable data in memory, using the initToWritableData:
method.
- (void)windowControllerDidLoadNib:(NSWindowController *) aController
{
NSError *error = nil;
[super windowControllerDidLoadNib:aController];
[[aController window] setDelegate:self];
if (!mMovie) {
mMovie = [[QTMovie alloc] initToWritableData:[NSMutableData data]
error:&error];
if (!mMovie) {
[[NSAlert alertWithError:error] runModal];
return;
}
}
5. Set up a capture session that outputs the raw frames you want to grab.
[mMovieView setMovie:mMovie];
if (!mCaptureSession) {
BOOL success;
mCaptureSession = [[QTCaptureSession alloc] init];
6. Find a video device and add a device input for that device to the capture session.
7. Add a decompressed video output that returns the raw frames you’ve grabbed to the session and then
previews the video from the session in the document window.
mCaptureDecompressedVideoOutput = [[QTCaptureDecompressedVideoOutput
alloc] init];
[mCaptureDecompressedVideoOutput setDelegate:self];
success = [mCaptureSession addOutput:mCaptureDecompressedVideoOutput
error:&error];
if (!success) {
[[NSAlert alertWithError:error] runModal];
return;
[mCaptureView setCaptureSession:mCaptureSession];
9. Start the session, using the startRunning method you’ve used previously in the MyRecorder sample
code.
[mCaptureSession startRunning];
}
}
- (void)captureOutput:(QTCaptureOutput *)captureOutput
didOutputVideoFrame:(CVImageBufferRef)videoFrame withSampleBuffer:(QTSampleBuffer
*)sampleBuffer fromConnection:(QTCaptureConnection *)connection
11. Store the latest frame. Do this in a @synchronized block because the delegate method is not called
on the main thread.
CVImageBufferRef imageBufferToRelease;
CVBufferRetain(videoFrame);
@synchronized (self) {
imageBufferToRelease = mCurrentImageBuffer;
mCurrentImageBuffer = videoFrame;
}
CVBufferRelease(imageBufferToRelease);
}
12. Handle window closing notifications for your device input and stop the capture session.
- (void)windowWillClose:(NSNotification *)notification
{
[mCaptureSession stopRunning];
QTCaptureDevice *device = [mCaptureDeviceInput device];
if ([device isOpen])
[device close];
}
- (void)dealloc
{
[mMovie release];
[mCaptureSession release];
[mCaptureDeviceInput release];
[mCaptureDecompressedVideoOutput release];
[super dealloc];
}
14. Specify the output destination for your recorded media, in this case an editable QuickTime movie.
15. Add the addFrame: action method that you specified previously in your implementation file. This enables
you to get the most recent frame that you’ve grabbed. Do this in a @synchronized block because the
delegate method that sets the most recent frame is not called on the main thread. Note that you’re
wrapping a CVImageBufferRef object into an NSImage. After you create an NSImage, you can then
add it to the movie.
- (IBAction)addFrame:(id)sender
{
CVImageBufferRef imageBuffer;
@synchronized (self) {
imageBuffer = CVBufferRetain(mCurrentImageBuffer);
}
if (imageBuffer) {
NSCIImageRep *imageRep = [NSCIImageRep imageRepWithCIImage:[CIImage
imageWithCVImageBuffer:imageBuffer]];
NSImage *image = [[[NSImage alloc] initWithSize:[imageRep size]]
autorelease];
[image addRepresentation:imageRep];
CVBufferRelease(imageBuffer);
[mMovie addImage:image forDuration:QTMakeTime(1, 10)
withAttributes:[NSDictionary dictionaryWithObjectsAndKeys:
@"jpeg", QTAddImageCodecType, nil]];
[mMovie setCurrentTime:[mMovie duration]];
[mMovieView setNeedsDisplay:YES];
[self updateChangeCount:NSChangeDone];
}
}
After you’ve saved your project, click Build and Go. After compiling, click the Add Frame button to record
each captured frame and output that frame to a QuickTime movie. The output of your captured session is
saved as a QuickTime movie.
Now you can begin capturing and recording with your QTKit stop motion recorder application. Typically,
you can record any number of frames illustrating movement or action, using objects of clay or stick figures,
for example, which, when combined and recorded, will create the illusion of motion in a movie or animated
sequence. The technique is common in working with various inanimate objects that can be assembled into
a particular story or narrative.
Summary
In this chapter, you focused on building a stop motion recorder application using three capture objects:
QTCaptureSession, QTCaptureDecompressedVideoOutput and QTCaptureDeviceInput. These were
the essential building blocks for your project. You learned how to:
● Prototype and design the data model and control for your project in a rough sketch before constructing
the actual user interface.
● Create the user interface using the QTCaptureView plug-in from the Interface Builder library of controls.
● Wire up the Add Frame control button for your user interface.
● Define the instance variables that point to the capture session, as well as to the device input and
decompressed video output objects.
● Build and run the stop motion recorder application, using the iSight camera as your input device.
● Output the single captured frames to a QuickTime movie for animated effects.
Summary 67
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
CHAPTER 6
Creating a QTKit Stop Motion Application
68 Summary
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
REVISION HISTORY
Date Notes
2010-02-24 Added new section on supporting movie file type ownership in the
MyMediaPlayer application.
2009-08-19 New tutorial document describing how you can build, enhance and customize
a media player application using the QTKit framework.
69
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.
REVISION HISTORY
Document Revision History
70
2010-02-24 | © 2010 Apple Inc. All Rights Reserved.