SlideShare a Scribd company logo
1 of 105
Advanced Media Manipulation
    with AV Foundation
    Chris Adamson — @invalidname — http://www.subfurther.com/blog

    Voices That Matter IPhone Developer Conference — March 10, 2011


Sunday, April 10, 2011
The Deal



    ✤    Slides will be posted VTM conference site and http://
         www.slideshare.com/invalidname

    ✤    Code will be posted to blog at http://www.subfurther.com/blog

    ✤    Don’t try to transcribe the code examples




Sunday, April 10, 2011
No, really

    ✤    Seriously, don’t try to transcribe the code examples

    ✤    You will never keep up

    ✤    AV Foundation has the longest class and method names you have
         ever seen:

          ✤    AVMutableVideoCompositionLayerInstruction

          ✤    AVAssetWriterInputPixelBufferAdaptor

          ✤    etc.

Sunday, April 10, 2011
Really, really, seriously… don’t



AVMutableVideoCompositionLayerInstruction *aInstruction =
    [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack: trackA];
[aInstruction setOpacityRampFromStartOpacity:0.0
                                toEndOpacity:1.0
                                   timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(2.9, VIDEO_TIME_SCALE),
                                                             CMTimeMakeWithSeconds(6.0, VIDEO_TIME_SCALE))];




Sunday, April 10, 2011
The Road Map

    ✤    VTM Philly recap

          ✤    Assets, playback, and capture

    ✤    Reading samples with AVAssetReader

    ✤    Writing samples with AVAssetWriter

    ✤    Editing with effects

    ✤    Capture callbacks


Sunday, April 10, 2011
Last Time at #vtm_iphone…




Sunday, April 10, 2011
Introduction to AV Foundation




Sunday, April 10, 2011
iOS 4 Media Frameworks

                         Core Audio /
                                           Low-level audio streaming
                           OpenAL


                         Media Player     iPod library search/playback


                                         Audio / Video capture, editing,
                         AV Foundation
                                              playback, export…


                          Core Video     Quartz effects on moving images


                                          Objects for representing media
                          Core Media
                                              times, formats, buffers



Sunday, April 10, 2011
Size is relative


                                       AV          android.                 QuickTime
                                                                 QT Kit
                                    Foundation      media                    for Java*



                         Classes         61           40           24           576


                         Methods       500+          280          360        >10,000


                     * – QTJ is used here only as an OO proxy for the procedural QuickTime API
Sunday, April 10, 2011
How do media frameworks work?




Sunday, April 10, 2011
Sunday, April 10, 2011
Sunday, April 10, 2011
Sunday, April 10, 2011
Sunday, April 10, 2011
“Boom Box” APIs

                    ✤    Simple API for playback, sometimes
                         recording

                    ✤    Little or no support for editing,
                         mixing, metadata, etc.

                    ✤    Example: HTML 5 <audio> and
                         <video> tags, iOS Media Player
                         framework




Sunday, April 10, 2011
“Streaming” APIs


                    ✤    Use “stream of audio” metaphor

                    ✤    Strong support for mixing, effects,
                         other real-time operations

                    ✤    Example: Core Audio




Sunday, April 10, 2011
“Streaming” APIs


                    ✤    Use “stream of audio” metaphor

                    ✤    Strong support for mixing, effects,
                         other real-time operations

                    ✤    Example: Core Audio
                         and AV Foundation (capture)




Sunday, April 10, 2011
“Document” APIs


                    ✤    Use “media document” metaphor

                    ✤    Strong support for editing

                    ✤    Mixing may be a special case of
                         editing

                    ✤    Example: QuickTime




Sunday, April 10, 2011
“Document” APIs


                    ✤    Use “media document” metaphor

                    ✤    Strong support for editing

                    ✤    Mixing may be a special case of
                         editing

                    ✤    Example: QuickTime
                         and AV Foundation (playback and editing)



Sunday, April 10, 2011
AV Foundation Classes


    ✤    Capture

    ✤    Assets and compositions

          ✤    Playback, editing, and export

    ✤    Legacy classes




Sunday, April 10, 2011
AVAsset

    ✤    A collection of time-based media data

          ✤    Sound, video, text (closed captions, subtitles, etc.)

    ✤    Each distinct media type is contained in a track

    ✤    An asset represents the arrangement of the tracks. Tracks are pointers
         to source media, plus metadata (i.e., what parts of the source to use; a
         gain or opacity to apply, etc.)

    ✤    Asset ≠ media. Track ≠ media. Media = media.

    ✤    Asset also contains metadata (where common to all tracks)

Sunday, April 10, 2011
AVAsset subclasses


    ✤    AVURLAsset — An asset created from a URL, such as a song or
         movie file or network document/stream

    ✤    AVComposition — An asset created from assets in multiple files, used
         to combine and present media together.

          ✤    Used for editing




Sunday, April 10, 2011
AVPlayer

    ✤    Provides the ability to play an asset

          ✤    play, pause, seekToTime: methods; currentTime, rate properties

    ✤    Init with URL or with AVPlayerItem

          NSURL *url = [NSURL URLWithString:
                 @"http://www.subfurther.com/video/running-start-
                 iphone.m4v"];
          AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url
          ! ! ! ! ! ! ! ! options:nil];
          AVPlayerItem *playerItem = [AVPlayerItem
                              playerItemWithAsset:asset];
          player = [[AVPlayer playerWithPlayerItem:playerItem]
                              retain];

Sunday, April 10, 2011
AVPlayerLayer (or not)

    ✤    CALayer used to display video from a player

          ✤    Check that the media has video

              NSArray *visualTracks = [asset tracksWithMediaCharacteristic:
                                  AVMediaCharacteristicVisual];
              if ((!visualTracks) ||
              ! ([visualTracks count] == 0)) {
              ! playerView.hidden = YES;
              ! noVideoLabel.hidden = NO;
              }




Sunday, April 10, 2011
AVPlayerLayer (no really)

    ✤    If you have video, create AVPlayerLayer from AVPlayer.

    ✤    Set bounds and video “gravity” (bounds-filling behavior)

            else {
            ! playerView.hidden = NO;
            ! noVideoLabel.hidden = YES;
            ! AVPlayerLayer *playerLayer = [AVPlayerLayer
                                playerLayerWithPlayer:player];
            ! [playerView.layer addSublayer:playerLayer];
            ! playerLayer.frame = playerView.layer.bounds;
            ! playerLayer.videoGravity =
                                AVLayerVideoGravityResizeAspect;
            }



Sunday, April 10, 2011
Demo
    VTM_AVPlayer




Sunday, April 10, 2011
Media Capture


    ✤    AV Foundation capture classes for audio / video capture, along with
         still image capture

          ✤    Programmatic control of white balance, autofocus, zoom, etc.

    ✤    Does not exist on the simulator. AV Foundation capture apps can
         only be compiled for and run on the device.

    ✤    API design is borrowed from QTKit on the Mac



Sunday, April 10, 2011
Capture Classes Seem Familiar?
                         QT Kit                 AV Foundation
          QTCaptureAudioPreviewOutput        AVCaptureAudioDataOutput
          QTCaptureConnection                AVCaptureConnection
          QTCaptureDecompressedAudioOutput   AVCaptureDevice
          QTCaptureDecompressedVideoOutput   AVCaptureFileOutput
          QTCaptureDevice                    AVCaptureInput
          QTCaptureDeviceInput               AVCaptureMovieFileOutput
          QTCaptureFileOutput                AVCaptureOutput
          QTCaptureInput                     AVCaptureSession
          QTCaptureLayer                     AVCaptureStillImageOutput
          QTCaptureMovieFileOutput           AVCaptureVideoDataOutput
          QTCaptureOutput                    AVCaptureVideoPreviewLayer
          QTCaptureSession
          QTCaptureVideoPreviewOutput
          QTCaptureView


Sunday, April 10, 2011
Capture Classes Seem Familiar?
                         QT Kit                 AV Foundation
          QTCaptureAudioPreviewOutput        AVCaptureAudioDataOutput
          QTCaptureConnection                AVCaptureConnection
          QTCaptureDecompressedAudioOutput   AVCaptureDevice
          QTCaptureDecompressedVideoOutput   AVCaptureFileOutput
          QTCaptureDevice                    AVCaptureInput
          QTCaptureDeviceInput               AVCaptureMovieFileOutput
          QTCaptureFileOutput                AVCaptureOutput
          QTCaptureInput                     AVCaptureSession
          QTCaptureLayer                     AVCaptureStillImageOutput
          QTCaptureMovieFileOutput           AVCaptureVideoDataOutput
          QTCaptureOutput                    AVCaptureVideoPreviewLayer
          QTCaptureSession
          QTCaptureVideoPreviewOutput
          QTCaptureView


Sunday, April 10, 2011
Capture basics

    ✤    Create an AVCaptureSession to coordinate the capture

    ✤    Investigate available AVCaptureDevices

    ✤    Create AVCaptureDeviceInput and connect it to the session

    ✤    Optional: set up an AVCaptureVideoPreviewLayer

    ✤    Optional: connect AVCaptureOutputs

    ✤    Tell the session to start recording


Sunday, April 10, 2011
Getting capture device and input

        AVCaptureDevice *videoDevice = [AVCaptureDevice
                  defaultDeviceWithMediaType: AVMediaTypeVideo];
        if (videoDevice) {
        ! NSLog (@"got videoDevice");
        ! AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput
                            deviceInputWithDevice:videoDevice
        ! ! ! ! ! ! ! ! error:&setUpError];
        ! if (videoInput) {
        ! ! [captureSession addInput: videoInput];
        ! }
        }


          Note 1: You may also want to check for AVMediaTypeMuxed
          Note 2: Do not assume devices based on model (c.f. iPad
          Camera Connection Kit)

Sunday, April 10, 2011
Creating a video preview layer

         AVCaptureVideoPreviewLayer *previewLayer =
                          [AVCaptureVideoPreviewLayer
                          layerWithSession:captureSession];
         previewLayer.frame = captureView.layer.bounds;
         previewLayer.videoGravity =
                          AVLayerVideoGravityResizeAspect;
         [captureView.layer addSublayer:previewLayer];




                         Keep in mind that the iPhone cameras have a
                         portrait orientation

Sunday, April 10, 2011
Setting an output

      captureMovieOutput = [[AVCaptureMovieFileOutput alloc] init];
      if (! captureMovieURL) {
      ! captureMoviePath = [getCaptureMoviePath() retain];
      ! captureMovieURL = [[NSURL alloc]
                        initFileURLWithPath:captureMoviePath];
      }
      NSLog (@"recording to %@", captureMovieURL);
      [captureSession addOutput:captureMovieOutput];




                         We’ll use the captureMovieURL later…


Sunday, April 10, 2011
Start capturing


             [captureSession startRunning];
             recordButton.selected = YES;
             if ([[NSFileManager defaultManager]
                  fileExistsAtPath:captureMoviePath]) {
             ! [[NSFileManager defaultManager]
                  removeItemAtPath:captureMoviePath error:nil];
             }
             // note: must have a delegate
             [captureMovieOutput
             startRecordingToOutputFileURL:captureMovieURL
             ! ! ! ! ! ! ! ! recordingDelegate:self];




Sunday, April 10, 2011
Capture delegate callbacks

       - (void)captureOutput:(AVCaptureFileOutput *)captureOutput
            didStartRecordingToOutputFileAtURL:(NSURL *)fileURL
            fromConnections:(NSArray *)connections {
       ! NSLog (@"started recording to %@", fileURL);
       }

       - (void)captureOutput:(AVCaptureFileOutput *)captureOutput
            didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
            fromConnections:(NSArray *)connections
            error:(NSError *)error {
       ! if (error) {
       ! ! NSLog (@"failed to record: %@", error);
       ! } else {
       ! ! NSLog (@"finished recording to %@", outputFileURL);
       ! }
       }

Sunday, April 10, 2011
Demo
    VTM_AVRecPlay




Sunday, April 10, 2011
Let’s get advanced




Sunday, April 10, 2011
Prerequisite: Core Media




Sunday, April 10, 2011
Core Media

    ✤    C-based framework containing structures that represent media
         samples and media timing

          ✤    Opaque types: CMBlockBuffer, CMBufferQueue,
               CMFormatDescription, CMSampleBuffer, CMTime, CMTimeRange

          ✤    Handful of convenience functions to work with these

    ✤    Buffer types provide wrappers around possibly-fragmented memory,
         time types provide timing at arbitrary precision



Sunday, April 10, 2011
CMTime

    ✤    CMTime contains a value and a timescale (similar to QuickTime)

    ✤    Time scale is how the time is measured: “nths of a second”

          ✤    Time in seconds = value / timescale

          ✤    Allows for exact timing of any kind of media

    ✤    Different tracks of an asset can and will have different timescales

          ✤    Convert with CMTimeConvertScale()


Sunday, April 10, 2011
Writing Samples




Sunday, April 10, 2011
AVAssetWriter


    ✤    Introduced in iOS 4.1

    ✤    Allows you to create samples programmatically and write them to an
         asset

    ✤    Used for synthesized media files: screen recording, CGI, synthesized
         audio, etc.




Sunday, April 10, 2011
Using AVAssetWriter

    ✤    Create an AVAssetWriter

    ✤    Create and configure an AVAssetWriterInput and connect it to the
         writer

    ✤    -[AVAssetWriter startWriting]

    ✤    Repeatedly call -[AVAssetWriterInput appendSampleBuffer:] with
         CMSampleBufferRef’s

          ✤    Set expectsDataInRealTime appropriately, honor
               readyForMoreMediaData property.

Sunday, April 10, 2011
Example: iOS Screen Recorder


    ✤    Set up an AVAssetWriter to write to a QuickTime movie file, and an
         AVAssetWriterInput with codec and other video track metadata

    ✤    Set up an AVAssetWriterPixelBufferAdaptor to simplify converting
         CGImageRefs into CMSampleBufferRefs

    ✤    Use an NSTimer to periodically grab the screen image and use the
         AVAssetWriterPixelBufferAdapter to write to the AVAssetWriterInput




Sunday, April 10, 2011
Create writer, writer input, and
    pixel buffer adaptor
   assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL
                                            fileType:AVFileTypeQuickTimeMovie
                                               error:&movieError];
   NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
                                              [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
                                              nil];
   assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                          outputSettings:assetWriterInputSettings];
   assetWriterInput.expectsMediaDataInRealTime = YES;
   [assetWriter addInput:assetWriterInput];

   assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc]
                                 initWithAssetWriterInput:assetWriterInput
                                 sourcePixelBufferAttributes:nil];
   [assetWriter startWriting];


                          Settings keys and values are defined in AVAudioSettings.h
                         and AVVideoSettings.h, or AV Foundation Constants Reference

Sunday, April 10, 2011
Getting a screenshot




Sunday, April 10, 2011
Create a pixel buffer

         // get screenshot image!
         CGImageRef image = (CGImageRef) [[self screenshot] CGImage];
         NSLog (@"made screenshot");

         // prepare the pixel buffer
         CVPixelBufferRef pixelBuffer = NULL;
         CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
         NSLog (@"copied image data");
         cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                              FRAME_WIDTH,
                                              FRAME_HEIGHT,
                                              kCVPixelFormatType_32BGRA,
                                              (void*)CFDataGetBytePtr(imageData),
                                              CGImageGetBytesPerRow(image),
                                              NULL,
                                              NULL,
                                              NULL,
                                              &pixelBuffer);
         NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);



Sunday, April 10, 2011
Calculate time and write sample


               // calculate the time
               CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
               CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
               NSLog (@"elapsedTime: %f", elapsedTime);
               CMTime presentationTime = CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE);

               // write the sample
               BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer
                                                           withPresentationTime:presentationTime];




Sunday, April 10, 2011
Demo
    VTM_ScreenRecorderTest




Sunday, April 10, 2011
Reading Samples




Sunday, April 10, 2011
AVAssetReader


    ✤    Introduced in iOS 4.1

    ✤    Possible uses:

          ✤    Showing an audio wave form in a timeline

          ✤    Generating frame-accurate thumbnails




Sunday, April 10, 2011
Using AVAssetReader


    ✤    Create an AVAssetReader

    ✤    Create and configure an AVAssetReaderOutput

          ✤    Three concrete subclasses: AVAssetReaderTrackOutput,
               AVAssetReaderAudioMixOutput, and
               AVAssetReaderVideoCompositionOutput.

    ✤    Get data with -[AVAssetReader copyNextSampleBuffer]



Sunday, April 10, 2011
Example: Convert iPod song to
    PCM


    ✤    In iOS 4, Media Framework exposes a new metadata property,
         MPMediaItemPropertyAssetURL, that allows AV Foundation to open
         the library item as an AVAsset

    ✤    Create an AVAssetReader to read sample buffers from the song

    ✤    Create an AVAssetWriter to convert and write PCM samples




Sunday, April 10, 2011
Coordinated reading/writing



    ✤    You can provide a block to -[AVAssetWriter
         requestMediaDataWhenReady:onQueue:]

          ✤    Only perform your asset reads / writes when the writer is ready.

    ✤    In this example, AVAssetWriterInput.expectsMediaInRealTime is NO




Sunday, April 10, 2011
Set up reader, reader output,
    writer
                    NSURL *assetURL = [song valueForProperty:MPMediaItemPropertyAssetURL];
                    AVURLAsset *songAsset =
                        [AVURLAsset URLAssetWithURL:assetURL options:nil];

                    NSError *assetError = nil;
                    AVAssetReader *assetReader =
                        [[AVAssetReader assetReaderWithAsset:songAsset
                               error:&assetError]
                          retain];

                    AVAssetReaderOutput *assetReaderOutput =
                        [[AVAssetReaderAudioMixOutput
                          assetReaderAudioMixOutputWithAudioTracks:songAsset.tracks
                                    audioSettings: nil]
                        retain];
                    [assetReader addOutput: assetReaderOutput];
                    AVAssetWriter *assetWriter =
                        [[AVAssetWriter assetWriterWithURL:exportURL
                                                  fileType:AVFileTypeCoreAudioFormat
                                                      error:&assetError]
                          retain];


Sunday, April 10, 2011
Set up writer input
                 AudioChannelLayout channelLayout;
                 memset(&channelLayout, 0, sizeof(AudioChannelLayout));
                 channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
                 NSDictionary *outputSettings =
                 [NSDictionary dictionaryWithObjectsAndKeys:
                     [NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
                     [NSNumber numberWithFloat:44100.0], AVSampleRateKey,
                     [NSNumber numberWithInt:2], AVNumberOfChannelsKey,
                     [NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)],
                         AVChannelLayoutKey,
                     [NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
                     [NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
                     [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
                     [NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
                     nil];
                 AVAssetWriterInput *assetWriterInput =
                     [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio
                                 outputSettings:outputSettings]
                     retain];

                    Note 1: Many of these settings are required, but you won’t know which until you get a runtime error.
                                              Note 2: AudioChannelLayout is from Core Audio
Sunday, April 10, 2011
Start reading and writing



                         [assetWriter startWriting];
                         [assetReader startReading];
                         AVAssetTrack *soundTrack = [songAsset.tracks objectAtIndex:0];
                         CMTime startTime = CMTimeMake (0, soundTrack.naturalTimeScale);
                         [assetWriter startSessionAtSourceTime: startTime];




Sunday, April 10, 2011
Read only when writer is ready

      __block UInt64 convertedByteCount = 0;
      dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
      [assetWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue
                                               usingBlock: ^
       {
           while (assetWriterInput.readyForMoreMediaData) {
              CMSampleBufferRef nextBuffer = [assetReaderOutput copyNextSampleBuffer];
              if (nextBuffer) {
                  // append buffer
                  [assetWriterInput appendSampleBuffer: nextBuffer];
                  convertedByteCount += CMSampleBufferGetTotalSampleSize (nextBuffer);
                  // update UI on main thread only
                  NSNumber *convertedByteCountNumber = [NSNumber
      numberWithLong:convertedByteCount];
                  [self performSelectorOnMainThread:@selector(updateSizeLabel:)
                                          withObject:convertedByteCountNumber
                                      waitUntilDone:NO];
              }


Sunday, April 10, 2011
Close file when done
                 else {
                     // done!
                     [assetWriterInput markAsFinished];
                     [assetWriter finishWriting];
                     [assetReader cancelReading];
                     NSDictionary *outputFileAttributes = [[NSFileManager defaultManager]
                                                            attributesOfItemAtPath:exportPath
                                                            error:nil];
                     NSNumber *doneFileSize = [NSNumber numberWithLong:[outputFileAttributes fileSize]];
                     [self performSelectorOnMainThread:@selector(updateCompletedSizeLabel:)
                                            withObject:doneFileSize
                                         waitUntilDone:NO];
                     // release a lot of stuff
                     [assetReader release];
                     [assetReaderOutput release];
                     [assetWriter release];
                     [assetWriterInput release];
                     [exportPath release];
                     break;
                 }
          }
    }];
Sunday, April 10, 2011
Demo
    VTM_AViPodReader




Sunday, April 10, 2011
Media Editing




Sunday, April 10, 2011
Video Editing? On iPhone?
    Really?




                         Comparison specs from everymac.com
Sunday, April 10, 2011
Video Editing? On iPhone?
    Really?
                1999:
         Power Mac G4 500 AGP




                          Comparison specs from everymac.com
Sunday, April 10, 2011
Video Editing? On iPhone?
    Really?
                1999:
         Power Mac G4 500 AGP




                          Comparison specs from everymac.com
Sunday, April 10, 2011
Video Editing? On iPhone?
    Really?
                1999:
         Power Mac G4 500 AGP




              CPU: 500 MHz G4
              RAM: 256 MB
              Storage: 20 GB HDD
                              Comparison specs from everymac.com
Sunday, April 10, 2011
Video Editing? On iPhone?
    Really?
                1999:                                                2010:
         Power Mac G4 500 AGP                                      iPhone 4




              CPU: 500 MHz G4
              RAM: 256 MB
              Storage: 20 GB HDD
                              Comparison specs from everymac.com
Sunday, April 10, 2011
Video Editing? On iPhone?
    Really?
                1999:                                                2010:
         Power Mac G4 500 AGP                                      iPhone 4




              CPU: 500 MHz G4
              RAM: 256 MB
              Storage: 20 GB HDD
                              Comparison specs from everymac.com
Sunday, April 10, 2011
Video Editing? On iPhone?
    Really?
                1999:                                                    2010:
         Power Mac G4 500 AGP                                          iPhone 4




              CPU: 500 MHz G4                                      CPU: 800 MHz Apple A4
              RAM: 256 MB                                          RAM: 512 MB
              Storage: 20 GB HDD                                   Storage: 16 GB Flash
                              Comparison specs from everymac.com
Sunday, April 10, 2011
AVComposition


    ✤    An AVAsset that gets its tracks from multiple file-based sources

    ✤    To create a movie, you typically use an AVMutableComposition


                 composition = [[AVMutableComposition alloc] init];




Sunday, April 10, 2011
Copying from another asset

    ✤    -[AVMutableComposition insertTimeRange:ofAsset:atTime:error:]


         CMTime inTime = CMTimeMakeWithSeconds(inSeconds, 600);
         CMTime outTime = CMTimeMakeWithSeconds(outSeconds, 600);
         CMTime duration = CMTimeSubtract(outTime, inTime);
         CMTimeRange editRange = CMTimeRangeMake(inTime, duration);
         NSError *editError = nil;

         [targetController.composition insertTimeRange:editRange
         ! ! ! ! ! ! ! ofAsset:sourceAsset
                          atTime:targetController.composition.duration
         ! ! ! ! ! ! ! error:&editError];




Sunday, April 10, 2011
Demo
    VTM_AVEditor




Sunday, April 10, 2011
Editing With Effects




Sunday, April 10, 2011
Sunday, April 10, 2011
Multiple video tracks

    ✤    To combine multiple video sources into one movie, create an
         AVMutableComposition, then create AVMutableCompositionTracks

      // create composition
      self.composition = [[AVMutableComposition alloc] init];

      // create video tracks a and b
      // note: mediatypes are defined in AVMediaFormat.h
      [trackA release];
      trackA = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo
                                             preferredTrackID:kCMPersistentTrackID_Invalid];
      [trackB release];
      trackB = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo
                                             preferredTrackID:kCMPersistentTrackID_Invalid];

      // locate source video track
      AVAssetTrack *sourceVideoTrack = [[sourceVideoAsset tracksWithMediaType: AVMediaTypeVideo]
                                        objectAtIndex: 0];



Sunday, April 10, 2011
A/B Roll Editing

    ✤    Apple recommends alternating between two tracks, rather than using
         arbitrarily many (e.g., one track per shot)




Sunday, April 10, 2011
Sound tracks

    ✤    Treat your audio as separate tracks too.


         // create music track
         trackMusic = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                     preferredTrackID:kCMPersistentTrackID_Invalid];
         CMTimeRange musicTrackTimeRange = CMTimeRangeMake(kCMTimeZero,
                                                            musicTrackAudioAsset.duration);
         NSError *trackMusicError = nil;
         [trackMusic insertTimeRange:musicTrackTimeRange
                             ofTrack:[musicTrackAudioAsset.tracks objectAtIndex:0]
                               atTime:kCMTimeZero
                                error:&trackMusicError];




Sunday, April 10, 2011
Empty ranges

    ✤    Use -[AVMutableCompositionTrack insertEmptyTimeRange:] to
         account for any part of any track where you won’t be inserting media
         segments.

          CMTime videoTracksTime = CMTimeMake(0, VIDEO_TIME_SCALE);
          CMTime postEditTime = CMTimeAdd (videoTracksTime,
                                           CMTimeMakeWithSeconds(FIRST_CUT_TRACK_A_IN_TIME,
                                                                 VIDEO_TIME_SCALE));
          [trackA insertEmptyTimeRange:CMTimeRangeMake(kCMTimeZero, postEditTime)];
          videoTracksTime = postEditTime;




Sunday, April 10, 2011
Track-level inserts

    ✤    Insert media segments with -[AVMutableCompositionTrack
         insertTimeRange:ofTrack:atTime:error]


          postEditTime = CMTimeAdd (videoTracksTime, CMTimeMakeWithSeconds(FIRST_CUT_DURATION,
                                                                           VIDEO_TIME_SCALE));
          CMTimeRange firstShotRange = CMTimeRangeMake(kCMTimeZero,
                                                       CMTimeMakeWithSeconds(FIRST_CUT_DURATION,
                                                                             VIDEO_TIME_SCALE));
          [trackA insertTimeRange:firstShotRange
                          ofTrack:sourceVideoTrack
                           atTime:videoTracksTime
                            error:&performError];
          videoTracksTime = postEditTime;




Sunday, April 10, 2011
AVVideoComposition


    ✤    Describes how multiple video tracks are to be composited together.
         Mutable version is AVMutableVideoComposition

          ✤    Not a subclass of AVComposition!

    ✤    Contains an array AVVideoCompositionInstructions

          ✤    Time ranges of these instructions need to not overlap, have gaps, or
               fail to match the duration of the AVComposition



Sunday, April 10, 2011
AVVideoCompositionInstruction


    ✤    Represents video compositor instructions for all tracks in one time
         range

    ✤    These instructions are a layerInstructions property

    ✤    Of course, you’ll be creating an
         AVMutableVideoCompositionInstruction




Sunday, April 10, 2011
AVVideoCompositionLayerInstru
    ction (yes, really)
    ✤    Identifies the instructions for one track within an
         AVVideoCompositionInstruction.

    ✤    AVMutableVideoCompositionLayerInstruction. I warned you about
         this back on slide 3.

    ✤    Currently supports two properties: opacity and affine transform.
         Animating (“ramping”) these creates fades/cross-dissolves and
         pushes.

          ✤    e.g., -[AVMutableVideoCompositionLayerInstruction
               setOpacityRampFromStartOpacity:toEndOpacity:timeRange]

Sunday, April 10, 2011
An
    AVVideoCompositionInstruction

 AVMutableVideoCompositionInstruction *transitionInstruction =
     [AVMutableVideoCompositionInstruction videoCompositionInstruction];
 transitionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
 AVMutableVideoCompositionLayerInstruction *aInstruction =
     [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:
     trackA];
 [aInstruction setOpacityRampFromStartOpacity:0.0 toEndOpacity:1.0
                                timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(2.9, VIDEO_TIME_SCALE),
                                                           CMTimeMakeWithSeconds(6.0, VIDEO_TIME_SCALE))];
 AVMutableVideoCompositionLayerInstruction *bInstruction =
      [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:
      trackB];
 [bInstruction setOpacity:0 atTime:kCMTimeZero];
 transitionInstruction.layerInstructions = [NSArray arrayWithObjects:aInstruction, bInstruction, nil];
 [videoInstructions addObject: transitionInstruction];




Sunday, April 10, 2011
Attaching the instructions



                  AVMutableVideoComposition *videoComposition =
                       [AVMutableVideoComposition videoComposition];
                  videoComposition.instructions = videoInstructions;
                  videoComposition.renderSize = videoSize;
                  videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps
                  compositionPlayer.currentItem.videoComposition = videoComposition;




Sunday, April 10, 2011
Titles and Effects


    ✤    AVSynchronizedLayer gives you a CALayer that gets its timing from
         an AVPlayerItem, rather than a wall clock

          ✤    Run the movie slowly or backwards, the animation runs slowly or
               backwards

    ✤    Can add other CALayers as sublayers and they’ll all get their timing
         from the AVPlayerItem




Sunday, April 10, 2011
Creating a main title layer

        // synchronized layer to own all the title layers
        AVSynchronizedLayer *synchronizedLayer =
             [AVSynchronizedLayer synchronizedLayerWithPlayerItem:compositionPlayer.currentItem];
        synchronizedLayer.frame = [compositionView frame];
        [self.view.layer addSublayer:synchronizedLayer];

        // main titles
        CATextLayer *mainTitleLayer = [CATextLayer layer];
        mainTitleLayer.string = NSLocalizedString(@"Running Start", nil);
        mainTitleLayer.font = @"Verdana-Bold";
        mainTitleLayer.fontSize = videoSize.height / 8;
        mainTitleLayer.foregroundColor = [[UIColor yellowColor] CGColor];
        mainTitleLayer.alignmentMode = kCAAlignmentCenter;
        mainTitleLayer.frame = CGRectMake(0.0, 0.0, videoSize.width, videoSize.height);
        mainTitleLayer.opacity = 0.0; // initially invisible
        [synchronizedLayer addSublayer:mainTitleLayer];




Sunday, April 10, 2011
Adding an animation


            // main title opacity animation
            [CATransaction begin];
            [CATransaction setDisableActions:YES];
            CABasicAnimation *mainTitleInAnimation =
                 [CABasicAnimation animationWithKeyPath:@"opacity"];
            mainTitleInAnimation.fromValue = [NSNumber numberWithFloat: 0.0];
            mainTitleInAnimation.toValue = [NSNumber numberWithFloat: 1.0];
            mainTitleInAnimation.removedOnCompletion = NO;
            mainTitleInAnimation.beginTime = AVCoreAnimationBeginTimeAtZero;
            mainTitleInAnimation.duration = 5.0;
            [mainTitleLayer addAnimation:mainTitleInAnimation forKey:@"in-animation"];




             Nasty gotcha: AVCoreAnimationBeginTimeAtZero is a special value that is used for AVF
                 animations, since 0 would otherwise be interpreted as CACurrentMediaTime()

Sunday, April 10, 2011
Demo
    VTM_AVEditor




Sunday, April 10, 2011
Multi-track audio



    ✤    AVPlayerItem.audioMix property

          ✤    AVAudioMix class describes how multiple audio tracks are to be
               mixed together

          ✤    Analogous to videoComposition property (AVVideoComposition)




Sunday, April 10, 2011
Basic Export

    ✤    Create an AVAssetExportSession

    ✤    Must set outputURL and outputFileType properties

          ✤    Inspect possible types with supportedFileTypes property (list of
               AVFileType… strings in docs)

    ✤    Begin export with exportAsynchronouslyWithCompletionHandler:

          ✤    This takes a block, which will be called on completion, failure,
               cancellation, etc.


Sunday, April 10, 2011
Advanced Export


    ✤    AVAssetExportSession takes videoComposition and audioMix
         parameters, just like AVPlayerItem

    ✤    To include AVSynchronizedLayer-based animations in an export, use
         a AVVideoCompositionCoreAnimationTool and set it as the
         animationTool property of the AVMutableVideoComposition (but
         only for export)




Sunday, April 10, 2011
Capture Callbacks




Sunday, April 10, 2011
More fun with capture


    ✤    Can analyze video data coming off the camera with the
         AVCaptureVideoDataOutput class

    ✤    Can provide uncompressed frames to your
         AVCaptureVideoDataOutputSampleBufferDelegate

    ✤    The callback provides you with a CMSampleBufferRef

    ✤    See WWDC 2010 AVCam example



Sunday, April 10, 2011
Hazards and Hassles




Sunday, April 10, 2011
Sunday, April 10, 2011
Sunday, April 10, 2011
Sunday, April 10, 2011
Sunday, April 10, 2011
Only effects are dissolve and
    push?




              How would we do this checkerboard wipe in AV Foundation?
                           It’s pretty easy in QuickTime!
Sunday, April 10, 2011
How do you…


    ✤    Save a composition to work on later?

          ✤    Even if AVMutableComposition supports NSCopying, what if
               you’ve got titles in an AVSynchronizedLayer?

    ✤    Support undo / redo of edits?

    ✤    Add import/export support for other formats and codecs?




Sunday, April 10, 2011
AV Foundation Sucks!


    ✤    Too hard to understand!

    ✤    Too many classes and methods!

    ✤    Verbose and obtuse method naming

          ✤    AVComposition and AVVideoComposition are completely
               unrelated? WTF, Apple?




Sunday, April 10, 2011
Sunday, April 10, 2011
Complex things usually aren’t easy

                         Simple   Complex



               Hard




               Easy


Sunday, April 10, 2011
AV Foundation Rocks!

    ✤    Addresses a huge range of media functionality

          ✤    The other guys don’t even try

    ✤    Same framework used by Apple for iMovie for iPhone/iPad

    ✤    You can create functionality equivalent to iMovie / Final Cut in a few
         hundred lines of code

    ✤    Coming to Mac OS X in 10.7 (Lion)


Sunday, April 10, 2011
Q&A
    Chris Adamson — @invalidname — http://www.subfurther.com/blog

    Voices That Matter IPhone Developer Conference — March 10, 2011


Sunday, April 10, 2011
Also from
    Pearson!


    ✤    “Core Audio is serious black
         arts shit.” — Mike Lee (@bmf)

    ✤    It’s tangentially related to AV
         Foundation, so you should
         totally buy it when it comes
         out.




Sunday, April 10, 2011

More Related Content

Viewers also liked

AVFoundation @ TACOW 2013 05 14
AVFoundation @ TACOW 2013 05 14AVFoundation @ TACOW 2013 05 14
AVFoundation @ TACOW 2013 05 14Ryder Mackay
 
Active ldap の事例紹介
Active ldap の事例紹介Active ldap の事例紹介
Active ldap の事例紹介Kazuaki Takase
 
Video streaming on e-lab
Video streaming on e-labVideo streaming on e-lab
Video streaming on e-labrneto11
 
Starting Core Animation
Starting Core AnimationStarting Core Animation
Starting Core AnimationJohn Wilker
 
Differentiated Instruction Made Simple
Differentiated Instruction Made SimpleDifferentiated Instruction Made Simple
Differentiated Instruction Made SimpleJennifer Dorman
 
Core Animation
Core AnimationCore Animation
Core AnimationBob McCune
 
ソーシャルゲームにおけるMongoDB適用事例 - Animal Land
ソーシャルゲームにおけるMongoDB適用事例 - Animal LandソーシャルゲームにおけるMongoDB適用事例 - Animal Land
ソーシャルゲームにおけるMongoDB適用事例 - Animal LandMasakazu Matsushita
 
1 visual comunication pptx
1 visual comunication pptx1 visual comunication pptx
1 visual comunication pptxasun camarasa
 
iOS Developer Interview Questions
iOS Developer Interview QuestionsiOS Developer Interview Questions
iOS Developer Interview QuestionsClark Davidson
 
20 iOS developer interview questions
20 iOS developer interview questions20 iOS developer interview questions
20 iOS developer interview questionsArc & Codementor
 

Viewers also liked (14)

AVFoundation @ TACOW 2013 05 14
AVFoundation @ TACOW 2013 05 14AVFoundation @ TACOW 2013 05 14
AVFoundation @ TACOW 2013 05 14
 
Active ldap の事例紹介
Active ldap の事例紹介Active ldap の事例紹介
Active ldap の事例紹介
 
Video Editing in iOS
Video Editing in iOSVideo Editing in iOS
Video Editing in iOS
 
Video streaming on e-lab
Video streaming on e-labVideo streaming on e-lab
Video streaming on e-lab
 
Es ppt
Es pptEs ppt
Es ppt
 
Starting Core Animation
Starting Core AnimationStarting Core Animation
Starting Core Animation
 
Differentiated Instruction Made Simple
Differentiated Instruction Made SimpleDifferentiated Instruction Made Simple
Differentiated Instruction Made Simple
 
Stupid Video Tricks
Stupid Video TricksStupid Video Tricks
Stupid Video Tricks
 
Core Animation
Core AnimationCore Animation
Core Animation
 
ソーシャルゲームにおけるMongoDB適用事例 - Animal Land
ソーシャルゲームにおけるMongoDB適用事例 - Animal LandソーシャルゲームにおけるMongoDB適用事例 - Animal Land
ソーシャルゲームにおけるMongoDB適用事例 - Animal Land
 
Animation in iOS
Animation in iOSAnimation in iOS
Animation in iOS
 
1 visual comunication pptx
1 visual comunication pptx1 visual comunication pptx
1 visual comunication pptx
 
iOS Developer Interview Questions
iOS Developer Interview QuestionsiOS Developer Interview Questions
iOS Developer Interview Questions
 
20 iOS developer interview questions
20 iOS developer interview questions20 iOS developer interview questions
20 iOS developer interview questions
 

More from Chris Adamson

Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)Chris Adamson
 
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
Whatever Happened to Visual Novel Anime? (JAFAX 2018)Whatever Happened to Visual Novel Anime? (JAFAX 2018)
Whatever Happened to Visual Novel Anime? (JAFAX 2018)Chris Adamson
 
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
Media Frameworks Versus Swift (Swift by Northwest, October 2017)Media Frameworks Versus Swift (Swift by Northwest, October 2017)
Media Frameworks Versus Swift (Swift by Northwest, October 2017)Chris Adamson
 
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...Chris Adamson
 
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is FineCocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is FineChris Adamson
 
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Chris Adamson
 
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Chris Adamson
 
Firebase: Totally Not Parse All Over Again (Unless It Is)
Firebase: Totally Not Parse All Over Again (Unless It Is)Firebase: Totally Not Parse All Over Again (Unless It Is)
Firebase: Totally Not Parse All Over Again (Unless It Is)Chris Adamson
 
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Chris Adamson
 
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Chris Adamson
 
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Chris Adamson
 
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...Chris Adamson
 
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014Chris Adamson
 
Stupid Video Tricks, CocoaConf Seattle 2014
Stupid Video Tricks, CocoaConf Seattle 2014Stupid Video Tricks, CocoaConf Seattle 2014
Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
 
Stupid Video Tricks, CocoaConf Las Vegas
Stupid Video Tricks, CocoaConf Las VegasStupid Video Tricks, CocoaConf Las Vegas
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
 
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)Chris Adamson
 
Stupid Video Tricks (CocoaConf DC, March 2014)
Stupid Video Tricks (CocoaConf DC, March 2014)Stupid Video Tricks (CocoaConf DC, March 2014)
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
 
Introduction to the Roku SDK
Introduction to the Roku SDKIntroduction to the Roku SDK
Introduction to the Roku SDKChris Adamson
 
Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)Chris Adamson
 
Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)Chris Adamson
 

More from Chris Adamson (20)

Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)
 
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
Whatever Happened to Visual Novel Anime? (JAFAX 2018)Whatever Happened to Visual Novel Anime? (JAFAX 2018)
Whatever Happened to Visual Novel Anime? (JAFAX 2018)
 
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
Media Frameworks Versus Swift (Swift by Northwest, October 2017)Media Frameworks Versus Swift (Swift by Northwest, October 2017)
Media Frameworks Versus Swift (Swift by Northwest, October 2017)
 
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...
 
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is FineCocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is Fine
 
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...
 
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)
 
Firebase: Totally Not Parse All Over Again (Unless It Is)
Firebase: Totally Not Parse All Over Again (Unless It Is)Firebase: Totally Not Parse All Over Again (Unless It Is)
Firebase: Totally Not Parse All Over Again (Unless It Is)
 
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)
 
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)
 
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)
 
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...
 
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014
 
Stupid Video Tricks, CocoaConf Seattle 2014
Stupid Video Tricks, CocoaConf Seattle 2014Stupid Video Tricks, CocoaConf Seattle 2014
Stupid Video Tricks, CocoaConf Seattle 2014
 
Stupid Video Tricks, CocoaConf Las Vegas
Stupid Video Tricks, CocoaConf Las VegasStupid Video Tricks, CocoaConf Las Vegas
Stupid Video Tricks, CocoaConf Las Vegas
 
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
 
Stupid Video Tricks (CocoaConf DC, March 2014)
Stupid Video Tricks (CocoaConf DC, March 2014)Stupid Video Tricks (CocoaConf DC, March 2014)
Stupid Video Tricks (CocoaConf DC, March 2014)
 
Introduction to the Roku SDK
Introduction to the Roku SDKIntroduction to the Roku SDK
Introduction to the Roku SDK
 
Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)Get On The Audiobus (CocoaConf Atlanta, November 2013)
Get On The Audiobus (CocoaConf Atlanta, November 2013)
 
Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)Get On The Audiobus (CocoaConf Boston, October 2013)
Get On The Audiobus (CocoaConf Boston, October 2013)
 

Recently uploaded

My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxhariprasad279825
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
The Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdfThe Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdfSeasiaInfotech2
 

Recently uploaded (20)

My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Artificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptxArtificial intelligence in cctv survelliance.pptx
Artificial intelligence in cctv survelliance.pptx
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
The Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdfThe Future of Software Development - Devin AI Innovative Approach.pdf
The Future of Software Development - Devin AI Innovative Approach.pdf
 

Advanced Media Manipulation with AV Foundation

  • 1. Advanced Media Manipulation with AV Foundation Chris Adamson — @invalidname — http://www.subfurther.com/blog Voices That Matter IPhone Developer Conference — March 10, 2011 Sunday, April 10, 2011
  • 2. The Deal ✤ Slides will be posted VTM conference site and http:// www.slideshare.com/invalidname ✤ Code will be posted to blog at http://www.subfurther.com/blog ✤ Don’t try to transcribe the code examples Sunday, April 10, 2011
  • 3. No, really ✤ Seriously, don’t try to transcribe the code examples ✤ You will never keep up ✤ AV Foundation has the longest class and method names you have ever seen: ✤ AVMutableVideoCompositionLayerInstruction ✤ AVAssetWriterInputPixelBufferAdaptor ✤ etc. Sunday, April 10, 2011
  • 4. Really, really, seriously… don’t AVMutableVideoCompositionLayerInstruction *aInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack: trackA]; [aInstruction setOpacityRampFromStartOpacity:0.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(2.9, VIDEO_TIME_SCALE), CMTimeMakeWithSeconds(6.0, VIDEO_TIME_SCALE))]; Sunday, April 10, 2011
  • 5. The Road Map ✤ VTM Philly recap ✤ Assets, playback, and capture ✤ Reading samples with AVAssetReader ✤ Writing samples with AVAssetWriter ✤ Editing with effects ✤ Capture callbacks Sunday, April 10, 2011
  • 6. Last Time at #vtm_iphone… Sunday, April 10, 2011
  • 7. Introduction to AV Foundation Sunday, April 10, 2011
  • 8. iOS 4 Media Frameworks Core Audio / Low-level audio streaming OpenAL Media Player iPod library search/playback Audio / Video capture, editing, AV Foundation playback, export… Core Video Quartz effects on moving images Objects for representing media Core Media times, formats, buffers Sunday, April 10, 2011
  • 9. Size is relative AV android. QuickTime QT Kit Foundation media for Java* Classes 61 40 24 576 Methods 500+ 280 360 >10,000 * – QTJ is used here only as an OO proxy for the procedural QuickTime API Sunday, April 10, 2011
  • 10. How do media frameworks work? Sunday, April 10, 2011
  • 15. “Boom Box” APIs ✤ Simple API for playback, sometimes recording ✤ Little or no support for editing, mixing, metadata, etc. ✤ Example: HTML 5 <audio> and <video> tags, iOS Media Player framework Sunday, April 10, 2011
  • 16. “Streaming” APIs ✤ Use “stream of audio” metaphor ✤ Strong support for mixing, effects, other real-time operations ✤ Example: Core Audio Sunday, April 10, 2011
  • 17. “Streaming” APIs ✤ Use “stream of audio” metaphor ✤ Strong support for mixing, effects, other real-time operations ✤ Example: Core Audio and AV Foundation (capture) Sunday, April 10, 2011
  • 18. “Document” APIs ✤ Use “media document” metaphor ✤ Strong support for editing ✤ Mixing may be a special case of editing ✤ Example: QuickTime Sunday, April 10, 2011
  • 19. “Document” APIs ✤ Use “media document” metaphor ✤ Strong support for editing ✤ Mixing may be a special case of editing ✤ Example: QuickTime and AV Foundation (playback and editing) Sunday, April 10, 2011
  • 20. AV Foundation Classes ✤ Capture ✤ Assets and compositions ✤ Playback, editing, and export ✤ Legacy classes Sunday, April 10, 2011
  • 21. AVAsset ✤ A collection of time-based media data ✤ Sound, video, text (closed captions, subtitles, etc.) ✤ Each distinct media type is contained in a track ✤ An asset represents the arrangement of the tracks. Tracks are pointers to source media, plus metadata (i.e., what parts of the source to use; a gain or opacity to apply, etc.) ✤ Asset ≠ media. Track ≠ media. Media = media. ✤ Asset also contains metadata (where common to all tracks) Sunday, April 10, 2011
  • 22. AVAsset subclasses ✤ AVURLAsset — An asset created from a URL, such as a song or movie file or network document/stream ✤ AVComposition — An asset created from assets in multiple files, used to combine and present media together. ✤ Used for editing Sunday, April 10, 2011
  • 23. AVPlayer ✤ Provides the ability to play an asset ✤ play, pause, seekToTime: methods; currentTime, rate properties ✤ Init with URL or with AVPlayerItem NSURL *url = [NSURL URLWithString: @"http://www.subfurther.com/video/running-start- iphone.m4v"]; AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url ! ! ! ! ! ! ! ! options:nil]; AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset]; player = [[AVPlayer playerWithPlayerItem:playerItem] retain]; Sunday, April 10, 2011
  • 24. AVPlayerLayer (or not) ✤ CALayer used to display video from a player ✤ Check that the media has video NSArray *visualTracks = [asset tracksWithMediaCharacteristic: AVMediaCharacteristicVisual]; if ((!visualTracks) || ! ([visualTracks count] == 0)) { ! playerView.hidden = YES; ! noVideoLabel.hidden = NO; } Sunday, April 10, 2011
  • 25. AVPlayerLayer (no really) ✤ If you have video, create AVPlayerLayer from AVPlayer. ✤ Set bounds and video “gravity” (bounds-filling behavior) else { ! playerView.hidden = NO; ! noVideoLabel.hidden = YES; ! AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player]; ! [playerView.layer addSublayer:playerLayer]; ! playerLayer.frame = playerView.layer.bounds; ! playerLayer.videoGravity = AVLayerVideoGravityResizeAspect; } Sunday, April 10, 2011
  • 26. Demo VTM_AVPlayer Sunday, April 10, 2011
  • 27. Media Capture ✤ AV Foundation capture classes for audio / video capture, along with still image capture ✤ Programmatic control of white balance, autofocus, zoom, etc. ✤ Does not exist on the simulator. AV Foundation capture apps can only be compiled for and run on the device. ✤ API design is borrowed from QTKit on the Mac Sunday, April 10, 2011
  • 28. Capture Classes Seem Familiar? QT Kit AV Foundation QTCaptureAudioPreviewOutput AVCaptureAudioDataOutput QTCaptureConnection AVCaptureConnection QTCaptureDecompressedAudioOutput AVCaptureDevice QTCaptureDecompressedVideoOutput AVCaptureFileOutput QTCaptureDevice AVCaptureInput QTCaptureDeviceInput AVCaptureMovieFileOutput QTCaptureFileOutput AVCaptureOutput QTCaptureInput AVCaptureSession QTCaptureLayer AVCaptureStillImageOutput QTCaptureMovieFileOutput AVCaptureVideoDataOutput QTCaptureOutput AVCaptureVideoPreviewLayer QTCaptureSession QTCaptureVideoPreviewOutput QTCaptureView Sunday, April 10, 2011
  • 29. Capture Classes Seem Familiar? QT Kit AV Foundation QTCaptureAudioPreviewOutput AVCaptureAudioDataOutput QTCaptureConnection AVCaptureConnection QTCaptureDecompressedAudioOutput AVCaptureDevice QTCaptureDecompressedVideoOutput AVCaptureFileOutput QTCaptureDevice AVCaptureInput QTCaptureDeviceInput AVCaptureMovieFileOutput QTCaptureFileOutput AVCaptureOutput QTCaptureInput AVCaptureSession QTCaptureLayer AVCaptureStillImageOutput QTCaptureMovieFileOutput AVCaptureVideoDataOutput QTCaptureOutput AVCaptureVideoPreviewLayer QTCaptureSession QTCaptureVideoPreviewOutput QTCaptureView Sunday, April 10, 2011
  • 30. Capture basics ✤ Create an AVCaptureSession to coordinate the capture ✤ Investigate available AVCaptureDevices ✤ Create AVCaptureDeviceInput and connect it to the session ✤ Optional: set up an AVCaptureVideoPreviewLayer ✤ Optional: connect AVCaptureOutputs ✤ Tell the session to start recording Sunday, April 10, 2011
  • 31. Getting capture device and input AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; if (videoDevice) { ! NSLog (@"got videoDevice"); ! AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice ! ! ! ! ! ! ! ! error:&setUpError]; ! if (videoInput) { ! ! [captureSession addInput: videoInput]; ! } } Note 1: You may also want to check for AVMediaTypeMuxed Note 2: Do not assume devices based on model (c.f. iPad Camera Connection Kit) Sunday, April 10, 2011
  • 32. Creating a video preview layer AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.frame = captureView.layer.bounds; previewLayer.videoGravity = AVLayerVideoGravityResizeAspect; [captureView.layer addSublayer:previewLayer]; Keep in mind that the iPhone cameras have a portrait orientation Sunday, April 10, 2011
  • 33. Setting an output captureMovieOutput = [[AVCaptureMovieFileOutput alloc] init]; if (! captureMovieURL) { ! captureMoviePath = [getCaptureMoviePath() retain]; ! captureMovieURL = [[NSURL alloc] initFileURLWithPath:captureMoviePath]; } NSLog (@"recording to %@", captureMovieURL); [captureSession addOutput:captureMovieOutput]; We’ll use the captureMovieURL later… Sunday, April 10, 2011
  • 34. Start capturing [captureSession startRunning]; recordButton.selected = YES; if ([[NSFileManager defaultManager] fileExistsAtPath:captureMoviePath]) { ! [[NSFileManager defaultManager] removeItemAtPath:captureMoviePath error:nil]; } // note: must have a delegate [captureMovieOutput startRecordingToOutputFileURL:captureMovieURL ! ! ! ! ! ! ! ! recordingDelegate:self]; Sunday, April 10, 2011
  • 35. Capture delegate callbacks - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections { ! NSLog (@"started recording to %@", fileURL); } - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error { ! if (error) { ! ! NSLog (@"failed to record: %@", error); ! } else { ! ! NSLog (@"finished recording to %@", outputFileURL); ! } } Sunday, April 10, 2011
  • 36. Demo VTM_AVRecPlay Sunday, April 10, 2011
  • 39. Core Media ✤ C-based framework containing structures that represent media samples and media timing ✤ Opaque types: CMBlockBuffer, CMBufferQueue, CMFormatDescription, CMSampleBuffer, CMTime, CMTimeRange ✤ Handful of convenience functions to work with these ✤ Buffer types provide wrappers around possibly-fragmented memory, time types provide timing at arbitrary precision Sunday, April 10, 2011
  • 40. CMTime ✤ CMTime contains a value and a timescale (similar to QuickTime) ✤ Time scale is how the time is measured: “nths of a second” ✤ Time in seconds = value / timescale ✤ Allows for exact timing of any kind of media ✤ Different tracks of an asset can and will have different timescales ✤ Convert with CMTimeConvertScale() Sunday, April 10, 2011
  • 42. AVAssetWriter ✤ Introduced in iOS 4.1 ✤ Allows you to create samples programmatically and write them to an asset ✤ Used for synthesized media files: screen recording, CGI, synthesized audio, etc. Sunday, April 10, 2011
  • 43. Using AVAssetWriter ✤ Create an AVAssetWriter ✤ Create and configure an AVAssetWriterInput and connect it to the writer ✤ -[AVAssetWriter startWriting] ✤ Repeatedly call -[AVAssetWriterInput appendSampleBuffer:] with CMSampleBufferRef’s ✤ Set expectsDataInRealTime appropriately, honor readyForMoreMediaData property. Sunday, April 10, 2011
  • 44. Example: iOS Screen Recorder ✤ Set up an AVAssetWriter to write to a QuickTime movie file, and an AVAssetWriterInput with codec and other video track metadata ✤ Set up an AVAssetWriterPixelBufferAdaptor to simplify converting CGImageRefs into CMSampleBufferRefs ✤ Use an NSTimer to periodically grab the screen image and use the AVAssetWriterPixelBufferAdapter to write to the AVAssetWriterInput Sunday, April 10, 2011
  • 45. Create writer, writer input, and pixel buffer adaptor assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeQuickTimeMovie error:&movieError]; NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey, [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey, nil]; assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings:assetWriterInputSettings]; assetWriterInput.expectsMediaDataInRealTime = YES; [assetWriter addInput:assetWriterInput]; assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:nil]; [assetWriter startWriting]; Settings keys and values are defined in AVAudioSettings.h and AVVideoSettings.h, or AV Foundation Constants Reference Sunday, April 10, 2011
  • 47. Create a pixel buffer // get screenshot image! CGImageRef image = (CGImageRef) [[self screenshot] CGImage]; NSLog (@"made screenshot"); // prepare the pixel buffer CVPixelBufferRef pixelBuffer = NULL; CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image)); NSLog (@"copied image data"); cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault, FRAME_WIDTH, FRAME_HEIGHT, kCVPixelFormatType_32BGRA, (void*)CFDataGetBytePtr(imageData), CGImageGetBytesPerRow(image), NULL, NULL, NULL, &pixelBuffer); NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr); Sunday, April 10, 2011
  • 48. Calculate time and write sample // calculate the time CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent(); CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime; NSLog (@"elapsedTime: %f", elapsedTime); CMTime presentationTime = CMTimeMake (elapsedTime * TIME_SCALE, TIME_SCALE); // write the sample BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime]; Sunday, April 10, 2011
  • 49. Demo VTM_ScreenRecorderTest Sunday, April 10, 2011
  • 51. AVAssetReader ✤ Introduced in iOS 4.1 ✤ Possible uses: ✤ Showing an audio wave form in a timeline ✤ Generating frame-accurate thumbnails Sunday, April 10, 2011
  • 52. Using AVAssetReader ✤ Create an AVAssetReader ✤ Create and configure an AVAssetReaderOutput ✤ Three concrete subclasses: AVAssetReaderTrackOutput, AVAssetReaderAudioMixOutput, and AVAssetReaderVideoCompositionOutput. ✤ Get data with -[AVAssetReader copyNextSampleBuffer] Sunday, April 10, 2011
  • 53. Example: Convert iPod song to PCM ✤ In iOS 4, Media Framework exposes a new metadata property, MPMediaItemPropertyAssetURL, that allows AV Foundation to open the library item as an AVAsset ✤ Create an AVAssetReader to read sample buffers from the song ✤ Create an AVAssetWriter to convert and write PCM samples Sunday, April 10, 2011
  • 54. Coordinated reading/writing ✤ You can provide a block to -[AVAssetWriter requestMediaDataWhenReady:onQueue:] ✤ Only perform your asset reads / writes when the writer is ready. ✤ In this example, AVAssetWriterInput.expectsMediaInRealTime is NO Sunday, April 10, 2011
  • 55. Set up reader, reader output, writer NSURL *assetURL = [song valueForProperty:MPMediaItemPropertyAssetURL]; AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil]; NSError *assetError = nil; AVAssetReader *assetReader = [[AVAssetReader assetReaderWithAsset:songAsset error:&assetError] retain]; AVAssetReaderOutput *assetReaderOutput = [[AVAssetReaderAudioMixOutput assetReaderAudioMixOutputWithAudioTracks:songAsset.tracks audioSettings: nil] retain]; [assetReader addOutput: assetReaderOutput]; AVAssetWriter *assetWriter = [[AVAssetWriter assetWriterWithURL:exportURL fileType:AVFileTypeCoreAudioFormat error:&assetError] retain]; Sunday, April 10, 2011
  • 56. Set up writer input AudioChannelLayout channelLayout; memset(&channelLayout, 0, sizeof(AudioChannelLayout)); channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo; NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, [NSNumber numberWithFloat:44100.0], AVSampleRateKey, [NSNumber numberWithInt:2], AVNumberOfChannelsKey, [NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)], AVChannelLayoutKey, [NSNumber numberWithInt:16], AVLinearPCMBitDepthKey, [NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved, [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey, [NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey, nil]; AVAssetWriterInput *assetWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:outputSettings] retain]; Note 1: Many of these settings are required, but you won’t know which until you get a runtime error. Note 2: AudioChannelLayout is from Core Audio Sunday, April 10, 2011
  • 57. Start reading and writing [assetWriter startWriting]; [assetReader startReading]; AVAssetTrack *soundTrack = [songAsset.tracks objectAtIndex:0]; CMTime startTime = CMTimeMake (0, soundTrack.naturalTimeScale); [assetWriter startSessionAtSourceTime: startTime]; Sunday, April 10, 2011
  • 58. Read only when writer is ready __block UInt64 convertedByteCount = 0; dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL); [assetWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock: ^ { while (assetWriterInput.readyForMoreMediaData) { CMSampleBufferRef nextBuffer = [assetReaderOutput copyNextSampleBuffer]; if (nextBuffer) { // append buffer [assetWriterInput appendSampleBuffer: nextBuffer]; convertedByteCount += CMSampleBufferGetTotalSampleSize (nextBuffer); // update UI on main thread only NSNumber *convertedByteCountNumber = [NSNumber numberWithLong:convertedByteCount]; [self performSelectorOnMainThread:@selector(updateSizeLabel:) withObject:convertedByteCountNumber waitUntilDone:NO]; } Sunday, April 10, 2011
  • 59. Close file when done else { // done! [assetWriterInput markAsFinished]; [assetWriter finishWriting]; [assetReader cancelReading]; NSDictionary *outputFileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:exportPath error:nil]; NSNumber *doneFileSize = [NSNumber numberWithLong:[outputFileAttributes fileSize]]; [self performSelectorOnMainThread:@selector(updateCompletedSizeLabel:) withObject:doneFileSize waitUntilDone:NO]; // release a lot of stuff [assetReader release]; [assetReaderOutput release]; [assetWriter release]; [assetWriterInput release]; [exportPath release]; break; } } }]; Sunday, April 10, 2011
  • 60. Demo VTM_AViPodReader Sunday, April 10, 2011
  • 62. Video Editing? On iPhone? Really? Comparison specs from everymac.com Sunday, April 10, 2011
  • 63. Video Editing? On iPhone? Really? 1999: Power Mac G4 500 AGP Comparison specs from everymac.com Sunday, April 10, 2011
  • 64. Video Editing? On iPhone? Really? 1999: Power Mac G4 500 AGP Comparison specs from everymac.com Sunday, April 10, 2011
  • 65. Video Editing? On iPhone? Really? 1999: Power Mac G4 500 AGP CPU: 500 MHz G4 RAM: 256 MB Storage: 20 GB HDD Comparison specs from everymac.com Sunday, April 10, 2011
  • 66. Video Editing? On iPhone? Really? 1999: 2010: Power Mac G4 500 AGP iPhone 4 CPU: 500 MHz G4 RAM: 256 MB Storage: 20 GB HDD Comparison specs from everymac.com Sunday, April 10, 2011
  • 67. Video Editing? On iPhone? Really? 1999: 2010: Power Mac G4 500 AGP iPhone 4 CPU: 500 MHz G4 RAM: 256 MB Storage: 20 GB HDD Comparison specs from everymac.com Sunday, April 10, 2011
  • 68. Video Editing? On iPhone? Really? 1999: 2010: Power Mac G4 500 AGP iPhone 4 CPU: 500 MHz G4 CPU: 800 MHz Apple A4 RAM: 256 MB RAM: 512 MB Storage: 20 GB HDD Storage: 16 GB Flash Comparison specs from everymac.com Sunday, April 10, 2011
  • 69. AVComposition ✤ An AVAsset that gets its tracks from multiple file-based sources ✤ To create a movie, you typically use an AVMutableComposition composition = [[AVMutableComposition alloc] init]; Sunday, April 10, 2011
  • 70. Copying from another asset ✤ -[AVMutableComposition insertTimeRange:ofAsset:atTime:error:] CMTime inTime = CMTimeMakeWithSeconds(inSeconds, 600); CMTime outTime = CMTimeMakeWithSeconds(outSeconds, 600); CMTime duration = CMTimeSubtract(outTime, inTime); CMTimeRange editRange = CMTimeRangeMake(inTime, duration); NSError *editError = nil; [targetController.composition insertTimeRange:editRange ! ! ! ! ! ! ! ofAsset:sourceAsset atTime:targetController.composition.duration ! ! ! ! ! ! ! error:&editError]; Sunday, April 10, 2011
  • 71. Demo VTM_AVEditor Sunday, April 10, 2011
  • 74. Multiple video tracks ✤ To combine multiple video sources into one movie, create an AVMutableComposition, then create AVMutableCompositionTracks // create composition self.composition = [[AVMutableComposition alloc] init]; // create video tracks a and b // note: mediatypes are defined in AVMediaFormat.h [trackA release]; trackA = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [trackB release]; trackB = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; // locate source video track AVAssetTrack *sourceVideoTrack = [[sourceVideoAsset tracksWithMediaType: AVMediaTypeVideo] objectAtIndex: 0]; Sunday, April 10, 2011
  • 75. A/B Roll Editing ✤ Apple recommends alternating between two tracks, rather than using arbitrarily many (e.g., one track per shot) Sunday, April 10, 2011
  • 76. Sound tracks ✤ Treat your audio as separate tracks too. // create music track trackMusic = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; CMTimeRange musicTrackTimeRange = CMTimeRangeMake(kCMTimeZero, musicTrackAudioAsset.duration); NSError *trackMusicError = nil; [trackMusic insertTimeRange:musicTrackTimeRange ofTrack:[musicTrackAudioAsset.tracks objectAtIndex:0] atTime:kCMTimeZero error:&trackMusicError]; Sunday, April 10, 2011
  • 77. Empty ranges ✤ Use -[AVMutableCompositionTrack insertEmptyTimeRange:] to account for any part of any track where you won’t be inserting media segments. CMTime videoTracksTime = CMTimeMake(0, VIDEO_TIME_SCALE); CMTime postEditTime = CMTimeAdd (videoTracksTime, CMTimeMakeWithSeconds(FIRST_CUT_TRACK_A_IN_TIME, VIDEO_TIME_SCALE)); [trackA insertEmptyTimeRange:CMTimeRangeMake(kCMTimeZero, postEditTime)]; videoTracksTime = postEditTime; Sunday, April 10, 2011
  • 78. Track-level inserts ✤ Insert media segments with -[AVMutableCompositionTrack insertTimeRange:ofTrack:atTime:error] postEditTime = CMTimeAdd (videoTracksTime, CMTimeMakeWithSeconds(FIRST_CUT_DURATION, VIDEO_TIME_SCALE)); CMTimeRange firstShotRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(FIRST_CUT_DURATION, VIDEO_TIME_SCALE)); [trackA insertTimeRange:firstShotRange ofTrack:sourceVideoTrack atTime:videoTracksTime error:&performError]; videoTracksTime = postEditTime; Sunday, April 10, 2011
  • 79. AVVideoComposition ✤ Describes how multiple video tracks are to be composited together. Mutable version is AVMutableVideoComposition ✤ Not a subclass of AVComposition! ✤ Contains an array AVVideoCompositionInstructions ✤ Time ranges of these instructions need to not overlap, have gaps, or fail to match the duration of the AVComposition Sunday, April 10, 2011
  • 80. AVVideoCompositionInstruction ✤ Represents video compositor instructions for all tracks in one time range ✤ These instructions are a layerInstructions property ✤ Of course, you’ll be creating an AVMutableVideoCompositionInstruction Sunday, April 10, 2011
  • 81. AVVideoCompositionLayerInstru ction (yes, really) ✤ Identifies the instructions for one track within an AVVideoCompositionInstruction. ✤ AVMutableVideoCompositionLayerInstruction. I warned you about this back on slide 3. ✤ Currently supports two properties: opacity and affine transform. Animating (“ramping”) these creates fades/cross-dissolves and pushes. ✤ e.g., -[AVMutableVideoCompositionLayerInstruction setOpacityRampFromStartOpacity:toEndOpacity:timeRange] Sunday, April 10, 2011
  • 82. An AVVideoCompositionInstruction AVMutableVideoCompositionInstruction *transitionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; transitionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration); AVMutableVideoCompositionLayerInstruction *aInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack: trackA]; [aInstruction setOpacityRampFromStartOpacity:0.0 toEndOpacity:1.0 timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(2.9, VIDEO_TIME_SCALE), CMTimeMakeWithSeconds(6.0, VIDEO_TIME_SCALE))]; AVMutableVideoCompositionLayerInstruction *bInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack: trackB]; [bInstruction setOpacity:0 atTime:kCMTimeZero]; transitionInstruction.layerInstructions = [NSArray arrayWithObjects:aInstruction, bInstruction, nil]; [videoInstructions addObject: transitionInstruction]; Sunday, April 10, 2011
  • 83. Attaching the instructions AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.instructions = videoInstructions; videoComposition.renderSize = videoSize; videoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps compositionPlayer.currentItem.videoComposition = videoComposition; Sunday, April 10, 2011
  • 84. Titles and Effects ✤ AVSynchronizedLayer gives you a CALayer that gets its timing from an AVPlayerItem, rather than a wall clock ✤ Run the movie slowly or backwards, the animation runs slowly or backwards ✤ Can add other CALayers as sublayers and they’ll all get their timing from the AVPlayerItem Sunday, April 10, 2011
  • 85. Creating a main title layer // synchronized layer to own all the title layers AVSynchronizedLayer *synchronizedLayer = [AVSynchronizedLayer synchronizedLayerWithPlayerItem:compositionPlayer.currentItem]; synchronizedLayer.frame = [compositionView frame]; [self.view.layer addSublayer:synchronizedLayer]; // main titles CATextLayer *mainTitleLayer = [CATextLayer layer]; mainTitleLayer.string = NSLocalizedString(@"Running Start", nil); mainTitleLayer.font = @"Verdana-Bold"; mainTitleLayer.fontSize = videoSize.height / 8; mainTitleLayer.foregroundColor = [[UIColor yellowColor] CGColor]; mainTitleLayer.alignmentMode = kCAAlignmentCenter; mainTitleLayer.frame = CGRectMake(0.0, 0.0, videoSize.width, videoSize.height); mainTitleLayer.opacity = 0.0; // initially invisible [synchronizedLayer addSublayer:mainTitleLayer]; Sunday, April 10, 2011
  • 86. Adding an animation // main title opacity animation [CATransaction begin]; [CATransaction setDisableActions:YES]; CABasicAnimation *mainTitleInAnimation = [CABasicAnimation animationWithKeyPath:@"opacity"]; mainTitleInAnimation.fromValue = [NSNumber numberWithFloat: 0.0]; mainTitleInAnimation.toValue = [NSNumber numberWithFloat: 1.0]; mainTitleInAnimation.removedOnCompletion = NO; mainTitleInAnimation.beginTime = AVCoreAnimationBeginTimeAtZero; mainTitleInAnimation.duration = 5.0; [mainTitleLayer addAnimation:mainTitleInAnimation forKey:@"in-animation"]; Nasty gotcha: AVCoreAnimationBeginTimeAtZero is a special value that is used for AVF animations, since 0 would otherwise be interpreted as CACurrentMediaTime() Sunday, April 10, 2011
  • 87. Demo VTM_AVEditor Sunday, April 10, 2011
  • 88. Multi-track audio ✤ AVPlayerItem.audioMix property ✤ AVAudioMix class describes how multiple audio tracks are to be mixed together ✤ Analogous to videoComposition property (AVVideoComposition) Sunday, April 10, 2011
  • 89. Basic Export ✤ Create an AVAssetExportSession ✤ Must set outputURL and outputFileType properties ✤ Inspect possible types with supportedFileTypes property (list of AVFileType… strings in docs) ✤ Begin export with exportAsynchronouslyWithCompletionHandler: ✤ This takes a block, which will be called on completion, failure, cancellation, etc. Sunday, April 10, 2011
  • 90. Advanced Export ✤ AVAssetExportSession takes videoComposition and audioMix parameters, just like AVPlayerItem ✤ To include AVSynchronizedLayer-based animations in an export, use a AVVideoCompositionCoreAnimationTool and set it as the animationTool property of the AVMutableVideoComposition (but only for export) Sunday, April 10, 2011
  • 92. More fun with capture ✤ Can analyze video data coming off the camera with the AVCaptureVideoDataOutput class ✤ Can provide uncompressed frames to your AVCaptureVideoDataOutputSampleBufferDelegate ✤ The callback provides you with a CMSampleBufferRef ✤ See WWDC 2010 AVCam example Sunday, April 10, 2011
  • 93. Hazards and Hassles Sunday, April 10, 2011
  • 98. Only effects are dissolve and push? How would we do this checkerboard wipe in AV Foundation? It’s pretty easy in QuickTime! Sunday, April 10, 2011
  • 99. How do you… ✤ Save a composition to work on later? ✤ Even if AVMutableComposition supports NSCopying, what if you’ve got titles in an AVSynchronizedLayer? ✤ Support undo / redo of edits? ✤ Add import/export support for other formats and codecs? Sunday, April 10, 2011
  • 100. AV Foundation Sucks! ✤ Too hard to understand! ✤ Too many classes and methods! ✤ Verbose and obtuse method naming ✤ AVComposition and AVVideoComposition are completely unrelated? WTF, Apple? Sunday, April 10, 2011
  • 102. Complex things usually aren’t easy Simple Complex Hard Easy Sunday, April 10, 2011
  • 103. AV Foundation Rocks! ✤ Addresses a huge range of media functionality ✤ The other guys don’t even try ✤ Same framework used by Apple for iMovie for iPhone/iPad ✤ You can create functionality equivalent to iMovie / Final Cut in a few hundred lines of code ✤ Coming to Mac OS X in 10.7 (Lion) Sunday, April 10, 2011
  • 104. Q&A Chris Adamson — @invalidname — http://www.subfurther.com/blog Voices That Matter IPhone Developer Conference — March 10, 2011 Sunday, April 10, 2011
  • 105. Also from Pearson! ✤ “Core Audio is serious black arts shit.” — Mike Lee (@bmf) ✤ It’s tangentially related to AV Foundation, so you should totally buy it when it comes out. Sunday, April 10, 2011