How to save recorded video using AVAssetWriter?











up vote
1
down vote

favorite












I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,



-(void) initilizeCameraConfigurations {

if(!captureSession) {

captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
self.view.backgroundColor = UIColor.blackColor;
CGRect bounds = self.view.bounds;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
captureVideoPreviewLayer.bounds = self.view.frame;
captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.view.layer addSublayer:captureVideoPreviewLayer];
[self.view bringSubviewToFront:self.controlsBgView];
}


// Add input to session
NSError *err;
videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];

if([captureSession canAddInput:videoCaptureDeviceInput]) {
[captureSession addInput:videoCaptureDeviceInput];
}

docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];

assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:300], AVVideoWidthKey,
[NSNumber numberWithInt:300], AVVideoHeightKey,
nil];




writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
writerInput.transform = CGAffineTransformMakeRotation(M_PI);

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
nil];

assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];


if([assetWriter canAddInput:writerInput]) {
[assetWriter addInput:writerInput];
}

// Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
[captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}


// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];

[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
[captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];

}
-(IBAction)startStopVideoRecording:(id)sender {

if(captureSession) {
if(isVideoRecording) {
[writerInput markAsFinished];

[assetWriter finishWritingWithCompletionHandler:^{
NSLog(@"Finished writing...checking completion status...");
if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
{
// Video saved
} else
{
NSLog(@"#123 Video writing failed: %@", assetWriter.error);
}

}];

} else {

[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
isVideoRecording = YES;

}
}
}
-(NSString *) getDocumentsUrl {

NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
NSError *err;
[[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;


}


@end


Correct me if anything wrong. Thank you in advance.










share|improve this question




























    up vote
    1
    down vote

    favorite












    I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,



    -(void) initilizeCameraConfigurations {

    if(!captureSession) {

    captureSession = [[AVCaptureSession alloc] init];
    [captureSession beginConfiguration];
    captureSession.sessionPreset = AVCaptureSessionPresetHigh;
    self.view.backgroundColor = UIColor.blackColor;
    CGRect bounds = self.view.bounds;
    captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
    captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
    captureVideoPreviewLayer.bounds = self.view.frame;
    captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
    [self.view.layer addSublayer:captureVideoPreviewLayer];
    [self.view bringSubviewToFront:self.controlsBgView];
    }


    // Add input to session
    NSError *err;
    videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];

    if([captureSession canAddInput:videoCaptureDeviceInput]) {
    [captureSession addInput:videoCaptureDeviceInput];
    }

    docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];

    assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
    NSParameterAssert(assetWriter);
    //assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
    AVVideoCodecH264, AVVideoCodecKey,
    [NSNumber numberWithInt:300], AVVideoWidthKey,
    [NSNumber numberWithInt:300], AVVideoHeightKey,
    nil];




    writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
    writerInput.expectsMediaDataInRealTime = YES;
    writerInput.transform = CGAffineTransformMakeRotation(M_PI);

    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
    [NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
    [NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
    nil];

    assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];


    if([assetWriter canAddInput:writerInput]) {
    [assetWriter addInput:writerInput];
    }

    // Set video stabilization mode to preview layer
    AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
    if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
    [captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
    }


    // image output
    stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];
    [captureSession addOutput:stillImageOutput];

    [captureSession commitConfiguration];
    if (![captureVideoPreviewLayer.connection isEnabled]) {
    [captureVideoPreviewLayer.connection setEnabled:YES];
    }
    [captureSession startRunning];

    }
    -(IBAction)startStopVideoRecording:(id)sender {

    if(captureSession) {
    if(isVideoRecording) {
    [writerInput markAsFinished];

    [assetWriter finishWritingWithCompletionHandler:^{
    NSLog(@"Finished writing...checking completion status...");
    if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
    {
    // Video saved
    } else
    {
    NSLog(@"#123 Video writing failed: %@", assetWriter.error);
    }

    }];

    } else {

    [assetWriter startWriting];
    [assetWriter startSessionAtSourceTime:kCMTimeZero];
    isVideoRecording = YES;

    }
    }
    }
    -(NSString *) getDocumentsUrl {

    NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
    docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
    if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
    NSError *err;
    [[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
    }
    NSLog(@"Movie path : %@",docPath);
    return docPath;


    }


    @end


    Correct me if anything wrong. Thank you in advance.










    share|improve this question


























      up vote
      1
      down vote

      favorite









      up vote
      1
      down vote

      favorite











      I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,



      -(void) initilizeCameraConfigurations {

      if(!captureSession) {

      captureSession = [[AVCaptureSession alloc] init];
      [captureSession beginConfiguration];
      captureSession.sessionPreset = AVCaptureSessionPresetHigh;
      self.view.backgroundColor = UIColor.blackColor;
      CGRect bounds = self.view.bounds;
      captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
      captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
      captureVideoPreviewLayer.bounds = self.view.frame;
      captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
      captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
      captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
      [self.view.layer addSublayer:captureVideoPreviewLayer];
      [self.view bringSubviewToFront:self.controlsBgView];
      }


      // Add input to session
      NSError *err;
      videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];

      if([captureSession canAddInput:videoCaptureDeviceInput]) {
      [captureSession addInput:videoCaptureDeviceInput];
      }

      docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];

      assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
      NSParameterAssert(assetWriter);
      //assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);

      NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
      AVVideoCodecH264, AVVideoCodecKey,
      [NSNumber numberWithInt:300], AVVideoWidthKey,
      [NSNumber numberWithInt:300], AVVideoHeightKey,
      nil];




      writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
      writerInput.expectsMediaDataInRealTime = YES;
      writerInput.transform = CGAffineTransformMakeRotation(M_PI);

      NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
      [NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
      [NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
      nil];

      assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];


      if([assetWriter canAddInput:writerInput]) {
      [assetWriter addInput:writerInput];
      }

      // Set video stabilization mode to preview layer
      AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
      if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
      [captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
      }


      // image output
      stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
      NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
      [stillImageOutput setOutputSettings:outputSettings];
      [captureSession addOutput:stillImageOutput];

      [captureSession commitConfiguration];
      if (![captureVideoPreviewLayer.connection isEnabled]) {
      [captureVideoPreviewLayer.connection setEnabled:YES];
      }
      [captureSession startRunning];

      }
      -(IBAction)startStopVideoRecording:(id)sender {

      if(captureSession) {
      if(isVideoRecording) {
      [writerInput markAsFinished];

      [assetWriter finishWritingWithCompletionHandler:^{
      NSLog(@"Finished writing...checking completion status...");
      if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
      {
      // Video saved
      } else
      {
      NSLog(@"#123 Video writing failed: %@", assetWriter.error);
      }

      }];

      } else {

      [assetWriter startWriting];
      [assetWriter startSessionAtSourceTime:kCMTimeZero];
      isVideoRecording = YES;

      }
      }
      }
      -(NSString *) getDocumentsUrl {

      NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
      docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
      if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
      NSError *err;
      [[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
      }
      NSLog(@"Movie path : %@",docPath);
      return docPath;


      }


      @end


      Correct me if anything wrong. Thank you in advance.










      share|improve this question















      I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,



      -(void) initilizeCameraConfigurations {

      if(!captureSession) {

      captureSession = [[AVCaptureSession alloc] init];
      [captureSession beginConfiguration];
      captureSession.sessionPreset = AVCaptureSessionPresetHigh;
      self.view.backgroundColor = UIColor.blackColor;
      CGRect bounds = self.view.bounds;
      captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
      captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
      captureVideoPreviewLayer.bounds = self.view.frame;
      captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
      captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
      captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
      [self.view.layer addSublayer:captureVideoPreviewLayer];
      [self.view bringSubviewToFront:self.controlsBgView];
      }


      // Add input to session
      NSError *err;
      videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];

      if([captureSession canAddInput:videoCaptureDeviceInput]) {
      [captureSession addInput:videoCaptureDeviceInput];
      }

      docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];

      assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
      NSParameterAssert(assetWriter);
      //assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);

      NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
      AVVideoCodecH264, AVVideoCodecKey,
      [NSNumber numberWithInt:300], AVVideoWidthKey,
      [NSNumber numberWithInt:300], AVVideoHeightKey,
      nil];




      writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
      writerInput.expectsMediaDataInRealTime = YES;
      writerInput.transform = CGAffineTransformMakeRotation(M_PI);

      NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
      [NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
      [NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
      nil];

      assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];


      if([assetWriter canAddInput:writerInput]) {
      [assetWriter addInput:writerInput];
      }

      // Set video stabilization mode to preview layer
      AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
      if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
      [captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
      }


      // image output
      stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
      NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
      [stillImageOutput setOutputSettings:outputSettings];
      [captureSession addOutput:stillImageOutput];

      [captureSession commitConfiguration];
      if (![captureVideoPreviewLayer.connection isEnabled]) {
      [captureVideoPreviewLayer.connection setEnabled:YES];
      }
      [captureSession startRunning];

      }
      -(IBAction)startStopVideoRecording:(id)sender {

      if(captureSession) {
      if(isVideoRecording) {
      [writerInput markAsFinished];

      [assetWriter finishWritingWithCompletionHandler:^{
      NSLog(@"Finished writing...checking completion status...");
      if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
      {
      // Video saved
      } else
      {
      NSLog(@"#123 Video writing failed: %@", assetWriter.error);
      }

      }];

      } else {

      [assetWriter startWriting];
      [assetWriter startSessionAtSourceTime:kCMTimeZero];
      isVideoRecording = YES;

      }
      }
      }
      -(NSString *) getDocumentsUrl {

      NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
      docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
      if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
      NSError *err;
      [[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
      }
      NSLog(@"Movie path : %@",docPath);
      return docPath;


      }


      @end


      Correct me if anything wrong. Thank you in advance.







      ios avfoundation






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 21 at 3:51

























      asked Nov 20 at 16:20









      G. Hazarath Reddy

      245




      245
























          2 Answers
          2






          active

          oldest

          votes

















          up vote
          1
          down vote













          Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter with attached inputs.

          The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.



          Furthermore, the AVCaptureStillImageOutput method -captureStillImageAsynchronouslyFromConnection:completionHandler: is nowhere called, so the capture session actually produces no frames.



          So, as a minimum, implement something like this:



          -(IBAction)captureStillImageAndAppend:(id)sender
          {
          [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
          ^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
          {
          // check error, omitted here
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
          [writerInput appendSampleBuffer:imageDataSampleBuffer];
          }];
          }


          Remove the AVAssetWriterInputPixelBufferAdaptor, it's not used.



          But there are issues with AVCaptureStillImageOutput:




          • it's only intended to produce still images, not videos


          • it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};)


          • it's deprecated under iOS



          If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput add a AVCaptureVideoDataOutput to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:



          -(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
          {
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
          [writerInput appendSampleBuffer:sampleBuffer];
          }


          Note that




          • you will want to make sure that the AVCaptureVideoDataOutput only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording action


          • reset the startTime to kCMTimeInvalid before starting another recording







          share|improve this answer








          New contributor




          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.


















          • Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
            – G. Hazarath Reddy
            Nov 22 at 9:24






          • 1




            Have a look at the documentation for -captureOutput:didDropSampleBuffer:fromConnection:. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason attachment.
            – NoHalfBits
            Nov 22 at 12:16










          • I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
            – G. Hazarath Reddy
            2 days ago












          • The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to didOutputSampleBuffer. Do you do anything beyond appending it to the input as shown above?
            – NoHalfBits
            2 days ago










          • A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the expectsMediaDataInRealTime of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput has two -recommended... methods which will provide optimized settings
            – NoHalfBits
            2 days ago




















          up vote
          0
          down vote













          You don't say what actually goes wrong, but two things look wrong with your code:



          docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];


          looks like it creates an undesired path like this @"/path/Movie/.mov", when you want this:



          docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];


          And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffers start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0, so instead do this:



          -(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
          if(firstSampleBuffer) {
          [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
          }

          [writerInput appendSampleBuffer:sampleBuffer];

          }





          share|improve this answer





















          • I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
            – G. Hazarath Reddy
            Nov 21 at 3:39











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














           

          draft saved


          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53397245%2fhow-to-save-recorded-video-using-avassetwriter%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes








          up vote
          1
          down vote













          Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter with attached inputs.

          The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.



          Furthermore, the AVCaptureStillImageOutput method -captureStillImageAsynchronouslyFromConnection:completionHandler: is nowhere called, so the capture session actually produces no frames.



          So, as a minimum, implement something like this:



          -(IBAction)captureStillImageAndAppend:(id)sender
          {
          [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
          ^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
          {
          // check error, omitted here
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
          [writerInput appendSampleBuffer:imageDataSampleBuffer];
          }];
          }


          Remove the AVAssetWriterInputPixelBufferAdaptor, it's not used.



          But there are issues with AVCaptureStillImageOutput:




          • it's only intended to produce still images, not videos


          • it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};)


          • it's deprecated under iOS



          If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput add a AVCaptureVideoDataOutput to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:



          -(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
          {
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
          [writerInput appendSampleBuffer:sampleBuffer];
          }


          Note that




          • you will want to make sure that the AVCaptureVideoDataOutput only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording action


          • reset the startTime to kCMTimeInvalid before starting another recording







          share|improve this answer








          New contributor




          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.


















          • Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
            – G. Hazarath Reddy
            Nov 22 at 9:24






          • 1




            Have a look at the documentation for -captureOutput:didDropSampleBuffer:fromConnection:. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason attachment.
            – NoHalfBits
            Nov 22 at 12:16










          • I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
            – G. Hazarath Reddy
            2 days ago












          • The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to didOutputSampleBuffer. Do you do anything beyond appending it to the input as shown above?
            – NoHalfBits
            2 days ago










          • A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the expectsMediaDataInRealTime of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput has two -recommended... methods which will provide optimized settings
            – NoHalfBits
            2 days ago

















          up vote
          1
          down vote













          Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter with attached inputs.

          The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.



          Furthermore, the AVCaptureStillImageOutput method -captureStillImageAsynchronouslyFromConnection:completionHandler: is nowhere called, so the capture session actually produces no frames.



          So, as a minimum, implement something like this:



          -(IBAction)captureStillImageAndAppend:(id)sender
          {
          [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
          ^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
          {
          // check error, omitted here
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
          [writerInput appendSampleBuffer:imageDataSampleBuffer];
          }];
          }


          Remove the AVAssetWriterInputPixelBufferAdaptor, it's not used.



          But there are issues with AVCaptureStillImageOutput:




          • it's only intended to produce still images, not videos


          • it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};)


          • it's deprecated under iOS



          If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput add a AVCaptureVideoDataOutput to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:



          -(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
          {
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
          [writerInput appendSampleBuffer:sampleBuffer];
          }


          Note that




          • you will want to make sure that the AVCaptureVideoDataOutput only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording action


          • reset the startTime to kCMTimeInvalid before starting another recording







          share|improve this answer








          New contributor




          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.


















          • Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
            – G. Hazarath Reddy
            Nov 22 at 9:24






          • 1




            Have a look at the documentation for -captureOutput:didDropSampleBuffer:fromConnection:. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason attachment.
            – NoHalfBits
            Nov 22 at 12:16










          • I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
            – G. Hazarath Reddy
            2 days ago












          • The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to didOutputSampleBuffer. Do you do anything beyond appending it to the input as shown above?
            – NoHalfBits
            2 days ago










          • A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the expectsMediaDataInRealTime of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput has two -recommended... methods which will provide optimized settings
            – NoHalfBits
            2 days ago















          up vote
          1
          down vote










          up vote
          1
          down vote









          Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter with attached inputs.

          The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.



          Furthermore, the AVCaptureStillImageOutput method -captureStillImageAsynchronouslyFromConnection:completionHandler: is nowhere called, so the capture session actually produces no frames.



          So, as a minimum, implement something like this:



          -(IBAction)captureStillImageAndAppend:(id)sender
          {
          [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
          ^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
          {
          // check error, omitted here
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
          [writerInput appendSampleBuffer:imageDataSampleBuffer];
          }];
          }


          Remove the AVAssetWriterInputPixelBufferAdaptor, it's not used.



          But there are issues with AVCaptureStillImageOutput:




          • it's only intended to produce still images, not videos


          • it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};)


          • it's deprecated under iOS



          If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput add a AVCaptureVideoDataOutput to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:



          -(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
          {
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
          [writerInput appendSampleBuffer:sampleBuffer];
          }


          Note that




          • you will want to make sure that the AVCaptureVideoDataOutput only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording action


          • reset the startTime to kCMTimeInvalid before starting another recording







          share|improve this answer








          New contributor




          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter with attached inputs.

          The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.



          Furthermore, the AVCaptureStillImageOutput method -captureStillImageAsynchronouslyFromConnection:completionHandler: is nowhere called, so the capture session actually produces no frames.



          So, as a minimum, implement something like this:



          -(IBAction)captureStillImageAndAppend:(id)sender
          {
          [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
          ^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
          {
          // check error, omitted here
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
          [writerInput appendSampleBuffer:imageDataSampleBuffer];
          }];
          }


          Remove the AVAssetWriterInputPixelBufferAdaptor, it's not used.



          But there are issues with AVCaptureStillImageOutput:




          • it's only intended to produce still images, not videos


          • it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};)


          • it's deprecated under iOS



          If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput add a AVCaptureVideoDataOutput to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:



          -(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
          {
          if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
          [assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
          [writerInput appendSampleBuffer:sampleBuffer];
          }


          Note that




          • you will want to make sure that the AVCaptureVideoDataOutput only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording action


          • reset the startTime to kCMTimeInvalid before starting another recording








          share|improve this answer








          New contributor




          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          share|improve this answer



          share|improve this answer






          New contributor




          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.









          answered Nov 21 at 22:12









          NoHalfBits

          363




          363




          New contributor




          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.





          New contributor





          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.






          NoHalfBits is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
          Check out our Code of Conduct.












          • Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
            – G. Hazarath Reddy
            Nov 22 at 9:24






          • 1




            Have a look at the documentation for -captureOutput:didDropSampleBuffer:fromConnection:. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason attachment.
            – NoHalfBits
            Nov 22 at 12:16










          • I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
            – G. Hazarath Reddy
            2 days ago












          • The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to didOutputSampleBuffer. Do you do anything beyond appending it to the input as shown above?
            – NoHalfBits
            2 days ago










          • A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the expectsMediaDataInRealTime of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput has two -recommended... methods which will provide optimized settings
            – NoHalfBits
            2 days ago




















          • Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
            – G. Hazarath Reddy
            Nov 22 at 9:24






          • 1




            Have a look at the documentation for -captureOutput:didDropSampleBuffer:fromConnection:. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason attachment.
            – NoHalfBits
            Nov 22 at 12:16










          • I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
            – G. Hazarath Reddy
            2 days ago












          • The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to didOutputSampleBuffer. Do you do anything beyond appending it to the input as shown above?
            – NoHalfBits
            2 days ago










          • A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the expectsMediaDataInRealTime of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput has two -recommended... methods which will provide optimized settings
            – NoHalfBits
            2 days ago


















          Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
          – G. Hazarath Reddy
          Nov 22 at 9:24




          Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
          – G. Hazarath Reddy
          Nov 22 at 9:24




          1




          1




          Have a look at the documentation for -captureOutput:didDropSampleBuffer:fromConnection:. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason attachment.
          – NoHalfBits
          Nov 22 at 12:16




          Have a look at the documentation for -captureOutput:didDropSampleBuffer:fromConnection:. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason attachment.
          – NoHalfBits
          Nov 22 at 12:16












          I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
          – G. Hazarath Reddy
          2 days ago






          I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
          – G. Hazarath Reddy
          2 days ago














          The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to didOutputSampleBuffer. Do you do anything beyond appending it to the input as shown above?
          – NoHalfBits
          2 days ago




          The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to didOutputSampleBuffer. Do you do anything beyond appending it to the input as shown above?
          – NoHalfBits
          2 days ago












          A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the expectsMediaDataInRealTime of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput has two -recommended... methods which will provide optimized settings
          – NoHalfBits
          2 days ago






          A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the expectsMediaDataInRealTime of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput has two -recommended... methods which will provide optimized settings
          – NoHalfBits
          2 days ago














          up vote
          0
          down vote













          You don't say what actually goes wrong, but two things look wrong with your code:



          docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];


          looks like it creates an undesired path like this @"/path/Movie/.mov", when you want this:



          docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];


          And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffers start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0, so instead do this:



          -(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
          if(firstSampleBuffer) {
          [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
          }

          [writerInput appendSampleBuffer:sampleBuffer];

          }





          share|improve this answer





















          • I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
            – G. Hazarath Reddy
            Nov 21 at 3:39















          up vote
          0
          down vote













          You don't say what actually goes wrong, but two things look wrong with your code:



          docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];


          looks like it creates an undesired path like this @"/path/Movie/.mov", when you want this:



          docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];


          And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffers start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0, so instead do this:



          -(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
          if(firstSampleBuffer) {
          [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
          }

          [writerInput appendSampleBuffer:sampleBuffer];

          }





          share|improve this answer





















          • I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
            – G. Hazarath Reddy
            Nov 21 at 3:39













          up vote
          0
          down vote










          up vote
          0
          down vote









          You don't say what actually goes wrong, but two things look wrong with your code:



          docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];


          looks like it creates an undesired path like this @"/path/Movie/.mov", when you want this:



          docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];


          And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffers start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0, so instead do this:



          -(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
          if(firstSampleBuffer) {
          [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
          }

          [writerInput appendSampleBuffer:sampleBuffer];

          }





          share|improve this answer












          You don't say what actually goes wrong, but two things look wrong with your code:



          docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];


          looks like it creates an undesired path like this @"/path/Movie/.mov", when you want this:



          docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];


          And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffers start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0, so instead do this:



          -(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
          if(firstSampleBuffer) {
          [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
          }

          [writerInput appendSampleBuffer:sampleBuffer];

          }






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 20 at 21:36









          Rhythmic Fistman

          23.3k450104




          23.3k450104












          • I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
            – G. Hazarath Reddy
            Nov 21 at 3:39


















          • I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
            – G. Hazarath Reddy
            Nov 21 at 3:39
















          I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
          – G. Hazarath Reddy
          Nov 21 at 3:39




          I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
          – G. Hazarath Reddy
          Nov 21 at 3:39


















           

          draft saved


          draft discarded



















































           


          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53397245%2fhow-to-save-recorded-video-using-avassetwriter%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Berounka

          Sphinx de Gizeh

          Different font size/position of beamer's navigation symbols template's content depending on regular/plain...