How to save recorded video using AVAssetWriter?
up vote
1
down vote
favorite
I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,
-(void) initilizeCameraConfigurations {
if(!captureSession) {
captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
self.view.backgroundColor = UIColor.blackColor;
CGRect bounds = self.view.bounds;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
captureVideoPreviewLayer.bounds = self.view.frame;
captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.view.layer addSublayer:captureVideoPreviewLayer];
[self.view bringSubviewToFront:self.controlsBgView];
}
// Add input to session
NSError *err;
videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];
if([captureSession canAddInput:videoCaptureDeviceInput]) {
[captureSession addInput:videoCaptureDeviceInput];
}
docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];
assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:300], AVVideoWidthKey,
[NSNumber numberWithInt:300], AVVideoHeightKey,
nil];
writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
writerInput.transform = CGAffineTransformMakeRotation(M_PI);
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
nil];
assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
if([assetWriter canAddInput:writerInput]) {
[assetWriter addInput:writerInput];
}
// Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
[captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}
// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
[captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];
}
-(IBAction)startStopVideoRecording:(id)sender {
if(captureSession) {
if(isVideoRecording) {
[writerInput markAsFinished];
[assetWriter finishWritingWithCompletionHandler:^{
NSLog(@"Finished writing...checking completion status...");
if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
{
// Video saved
} else
{
NSLog(@"#123 Video writing failed: %@", assetWriter.error);
}
}];
} else {
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
isVideoRecording = YES;
}
}
}
-(NSString *) getDocumentsUrl {
NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
NSError *err;
[[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;
}
@end
Correct me if anything wrong. Thank you in advance.
ios avfoundation
add a comment |
up vote
1
down vote
favorite
I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,
-(void) initilizeCameraConfigurations {
if(!captureSession) {
captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
self.view.backgroundColor = UIColor.blackColor;
CGRect bounds = self.view.bounds;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
captureVideoPreviewLayer.bounds = self.view.frame;
captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.view.layer addSublayer:captureVideoPreviewLayer];
[self.view bringSubviewToFront:self.controlsBgView];
}
// Add input to session
NSError *err;
videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];
if([captureSession canAddInput:videoCaptureDeviceInput]) {
[captureSession addInput:videoCaptureDeviceInput];
}
docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];
assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:300], AVVideoWidthKey,
[NSNumber numberWithInt:300], AVVideoHeightKey,
nil];
writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
writerInput.transform = CGAffineTransformMakeRotation(M_PI);
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
nil];
assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
if([assetWriter canAddInput:writerInput]) {
[assetWriter addInput:writerInput];
}
// Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
[captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}
// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
[captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];
}
-(IBAction)startStopVideoRecording:(id)sender {
if(captureSession) {
if(isVideoRecording) {
[writerInput markAsFinished];
[assetWriter finishWritingWithCompletionHandler:^{
NSLog(@"Finished writing...checking completion status...");
if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
{
// Video saved
} else
{
NSLog(@"#123 Video writing failed: %@", assetWriter.error);
}
}];
} else {
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
isVideoRecording = YES;
}
}
}
-(NSString *) getDocumentsUrl {
NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
NSError *err;
[[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;
}
@end
Correct me if anything wrong. Thank you in advance.
ios avfoundation
add a comment |
up vote
1
down vote
favorite
up vote
1
down vote
favorite
I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,
-(void) initilizeCameraConfigurations {
if(!captureSession) {
captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
self.view.backgroundColor = UIColor.blackColor;
CGRect bounds = self.view.bounds;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
captureVideoPreviewLayer.bounds = self.view.frame;
captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.view.layer addSublayer:captureVideoPreviewLayer];
[self.view bringSubviewToFront:self.controlsBgView];
}
// Add input to session
NSError *err;
videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];
if([captureSession canAddInput:videoCaptureDeviceInput]) {
[captureSession addInput:videoCaptureDeviceInput];
}
docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];
assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:300], AVVideoWidthKey,
[NSNumber numberWithInt:300], AVVideoHeightKey,
nil];
writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
writerInput.transform = CGAffineTransformMakeRotation(M_PI);
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
nil];
assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
if([assetWriter canAddInput:writerInput]) {
[assetWriter addInput:writerInput];
}
// Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
[captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}
// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
[captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];
}
-(IBAction)startStopVideoRecording:(id)sender {
if(captureSession) {
if(isVideoRecording) {
[writerInput markAsFinished];
[assetWriter finishWritingWithCompletionHandler:^{
NSLog(@"Finished writing...checking completion status...");
if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
{
// Video saved
} else
{
NSLog(@"#123 Video writing failed: %@", assetWriter.error);
}
}];
} else {
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
isVideoRecording = YES;
}
}
}
-(NSString *) getDocumentsUrl {
NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
NSError *err;
[[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;
}
@end
Correct me if anything wrong. Thank you in advance.
ios avfoundation
I tried many other blogs and stack overflow. I didn't get solution for this, I can able to create custom camera with preview. I need video with custom frame, that's why I am using AVAssetWriter. But i unable to save recorded video into documents. I tried like this,
-(void) initilizeCameraConfigurations {
if(!captureSession) {
captureSession = [[AVCaptureSession alloc] init];
[captureSession beginConfiguration];
captureSession.sessionPreset = AVCaptureSessionPresetHigh;
self.view.backgroundColor = UIColor.blackColor;
CGRect bounds = self.view.bounds;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureVideoPreviewLayer.backgroundColor = [UIColor clearColor].CGColor;
captureVideoPreviewLayer.bounds = self.view.frame;
captureVideoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
[self.view.layer addSublayer:captureVideoPreviewLayer];
[self.view bringSubviewToFront:self.controlsBgView];
}
// Add input to session
NSError *err;
videoCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&err];
if([captureSession canAddInput:videoCaptureDeviceInput]) {
[captureSession addInput:videoCaptureDeviceInput];
}
docPathUrl = [[NSURL alloc] initFileURLWithPath:[self getDocumentsUrl]];
assetWriter = [AVAssetWriter assetWriterWithURL:docPathUrl fileType:AVFileTypeQuickTimeMovie error:&err];
NSParameterAssert(assetWriter);
//assetWriter.movieFragmentInterval = CMTimeMakeWithSeconds(1.0, 1000);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:300], AVVideoWidthKey,
[NSNumber numberWithInt:300], AVVideoHeightKey,
nil];
writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
writerInput.transform = CGAffineTransformMakeRotation(M_PI);
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:300], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:300], kCVPixelBufferHeightKey,
nil];
assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
if([assetWriter canAddInput:writerInput]) {
[assetWriter addInput:writerInput];
}
// Set video stabilization mode to preview layer
AVCaptureVideoStabilizationMode stablilizationMode = AVCaptureVideoStabilizationModeCinematic;
if([videoCaptureDevice.activeFormat isVideoStabilizationModeSupported:stablilizationMode]) {
[captureVideoPreviewLayer.connection setPreferredVideoStabilizationMode:stablilizationMode];
}
// image output
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
[captureSession commitConfiguration];
if (![captureVideoPreviewLayer.connection isEnabled]) {
[captureVideoPreviewLayer.connection setEnabled:YES];
}
[captureSession startRunning];
}
-(IBAction)startStopVideoRecording:(id)sender {
if(captureSession) {
if(isVideoRecording) {
[writerInput markAsFinished];
[assetWriter finishWritingWithCompletionHandler:^{
NSLog(@"Finished writing...checking completion status...");
if (assetWriter.status != AVAssetWriterStatusFailed && assetWriter.status == AVAssetWriterStatusCompleted)
{
// Video saved
} else
{
NSLog(@"#123 Video writing failed: %@", assetWriter.error);
}
}];
} else {
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
isVideoRecording = YES;
}
}
}
-(NSString *) getDocumentsUrl {
NSString *docPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:docPath]) {
NSError *err;
[[NSFileManager defaultManager] removeItemAtPath:docPath error:&err];
}
NSLog(@"Movie path : %@",docPath);
return docPath;
}
@end
Correct me if anything wrong. Thank you in advance.
ios avfoundation
ios avfoundation
edited Nov 21 at 3:51
asked Nov 20 at 16:20
G. Hazarath Reddy
245
245
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
up vote
1
down vote
Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession
, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter
with attached inputs.
The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.
Furthermore, the AVCaptureStillImageOutput
method -captureStillImageAsynchronouslyFromConnection:completionHandler:
is nowhere called, so the capture session actually produces no frames.
So, as a minimum, implement something like this:
-(IBAction)captureStillImageAndAppend:(id)sender
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
{
// check error, omitted here
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
[writerInput appendSampleBuffer:imageDataSampleBuffer];
}];
}
Remove the AVAssetWriterInputPixelBufferAdaptor
, it's not used.
But there are issues with AVCaptureStillImageOutput
:
it's only intended to produce still images, not videos
it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (
stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};
)it's deprecated under iOS
If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput
add a AVCaptureVideoDataOutput
to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:
-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
[writerInput appendSampleBuffer:sampleBuffer];
}
Note that
you will want to make sure that the
AVCaptureVideoDataOutput
only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording actionreset the
startTime
tokCMTimeInvalid
before starting another recording
New contributor
Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
– G. Hazarath Reddy
Nov 22 at 9:24
1
Have a look at the documentation for-captureOutput:didDropSampleBuffer:fromConnection:
. It explains how to extract the reason from the passed in sample buffer with thekCMSampleBufferAttachmentKey_DroppedFrameReason
attachment.
– NoHalfBits
Nov 22 at 12:16
I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
– G. Hazarath Reddy
2 days ago
The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed todidOutputSampleBuffer
. Do you do anything beyond appending it to the input as shown above?
– NoHalfBits
2 days ago
A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set theexpectsMediaDataInRealTime
of the writer input to YES, right? Otherwise, you may have to adjust output settings; note thatAVCaptureVideoDataOutput
has two-recommended...
methods which will provide optimized settings
– NoHalfBits
2 days ago
add a comment |
up vote
0
down vote
You don't say what actually goes wrong, but two things look wrong with your code:
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
looks like it creates an undesired path like this @"/path/Movie/.mov"
, when you want this:
docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];
And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffer
s start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0
, so instead do this:
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(firstSampleBuffer) {
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
}
[writerInput appendSampleBuffer:sampleBuffer];
}
I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
– G. Hazarath Reddy
Nov 21 at 3:39
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
1
down vote
Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession
, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter
with attached inputs.
The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.
Furthermore, the AVCaptureStillImageOutput
method -captureStillImageAsynchronouslyFromConnection:completionHandler:
is nowhere called, so the capture session actually produces no frames.
So, as a minimum, implement something like this:
-(IBAction)captureStillImageAndAppend:(id)sender
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
{
// check error, omitted here
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
[writerInput appendSampleBuffer:imageDataSampleBuffer];
}];
}
Remove the AVAssetWriterInputPixelBufferAdaptor
, it's not used.
But there are issues with AVCaptureStillImageOutput
:
it's only intended to produce still images, not videos
it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (
stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};
)it's deprecated under iOS
If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput
add a AVCaptureVideoDataOutput
to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:
-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
[writerInput appendSampleBuffer:sampleBuffer];
}
Note that
you will want to make sure that the
AVCaptureVideoDataOutput
only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording actionreset the
startTime
tokCMTimeInvalid
before starting another recording
New contributor
Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
– G. Hazarath Reddy
Nov 22 at 9:24
1
Have a look at the documentation for-captureOutput:didDropSampleBuffer:fromConnection:
. It explains how to extract the reason from the passed in sample buffer with thekCMSampleBufferAttachmentKey_DroppedFrameReason
attachment.
– NoHalfBits
Nov 22 at 12:16
I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
– G. Hazarath Reddy
2 days ago
The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed todidOutputSampleBuffer
. Do you do anything beyond appending it to the input as shown above?
– NoHalfBits
2 days ago
A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set theexpectsMediaDataInRealTime
of the writer input to YES, right? Otherwise, you may have to adjust output settings; note thatAVCaptureVideoDataOutput
has two-recommended...
methods which will provide optimized settings
– NoHalfBits
2 days ago
add a comment |
up vote
1
down vote
Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession
, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter
with attached inputs.
The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.
Furthermore, the AVCaptureStillImageOutput
method -captureStillImageAsynchronouslyFromConnection:completionHandler:
is nowhere called, so the capture session actually produces no frames.
So, as a minimum, implement something like this:
-(IBAction)captureStillImageAndAppend:(id)sender
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
{
// check error, omitted here
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
[writerInput appendSampleBuffer:imageDataSampleBuffer];
}];
}
Remove the AVAssetWriterInputPixelBufferAdaptor
, it's not used.
But there are issues with AVCaptureStillImageOutput
:
it's only intended to produce still images, not videos
it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (
stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};
)it's deprecated under iOS
If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput
add a AVCaptureVideoDataOutput
to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:
-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
[writerInput appendSampleBuffer:sampleBuffer];
}
Note that
you will want to make sure that the
AVCaptureVideoDataOutput
only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording actionreset the
startTime
tokCMTimeInvalid
before starting another recording
New contributor
Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
– G. Hazarath Reddy
Nov 22 at 9:24
1
Have a look at the documentation for-captureOutput:didDropSampleBuffer:fromConnection:
. It explains how to extract the reason from the passed in sample buffer with thekCMSampleBufferAttachmentKey_DroppedFrameReason
attachment.
– NoHalfBits
Nov 22 at 12:16
I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
– G. Hazarath Reddy
2 days ago
The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed todidOutputSampleBuffer
. Do you do anything beyond appending it to the input as shown above?
– NoHalfBits
2 days ago
A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set theexpectsMediaDataInRealTime
of the writer input to YES, right? Otherwise, you may have to adjust output settings; note thatAVCaptureVideoDataOutput
has two-recommended...
methods which will provide optimized settings
– NoHalfBits
2 days ago
add a comment |
up vote
1
down vote
up vote
1
down vote
Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession
, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter
with attached inputs.
The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.
Furthermore, the AVCaptureStillImageOutput
method -captureStillImageAsynchronouslyFromConnection:completionHandler:
is nowhere called, so the capture session actually produces no frames.
So, as a minimum, implement something like this:
-(IBAction)captureStillImageAndAppend:(id)sender
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
{
// check error, omitted here
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
[writerInput appendSampleBuffer:imageDataSampleBuffer];
}];
}
Remove the AVAssetWriterInputPixelBufferAdaptor
, it's not used.
But there are issues with AVCaptureStillImageOutput
:
it's only intended to produce still images, not videos
it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (
stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};
)it's deprecated under iOS
If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput
add a AVCaptureVideoDataOutput
to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:
-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
[writerInput appendSampleBuffer:sampleBuffer];
}
Note that
you will want to make sure that the
AVCaptureVideoDataOutput
only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording actionreset the
startTime
tokCMTimeInvalid
before starting another recording
New contributor
Conceptually, you have to main functional areas: One that generates video frames – this the AVCaptureSession
, and everything that is attached to it –, and another that writes these frames to a file – in your case the AVAssetWriter
with attached inputs.
The problem with your code is: There is no connection between these two. No video frames / images coming out of the capture session are passed to the asset writer inputs.
Furthermore, the AVCaptureStillImageOutput
method -captureStillImageAsynchronouslyFromConnection:completionHandler:
is nowhere called, so the capture session actually produces no frames.
So, as a minimum, implement something like this:
-(IBAction)captureStillImageAndAppend:(id)sender
{
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageOutput.connections.firstObject completionHandler:
^(CMSampleBufferRef imageDataSampleBuffer, NSError* error)
{
// check error, omitted here
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( imageDataSampleBuffer))];
[writerInput appendSampleBuffer:imageDataSampleBuffer];
}];
}
Remove the AVAssetWriterInputPixelBufferAdaptor
, it's not used.
But there are issues with AVCaptureStillImageOutput
:
it's only intended to produce still images, not videos
it must be configured to produce uncompressed sample buffers if the asset writer input is configured to compress the appended sample buffers (
stillImageOutput.outputSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)};
)it's deprecated under iOS
If you actually want to produce a video, as opposed to a sequence of still images, instead of the AVCaptureStillImageOutput
add a AVCaptureVideoDataOutput
to the capture session. It needs a delegate and a serial dispatch queue to output the sample buffers. The delegate has to implement something like this:
-(void)captureOutput:(AVCaptureOutput*)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
if (CMTIME_IS_INVALID( startTime)) // startTime is an ivar
[assetWriter startSessionAtSourceTime:(startTime = CMSampleBufferGetPresentationTimeStamp( sampleBuffer))];
[writerInput appendSampleBuffer:sampleBuffer];
}
Note that
you will want to make sure that the
AVCaptureVideoDataOutput
only outputs frames when you're actually recording; add/remove it from the capture session or enable/disable its connection in the startStopVideoRecording actionreset the
startTime
tokCMTimeInvalid
before starting another recording
New contributor
New contributor
answered Nov 21 at 22:12
NoHalfBits
363
363
New contributor
New contributor
Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
– G. Hazarath Reddy
Nov 22 at 9:24
1
Have a look at the documentation for-captureOutput:didDropSampleBuffer:fromConnection:
. It explains how to extract the reason from the passed in sample buffer with thekCMSampleBufferAttachmentKey_DroppedFrameReason
attachment.
– NoHalfBits
Nov 22 at 12:16
I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
– G. Hazarath Reddy
2 days ago
The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed todidOutputSampleBuffer
. Do you do anything beyond appending it to the input as shown above?
– NoHalfBits
2 days ago
A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set theexpectsMediaDataInRealTime
of the writer input to YES, right? Otherwise, you may have to adjust output settings; note thatAVCaptureVideoDataOutput
has two-recommended...
methods which will provide optimized settings
– NoHalfBits
2 days ago
add a comment |
Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
– G. Hazarath Reddy
Nov 22 at 9:24
1
Have a look at the documentation for-captureOutput:didDropSampleBuffer:fromConnection:
. It explains how to extract the reason from the passed in sample buffer with thekCMSampleBufferAttachmentKey_DroppedFrameReason
attachment.
– NoHalfBits
Nov 22 at 12:16
I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
– G. Hazarath Reddy
2 days ago
The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed todidOutputSampleBuffer
. Do you do anything beyond appending it to the input as shown above?
– NoHalfBits
2 days ago
A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set theexpectsMediaDataInRealTime
of the writer input to YES, right? Otherwise, you may have to adjust output settings; note thatAVCaptureVideoDataOutput
has two-recommended...
methods which will provide optimized settings
– NoHalfBits
2 days ago
Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
– G. Hazarath Reddy
Nov 22 at 9:24
Thank you for response. It's working good and video also saving into documents folder. But it automatically stopped calling "didOutputSampleBuffer" after 2 seconds and started calling "didDropSampleBuffer". How to know cause of that issue and fix it?
– G. Hazarath Reddy
Nov 22 at 9:24
1
1
Have a look at the documentation for
-captureOutput:didDropSampleBuffer:fromConnection:
. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason
attachment.– NoHalfBits
Nov 22 at 12:16
Have a look at the documentation for
-captureOutput:didDropSampleBuffer:fromConnection:
. It explains how to extract the reason from the passed in sample buffer with the kCMSampleBufferAttachmentKey_DroppedFrameReason
attachment.– NoHalfBits
Nov 22 at 12:16
I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
– G. Hazarath Reddy
2 days ago
I can able to find the reason. It's always returning kCMSampleBufferDroppedFrameReason_OutOfBuffers after 2 seconds. How to overcome this. It's automatically getting stops after two seconds. I couldn't get solution for this. Can you suggest me the solution?
– G. Hazarath Reddy
2 days ago
The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to
didOutputSampleBuffer
. Do you do anything beyond appending it to the input as shown above?– NoHalfBits
2 days ago
The most obvious reason for this would be retaining/keeping a permanent reference to the sample buffer passed to
didOutputSampleBuffer
. Do you do anything beyond appending it to the input as shown above?– NoHalfBits
2 days ago
A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the
expectsMediaDataInRealTime
of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput
has two -recommended...
methods which will provide optimized settings– NoHalfBits
2 days ago
A more subtle reason would be the asset writer not being able to handle (compress) the passed in buffer fast enough, e.g. when the cpu or gpu are under load. You did set the
expectsMediaDataInRealTime
of the writer input to YES, right? Otherwise, you may have to adjust output settings; note that AVCaptureVideoDataOutput
has two -recommended...
methods which will provide optimized settings– NoHalfBits
2 days ago
add a comment |
up vote
0
down vote
You don't say what actually goes wrong, but two things look wrong with your code:
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
looks like it creates an undesired path like this @"/path/Movie/.mov"
, when you want this:
docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];
And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffer
s start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0
, so instead do this:
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(firstSampleBuffer) {
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
}
[writerInput appendSampleBuffer:sampleBuffer];
}
I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
– G. Hazarath Reddy
Nov 21 at 3:39
add a comment |
up vote
0
down vote
You don't say what actually goes wrong, but two things look wrong with your code:
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
looks like it creates an undesired path like this @"/path/Movie/.mov"
, when you want this:
docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];
And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffer
s start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0
, so instead do this:
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(firstSampleBuffer) {
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
}
[writerInput appendSampleBuffer:sampleBuffer];
}
I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
– G. Hazarath Reddy
Nov 21 at 3:39
add a comment |
up vote
0
down vote
up vote
0
down vote
You don't say what actually goes wrong, but two things look wrong with your code:
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
looks like it creates an undesired path like this @"/path/Movie/.mov"
, when you want this:
docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];
And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffer
s start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0
, so instead do this:
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(firstSampleBuffer) {
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
}
[writerInput appendSampleBuffer:sampleBuffer];
}
You don't say what actually goes wrong, but two things look wrong with your code:
docPath = [[docPath stringByAppendingPathComponent:@"Movie"] stringByAppendingString:@".mov"];
looks like it creates an undesired path like this @"/path/Movie/.mov"
, when you want this:
docPath = [docPath stringByAppendingPathComponent:@"Movie.mov"];
And your timeline is wrong. Your asset writer starts at time 0, but the sampleBuffer
s start at CMSampleBufferGetPresentationTimestamp(sampleBuffer) > 0
, so instead do this:
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
if(firstSampleBuffer) {
[assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimestamp(sampleBuffer)];
}
[writerInput appendSampleBuffer:sampleBuffer];
}
answered Nov 20 at 21:36
Rhythmic Fistman
23.3k450104
23.3k450104
I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
– G. Hazarath Reddy
Nov 21 at 3:39
add a comment |
I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
– G. Hazarath Reddy
Nov 21 at 3:39
I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
– G. Hazarath Reddy
Nov 21 at 3:39
I don't have any issue with document path. I think "didOutputSampleBuffer" use for "AVCaptureVideoOutput" that's why it won't call because we are using AVAssetWriter. We are providing docPath to asset writer to save video. we just call start and finish to asset writer. I don't have clear clarity on it. Can you provide clear picture on it.
– G. Hazarath Reddy
Nov 21 at 3:39
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53397245%2fhow-to-save-recorded-video-using-avassetwriter%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown