Memory warings and app crashs.
Hi bottotl,
Can you please help with the below memory warings and app crashs.
crashes the app with this case
Got memory pressure notification (critical)
2017-11-01 18:50:48.364536+0530 Edovi[21650:6364194] [MC] Invalidating cache 2017-11-01 18:51:03.619378+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 18:51:03.646973+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 18:51:03.688721+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 18:51:21.358407+0530 Edovi[21650:6351901] [GatekeeperXPC] Connection to assetsd was interrupted or assetsd died 2017-11-01 18:51:36.579644+0530 Edovi[21650:6347783] Got memory pressure notification (non-critical) 2017-11-01 18:51:43.767365+0530 Edovi[21650:6347783] System is no longer under (non-critical) memory pressure. 2017-11-01 18:52:15.053472+0530 Edovi[21650:6373407] [MC] Invalidating cache 2017-11-01 18:52:15.315994+0530 Edovi[21650:6351902] [MC] Invalidating cache 2017-11-01 18:53:10.168759+0530 Edovi[21650:6347783] Got memory pressure notification (non-critical) 2017-11-01 18:53:34.551883+0530 Edovi[21650:6364194] [MC] Invalidating cache 2017-11-01 18:53:34.748212+0530 Edovi[21650:6364194] [MC] Invalidating cache 2017-11-01 18:54:20.383428+0530 Edovi[21650:6347783] System is no longer under (non-critical) memory pressure. 2017-11-01 18:54:40.657309+0530 Edovi[21650:6388455] [MC] Invalidating cache 2017-11-01 18:54:40.865958+0530 Edovi[21650:6351903] [MC] Invalidating cache 2017-11-01 19:00:03.209629+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.236189+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.257170+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.276643+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.295550+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.316632+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.338264+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.357406+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.376734+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.395740+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.420811+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.440474+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.459381+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:08.752881+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
Too little information for this problem. Can you tell me how your code working? If you create CIImage yourself, do remember release it as soon as possible. Are you adding filter to a video with high quality? Memory issues and CPU usage problem is hard to solve when you try to do CIFilter to a huge pixbuffer.
发自我的 iPhone
在 2017年11月1日,下午9:38,5ysourcesafe [email protected] 写道:
Hi bottotl,
Can you please help with the below memory warings and app crashs.
2017-11-01 18:50:48.364536+0530 Edovi[21650:6364194] [MC] Invalidating cache 2017-11-01 18:51:03.619378+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 18:51:03.646973+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 18:51:03.688721+0530 Edovi[21650:6351902] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 18:51:21.358407+0530 Edovi[21650:6351901] [GatekeeperXPC] Connection to assetsd was interrupted or assetsd died 2017-11-01 18:51:36.579644+0530 Edovi[21650:6347783] Got memory pressure notification (non-critical) 2017-11-01 18:51:43.767365+0530 Edovi[21650:6347783] System is no longer under (non-critical) memory pressure. 2017-11-01 18:52:15.053472+0530 Edovi[21650:6373407] [MC] Invalidating cache 2017-11-01 18:52:15.315994+0530 Edovi[21650:6351902] [MC] Invalidating cache 2017-11-01 18:53:10.168759+0530 Edovi[21650:6347783] Got memory pressure notification (non-critical) 2017-11-01 18:53:34.551883+0530 Edovi[21650:6364194] [MC] Invalidating cache 2017-11-01 18:53:34.748212+0530 Edovi[21650:6364194] [MC] Invalidating cache 2017-11-01 18:54:20.383428+0530 Edovi[21650:6347783] System is no longer under (non-critical) memory pressure. 2017-11-01 18:54:40.657309+0530 Edovi[21650:6388455] [MC] Invalidating cache 2017-11-01 18:54:40.865958+0530 Edovi[21650:6351903] [MC] Invalidating cache 2017-11-01 19:00:03.209629+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.236189+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.257170+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.276643+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.295550+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.316632+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.338264+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.357406+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.376734+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.395740+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.420811+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.440474+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:03.459381+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2017-11-01 19:00:08.752881+0530 Edovi[21650:6351903] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.
Check this things
Follow these practices for best performance: Don’t create a CIContext object every time you render. Contexts store a lot of state information; it’s more efficient to reuse them. Evaluate whether you app needs color management. Don’t use it unless you need it. See Does Your App Need Color Management? https://developer.apple.com/library/content/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_performance/ci_performance.html#//apple_ref/doc/uid/TP30001185-CH10-SW7. Avoid Core Animation animations while rendering CIImage objects with a GPU context. If you need to use both simultaneously, you can set up both to use the CPU. Make sure images don’t exceed CPU and GPU limits. Image size limits for CIContext objects differ depending on whether Core Image uses the CPU or GPU. Check the limit by using the methods inputImageMaximumSize https://developer.apple.com/documentation/coreimage/cicontext/1620425-inputimagemaximumsize and outputImageMaximumSize https://developer.apple.com/documentation/coreimage/cicontext/1620335-outputimagemaximumsize. User smaller images when possible. Performance scales with the number of output pixels. You can have Core Image render into a smaller view, texture, or framebuffer. Allow Core Animation to upscale to display size. Use Core Graphics or Image I/O functions to crop or downsample, such as the functions CGImageCreateWithImageInRect https://developer.apple.com/documentation/coregraphics/1454683-cgimagecreatewithimageinrect or CGImageSourceCreateThumbnailAtIndex https://developer.apple.com/documentation/imageio/1465099-cgimagesourcecreatethumbnailatin. The UIImageView class works best with static images. If your app needs to get the best performance, use lower-level APIs. Avoid unnecessary texture transfers between the CPU and GPU. Render to a rectangle that is the same size as the source image before applying a contents scale factor. Consider using simpler filters that can produce results similar to algorithmic filters. For example, CIColorCube can produce output similar to CISepiaTone, and do so more efficiently. Take advantage of the support for YUV image in iOS 6.0 and later. Camera pixel buffers are natively YUV but most image processing algorithms expect RBGA data. There is a cost to converting between the two. Core Image supports reading YUB from CVPixelBuffer objects and applying the appropriate color transform. options = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr88iPlanarFullRange) }; <>
Here are some code here.
- (CVPixelBufferRef)finishPassthroughCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request error:(NSError **)errOut {
@autoreleasepool {
JFTAVCustomVideoCompositionInstruction *instruction = request.videoCompositionInstruction;
JFTAVCustomVideoCompositionLayerInstruction *simpleLayerInstruction = instruction.simpleLayerInstructions.firstObject;
CGSize renderSize = _renderContext.size;
CVPixelBufferRef pixelBuffer = [_renderContext newPixelBuffer];
if (!request.sourceTrackIDs.count) {
NSLog(@"request.sourceTrackIDs.count does not exit");
CIImage *emptyImage = [CIImage imageWithColor:[CIColor colorWithCGColor:[UIColor blackColor].CGColor]];
emptyImage = [emptyImage imageByCroppingToRect:CGRectMake(0, 0, renderSize.width, renderSize.height)];
[_ciContext render:emptyImage toCVPixelBuffer:pixelBuffer];
return pixelBuffer;
}
CMPersistentTrackID trackID = simpleLayerInstruction?simpleLayerInstruction.trackID:request.sourceTrackIDs[0].intValue;
CVPixelBufferRef sourcePixels = [request sourceFrameByTrackID:trackID];
if (!sourcePixels) return nil;
CIImage *sourceImage = [CIImage imageWithCVPixelBuffer:sourcePixels];
if (simpleLayerInstruction) {
sourceImage = [sourceImage imageByApplyingTransform:[self transformFix:simpleLayerInstruction.transform extent:sourceImage.extent]];
}
if (simpleLayerInstruction.videoItem.filter) {
[simpleLayerInstruction.videoItem.filter setValue:sourceImage forKey:kCIInputImageKey];
sourceImage = simpleLayerInstruction.videoItem.filter.outputImage;
}
[_ciContext render:sourceImage toCVPixelBuffer:pixelBuffer];
if (!pixelBuffer) {
*errOut = [NSError errorWithDomain:@"finishPassthroughCompositionRequest error unknow"
code:1000
userInfo:nil];
}
return pixelBuffer;
}
}
The code you reply in the mail have some problem:
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *inputCIImage = [CIImage imageWithCGImage:[image CGImage]];
[filter setValue:inputCIImage forKeyPath:kCIInputImageKey];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *img = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
return img;
same like you playback video by doing following things:
- Decode video --> get UIImage
- add filter to UIImage
- show UIImage to user
this is not look like a good way. Use CVPixelBufferRef replace of UIImage.
Using AVFoundation for playback is very simple.
20 mins…I have never tried this kind of video before. I may do some test tonight. Any thing about this video’s media info, it may help me to find out a way to solve this problem.
在 2017年11月2日,下午1:06,5ysourcesafe [email protected] 写道:
Hi bottotl, Thanks for providing information. But now i am using your sample code to add filters for video with 20 mins(indiv) video
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/bottotl/AVBuilder/issues/2#issuecomment-341318843, or mute the thread https://github.com/notifications/unsubscribe-auth/AJYdTWaZqInYYjan2ay29yGimexMX9-aks5syU3ZgaJpZM4QONpL.
Thanks for you response bottotl.
I am try to merge multiple videos with different filters and transitions to individual videos added by the user into project. But i tried to export video with 20 to 30 mins it crash app with memory usages 60 to 70 MB only but the other process will use around 700 to 800MB can you please help on this. I am new to AVFoundation framework.
Can it is possible to unlimited time to export video.
Are you using JFTAVAssetExportSession for export?
在 2017年11月2日,下午2:07,5ysourcesafe [email protected] 写道:
Thanks for you response bottotl.
I am try to merge multiple videos with different filters and transitions by the user added to project. But i tried to export video with 20 to 30 mins it crash app with memory usages 60 to 70 MB only but the other process will use around 700 to 800MB can you pls help on this.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/bottotl/AVBuilder/issues/2#issuecomment-341326542, or mute the thread https://github.com/notifications/unsubscribe-auth/AJYdTYTt-AlxUPd3UprnT6Ry5KILpfeyks5syVwTgaJpZM4QONpL.
Yes i am using JFTAVAssetExportSession
Ok, I will see this later.
If you are in hurry, you can try to use AVAssetExportSession replace of this class.
AVAssetExportSession don’t gave enough size to choice,so I wrote this simple class to solve this. I know it is not reliable long time ago….
在 2017年11月2日,下午3:03,5ysourcesafe [email protected] 写道:
JFTAVAssetExportSession
It works fine when I try to export a video about 2G on my iPhone7 iOS 11.1 . Instruments no leak, app no crash.I found some little leak but not very serious, I will fix this leak this weekend.
It do have huge leak problem when use CIContext in simulator.I found some one have the same problem I looks like a Apple's bug still not fixed since iOS 9. I will try to figure out is there any possible to prevent it.
@5ysourcesafe
Consider AVAssetReader may post too mush frames to compositor, there were two way to solve this problem
- slow down post operation.
- stop rendering until objects been released.
I add some protection when memory warning happen. Custom compositor observe UIApplicationDidReceiveMemoryWarningNotification ,when receive this warning, let the render queue sleep 1 sec .
My co-worker tell me, consider CIContext do have memory issues(in it's black box), use OpenGL to apply Filter to image can solve memory problem. But I'm not familiar with OpenGL, I will have a try.