I've been playing around with QTCaptureView to grab input from my iSight and I want to then run some Core Image filters over the image it's receiving. I first attempted this with Hotcoca and I got everything working, except when I ran everything the performance was significantly slower than the same application written in Objective-C.
I went and rewrote things right in MacRuby to check if there were any overhead with the Hotcocoa libraries that could account for the slowness. Unfortunately the problem still existed. After analysis I'm noticing the application is eating memory rather quickly and then all at once releasing much of it and the cycle begins again.
The same application in Objective-C runs steadily.
def awakeFromNib
session = QTCaptureSession.alloc.init
device = QTCaptureDevice.defaultInputDeviceWithMediaType(QTMediaTypeVideo)
device.open(nil)
input = QTCaptureDeviceInput.alloc.initWithDevice(device)
success = session.addInput(input, error:nil)
@filter = CIFilter.filterWithName("CICrystallize")
@filter.setDefaults
@filter.setValue(5, forKey:"inputRadius")
captureView.setCaptureSession(session)
captureView.setDelegate(self)
session.startRunning
end
def view(v, willDisplayImage:image)
@filter.setValue(image, forKey:"inputImage")
@filter.valueForKey "outputImage"
end
If you remove the setDelegate call things run just fine, but then you don't get the filtering. Also if you remove all the code from the delegate method the memory growth still occurs, but the video runs just as smoothly as the straight Objective C version. I'm not sure if the slowness is related to the memory growth or if they are separate issues. I've also had the memory grow quite uncontrollably to up to a gig before my system started thrashing and about a minute later it recovered.
I'd love to try and fix this problem, but I'm out of ideas on where to look. If anyone has any suggestions on where I could dig around I'm all ears.
Thanks,
Adam Elliot