Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
I am changing the picture buffer (CVPixelBuffer) to UIImage and, in doing so, I am altering the orientation in addition to the dimensions.
let picture = CIImage(cvImageBuffer: imageBuffer)
let imageSize = CGSize(width: CVPixelBufferGetWidth(imageBuffer), peak: CVPixelBufferGetHeight(imageBuffer))
let normalizeTransform = CGAffineTransform(scaleX: 1.0 / imageSize.width, y: 1.0 / imageSize.peak)
let flipTransform = orientation.isPortrait ? CGAffineTransform(scaleX: -1, y: -1).translatedBy(x: -1, y: -1) : .identification
let viewPortSize: CGSize = viewPort.measurement
let displayTransform: CGAffineTransform = arFrame.displayTransform(for: orientation, viewportSize: CGSize(width: viewPortSize.width, peak: viewPortSize.peak))
let scaleX: CGFloat = viewPortSize.width
let scaleY: CGFloat = viewPortSize.peak
let viewPortTransform = CGAffineTransform(scaleX: scaleX, y: scaleY)
let scaledImage: CIImage = picture
.reworked(by: normalizeTransform
.concatenating(flipTransform)
.concatenating(displayTransform)
.concatenating(viewPortTransform)
)
.cropped(to: viewPort)
guard let uiImage: UIImage = self.convert(cmage: scaledImage) else {
return nil
}
(arFrame is ARFrame from ARKit and displayTransform is for creating the CGAffineTransform for reworking a normalized picture.)
The breakdown of above code is one thing like this:
One downside I am going through is that since I am cutting down the picture #1 and enlarging it again up #4, the picture high quality appears to be severely impacted. #1 has to come back earlier than #4 and cannot be mixed since #3 has to soak up a normalized picture.