ios – Learn how to resize UIImage with out compromising the picture high quality?


I am changing the picture buffer (CVPixelBuffer) to UIImage and, in doing so, I am altering the orientation in addition to the dimensions.

let picture = CIImage(cvImageBuffer: imageBuffer)
let imageSize = CGSize(width: CVPixelBufferGetWidth(imageBuffer), peak: CVPixelBufferGetHeight(imageBuffer))
let normalizeTransform = CGAffineTransform(scaleX: 1.0 / imageSize.width, y: 1.0 / imageSize.peak)
let flipTransform = orientation.isPortrait ? CGAffineTransform(scaleX: -1, y: -1).translatedBy(x: -1, y: -1) : .identification
let viewPortSize: CGSize = viewPort.measurement
let displayTransform: CGAffineTransform = arFrame.displayTransform(for: orientation, viewportSize: CGSize(width: viewPortSize.width, peak: viewPortSize.peak))

let scaleX: CGFloat = viewPortSize.width
let scaleY: CGFloat = viewPortSize.peak
let viewPortTransform = CGAffineTransform(scaleX: scaleX, y: scaleY)

let scaledImage: CIImage = picture
    .reworked(by: normalizeTransform
        .concatenating(flipTransform)
        .concatenating(displayTransform)
        .concatenating(viewPortTransform)
    )
    .cropped(to: viewPort)

guard let uiImage: UIImage = self.convert(cmage: scaledImage) else {
    return nil
}

(arFrame is ARFrame from ARKit and displayTransform is for creating the CGAffineTransform for reworking a normalized picture.)

The breakdown of above code is one thing like this:

  1. Scale down the picture to normalize the coordinates.
  2. Flip the picture in line with the orientation (some ARKit quirk)
  3. Rework the picture suited to rendering the digicam picture onscreen.
  4. Scale up the picture to suit the digicam display.

One downside I am going through is that since I am cutting down the picture #1 and enlarging it again up #4, the picture high quality appears to be severely impacted. #1 has to come back earlier than #4 and cannot be mixed since #3 has to soak up a normalized picture.

Leave a Reply