[ad_1]
I have an UIImageView in which I draw handwritten text, using UIGraphicsBeginImageContext to create the bitmap image.
I pass this image to an OCR func:
func ocrText(onImage: UIImage?) {
let request = VNRecognizeTextRequest { request, error in
guard let observations = request.results as? [VNRecognizedTextObservation] else {
fatalError("Received invalid observations")
}
print("observations", observations.count) // count is 0
for observation in observations {
if observation.topCandidates(1).isEmpty {
continue
}
}
} // end of request handler
request.recognitionLanguages = ["fr"]
let requests = [request]
DispatchQueue.global(qos: .userInitiated).async {
let ocrGroup = DispatchGroup()
guard let img = onImage?.cgImage else { return }
crGroup.enter()
let handler = VNImageRequestHandler(cgImage: img, options: [:])
try? handler.perform(requests)
ocrGroup.leave()
crGroup.wait()
}
}
Problem is that observations is an empty array.
But, If I save UIImage to the photo album:
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
and read back image from the album with imagePicker and pass this image to ocrText, it works.
So it seems there is a format change to the image (or metadata?) when saved to album and that VNRecognizer needs those data.
Is there a way to change directly the original bitmap image format, without going through the storage on photo album ?
Or am I missing something in the use of VNRecognizeTextRequest ?
[ad_2]