On-device processing key to iPadOS Scribble’s success hints Apple SVP Craig Federighi


Apple’s handwriting recognition in the Apple Pencil relies on recognizing strokes, an interview with Craig Federighi reveals, while new features such as iPadOS’ Scribble rely on massive amounts of onboard machine learning processing.

Introduced as part of iPadOS 14, Scribble enables users to fill out text fields and forms using the Apple Pencil, without needing to type anything out. It accomplishes this by performing onboard processing instead of cloud-based versions, as well as taking advantage of machine learning to improve its accuracy.

Speaking to Popular Mechanica, Apple SVP of software engineering Craig Federighi explains how the Apple Pencil’s handwriting recognition was produced. It all started with “data-gathering” by asking people around the world to write stuff down.

“We give them a Pencil, and we have them write fast, we have them write slow, write at a tilt. All of this variation,” said Federighi. “If you understand the strokes and how the strokes went down, that can be used to disambiguate what was being written.”

Combining the stroke-based recognition with character and word prediction also means that a lot of processing has to take place. As speed is of the essence, this eliminates the use of cloud-based processing of handwriting recognition, and instead forced Apple into a system involving on-device processing.

Apple’s expertise in chip design has led to the new iPad Air 4 having the A14 Bionic, Apple’s fastest self-designed SoC, packing 11.8 billion transistors, a 6-core CPU, a new 4-core graphics architecture, and a 16-core Neural Engine that is capable of up to 11 trillion operations per second. Apple has even added CPU-based machine learning accelerators, which makes machine learning tasks run up to 10 times fas



Article First Apperared here