Skip to main content

Google details ML behind Pixel 4’s ‘firm press’ and why it avoided ‘force press’

The March Feature Drop introduced a slew of features like Cards & Passes, play/pause with Motion Sense, and dark theme scheduling. The Pixel 4 picked up support for a firm press gesture that serves as a quicker alternative to long pressing. Google AI today detailed the machine learning work behind it.

Google starts by arguing how the 400-500 millisecond wait associated with a long press has “negative effects for usability and discoverability as the lack of immediate feedback disconnects the user’s action from the system’s response.” However, the team recognizes how hardware-based approaches are “expensive to design and integrate” as evidenced by the Apple Watch Series 6 likely getting rid of Force Touch following design changes in watchOS 6 this week.

When a user’s finger presses against a surface, its soft tissue deforms and spreads out. The nature of this spread depends on the size and shape of the user’s finger, and its angle to the screen. At a high level, we can observe a couple of key features in this spread (shown in the figures): it is asymmetric around the initial contact point, and the overall center of mass shifts along the axis of the finger.

Since people have different fingers, Google could not “encode these observations with heuristic rules” and turned to a machine learning approach.

The network was intentionally kept simple to minimize on-device inference costs when running concurrently with other applications (taking approximately 50 µs of processing per frame and less than 1 MB of memory using TensorFlow Lite).

Meanwhile, in deciding how to implement the underlying technology, UX research found that it was “hard for users to discover force-based interactions and that users frequently confused a force press with a long press because of the difficulty in coordinating the amount of force they were applying with the duration of their contact.”

This informed Google’s decision to not create an entirely “new interaction modality based on force.” On the Pixel 4, the firm detection goes towards speeding up long presses. This also means that existing apps can take advantage of the improvements without any developer updates.

Applications that use Android’s GestureDetector or View APIs will automatically get these press signals through their existing long-press handlers. Developers that implement custom long-press detection logic can receive these press signals through the MotionEvent classification API introduced in Android Q.

A user long pressing (left) and firmly pressing (right) on a launcher icon.

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel



Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: