The latest “feature drop” for Google’s Pixel line of Android phones includes the ability to “firmly press” on the screen “to get more help from your apps more quickly.” If that sounds familiar, it’s because it’s a lot like iPhone’s 3D Touch, which Apple stopped supporting in all of its 2019 iPhones. The Verge reports: “Firmly press” sets off alarm bells because it sounds a lot like the iPhone’s 3D Touch, which enables different actions depending on how hard you press on the touchscreen. It was a beloved feature for some people because it gave faster access to the cursor mode on the iPhone’s keyboard (I think long-pressing the space bar works fine for that, but I get that people love it). It’s also gone on the latest versions of the iPhone — Apple has seemingly abandoned it because the hardware to support it was too expensive/thick/complex/finicky/whatever. But now, it seems that Google has done the same thing for the touchscreen that it does with the camera: use its software algorithms to make commodity parts do something special. That is a very Googley thing to do, but not quite as Googley as the fact that there was virtually no information about this feature to be found anywhere on the internet beyond a speculative note over at XDA Developers.
After a few hours of back and forth, I finally got more details from Google. Here’s what this feature does, according to Google: “Long Press currently works in a select set of apps and system user interfaces such as the app Launcher, Photos, and Drive. This update accelerates the press to bring up more options faster. We also plan to expand its applications to more first party apps in the near future.” Essentially, this new feature lets you press harder to bring up long-press menus faster. In fact, Google’s documentation for Android’s Deep Press API explicitly says it should never do a new thing, it should only be a faster way to execute a long press. The answer to why it only works in certain apps is that a lot of Android developers aren’t using standard APIs for long press actions. Because Android. Okay, but how does it work? It turns out my hunch was correct: Google has figured out how to use machine learning algorithms to detect a firm press, something Apple had to use hardware for.