Can Pixel 4's radical Motion Sense navigation spell 'the end of the touchscreen'?

There is a reason why Google is going with a giant top bezel on the Pixel 4, in times where the Note 10 will be the most compact large-screen phone in our database precisely on account of the shaved-off strip there. No, that reason is not a 3D face-scanning kit, although there will be one, and Google is collecting faces to perfect it at $5 a pop.
rules to allow users to operate Google Soli devices while aboard aircraft.
We find that the Soli sensors, when operating under the waiver conditions specified herein, pose minimal potential of causing harmful interference to other spectrum users and uses of the 57-64 GHz frequency band, including for the earth exploration satellite service (EESS) and the radio astronomy service (RAS). We further find that grant of the waiver will serve the public interest by providing for innovative device control features using touchless hand gesture technology.
How does Motion Sense work?
Google claims that the radar and the accompanying software can “track sub-millimeter motion at high speeds with great accuracy.” The Soli chip does it by pushing out electromagnetic waves in a broad sweep that get reflected back to the tiny antenna inside.
A combination of sensors and software algorithms then accounts for the energy these reflected beams carry, the time they needed to come back, and how they changed on the way, being able to determine “the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity.” Despite that the small chip can’t really bring the spatial recognition of larger installations, Google has perfected its motion sensing and predicting algorithms to allow for slight variations in gestures that will be transformed into one and the same interface action.
The Soli radar algorithms can recognize gesture variations brought on by different users
Motion Sense brings a universal set of gestures
Hand gestures come more natural than pushing against a piece of glass, yet so far the technology for their recognition on a phone has been imperfect as it relied on camera sensors. Google is aiming to revolutionize the interaction with our mobile devices by employing the radar-based Motion Sense technology which would include, but not be limited to, the following natural gestures that can be employed in any orientation of the phone, day or night.
Button
Slider
So, can the world be your interface?
Dial
Google is known for its “moonshots,” or crazy-sounding projects that it thinks will ultimately prove their mettle as time goes by. So far, the Google X moonshot lab has bagged incredible technology leaps like the self-driving car efforts, smart contact lenses, and other eye-poppers. The Motion Sense gear on the Pixel 4 could be one small step for Google’s hardware department, and one giant leap for the future of the smartphone interaction.
In fact, the guy behind Project Soli, Google’s Ivan Poupyrev that you saw in the video above, recently gave a TED talk explaining how this radar-based gesture navigation can be deployed everywhere in a “the world is your interface” kind of moment.
While we can’t really comment on the practicality of Google and Levi’s Project Jackard idea that employs the motion sensing gizmo in a jeans jacket, getting it into the Pixel 4 is a whole different ball game.
Levi’s Jacquard jacket has a Soli radar as a snap tag
Google’s Pixel 4 and Motion Sense
Just when we were preparing this primer on Project Soli, Google came out confirming that this will indeed be the tech occupying the mysterious openings at the right of the thick top bezel on the Pixel 4. In its blog post, the company went through the same points and advantages we list above in more detail. Apparently, it all ties up with Google’s head of hardware Rick Osterloh “ambient computing” strategy which he explains as:
Our vision is that everything around you should be able to help you. And so many things are becoming computers that we think the users should be able to seamlessly get help wherever they need it from a variety of different devices.
What about scrolling with an air flick of the finger through long articles, or going back in the interface with a simple thumb twitch, though? Google does wax poetic that this is just the start and “Motion Sense will evolve,” but we’ve heard many a marketing writeups for options and features that ultimately prove to be slow on the uptake.
That Motion Sense “will be available in select Pixel countries” bit is also raising a few eyebrows, as to why would a Pixel 4 model in one place come with the radar-based gesture navigation, while in others it won’t. Is it because different countries have different rules on commercial radars in the 57-64 GHz frequency band and some are reluctant to accept the FCC’s waiver?
We probed Google with the question and will get back to you here when we get a nod. What do you think, could Pixel 4’s Motion Sense be the “end of the touchscreen” and the beginning of the “world is your interface” era indeed, and is it too early to tell, or too complex of an interaction paradigm to take off?