The latest beta of Android 12 from Google has a new superpower that goes almost unnoticed. Could this be the future of how we interact with our phones?
Well, I will be: just when I thought I found out everything we need to know Android 12 update almost ready for prime time Google , a new futuristic feature has been introduced in the mix.
As I know discovered over the weekend The latest beta version of Android 12 includes a fascinating new capability that allows you to literally control your phone with your eyes. A look to the left, for example, could replace the typical system-level back gesture. A look up could open notifications. And a shy raise of your eyebrows could bring you back to the home screen (and make everyone around you think you’re the most uncomfortable person in the world).
The possibilities continue from there. You can teach your Android phone to open the quick settings panel when you open your mouth, or even open the general interface of the app switch every time you display a winning smile. (Be sure to avoid using the system while eating delicacy meat, as all chew-induced smiles and salamis could even drive your phone crazy.)
This is a really wild thing and it works almost surprisingly well. However, the most shocking thing may be that Google doesn’t even have it mentioned the presence of this possibility in any public forum. He simply appeared in silence, without any fanfare, announcement, or as much as a hired monkey collapsing and playing cymbals to alert us.
However, if you’re using the latest beta of Android 12, you can test the system right now. And if not, don’t get too nervous. The feature will probably arrive on your phone once the Android 12 software is officially completed and ready to go, which suggests all the signs should happen in any week and certainly in the next month.
So, warm up your viewers and put on a mental dashboard. As long as you have Android 12 in front of you, this is all you need to do to find and activate your device:
- Open the system settings and go to the Accessibility section.
- Scroll down until you see the line labeled «Enhanced Accessibility.» Play it. Damn it, play it!
- Now tap the switch next to the words «Use switch access» and follow the instructions to give the system the permissions it needs to operate. (It may sound like a lot, but (a) they are clearly required for this mode of operation and (b) it is a system-wide application, made by Google, that we are talking about, so it is not actually granted access to any type of third party The request is only a formality that appears every time this form of access is required).
- Follow the steps to configure the system and its various forward-facing gestures. You can also return to the same options later by tapping «Settings» on the «Enhanced Accessibility» main screen and then tapping «Camera switch settings» on the screen that appears.
No matter how you do it, you will eventually have six possible face and eye gestures that you can set up.
And you can configure any of them to handle a little more than a dozen different tasks.
The easiest thing you can try is to configure some of the gestures to control the basic navigation of the system; you know, commands like stepping back, returning to the home screen, opening the system presentation interface, and can also open notifications and / or quick settings area.
The other thing I would suggest is to tap the «Switch Access» option on the main settings screen to activate «Switch Shortcut». This will allow you to set the system to be awake and attentive to your commands each time you press and hold both volume keys together and then turn off the next time you press those buttons.
The visual gesture system consumes a good amount of battery power and can also become quite annoying if it is activated when you are not deliberately trying to use it (especially if you tend to raise your eyebrows a lot, it feels uncomfortable, uncomfortable). strange). Enabling this shortcut will give you a simple and discreet way to start and stop the system whenever you feel the need.
So, putting it all together, here’s an example where I never touch my screen and just do the following:
- Press and hold the up and down buttons at the same time
- Search for
- Look to the left
- Raise your eyebrows and hold them up for a moment
Following these steps, you will see:
- The visual gesture system is activated
- My notifications panel is open
- Close my notifications panel
- My system presentation area is open, with the most recent process I opened and then selected
- My phone returns to the home screen
- My wife stares at me, sighs, and leaves the room (not pictured)
Ready for that?
Again, apart from the initial activation of the feature, by pressing the volume up and down button, this involved nothing but subtle movements with my face. Pretty crazy, isn’t it?
Longtime Android fans may notice that this feature is vaguely similar to a The «Smart Scroll» option that Samsung included it in Galaxy devices. With this, the phone would try to use angle holding the device to scroll through web pages and other long documents depending on how much your phone or big head was tilted at one point. However, this feature has never been particularly reliable and quietly disappeared somewhere around 2014.
With Android 12, it seems that the same basic concept returns with a much more advanced and consistent technology behind it and with a much more distinctive purpose in mind. Outside of accessibility goals, of course, it’s hard to say how useful it will be in the real world in a long-term sense. Maybe with a handful of extra action options, you could have real potential to simplify phone use during exercise (think of an elliptical cross trainer or a stationary bike) or read Very Important Business Materials ™ in bed.
At least, though, it’s a new fun trick and an impressive demonstration of what our current phone technology makes possible. And it’s absolutely a feature worth keeping an eye on as you head out into the world and then keep evolving from there.