Welcome to Project Soli

Welcome to Project Soli


Poupyrev:
My name is Ivan Poupyrev, and I work for Advanced
Technology and Projects group at Google. The hand is the ultimate
input device. It’s extremely precise,
it’s extremely fast, and it’s very natural
for us to use it. Capturing the possibilities
of the human hand was one of my passions. How could we take
this incredible capability, the finesse of human actions
and finesse of using our hand, but apply it
to the virtual world? We use radio frequency spectrum,
which is radars, to track human hand. Radars have been used
for many different things– to track cars, big objects,
satellites and planes. We’re using them
to track micro motions, twitches, of the human hand and then use that
to interact with wearables and Internet of Things and other computing devices. Lien: Our team is focused
on taking radar hardware and turning it
into a gesture sensor. Radar is a technology which transmits a radio wave
towards a target, and then the receiver
of the radar intercepts the reflected energy
from that target. The reason why we’re able
to interpret so much from this one radar signal is because of the full
gesture recognition pipeline that we’ve built. The various stages
of this pipeline are designed to extract
specific gesture information from this one radar signal
that we receive at a high frame rate. Amihood: From these strange,
foreign range Doppler signals, we are actually interpreting
human intent. Karagozler: Radar
has some unique properties when compared to cameras,
for example. It has very high
positional accuracy, which means that you can sense
the tiniest motions. Schwesig: We arrived
at this idea of virtual tools because we recognized
that there are certain archetypes of controls,
like a volume knob or a physical slider,
a volume slider, Imagine a button between your thumb
and your index finger, and the button’s not there, but pressing this
is a very clear action. And there’s an actual
physical haptic feedback that occurs as you perform
that action. The hand can both embody
a virtual tool, and it can also be, you know, acting on that virtual tool
at the same time. So if we can recognize
that action, we have an interesting direction
for interacting with technology. Poupyrev: So when we started
this project, you know, me and my team,
we looked at the project idea, and we thought,
“Are we gonna make it or not? Eh, we don’t know.” But we have to do it. Because unless you do it,
you don’t know. Raja: What I think I’m most
proud of about our project is, we have pushed
the processing power of the electronics itself
further out to do the sensing part for us. Poupyrev:
The radar has a property which no other technology has. It can work through materials. You can embed it into objects. It allows us to track
really precise motions. And what is most exciting
about it is that you can shrink
the entire radar and put it in a tiny chip. That’s what makes this approach
so promising. It’s extremely reliable. There’s nothing to break. There’s no moving parts. There’s no lenses. There’s nothing,
just a piece of sand on your board. Schwesig: Now we are at a point
where we have the hardware where we can sense
these interactions, and we can put them to work. We can explore
how well they work and how well they might work
in products. Poupyrev:
It blows your mind, usually, when you see things people do. And that I’m really
looking forward to. I’m really looking forward to releasing this
to the development community, and I really want them
to be excited and motivated to do something cool with it,
right?

100 thoughts on “Welcome to Project Soli

  1. I see absolutely nothing intuitive about this method of interaction. It's cool tech but I don't see it being useful for common consumers if it doesn't feel intuitive to use

  2. Once again one company comes up with an idea and does the R&D. Then Apple will perfect it in about 2 years and call it their own.

  3. Fast forward to next year when I try to unlock my device with this and get locked out because of too many tries.

  4. So I am a little late getting my Google 3. But then I just learned that Apple made a deal with Intel and is now going to build its own chips.

  5. So.. it's project natal(Kinect) for your phone? Just glad it's run by Google and not M$.. otherwise they'd release it and then remove most of the features later on.. Sorry bro, got ya money!

  6. I see Soli becoming not only a difference between other smartphones for Pixel 4 from a technical point of view, but, if Google does this right, it could be the next generation of interaction with devices, much like the same way Apple changed touchscreens.

    This could become an advanced new UI for Pixels. I just hope that they have some convincing ways of 3rd party developers being able to show off what Soli can really do.

  7. This wold be interesting on a game console hooked to a large TV screen, that I use from the other end of the room. The phone is in my hands.

  8. Jesus, this is so clever, great implementation is critical as this could easily just be a gimmick but if it can be used thoughtfully and intelligently it could change the market and make everything around it feel just old.

  9. People commenting upon the wasted time and money on this technology which is not going to be used anywhere…
    .
    .
    .
    "Meanwhile Google Pixel 4 promo and leaks"😅

  10. the best things from Google don't come to fruition. Project ARA, Project Tango and now this. 4 years and still waiting.

  11. This is so great I cant explain how excited it looks and represents itself in the video..great minded people doing wonderfull things with technology..great project…

  12. It looks great and I hope it works out but I cannot think of many great uses for it. There don't seem to be many benefits of doing this other than you don't have to touch the screen so you can use functions when cooking or eating or something. I think other companies such as LG have had very similar things but maybe this will be implemented better.

  13. Just when you thought we couldn't lose any more privacy Google delivers yet again. So basically along with having cameras recording your every move, Google is also looking to capture your body position and a map of the environment you are in now as well. And most of the idiots in America are gonna back this idea thinking that this is great news, because average Americans are borderline retarded. Nothing more than wannabe popular social media zombies who are too lazy to want to actually do anything physical.

  14. People are going to say, again, that Google copied Apple. But we all know that Apple ripped Google when they launched their faceid in 2017 while Google were trying to make it perfecto.

  15. This is amazing! Why haven't I thought about it?! Make it purchasable as a discrete unit, not only as a 'built-in". Can't wait to get my hand on one of those sensors and integrate with one of my projects. I mean, kind of 'get on'.

  16. You Do NOT have permission to access this Group 🙁
    https://www.youtube.com/redirect?v=0QNiZfSsPc0&redir_token=i5K_vWs16N5cKIl_JVyGMp_6BpF8MTU2ODk3MzYyMEAxNTY4ODg3MjIw&q=https%3A%2F%2Fgroups.google.com%2Fforum%2F%23%21forum%2Fsoli-announce&event=video_description

    https://support.google.com/a/thread/7309984?hl=en

  17. It's going to be interesting putting this in a phone. It's seems impractical due to the device always being in your hands. It'll be convenient for any sort of travel and for a quick peek, but time will tell they'll probably develop very clever uses for it.
    I could see an integration into smart home. Gesturing to your phone to control the tv, Chromecast, lights, locks, routines, the whole shebang.
    Beyond that, it'll open the door for live sign language interpretation, which would be a huge win in itself.

  18. Затея то отличная, но поколение тупеет, для них это слишком сложное управление.

Leave a Reply

Your email address will not be published. Required fields are marked *