Android Things presents: Machine Learning Flowers

Android Things presents: Machine Learning Flowers


[MUSIC PLAYING] BILL SCHILIT: More
and more, we’re seeing machine learning as
a part of everyday things– in your car, in your
cameras, in your doorbell. NHAT VU: We wanted
to build something to showcase the technologies
and the power of Android Things. And so we decided to build a
garden that can kind of react and interact with the user. BILL SCHILIT: Android
Things is a way to experiment with machine
learning and devices, and do it very rapidly. We’re brainstorming
about how flowers could become responsive to people. And one of the effects that
we see in the natural world is flowers that follow the sun. So we had this idea that the
flowers could follow people. NHAT VU: The Emotion Flower is
just basically a smart flower that reacts to a user’s
facial expressions, that will open when you’re smiling,
or when you’re winking at it, it will turn shy, and close,
and turn sort of pink. BILL SCHILIT: So
we added a camera in the head of the flower,
and it detected faces. And then the algorithm that did
the face detection controlled the motors in the flower
body, and slowly turned it to face people that
were standing there, and be a little bit
reactive and responsive. We wanted to just engage
people with something that was aesthetically
pleasing, beautiful. And so we had a lot of
flowers that had lights, leaves built around them. These were active
and interactive, and people could stand
in front of them. So we had another section where
we showed the Android Things processor, and the motors,
and the cables, and the spine, and the vertebrae
of this flower. And we even let them control the
flower using game controllers. NHAT VU: We also
used stuff that you can get from the hardware
store, like copper piping for the stem. And really it’s a matter of
how we put them all together to make it look like this flower
that’s not something you’d see in nature, but also
it conveyed the idea that it is a flower. BILL SCHILIT:
There’s a spine that runs up the center of
the continuum robot. And we finally found something
called a flex cable that was available for a weed whacker. And we were able to
build the spines which hold the vertebrae on. The spines were printed plastic. And then we had to figure
out how to write the software to control this robot. And so we went back to
seventh grade trigonometry to figure out how much
to move the wire caused how much to move
the continuum robot. What we wanted to do was
make it super-responsive. If you go up to the
cloud and come back, you have a little bit of a lag. And we wanted our flowers
to be organic and natural, and not to show that lag. NHAT VU: And really
the whole thing is powered by Android Things. And so the idea is you
have sensor inputs. In our case, it’s just the
camera that senses the scene. And then on-device processing. BILL SCHILIT: So we started with
the face recognition software library that Google has. We then were interested in
how quickly could we recognize faces in a video stream. NHAT VU: On the
Raspberry Pi, we detect that there’s a face there. And then there’s a neural
net that infers your emotion. And so depending on
what the emotion is, we change the LED lights or we
move the several motors to kind of close and open the petals. BILL SCHILIT: The platforms
that we have running– the hardware that we have
running in the Android Things system is quite capable. It has a multi-core processor. And it can really be responsive
to changes in a video stream. NHAT VU: Anyone
who has any inkling to tinker or to sort of do these
DIY hobby-type projects, you should just go
and give it a try. I think you’ll be surprised at
how quickly you can build sort of fun, wacky,
interesting gadgets, and then maybe hopefully
you’ll do something useful. We wanted to
showcase both the AI and the robotics-like
capabilities. But the idea is to
have something that’s approachable, not
intimidating, but also provide a delightful experience. And so we thought about
trying to sort of scale down the complexity, both physically
and also in the software, to make it into
sort of a project that we can open-source. There is a Hackster.io page– sort of described the entire
process in very fine details, of how to solder LEDs
and things like that, where you can build your
own little flower that sits on your desk. We’ve kind of carried on
sort of the work a little bit afterward to make sure
that what we’ve done is available for the community. [MUSIC PLAYING]

26 thoughts on “Android Things presents: Machine Learning Flowers

  1. Great, fun project. How do you Googlers find the time to work on such projects? Are they full time projects? Or projects employees do in their leisure time?

  2. I would like to spend my time with real flowers and nature.. they are more responsive than these useless machines.(but you have to be receptive to see that)

Leave a Reply

Your email address will not be published. Required fields are marked *