AI Learns Human Movement From Unorganized Data 🏃‍♀️

AI Learns Human Movement From Unorganized Data 🏃‍♀️


Dear Fellow Scholars, this is Two Minute Papers
with Károly Zsolnai-Fehér. Last year, an amazing neural network-based
technique appeared that was able to look at a bunch of unlabeled motion data, and learned
to weave them together to control the motion of quadrupeds, like this wolf here. It was able to successfully address the shortcomings
of previous works, for instance, the weird sliding motions have been eliminated, and
it was also capable of following some predefined trajectories. This new paper continues research in this
direction by proposing a technique that is also capable of interacting with its environment
or other characters, for instance, they can punch each other, and after the punch, they
can recover from undesirable positions, and more. The problem formulation is as follows – it
is given the current state of the character and a goal, and you see here with blue how
it predicts the motion to continue. It understands that we have to walk towards
the goal, that we are likely to fall when hit by a ball, and it knows, that then, we
have to get up and continue our journey, and eventually, reach our goal. Some amazing life advice from the AI right
there. The goal here is also to learn something meaningful
from lots of barely labeled human motion data. Barely labeled means that a bunch of videos
are given almost as-is, without additional information on what movements are being performed
in these videos. If we had labels for all this data that you
see here, it would say that this sequence shows a jump, and these ones are for running. However, the labeling process takes a ton
of time and effort, so if we can get away without it, that’s glorious, but, in return,
with this, we create an additional burden that the learning algorithm has to shoulder. Unfortunately, the problem gets even worse
– as you see here, the number of frames contained in the original dataset is very scarce. To alleviate this, the authors decided to
augment this dataset, which means trying to combine parts of this data to squeeze out
as much information as possible. You see some examples here how this motion
data can be combined from many small segments, and in the paper, they show that the augmentation
helps us create even up to 10 to 30 times more training data for the neural networks. As a result of this augmented dataset, it
can learn to perform zombie, gorilla movements, chicken hopping, even dribbling with a basketball,
you name it. What’s even more, we can give the AI high-level
commands interactively, and it will try to weave the motions together appropriately. They can also punch each other. Ow. And all this was learned from a bunch of unorganized
data. What a time to be alive! Thanks for watching and for your generous
support, and I’ll see you next time!

77 thoughts on “AI Learns Human Movement From Unorganized Data 🏃‍♀️

  1. Oh yeah! Did I mention they could punch each other? Just wanted to mention that they could punch each other. PUNCH.

  2. animators in the gaming industry might find themselves out of a job if a 3D model can learn to walk and perform other movements on its own

  3. I thought that NVidia changed their driver licenses to prohibit using their consumer cards, which Lambda cloud advertises, in data centers? Has that changed, can we now legally do that?

  4. I have trouble understanding what kind of augmentation on the training data was performed, and it seems like a crucial part, if not the crucial part.

  5. Very nice job, we are getting closer to the true IA status, call me again when IA manages to create full 3D environments scenes based on video capture only without any human input.

  6. Is Lambda Labs safe from thermonuclear and cross-dimensional hostiles attacks? Because the last Lambda Lab that I know of didn't fare too well.

  7. Imagine this techniques being integrated into the already existed Euphoria Engine, if you don't know what it is YouTube euphoria physics rn.

  8. Does anyone else think that there should/will be a God game where you control a population/individual of people like Dwarf fortress, with a more expanded customization of Spore (Or Graffiti Kingdom etc) but they are autonomous like The Sims, and you can collect cards that you can give to your AI to summon on the ground like Lost Kingdoms 2 to fight other enemy AI in the world while you both evolve? I wonder if there can be a card game where it generates them like Borderlands guns..? That would all be amazing.

  9. "Some amazing life advice from the AI right there"
    Don't give them ideas Karoly, I don't want any motivational speaker AIs :'DDD

  10. Dreams come true. Imagine some fantasy mmorpg where all fighting, skills and actions of characters are based on such technique.

  11. I would really love to see a paper where an AI is given a task that's literally impossible – like extrapolating things that can't be extrapolated – just to see how those strategies would look like.

  12. Wasn't there a paper about tracking and transferring motion (dancing?). Use that to get tonnes of different movement data, get data from ppl moving just on YouTube or whatever, train all random movements for different situations.

    Then train it on 3d environments approximating real world. Figure out how to transfer this knowledge to real robots(there were papers about this too!) and you're pretty much at Boston dynamics level minus the amazing hardware.

    Maybe the need for amazing hardware can be compensated by optimising for 'careful and conserving' movement when training AI?

    Technology progress is accelerating!! I feel so behind 🙁

  13. Long story short…….20 years from now we will have flying terminators. I'm not trolling. This is serious. Look at the evolution of the automobile from 1905 to 1925. The difference is astounding. We went from dirt roads full of horses to paved roads full of automobile. We are in a similar evolution with robots. Human soldiers on the battle field will be as rare as weaponized blimps soon.

  14. Wow! Really looking forward to the time when augmented artificial musculo-skeletal system shall help us move real fast without moving a single biological muscle strand. Or embedded BCI help us simulate environments and interact with them.

  15. The best moment in these videos, is always when you say "What a time to be alive!". Thank you for your optimism! 🙂

  16. In a time of ai, deep fake, high resolution and alot of hate in every direction I'm a bit scared.

    But I'm not scared for myself.
    I'm scared for humanity.
    Since it gets harder and harder to find the truth.

    Jesus Christ. Our Lord, King, and Savior. Our divine God.

  17. i curious and want to feel how playing a RPG game with A.I NPC..
    2020 NPC still passive? static? never move from original place? just walk without goal then vanish when not rendered?
    give them A.I. let the world progress!! if it offline game then no one have same experience because A.I will constantly changing the world.
    merchant NPC? their item should acquired from interaction not spawned (maybe exception if not rendered/out of sight -with history on their data where they get the item-).
    well is this possible with our low/medium end CPU? because not everyone can buy monster PC. (maybe this is the reason we must wait for 2030 for a game implement this)

  18. I have been a fan of your channel for a while. Today I clicked on your ads about Lambda GPU cloud and decided to try it. However after I filled in my credit card info and submit, the site just shown an alert "Hi". Very shady if you ask me. After that I tried to launch an instance and it just show error message. I tried to delete my card on the site but cannot find that feature anywhere, also updating the card info did not work either. Just a warning for other if they want to try it 🙁

Leave a Reply

Your email address will not be published. Required fields are marked *