Using physics for grabbing objects with kinematic hands
In this post I will concentrate on simulating hands in a virtual environment. The most natural way to do this is to use physics: when the hand touches something, it moves. When the hand grabs something, it is picked up. Sounds sensible, but does it work?
First we need to model the physics of the hand. Currently I am using a very simple rig for my hand which basically consists of a hand, a thumb, the index finger and the other fingers. We need to create colliders for each of them, as these will be used to detect collisions between the various parts of the hand an the environment. In my case I use capsule colliders for the thumb and fingers and a box collider for the palm of the hand:
I am using the Razer Hydra as the input device for the hand. The position of the hand is measures by the hand devices. The tumb and index finger are switches, while the other finger uses a analog input bumper which enables me to regulate the bending of these fingers and do precise grabbing.
Finally, I have to make my first person a kinematic rigid body or I won’t be able to generate and detect collisions with the environment.
Then the result:
- As the hand and especially fingers are kinematic, they will simply go through objects: collisions are detected, but do no prevent further movement. This is a problem. I tried limiting the movement of the fingers when the collision took place, but this did not result in the behaviour I wanted.
- Grabbing does not work: objects ‘balance’ between the fingers, but they are not actually hold tightly.
- Moving objects is not possible. This came as a surprise for me. When I manage to hold an object on top of my hands and move my hands in one direction together, the object itself does not move. This looks very silly. It does not move, even when the physics material’s friction is set to 1 (maximum friction).
I am afraid I have to use a different method (again) for simulating hands. Will be continued…
You can see the result in the video below: