This paper proposes a control methodology for human gait with a pattern generator. The pattern generator generates cyclic signals via a couple of mutually inhibited neurons, and drives a proportional derivative controller that supplies joint angles of a virtual human. The state of the pattern generator is entrained by the signal of the controller, and such mutual feedback stabilizes the generation of rhythmic signals for variable conditions. Legs and arms can automatically synchronize their periodical movements without using a central supervisor because the corresponding neural oscillators mutually feed their output signals. Our system generates various gaits in a common mechanism with a small number of parameters, which is well suited for real-time, interactive and on-the-fly controls. Moreover, the movements obtained from motion capture data can be controlled by introducing adjustable non-linear filters.
MPEG-1: 7.2 MB with Audio
(with Japanese caption)