Human movement results from the intricate coordination of muscles, tendons, joints, and other physiological elements. While children learn to walk, run, climb, and jump in their first years of life and most of us can navigate complex environments–like a crowded street or moving subway–without considerable active attention, developing controllers that can efficiently and robustly synthesize realistic human motions in a variety of environments remains a grand challenge for biomechanists, neuroscientists, and computer scientists. Current controllers are confined to a small set of pre-specified movements or driven by torques, rather than the complex muscle actuators found in humans.
To solve these research problems, we created a virtual physiologically accurate musucluskeletal environment in OpenSim software. I set up a competition to build models of the brain. Competition was accepted as one of the 5 official competitions at NIPS 2017. Here is one of the solutions.
You can find the code on github.