Collaborative Research: Deconstructing the contributions of muscle intrinsic mechanics to the control of locomotion using a novel Muscle Avatar approach

Project Details

Description

Moving animals achieve impressive athletic feats of endurance, speed, and agility in complex environments. Animal locomotion is particularly impressive in contrast to that of human-engineered machines. The stability, agility and energy economy of current robots, prostheses and exoskeletons remains poor compared to animals. This pronounced gap between animal performance and technology stems, in part, from fundamental gaps in the understanding of muscle physiology and mechanical function. Muscle is the only actively controlled tissue in animal musculoskeletal systems, and therefore plays a central role in enabling and controlling movement. Yet, developments over the past 20 years have led to growing recognition that important problems in muscle physiology and movement sciences remain unsolved and the theoretical foundation of the field remains incomplete. In particular, the ability to model and predict muscle function under dynamic and perturbed locomotor conditions remains poor. This project will combine innovative experimental techniques with modeling approaches to develop new muscle models that can explain and predict muscle movement under a broad range of conditions. The findings have potential to transform numerous fields— enabling neuroscientists, biologists, clinicians and biomedical engineers to ask questions about human and animal behavior, control of motion, function of muscles and bones, and capacity of the nervous system and muscles to change. The research team will collaborate with colleagues in clinical and engineering fields to translate the findings into applications in human rehabilitation, treatment of disease and injury, and the design and control of assistive technology such as prosthetics and exoskeleton devices.

In the field of animal neuromechanics, a pronounced gap exists between 'top down' approaches — those that focus on whole-animal behavior but lack insight into underlying mechanisms— vs 'bottom-up' approaches — those that characterize mechanisms but lack insight into their contributions to animal behavior. The team will develop new tools to bridge this gap: 1) predictive muscle models that include intrinsic viscoelastic properties; and 2) experimental approaches that integrate across levels. This project's novel 'muscle avatar' approach will help bridge this gap, and enable rigorous analysis of intrinsic muscle property and neural activation contributions to control of locomotion. Aim 1 tests the ability of the muscle avatar approach to replicate steady and perturbed in vivo work loop patterns in mouse and guinea fowl muscles. In Aim 2, in vivo muscle strain, activation, and force will be measured during steady and perturbed running in guinea fowl muscles, and the muscle avatar will be used to quantitatively assess how intrinsic muscle properties and neural drive each contribute to stabilizing responses. In Aim 3, alternative muscle models will be developed, and the ability of each model to predict in vivo muscle function in high strain and perturbed contraction conditions will be compared. Computationally tractable muscle models are essential for closed loop neuromechanical simulations of locomotion, which are increasingly used to understand how muscle function and sensorimotor control change in response to aging, injury and neuromuscular disorders. The findings could inform clinical rehabilitation strategies and the design of assistive devices.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

StatusFinished
Effective start/end date1/01/1630/06/23

Funding

  • United States-Israel Binational Science Foundation (BSF)

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.