W-HTF-RL: Collaborative Research: Improving the Future of Retail and Warehouse Workers with Upper Limb Disabilities via Perceptive and Adaptive Soft Wearable Robots

Project Details


This project will investigate modeling, perception, and control of soft wearable robots to provide physical assistance and skill training for older workers and workers with physical disabilities in jobs involving picking, placing, and assembly tasks. If successful, the project will enhance their employment, inclusion, and integration in work that is relevant to retail, warehouse, and manufacturing. It is estimated that this technology can directly benefit nearly 20 million people in the U.S. with upper limb impairments due to neurological and musculoskeletal disorders. The long term goal of the project is to improve the quality of work, productivity, and employment of people with disabilities, who are the nation's largest minority and untapped labor force. To do so, the project will deploy artificial intelligence-powered, soft assistive robots to support workers and understand the resulting impact on economics and policy making. The proposed work has the potential to contribute to national economic growth and health by broadening participation of people with disabilities in the workforce. The assessment of economic impacts will provide the first econometric data-driven understanding of the productivity and labor market effects of artificial intelligence- and robotics-driven augmentation, with a specific focus on the underrepresented population of individuals with disabilities.

This project brings together multiple disciplines, including soft robotics, computer vision, learning and control, occupational therapy, worker training and labor economics. This convergent research team represents a collaboration with Rutgers New Jersey Medical School, New York University, and assistive device manufacturers. The team and project activities are structured to achieve multiple convergent goals and deliverables, including: 1) A model of interactions between a lightweight and complainant soft wearable robot with human workers; 2) An interactive visual perception framework that enables multimodal intention detection and action monitoring in a semantic 3D map of the dynamic workspace and provides in-context visual feedback for collaborative robot manipulation; 3) A framework for the transfer of demonstrated skills through cost function learning and model predictive control with perceptual feedback integration for online movement and impedance adaptation of exoskeletons; 4) Training programs for occupational therapists and people with upper-limb disabilities will entail a multi-pronged strategy to enhance awareness and knowledge regarding the scope and effectiveness of assistive robots and improve perception towards their use in collaborative workspaces; 5) Increased understanding the economics of assistive technologies using federal data on occupational ability requirements and then using these estimates in conjunction with productivity results on assistive technologies to construct a range of cost/benefit estimates of these technologies; and 6) An evidence-based policy approach that uses quantitative and qualitative data from field experiments, interviews, and focus groups with employers and employees to determine attitudes, barriers, and best practices for adoption and acceptance of assistive wearable technologies in the workplace. This project has been funded by the Future of Work at the Human-Technology Frontier cross-directorate program to promote deeper basic understanding of the interdependent human-technology partnership in work contexts by advancing design of intelligent work technologies that operate in harmony with human workers.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Effective start/end date8/15/207/31/22


  • National Science Foundation: $1,884,010.00


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.