Laboratory for Computational Sensing and Robotics Seminar: Panagiotis Artemiadis

April 3, 2019
12 - 1pm EDT
This event is free

Who can attend?

  • General public
  • Faculty
  • Staff
  • Students

Contact

Laboratory for Computational Sensing and Robotics
410-516-6841

Description

Dr. Panagiotis "Panos" Artemiadis, an associate professor of Mechanical and Aerospace Engineering at Arizona State University, will give a talk entitled "Modeling and Control of Human-Robot Interaction and Interfaces" for the Laboratory for Computational Sensing and Robotics.

Abstract:

This talk will focus on modeling and advanced control of robots that physically or cognitively interact with humans. This type of interaction can be found on devices that assist and augment human capabilities, as well as provide motor rehabilitation therapy to impaired individuals. The first part of the talk will present research on myoelectric control interfaces for a variety of robotic mechanisms. Results of a novel method for robust myoelectric control of robots will be presented. This work supports a shift in myoelectric control schemes towards proportional simultaneous controls learned through development of unique muscle synergies. The ability to enhance, retain, and generalize control, without needing to recalibrate or retrain the system, supports control schemes promoting synergy development, not necessarily user-specific decoders trained on a subset of existing synergies, for efficient myoelectric interfaces designed for long-term use. The second part of the talk will focus on a novel approach to robotic interventions for gait therapy, which takes advantage of mechanisms of inter-limb coordination, using a novel robotic system, called Variable Stiffness Treadmill (VST) developed in the HORC Lab at Arizona State University. The methods and results of the presented approach will lay the foundation for model-based rehabilitation strategies for impaired walkers. Finally, results on a novel control interface between humans and multi-agent systems will be presented. The human user will be in control of a swarm of unmanned aerial vehicles (UAVs) and will be able to provide high-level commands to these agents. The proposed brain-swarm interface allows for advancements in swarm high-level information perception, leading to augmentation of decision capabilities of manned-unmanned systems and promoting the symbiosis between human and machine systems for comprehensive situation awareness.

Who can attend?

  • General public
  • Faculty
  • Staff
  • Students

Contact

Laboratory for Computational Sensing and Robotics
410-516-6841