Research

My research interests are on the development of intelligent robotic systems that can aid humans in performing skillful tasks more effectively.  I have been applying techniques from machine learning, computer vision, and natural language processing to the field of biomedical engineering.  I am interested in evaluating performance in dextrous tool manipulation, providing augmented feedback through visual cues or haptic (touch) interfaces, enabling human-machine collaborative control, or automating repetitive tasks. Application areas include autonomous cars, surgery, industrial robotics, remote exploration, creative expression, and education.

Surgical Robotics Projects

skillEvaluation

SensorySubstitution
Sensory Substitution 2004-2007
Tumor AR Overlay  2007


Industrial Robotics Projects

IBM_Manipulator
Force Feelin’ 2003-2004

Remote Exploration/Underwater Robotic Projects

Underwater
Triton 2002-2003
Seafox 2001-2002


Creative Expression Robotics Projects

Kurvy Kirby 2004

Other
General Electric Research, People Tracking, 2007
Databases, 2005
Lockheed Martin Central Design Engineering, 2003
Lockheed Martin Space Based InfraRed System, 2004
Lockheed Martin Mobile User Objective System (Grant Writing Proposal) 2004
Automated Needle Insertion (Education), 2004
Computing Research Association- Women Distributed Mentor Program, 2003

Surgical Skill Evaluation

Objectively grade a surgeon through motion and video analysis using statistical modeling techniques.

skillEvaluation

Skill learning is an emerging area of human-machine collaborative control and automation. As personal robots are increasingly integrated into everyday human life, it will be important to quantify how humans skillfully perform various tasks, to develop methods of assisting humans to learn tasks more efficiently, and then generalizing those
skills for autonomous robot movement. Evaluating skill is a time-consuming, subjective, and difficult process. Robotic minimally invasive surgery (RMIS) has the potential to revolutionize our understanding of modeling, teaching, and evaluating human manipulation skills for data recording without extraneous sensors.

In this talk, I will focus on developing and automating assessment methods for measuring skill and technical competence of human motion. My work has addressed two main aspects of skill: motion modeling through statistical methods and providing augmented feedback to a novice. I examine how to evaluate skill between an expert and novice surgeon, then explore augmented feedback methods to help trainees. I then present a method of automating a surgical robot to perform part of a task using models trained on expert data.

These strategies will improve the performance of trainees by improving the rate of learning. Applications of these techniques for minimally invasive surgery and medical simulation is promising, as it will facilitate new procedures, improve patient outcomes, and reduce training costs. Skill learning can meet the needs of robotics outside the operating room and facilitate adoption in newly identified areas.

Tumor AR Overlay

Real-time tracking of kidney stone displayed through augmented reality. No need to keep checking CT scans when performing surgery.

Goals: I worked with Balazs Vagvolgyi by recording surgeries and make a video of a 3D augmented reality display project. The overlays are constructed by registering tumors based on CT scans for laparoscopic partial nephrectomy.

Sensory Substitution

Augmented reality display of tool-tissue interaction forces from low-cost sensors embedded onto surgical instruments.

With the high costs in the OR, creating surgical systems that integrate novel computer and human/machine interface technologies will revolutionize surgical procedures, extending the surgeon’s abilities to achieve better outcomes at lower costs.  Haptic (force and tactile feedback) has been proposed as a way to further enhance the performance of these systems. A limitation to the current generation of MIS robots is the lack of haptic feedback where the operator relies solely on visual feedback to determine the amount of force being applied. Implementing direct haptic feedback to the surgeon’s hands remains impractical for clinical application because of the cost and time of applying force sensors to disposable tools and the current limitations in sensing and control technologies. The goals of this work are to develop an intuitive augmented reality system for feedback of force information through sensory substitution, and to evaluate its performance in a surgical task such as knot tying. Work done in collaboration with graduate students Tope Akinibiyi and Sunipa Saha.

Funding Source: NIH Grant R01 EB002004, Whitaker R6-02-911
Special Thanks to the Minimally Invasive Surgery Training Center (MISTC) and Dr. David Yuh.

Publications

Patents:

  1. Gregory D. Hager, Carol E. Reiley, Balakrishnan Varadarajann, Sanjeev Khudanpur, Henry C. Lin, and Rajesh Kumar. Method and System for Quantifying Technical Skill. United States patent #C10692, International #PCT/US2010/028025 filed September 19, 2010.

Book Chapters/Collections:

  1. A. M. Okamura, L. N. Verner, C. E. Reiley, and M. Mahvash. “Haptics for Robot-Assisted Surgery”, International Symposium of Robotics Research, Springer Tracts in Advanced Robotics, 405-416, 2007.

Journals:

  1. B Varadarajan, C. E. Reiley, S. Khudanpur, and G. D. Hager. “Exploring Data-Driven Statistical Models for Computer Integrated Surgery”, Pattern Analysis and Machine Intelligence (submitted).
  2. C. E. Reiley, H. C. Lin, D. D. Yuh, G. D. Hager. “A Review of Methods for Objective Surgical Skill Evaluation“, Surgical Endoscopy, 25(2):356-66, 2011. (impact factor 3.231).
  3. L. Su, B. P. Vagvolgyi, R. Agarwal, C. E. Reiley, R. H. Taylor, and G. D. Hager. “Augmented reality during robot-assisted laparoscopic partial nephrectomy: Toward real-time 3d-ct to stereoscopic video registration,” Journal of Urology, 73(4):896-900, 2009 (impact factor 3.952).
  4. B. Vagvolgyi, C. E. Reiley, G. Hager, R. Taylor, and L.M. Su. “Augmented Reality Using Registration of 3D Computed Tomography To Stereoscopic Video of Laparoscopic Renal Surgery,” Journal of Urology, 179(4):241-241, 2008 (Impact factor 3.952).
  5. C. E. Reiley, T. Akinbiyi, D. Burschka, A. M. Okamura, C. Hasser, D. Yuh, “Effects of Visual Force Feedback on Robot-Assisted Surgical Task Performance,” The Journal of Thoracic and Cardiovascular Surgery, Vol. 135, Issue 1, pp.196-202, January 2007 (impact factor 3.037).

Peer-Reviewed Conferences:

  1. C. E. Reiley, E. Plaku, G. D. Hager, C. C. G. Chen, “Motion Generation of Robotic Surgical Tasks: Learning From Expert Demonstrations,” Engineering in Medicine and Biology,2010 (oral).
  2. C. E. Reiley, G. D. Hager, “Decomposition of Robotic Surgical Tasks: An Analysis of Subtasks and Their Correlation to Skill,” MICCAI M2Cai workshop, 2009 (poster).
  3. B. Varadarajan, C. E. Reiley, H. C. Lin, S. Khudanpur, G. D. Hager, Data-Derived Models for Segmentation with Application to Surgical Assessment and Training, MICCAI, pages 426-434, 2009 (poster, acceptance rate 32%).
  4. C. E. Reiley, G. D. Hager, “Task versus Subtask Surgical Skill Evaluation of Robotic Minimally Invasive Surgery,” MICCAI, pages 435-442, 2009 (poster, acceptance rate 32%).
  5. C. E. Reiley, H. C. Lin, B. Varadarajan, B. Vagolgyi, S. Khudanpur, D. D. Yuh, and G. D. Hager, “Automatic Recognition of Surgical Motions Using Statistical Modeling for Capturing Variability”, Medicine Meets Virtual Reality, 132:396-401, 2008 (oral).
  6. T. Akinbiyi, C. E. Reiley, S. Saha, D. Burschka, C. J. Hasser, D. D. Yuh, and A. M. Okamura. “Dynamic Augmented Reality for Sensory Substitution in Robot-Assisted Surgical Systems,” 28th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2006, pp. 567-570 (oral, acceptance rate 25%).

Abstracts/Short Papers/Non-Refereed:

  1. C. E. Reiley, G. D. Hager, C. C. G. Chen, “Skill Assessment for Robotic Surgery Using Statistical Models,” Female Pelvic Medicine and Reconstructive Surgery 2010 (presentation).
  2. R. Agarwal, B. Vagvolgyi, C. E. Reiley, G. D. Hager, A.W. Levinson, L. Su, “Tumor registration of augmented reality overlay of a 3D CT to real time stereoscopic video during laparoscopic partial nephrectomy” World Robotic Urology Symposium, March 2008 (poster).
  3. B. Vagvolgyi, C. E. Reiley, G. D. Hager, A. W. Levinson, L. Su, “Toward Direct Registration of Video to Computed Tomography for Intraoperative Surgical Planning during Laparoscopic Partial Nephrectomy”, World Congress of Endourology, 2007 (poster).
  4. B. Vagvolgyi, C. E. Reiley, G. D. Hager, A. W. Levinson, L. Su, “Tumor Registration of Augmented Reality Overlay of a 3D CT to Real-time Stereoscopic Video During Laparoscopic Partial Nephrectomy”, American Urological Association, 2007 (oral, acceptance rate 27%).
  5. T. Gao, J. Ji, C. E. Reiley, B. Winters, L. Selavo, N. Whyms. “Advancing MET Through Intelligent Patient Monitoring”, 3rd Annual Medical Emergency Team/Rapid Response Team Conference, 2007 (poster).
  6. C. E. Reiley and A. M. Okamura. “Augmented Reality for Haptic Display in Robot-Assisted Surgical Systems”, SWE, Kansas City, MO, October 2006.  (Best Poster Finalist)
  7. C. E. Reiley. “Dynamic Augmented Reality for Haptic Display in Robot-Assisted Surgical Systems”, CRA-W DMP Reunion at the 2004 Grace Hopper Conference, Chicago, IL: October 2004 (poster).
  8. C. E. Reiley. “Program Slicing for OpenMP Shared Memory Parallel Programs”, University of Delaware Undergraduate Summer Research Symposium, August 2003 (poster).

Peer-Reviewed Education-Based:

  1. T. Wedlick, C. E. Reiley, C. Ramey. “A Fair Game: A Low-Cost Easily Implemented Robotics Competition Leads to Diverse Entrants”, ASEE Mid-Atlantic Section Meeting, 2009 (oral).

Editorial:

  1. C. E. Reiley and G. D. Hager. “Using Robots to Train the Surgeons of Tomorrow” , IEEE Robotics & Automation blog, June 13, 2011.
  2. C. E. Reiley. “One on One Spotlight with the RAS President”, IEEE Robotics & Automation Magazine, pg 114, March 2009.
  3. C. E. Reiley. “The Future of Robotics”, IEEE Robotics & Automation Magazine, pg 19-22, March 2010.

Theses:

  1. C. E. Reiley. “System Design and Implementation of Visual Force Feedback and Virtual Fixtures in RobotAssisted Surgical Systems: Evaluating Alternatives to Direct Force Feedback Using Augmented Reality”, 2007.
  2. C. E. Reiley. “Haptic Integration of IBM Manipulator”, 2004.

Invited Talks and Seminars:

  1. IEEE Robotics and Automation DC-NoVa chapter meeting, “ Skill Assessment for Robotic Surgery (Language of Surgery).” University of Maryland, College Park, MD. April 19, 2010.
  2. Mechanical Engineering Graduate Engineering Seminar, “Haptics and Vision in Surgical Robotics.” Santa Clara University, Santa Clara, CA. February 13, 2008.
  3. Robotics for JHU Alumni Event, “The Language of Surgery.” Liberty Science Center, Newark, NJ. February 10, 2008.
  4. Electrical Engineering Seminar, “Visual Force Feedback and Virtual Fixtures In Robot-Assisted Surgical Systems.” Oregon Health and Science University OGI School of Engineering and Science, Beaverton, OR. August 21, 2007.
  5. Computer Engineering Graduate Engineering Seminar, “Dynamic Augmented Reality in Robot-Assisted Surgical Systems.” Santa Clara University, Santa Clara, CA. May 26, 2005.

Grants:

  1. National Science Foundation Graduate Research Fellowship, 2007-2010 ($122,500 over 3 years).
  2. JHU Alumni Association Community Action Grant: Funded to buy robotic kits for low income high schools to participate in the JHU Robotics Systems Challenge, 2006-2007 & 2008-2009 ($2000).
  3. JHU Digital Media Center Creative Use of Technology Grant: “Input Devices for Developing Photo and Video Applications for Social Networks” ($500).
  4. SCU Dean’s Fund: Wrote proposal to win grant for senior design project, 2004 ($500).
  5. SCU Student Leadership Fund: Grant to fund senior design project, 2004 ($500).

Force Feelin

IBM manipulator with haptics (force feedback)


 

 

 

 

As the first design team at Santa Clara University to experiment with force feedback and integrate haptic feedback into a robotic arm, helping make tasks more realistic for robotic operators.  The main objective of our interdisciplinary team of six was to modify an existing IBM robotic arm, model number 7545, focusing on the integration of haptics into the operation and control of the robot. This integration was to allow the user to feel differences in the rigidity of the objects picked up by the robot gripper. Therefore, the system would be providing cutaneous feedback to the user, so that the user will be able to respond to forces. For example, if the gripper was to pick up a rock, the user could distinguish the hardness of that object as opposed to squeezing a spring, which would feel pliable. Another objective was to perform the proper maintenance required for the robot since it was not functional and out of use. After getting the robot in a functional state, a new user interface was designed and manufactured that would not only move the arm and gripper, but also be simplistic and user friendly.

The video shows picking up and egg with and without haptic feedback.


Teaching

Courses I Have Taught:

This slideshow requires JavaScript.

Computer Vision

Teaching Assistant, Johns Hopkins University, Fall 2010
Primary Instructor: Dr. Gregory D. Hager
Number of students: 50 students (37 graduate and 13 undergraduate)
Role: Held office hours, conducted problem solving sessions, graded exams.

Facebook 101: Developing Photo and Video Applications for Online Social Networks
Instructor & new course designer, Johns Hopkins University, Intersession 2009
Number of students: 12 undergraduate students
Taught students how to create and launch web 2.0 applications using computer vision tools to detect or track objects in the environment. Students worked in small teams to conceptualize, develop, distribute, and market new applications to Facebook users. Co-taught with Daniel Mirota.

Haptic Applications in Medical Robotics
Instructor & new course designer, Johns Hopkins University, Intersession 2007
Student Evaluations: Mean teaching evaluation score of 4.85 out of 5.0
Number of students: 14 undergraduate students
An overview of medical robotic technology and haptic feedback in a surgical setting. Course work included weekly lectures, hands-on laboratory exercises, paper readings, discussions, and presentations. Co-taught with Dr. Panadda Marayong.

Graduate Courses I Have Taken:


Computer Science
: Computer Vision (Hager), Databases (Yarowsky), Programming Languages (Smith), Randomized Algorithms (Kosaraju), Computer Network Fundamentals (Masson), Machine Learning  (Sheppard), Linear Optimization (Han)

Mechanical: Motors, Sensors, and Actuators (Okamura), Introduction to Robotics (Cowan), Haptic Systems (Mahvash)-Short Course

Biomedical: Computer Integrated Surgery I (Taylor), Surgery for Engineers (Brown), Responsible Research Conduct, Biology for Engineers

Other: Intellectual Property-Short Course, Engineering Management

Seafox

Low-cost underwater remotely operated vehicle.

I built a rudimentary, low-cost tethered vehicle is made of PVC tubing and uses bilge pumps as thrusters along with six other students.  Our interdisciplinary team installed a camera and an internet control system allow students to fully control the vehicle and to rapidly prototype new ideas on a simple engineering platform.   Throughout the year, we constructed a low-cost but robust Remotely Operating Vehicle which consisted of  PVC pipes as a frame, six thrusters for propulsion and maneuvering, a tether which supplies controlled power from a 12 volt DC power source, floats, and a controller using push buttons to control the thrusters.  The robot sent a stream of video so that a recording was displayed on deck.   I configured and tested a safety system for the robot.  This involves programming a BasicStamp microcontroller, interfacing relays and setting up communications between the basic stamp and DSP.


Picture of Seafox (courtesy of SCU RSL)

Kurvy Kirby

An art robot that is functional and aesthetically pleasing. For my motors, sensors and actuator course project, I designed an a robot that created pin and thread art. Creative pictures drawn from straight lines and rotating the paper. Kurvy Kirby was designed and built by Sunipa Saha and Carol Reiley in Fall 2004.