From WPI Automation and Interventional Medicine (AIM) Robotics Laboratory
This is our old page, please see the new AIM Lab website at: http://aimlab.wpi.edu
Fields of Research
Please see the Publications page for a complete listing of published works.
Medical robotics is the link that enables “closed loop medicine” by using real-time feedback to guide a surgical procedure. We are integrating interactive MRI imaging with the interventional procedure for applications including deep brain stimulation lead placement for Parkinson’s Disease and brachytherapy seed implantation for prostate cancer therapy. Medical robotics is the link that enables “closed loop medicine” by using real-time feedback to guide a surgical procedure. We are integrating interactive MRI imaging with the interventional procedure for applications including deep brain stimulation lead placement for Parkinson’s Disease and brachytherapy seed implantation for prostate cancer therapy.
MRI Compatible Robotics
Magnetic Resonance Imaging (MRI) is an excellent imaging modality for many conditions, but to date there has been limited success in harnessing this modality for the guidance of interventional procedures. MRI is an ideal interventional guidance modality: it provides near real-time high-resolution images at arbitrary orientations and is able to monitor therapeutic agents, surgical tools, biomechanical tissue properties, and physiological function. At the same time, MRI poses formidable engineering challenges by severely limited access to the patient and high magnetic field that prevents the use of conventional materials and electronic equipment.
Direct MR image guidance during deep brain stimulation (DBS) insertion offers many benefits; most significantly, interventional MRI can be used for planning, monitoring of tissue deformation, real-time visualization of insertion, and confirmation of placement. The accuracy of standard stereotactic insertion is limited by registration errors and brain movement during surgery. With real-time acquisition of high-resolution MR images during insertion, probe placement can be confirmed intra-operatively. Direct MR guidance has not taken hold because it is often confounded by a number of issues including: MR-compatibility of existing stereotactic surgery equipment and patient access in the scanner bore. The high resolution images required for neurosurgical planning and guidance require high-field MR (1.5-3T); thus, any system must be capable of working within the constraints of a closed, long-bore diagnostic magnet. Currently, no technological solution exists to assist MRI guided neurosurgical interventions in an accurate, simple, and economical manner.
The objective of our research is to make conventional diagnostic closed high-field MRI scanners available for guiding deep brain stimulation electrode placement interventions for treatment of Parkinson's Disease and other neurological disorders including severe depression and Alzheimer's Disease.
Our approach is to employ an MRI-compatible robotic assistant for guiding DBS electrode insertion under direct, real-time MR image guidance. The system will allow interactive probe alignment under real-time imaging in standard diagnostic high-field MR scanners. Use of a robotic assistant will minimize the potential for human error and mis-registration associated with the current procedure and will better address the practical issues of operating in an MR scanner bore.
Prostate cancer is the most common male cancer and the second most common type of cancer. MRI is an ideal guidance modality with the ability to perform high quality, volumetric, real-time, multi-parametric imaging with high soft tissue contrast without ionizing radiation. The objective of this project is to develop highly sensitive and accurate image-guided interventional robotic system, which can be readily integrated to clinical work-flow, for targeted core biopsy as well as focal delivery of therapeutics to locally advanced tumor sites within the prostate.
MRI has potential to be a superior medical imaging modality for guiding and monitoring prostatic interventions. MRI can provide high-quality 3D visualization of prostate and surrounding tissue. However, the benefits can not be readily harnessed for interventional procedures due to difficulties that surround the use of high-field (1.5T or greater). The strong magnetic field prevents the use of conventional mechatronics and the confined physical space makes it extremely challenging to access the patient. We have designed a robotic assistant system that overcomes these difficulties and promises safe and reliable intra-prostatic needle placement inside closed high-field MRI scanners.
The unavailability of robot control interfaces that are compatible with the MRI environment has severely limited the ability to do research in the field. The high cost of entry into MRI robotics has been primarily due to the need for each researcher to develop and evaluate their control system in the scanner. We have developed an MRI compatible robot controller that sits in the scanner room without interfering with scanner imaging. The controller is modular and allows many different inputs and output and communicates to a high level planning and navigation software workstation through fiber optic connections.
Traditional actuators are often contraindicated by the strong magnetic and electric fields present in the MRI scanner bore. Further, it is critical that the devices not introduce noise or distortion into the acquired images. We are evaluating different actuator schemes including pneumatics and piezoelectric actuators. We are investigating ways of optimizing piezoelectric motors for MR-compatibility and developing high-accuracy pneumatic control systems.
Traditional sensors in robotics include force and positioning sensing. However, off-the-shelf sensors are not suiatable for use in MRI due to the potential for image degradation, malfunction, or safety issues. We are evaluating and developing sensors to be used in the MR environment. The current focus is on optical techniques for force and position sensing that do not compromise image quality and will allow for haptic feedback during MRI-guided interventions.
With increasing research on system integration for image-guided therapy (IGT), there has been a strong demand for standardized communication among devices and software to share data such as target positions, images and device status. We have worked on integration and development of components for OpenIGTLink, a standardized mechanism to connect software/hardware through the network for image-guided therapy (IGT) applications.
Augmented Reality Procedural Guidance
Magnetic Resonance Imaging (MRI) provides great potential for planning, guiding, monitoring and controlling interventions. MR arthrography (MRAr) is the imaging gold standard to assess small ligament and fibrocartilage injury in joints. In contemporary practice, MRAr consists of two consecutive sessions: 1) an interventional session where a needle is driven to the joint space and MR contrast is injected under fluoroscopy or CT guidance, and 2) A diagnostic MRI imaging session to visualize the distribution of contrast inside the joint space and evaluate the condition of the joint. Our approach to MRAr is to eliminate the separate radiologically guided needle insertion and contrast injection procedure by performing those tasks on conventional high-field closed MRI scanners. We propose a 2D augmented reality image overlay device to guide needle insertion procedures. This approach makes diagnostic high-field magnets available for interventions without a complex and expensive engineering entourage. In preclinical trials, needle insertions have been performed in the joints of porcine and human cadavers using MR image overlay guidance; insertions successfully reached the joint space on the first attempt in all cases.
Description of the MRI Image Overlay on the NSF ERC Achievements Showcase:
MRI-Guided Needle Placement with Augmented Reality Guidance
Image-guided percutaneous needle-based surgery has become part of routine clinical practice in performing procedures such as biopsies and injections. Image-guided needle placement procedures in CT/MR benefit from an accurate and effective augmented reality (AR) system. In order to operate the system the user has to be trained. Therefore, we have developed a laboratory validation and training system for measuring operator performance under different assistance techniques for needle-based surgical guidance systems named “The Perk Station.” Three techniques are fitted in this training suite: the image overlay, bi-plane laser guide, and traditional freehand techniques. An electromagnetic tracking system is applied in the validation system. The Perk Station, an inexpensive, simple and easily reproducible surgical navigation workstation for laboratory practice incorporating all the above mentioned functions in a “self-contained” unit, is introduced.
Robots in Education
We are developing robotic systems applicable to teaching the fundamentals of robotics. The first generation of the robot has been successfully used to teach the junior level undergraduate Robotics Engineering courses at WPI.
No one hardware platform provides all of the tools required to teach a robotics engineering curriculum. We are developing a unified platform specifically designed for multidisciplinary undergraduate robotics education. We have developed a set of instructional equipment including the RBE development board, a manipulator arm and a mobile platform.
Undergraduate Research Projects
The goal of this work is to study the conceptions about the use of robotis in surgery. We are specifically investigating the differences in these perceptions among different patient and medical professional populations. The work is primarily focussed on use of the da Vinci Surgical System.
This project is focused on developing a compact, intrinsically safe humanoid robot for interaction with Autistic children. The robot will be able to be used for treatment and assessment.
Developing electromagnetically (EM) tracked tools can be very time consuming. Tool design traditionally takes many iterations, each of which requires construction of a physical tool and performing lengthy experiments. We propose a simulator that allows tools to be virtually designed and tested before ever being physically built. Both tool rigid body (RB) configurations and reference RB configurations are configured; the reference RB can be located anywhere in the field, and the tool is virtually moved around the reference in user-specified pattern. Sensor measurements of both RBs are artificially distorted according to a previously acquired error field mapping, and the 6-DOF frames of the Tool and Reference are refit to the distorted sensors. It is possible to predict the tool tip registration error for a particular tool and coordinate reference frame (CRF) in a particular scenario before ever even building the tools.
Sensing Surgical Instruments
Gaining access to a surgical site via retracting neighboring tissue can result in complications due to occlusion of the tissue blood supply resulting in ischemic damage. By incorporating oxygenation sensors on the working surfaces of surgical retractors and graspers, it is possible to measure the local tissue oxygen saturation and look for trends in real-time. Further, by measuring tissue interaction forces simultaneously, we can further augment the information available to the surgeon. The sensors provide a means for sensory substitution to help compensate for the decreased sensation present in minimally invasive laparoscopic and robotic procedures that are gaining significant popularity. Sensing surgical instruments will allow for safer and more effective surgeries while not interfering with the normal workflow of a procedure.
Robotic Ultrasound and Liver Ablation
There has been increased interest in minimally invasive ablative treatments that typically require precise placement of the ablator tool to meet the predefined planning, and lead to efficient tumor destruction. Standard ablative procedures involve free hand transcutaneous ultrasonography (TCUS) in conjunction with manual tool positioning. Unfortunately, existing TCUS systems suffer from many limitations and results in failure to identify nearly half of all treatable liver lesions. Freehand manipulation of the ultrasound (US) probe and ablator tool critically lacks the level of control, accuracy, stability, and guaranteed performance required for these procedures. Freehand US results in undefined gap distribution, anatomic deformation due to variable pressure from the sonographer’s hand, and severe difficulty in maintaining optimal scanning position. In response to these limitations, we developed a dual robotic arm system that manages both ultrasound manipulation and needle guidance. We have performed a comparative performance study between robotic vs. freehand systems for both US scanning and needle placement in mechanical and animal tissue phantoms.
Steady Hand Guided Aneurysm Clip Applier
Steady hand guidance provides high accuracy motion while keeping the surgeon in contact with the surgical instrument. Force sensors are applied between the instrument and the robot, and as the surgeon applies forces to the instrument, the robot move accordingly. Tremor reduction, force scaling, and virtual fixtures can be applied to enhance control. This application uses steady hand guidance to precisely place brain aneurysm clips. The system was demonstrated and received good feedback at the CNS Conference in Denver, CO.
CT Guided Intra Cranial Hemorrhage(ICH) Evacuation
We developed a robotic system for rapid removal blood from the brain after a bleeding event resulting in blood in the ventricles or brain parenchyma. The procedure is performed inside a CT scanner. A hematoma evacuator is aligned with the target “out-of-plane”, with the use of a couch-mounted 2-DOF remote center of motion (RCM) robot. The robot is calibrated to CT image space with pure image based out-of-plane stereotactic registration. The system is frameless and the patient is secured in treatment position in a non-invasive manner. We achieved excellent out-of-plane tool placement accuracy in mechanical phantoms (1.0 mm) and demonstrated the workflow on human cadaver.