William "Liam" Wang
Hello! I am a PhD student in Computer Science at The University of Michigan under Dr. JJ Park.
I have participated in research projects involving robotic navigation systems, mixed reality interfaces for medical procedures, advance motion planning for remote operation of a satellite repair robotic arm, and machine learning for bioinformatics.
Email /
LinkedIn /
Github
|
|
Research
My current research interests are at the intersection of deep learning for robotics and visual computing (computer vision and 3D graphics).
|
|
A Dynamic Digital Twin for X-ray Guided Surgery
Killeen BD and Wang LJ (co-first authors), Zhang J, She Z, Taylor R, Osgood G, Unberath M.
in preparation
manuscript
We built a system to determine the changing 3D pose of hardware elements (K-wires) during insertion for percutaneous pelvic fracture fixation surgery
using only standard intraoperative X-ray fluoroscopy images.
Our approach dynamically creates a 3D anatomic 'digital twin' by estimating the 3D position of hardware during insertion using a deep neural network (DNN) and a global optimization method for per-frame localization.
|
|
Stand in Surgeon's Shoes: Fostering empathy and engagement among surgical teams through virtual reality. Submitted, International Conference on Information Processing in Computer-Assisted Intervention (IPCAI), Barcelona, June 18-19, 2024
Killeen BD, Zhang H, Wang LJ, Liu Z, Taylor R, Unberath M.
International Conference on Information Processing in Computer-Assisted Intervention (IPCAI),
Barcelona, June 18-19, 2024
manuscript
Leveraging virtual reality (VR),
we evaluate a curriculum in which surgeons and non-surgeons swap roles inside
a realistic VR operating room. Our study focuses on X-ray guided pelvic trauma
surgery, a procedure where successful communication depends on the shared
model between the surgeon and a C-arm technologist.
Exposing non-surgeons to VR surgical training results in higher engagement with the C-arm technologist role in VR.
It also has a significant effect on non-surgeon’s mental model of the overall task;
novice participants’ estimation of the mental demand and effort required for the surgeon’s task increases
after training.
|
|
Wearable Mechatronic Ultrasound-Integrated AR Navigation System for Lumbar Puncture Guidance.
Jiang B and Wang LJ (co-first authors), Xu K, Hossbach M, Demir A, Rajan P, Taylor R, Moghekar A, Foroughi P, Kazanzides P, Boctor E.
IEEE Transactions on Medical Robotics and Bionics, 27 Sep 2023
paper
We present a complete lumbar puncture guidance system
with the integration of (1) a wearable mechatronic ultrasound
imaging device, (2) volume-reconstruction and bone surface esti-
mation algorithms and (3) two alternative augmented reality user
interfaces for needle guidance, including a HoloLens-based and a
tablet-based solution.
|
|
A Virtual Reality Planning Environment for High-Risk, High-Latency Teleoperation
Pryor W, Wang LJ, Chatterjee A, Vagvolgyi B, Deguet A, Leonard S, Whitcomb L, Kazanzides P.
27 Sep 2023,
Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), pp. 11619-11625
paper
Teleoperation of robots in space is challenging due to high latency and limited workspace visibility.
We developed a 3D virtual reality (VR) interface for
the Interactive Planning and Supervised Execution (IPSE) system
and implemented it on a Meta Quest 2 head-mounted display (HMD).
We demonstrated improved operator load with the 3D VR interface, with no decrease in task performance, while also providing cost and
portability benefits compared to the conventional 2D interface.
|
|
Smartphone-Based Augmented Reality-Based Education for Patients Undergoing Radiation Therapy.
Wang LJ, Casto B, Reyes-Molyneux N, Chance WW, Wang SJ.
Technical Innovations and Patient Support in Radiation Oncology, volume 29, 100229, March 2024.
Paper
We built an augmented reality (AR) patient education iOS/Android application that allows patients to view a virtual simulation of themselves receiving radiation treatment.
We read DICOM-RT data from the clinical treatment planning system, convert their CT data into 3D translucent mesh objects, and render the patient’s actual radiotherapy plan in real time on any smartphone or tablet.
We conducted a patient study and found that this simplified low-cost tablet-based personalized AR simulation can be a helpful educational tool and reduces patient anxiety.
Smartphone Example (Safari, Chrome, Samsung):
|
|