William "Liam" Wang

I am a PhD student in Computer Science at The University of Michigan under Dr. JJ Park. My research interests include artificial intelligence / machine learning, medical imaging, high performance graphics computing, and virtual/augmented reality. I have participated in research projects involving VR/AR for medical procedures, VR for teleoperated robotic satellite repair, bioinformatics algorithms, clinical prediction tools, and robotic navigation systems.

Email  /  LinkedIn  /  Github

profile photo

Research

My current research area is in generative deep Learning for dynamic 3D image reconstruction. I am currently developing simulation and novel deep learning methods for reconstructing 3D structures from dynamic unsynchronized 2D multi-view images via 3D Gaussian Splatting.

Toward Improved Non-Invasive Coronary Disease Assessment: 3D Coronary Reconstruction from X-Ray Angiography.
Wang LJ, Figueroa CA, Park JJ.
U.S. National Congress on Computational Mechanics 18 Chicago, Illinois, July 2024

We built a deep learning-based automated approach for reconstructing patient-specific 3D coronary trees from standard unsynchronized 2D angiogram videos, to enable downstream 3D quantitative analysis of coronary deformation and contrast washout dynamics. Our feed-forward transformer-based model reconstructs coronary radiodensity volumes from multiple sparse angiograms. We utilize a novel adversarial reconstruction loss, which enables robust supervision in the presence of uncertainty and inconsistency. Our model trains on unlabeled angiograms, allowing for improved scalability and generalizability in diverse clinical scenarios. Results suggest successful 3D coronary artery reconstruction from synthetic 2D angiograms.

FluoroSAM: A Language-Promptable Foundation Model for Flexible X-Ray Image Segmentation.
Killeen BD, Wang LJ, Iñigo B, Zhang H, Armand M, Taylor RH, Osgood G, Unberath M.
Medical Image Computing and Computer Assisted Intervention – MICCAI 2025: 28th International Conference Daejeon, South Korea, Sep 27, 2025.
Conference Paper

We introduce FluoroSAM, a language-promptable variant of the Segment-Anything Model, trained from scratch on 3M synthetic X-ray images from a wide variety of human anatomies, imaging geometries, and viewing angles. FluoroSAM is capable of segmenting myriad anatomical structures and tools based on natural language prompts, thanks to the novel incorporation of vector quantization (VQ) of text embeddings in the training process. We demonstrate FluoroSAM’s performance quantitatively on real X-ray images and showcase on several applications how FluoroSAM is a key enabler for rich human-machine interaction in the X-ray image acquisition and analysis context.

HaptiKart: An engaging videogame reveals elevated proprioceptive bias in individuals with autism spectrum disorder.
Lidstone DE, Singhala M, Wang LJ, Brown JD, Mostofsky SH.
PLOS Digit Health. 2025 Jun 18.
Paper

We explored how people with autism spectrum disorder (ASD) may rely more on body-based (proprioceptive) senses than on visual information when interacting with their surroundings. Using a custom videogame, HaptiKart, we designed a fun and engaging driving task to measure this sensory preference. Players drove a virtual car using a steering wheel that could be set to delay either visual or proprioceptive feedback, allowing us to observe how each type of feedback affects driving accuracy. Our results showed that individuals with ASD had a stronger preference for proprioceptive feedback than those without ASD. This sensory preference was linked to greater autism symptom severity and lower IQ, suggesting that heightened reliance on body-centered feedback may contribute to learning and skill differences often seen in autism. These findings support the potential of HaptiKart as a simple, accessible tool to help clinicians better understand and address sensory biases in ASD, tailoring interventions to improve learning from visual information.

Stand in Surgeon's Shoes: Fostering empathy and engagement among surgical teams through virtual reality. Submitted, International Conference on Information Processing in Computer-Assisted Intervention (IPCAI), Barcelona, June 18-19, 2024
Killeen BD, Zhang H, Wang LJ, Liu Z, Taylor R, Unberath M.
International Conference on Information Processing in Computer-Assisted Intervention (IPCAI), Barcelona, June 18-19, 2024

Leveraging virtual reality (VR), we evaluate a curriculum in which surgeons and non-surgeons swap roles inside a realistic VR operating room. Our study focuses on X-ray guided pelvic trauma surgery, a procedure where successful communication depends on the shared model between the surgeon and a C-arm technologist. Exposing non-surgeons to VR surgical training results in higher engagement with the C-arm technologist role in VR. It also has a significant effect on non-surgeon’s mental model of the overall task; novice participants’ estimation of the mental demand and effort required for the surgeon’s task increases after training.

Wearable Mechatronic Ultrasound-Integrated AR Navigation System for Lumbar Puncture Guidance.
Jiang B and Wang LJ (co-first authors), Xu K, Hossbach M, Demir A, Rajan P, Taylor R, Moghekar A, Foroughi P, Kazanzides P, Boctor E.
IEEE Transactions on Medical Robotics and Bionics, 27 Sep 2023
paper

We present a complete lumbar puncture guidance system with the integration of (1) a wearable mechatronic ultrasound imaging device, (2) volume-reconstruction and bone surface esti- mation algorithms and (3) two alternative augmented reality user interfaces for needle guidance, including a HoloLens-based and a tablet-based solution.

A Virtual Reality Planning Environment for High-Risk, High-Latency Teleoperation
Pryor W, Wang LJ, Chatterjee A, Vagvolgyi B, Deguet A, Leonard S, Whitcomb L, Kazanzides P.
27 Sep 2023, Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA), pp. 11619-11625
paper

Teleoperation of robots in space is challenging due to high latency and limited workspace visibility. We developed a 3D virtual reality (VR) interface for the Interactive Planning and Supervised Execution (IPSE) system and implemented it on a Meta Quest 2 head-mounted display (HMD). We demonstrated improved operator load with the 3D VR interface, with no decrease in task performance, while also providing cost and portability benefits compared to the conventional 2D interface.

Smartphone-Based Augmented Reality-Based Education for Patients Undergoing Radiation Therapy.
Wang LJ, Casto B, Reyes-Molyneux N, Chance WW, Wang SJ.
Technical Innovations and Patient Support in Radiation Oncology, volume 29, 100229, March 2024.
Paper

We built an augmented reality (AR) patient education iOS/Android application that allows patients to view a virtual simulation of themselves receiving radiation treatment. We read DICOM-RT data from the clinical treatment planning system, convert their CT data into 3D translucent mesh objects, and render the patient’s actual radiotherapy plan in real time on any smartphone or tablet. We conducted a patient study and found that this simplified low-cost tablet-based personalized AR simulation can be a helpful educational tool and reduces patient anxiety.

Smartphone Example (Safari, Chrome, Samsung):

Website template design by Jon Barron.