Sovereign AI and digital twin technologies for neurological and surgical care; amyloid PET/MRI quantification and trustworthy generative neuroimaging.
Get In TouchI focus on the intersection of deep learning, neuroimaging, and trustworthy AI for clinical decision support.
I am a Postdoctoral Researcher at the Digital Medicine and Smart Healthcare Research Center, National Yang Ming Chiao Tung University (NYCU), focusing on sovereign AI and digital twin technologies for healthcare. I completed my Ph.D. in Biomedical Engineering at the University of Alabama at Birmingham (UAB), where I developed deep learning methods for amyloid PET quantification, MRI-assisted and MRI-less brain segmentation, simultaneous PET/MR motion correction, and large-scale neuroimaging biomarker analysis. I work at the interface of trustworthy clinical AI, neuroimaging, and real-time surgical computer vision.
Dissertation: “Optimizing Quantification in Alzheimer’s Disease with PET Imaging through Advanced Imaging and Deep Learning Techniques”
MR-based deep learning segmentation (LEON) for amyloid PET quantification—diagnostic performance and equivalence to FreeSurfer across large neuroimaging cohorts.
Deep learning quantification without MRI using synthetic CT for training; equivalent to MRI-based standard and robust on external PET/CT cohorts.
Tracer characteristic–based co-registration for simultaneous PET/MR: reduced misalignment, improved quantification and task detectability.
Modified VGG with four face detectors to estimate mask-wearing rates in a region (course project).
Mobile game for emotion-recognition training in children with ASD (ABA-inspired); RehabWeek 2019 Student Design Challenge, 3rd place (RESNA).