AI integrating imaging and genetics to understand human evolution, development, aging, and disease

Abstract
Imaging has been the primary means of diagnosing as well as tracking the progression of many diseases for decades but has largely been collected in isolation. Recently through the advent of large scale biobanks, this rich type of data has become linked with genetic and electronic health care record data at the level of tens of thousands of individuals, providing an unprecedented ability to study the relationship between genotype and phenotype directly in humans. I will discuss our group's work leveraging >1.2M medical images (DXA, MRI, and ultrasound) from ~60,000 individuals across multiple views of the heart, brain, skeleton, liver, and pancreas to provide new insights in 4 different domains of biological science: (a) to understand the evolution of the human skeletal form which underlies our ability to be bipedal; (b) examining the classical question in developmental biology of the genetic basis of left-right symmetry; (c) building biological aging clocks to study mechanisms of age acceleration/deceleration and to identify gene targets to combat aging; (d) multi-modal AI combining imaging, genetics, and metabolics to predict 10-year disease incidence for common complex disease.
Speaker Bio
After initially training in Electrical Engineering focusing on computer vision and information theory, Vagheesh did a Masters in Biostatistics under Curtis Huttenhower, and then moved to the University of Cambridge to do a PhD in Genetics with Chris Tyler Smith and Richard Durbin. He returned to Harvard as a postdoc with David Reich and Nick Patterson, and since 2020 he has been an Assistant Professor in the Departments of Integrative Biology as well as Statistics and Data Science at the University of Texas at Austin.