Our vision is data-driven machine learning systems that advance the quality of healthcare, the understanding of cyber arms races and the delivery of online education.
This community is interested in understanding and affecting the interaction between computing systems and society through engineering, computer science and public policy research, education, and public engagement.
We aim to develop the science of autonomy toward a future with robots and AI systems integrated into everyday life, supporting people with cognitive and physical tasks.
This CoR takes a unified approach to cover the full range of research areas required for success in artificial intelligence, including hardware, foundations, software systems, and applications.
The Weiss Lab seeks to create integrated biological systems capable of autonomously performing useful tasks, and to elucidate the design principles underlying complex phenotypes.
EQ-Radio can infer a person’s emotions using wireless signals. It transmits an RF signal and analyzes its reflections off a person’s body to recognize his emotional state (happy, sad, etc.).
We are developing a general framework that enforces privacy transparently enabling different kinds of machine learning to be developed that are automatically privacy preserving.
We develop algorithms, systems and software architectures for automating reconstruction of accurate representations of neural tissue structures, such as nanometer-scale neurons' morphology and synaptic connections in the mammalian cortex.
The goal of this project is to develop and test a wearable ultrasonic echolocation aid for people who are blind and visually impaired. We combine concepts from engineering, acoustic physics, and neuroscience to make echolocation accessible as a research tool and mobility aid.
Computer scientists often develop mathematical models to understand how animals move, enabling breakthroughs in designing things like microrobotic wings and artificial bone structures.
MIT’s Amar Gupta and his wife Poonam were on a trip to Los Angeles in 2016 when she fell and broke both wrists. She was whisked by ambulance to a reputable hospital. But staff informed the couple that they couldn’t treat her there, nor could they find another local hospital that would do so. In the end, the couple was forced to take the hospital’s stunning advice: return to Boston for treatment.
Genome-wide association studies, which look for links between particular genetic variants and incidence of disease, are the basis of much modern biomedical research.
This week it was announced that MIT professors and CSAIL principal investigators Shafi Goldwasser, Silvio Micali, Ronald Rivest, and former MIT professor Adi Shamir won this year’s BBVA Foundation Frontiers of Knowledge Awards in the Information and Communication Technologies category for their work in cryptography.
Doctors are often deluged by signals from charts, test results, and other metrics to keep track of. It can be difficult to integrate and monitor all of these data for multiple patients while making real-time treatment decisions, especially when data is documented inconsistently across hospitals. In a new pair of papers, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) explore ways for computers to help doctors make better medical decisions.
In experiments involving a simulation of the human esophagus and stomach, researchers at CSAIL, the University of Sheffield, and the Tokyo Institute of Technology have demonstrated a tiny origami robot that can unfold itself from a swallowed capsule and, steered by external magnetic fields, crawl across the stomach wall to remove a swallowed button battery or patch a wound.The new work, which the researchers are presenting this week at the International Conference on Robotics and Automation, builds on a long sequence of papers on origami robots from the research group of CSAIL Director Daniela Rus, the Andrew and Erna Viterbi Professor in MIT’s Department of Electrical Engineering and Computer Science.