#### Research Group

## Algorithms Group

We devise new mathematical tools to tackle the increasing difficulty and importance of problems we pose to computers.

- Research Areas
- Impact Areas

20 Group Results

We devise new mathematical tools to tackle the increasing difficulty and importance of problems we pose to computers.

We design software for high performance computing, develop algorithms for numerical linear algebra, and research random matrix theory and its applications.

The MIT Center for Deployable Machine Learning (CDML) works towards creating AI systems that are robust, reliable and safe for real-world deployment.

We focus on finding novel approaches to improve the performance of modern computer systems without unduly increasing the complexity faced by application developers, compiler writers, or computer architects.

Our interests span quantum complexity theory, barriers to solving P versus NP, theoretical computer science with a focus on probabilistically checkable proofs (PCP), pseudo-randomness, coding theory, and algorithms.

Our lab focuses on designing algorithms to gain biological insights from advances in automated data collection and the subsequent large data sets drawn from them.

Our mission is fostering the creation and development of high-performance, reliable and secure computing systems that are easy to interact with.

Our group’s goal is to create, based on such microscopic connectivity and functional data, new mathematical models explaining how neural tissue computes.

This community is interested in understanding and affecting the interaction between computing systems and society through engineering, computer science and public policy research, education, and public engagement.

We seek to develop techniques for securing tomorrow's global information infrastructure by exploring theoretical foundations, near-term practical applications, and long-range speculative research.

We are investigating decentralized technologies that affect social change.

Our group studies geometric problems in computer graphics, computer vision, machine learning, optimization, and other disciplines.

We are an interdisciplinary group of researchers blending approaches from human-computer interaction, social computing, databases, information management, and databases.

Our research interests center around the capabilities and limits of quantum computers, and computational complexity theory more generally.

We investigate the technologies that support scalable high-performance computing, including hardware, software, and theory.

The Systems CoR is focused on building and investigating large-scale software systems that power modern computers, phones, data centers, and networks, including operating systems, the Internet, wireless networks, databases, and other software infrastructure.

The goal of the Theory of Computation CoR is to study the fundamental strengths and limits of computation as well as how these interact with mathematics, computer science, and other disciplines.

We work on a wide range of problems in distributed computing theory. We study algorithms and lower bounds for typical problems that arise in distributed systems---like resource allocation, implementing shared memory abstractions, and reliable communication.

This CoR takes a unified approach to cover the full range of research areas required for success in artificial intelligence, including hardware, foundations, software systems, and applications.

26 Project Results

We aim to develop a systematic framework for robots to build models of the world and to use these to make effective and safe choices of actions to take in complex scenarios.

The project concerns algorithmic solutions for writing fast codes.

Our goal is to develop a socially intelligent team coacher agent that helps humans communicate, strategize, and work together efficiently.

We study the fundamentals of Bayesian optimization and develop efficient Bayesian optimization methods for global optimization of expensive black-box functions originated from a range of different applications.

Traffic is not just a nuisance for drivers: It’s also a public health hazard and bad news for the economy.

BlueDBM is an architecture of computer clusters consisting of fast distributed flash storage and in-storage accelerators, which often outperforms larger and more expensive clusters in applications such as graph analytics.

This project aims to design parallel algorithms for shared-memory machines that are efficient both in theory and also in practice.

Our goal is to design novel data compression techniques to accelerate popular machine learning algorithms in Big Data and streaming settings.

Wikipedia is one of the most widely accessed encyclopedia sites in the world, including by scientists. Our project aims to investigate just how far Wikipedia’s influence goes in shaping science.

Our goal is to investigate deterministic algorithms for robotic task and motion planning.

To further parallelize co-prime sampling based sparse sensing, we introduce Diophantine Equation in different algebraic structures to build generalized lattice arrays.

With strong relationship to generalized Chinese Remainder Theorem, the geometry properties in the remainder code space, a special lattice space, are explored.

With strong relationship to generalized Chinese Remainder Theorem, the geometry properties in the remainder code space, a special lattice space, are explored.

We aim to understand theory and applications of diversity-inducing probabilities (and, more generally, "negative dependence") in machine learning, and develop fast algorithms based on their mathematical properties.

Developing state-of-the-art tools that process 3D surfaces and volumes

We are designing new parallel algorithms, optimizations, and frameworks for clustering large-scale graph and geometric data.

The creation of low-power circuits capable of speech recognition and speaker verification will enable spoken interaction on a wide variety of devices in the era of Internet of Things.

Linking probability with geometry to improve the theory and practice of machine learning

Gerrymandering is a direct threat to our democracy, undermining founding principles like equal protection under the law and eroding public confidence in elections.

To enable privacy preservation in decentralized optimization, differential privacy is the most commonly used approach. However, under such scenario, the trade-off between accuracy (even efficiency) and privacy is inevitable. In this project, distributed numerical optimization scheme incorporated with lightweight cryptographic information sharing are explored. The affect on the convergence rate from network topology is considered.

We plan to develop a programming abstraction to enable programmers to write efficient parallel programs to process dynamic graphs.

The Robot Compiler allows non-engineering users to rapidly fabricate customized robots, facilitating the proliferation of robots in everyday life. It thereby marks an important step towards the realization of personal robots that have captured imaginations for decades.

In this project, we aim to develop a framework that can ensure and certify the safety of an autonomous vehicle. By leveraging research from the area of formal verification, this framework aims to assess the safety, i.e., free of collisions, of a broad class of autonomous car controllers/planners for a given traffic model.

Starling is a scalable query execution engine built on cloud function services that computes at a fine granularity, helping people more easily match workload demand.

A polyhedral compiler for expressing image processing, DNN, and linear/tensor algebra applications

32 People Results

Graduate Student

Research Affiliate

Graduate Student

Graduate Student

Graduate Student

Postdoctoral Associate

Graduate Student

Professor

25 News Results

Wireless system helps Boston retirement home care for COVID patients while reducing risk of contagion

Research aims to make it easier for self-driving cars, robotics, and other applications to understand the 3D world.

New capabilities allow “roboats” to change configurations to form pop-up bridges, stages, and other structures.

Professor Adam Chlipala builds tools to help programmers more quickly generate optimized, secure code.

Fleet of “roboats” could collect garbage or self-assemble into floating structures in Amsterdam’s many canals.

Speakers — all women — discuss everything from gravitational waves to robot nurses

Workshop explores technical directions for making AI safe, fair, and understandable

Algorithm could help autonomous underwater vehicles explore risky but scientifically-rewarding environments.

Last week MIT’s Institute for Foundations of Data Science (MIFODS) held an interdisciplinary workshop aimed at tackling the underlying theory behind deep learning. Led by MIT professor Aleksander Madry, the event focused on a number of research discussions at the intersection of math, statistics, and theoretical computer science.

Model identifies instances when autonomous systems have learned from examples that may cause dangerous errors in the real world.

Cambridge Mobile Telematics Raises $500M from SoftBank Vision Fund

In simulations, robots move through new environments by exploring, observing, and drawing from learned experiences.

MIT professor discusses using paper-folding for applications in manufacturing, medicine, and robotics

Algorithm computes “buffer zones” around autonomous vehicles and reassess them on the fly.

May 2, 2018 - Sir Tim Berners-Lee of MIT gave a Dertouzos Distinguished Lecture titled "From Utopia to Dystopia in 29 Short Years."

Harini Suresh, a PhD student at MIT CSAIL, studies how to make machine learning algorithms more understandable and less biased.

CSAIL's NanoMap system enables drones to avoid obstacles while flying at 20 miles per hour, by more deeply integrating sensing and control.

This week it was announced that MIT professors and CSAIL principal investigators Shafi Goldwasser, Silvio Micali, Ronald Rivest, and former MIT professor Adi Shamir won this year’s BBVA Foundation Frontiers of Knowledge Awards in the Information and Communication Technologies category for their work in cryptography.

New CSAIL work shows that traffic would flow faster if drivers kept an equal distance between cars

Today four MIT faculty were named among the Association for Computer Machinery's 2017 Fellows for making “landmark contributions to computing.”

We live in the age of big data, but most of that data is “sparse.” Imagine, for instance, a massive table that mapped all of Amazon’s customers against all of its products, with a “1” for each product a given customer bought and a “0” otherwise. The table would be mostly zeroes.

Most modern websites store data in databases, and since database queries are relatively slow, most sites also maintain so-called cache servers, which list the results of common queries for faster access. A data center for a major web service such as Google or Facebook might have as many as 1,000 servers dedicated just to caching.

This week the Association for Computer Machinery presented CSAIL principal investigator Daniel Jackson with the 2017 ACM SIGSOFT Outstanding Research Award for his pioneering work in software engineering. (This fall he also received the ACM SIGSOFT Impact Paper Award for his research method for finding bugs in code.)An EECS professor and associate director of CSAIL, Jackson was given the Outstanding Research Award for his “foundational contributions to software modeling, the creation of the modeling language Alloy, and the development of a widely used tool supporting model verification.”

In recent years, a host of Hollywood blockbusters — including “The Fast and the Furious 7,” “Jurassic World,” and “The Wolf of Wall Street” — have included aerial tracking shots provided by drone helicopters outfitted with cameras. Those shots required separate operators for the drones and the cameras, and careful planning to avoid collisions. But a team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable.