Our vision is data-driven machine learning systems that advance the quality of healthcare, the understanding of cyber arms races and the delivery of online education.
This community is interested in understanding and affecting the interaction between computing systems and society through engineering, computer science and public policy research, education, and public engagement.
We aim to develop the science of autonomy toward a future with robots and AI systems integrated into everyday life, supporting people with cognitive and physical tasks.
Our mission is to work with policy makers and cybersecurity technologists to increase the trustworthiness and effectiveness of interconnected digital systems.
The Systems CoR is focused on building and investigating large-scale software systems that power modern computers, phones, data centers, and networks, including operating systems, the Internet, wireless networks, databases, and other software infrastructure.
The goal of the Theory of Computation CoR is to study the fundamental strengths and limits of computation as well as how these interact with mathematics, computer science, and other disciplines.
This CoR takes a unified approach to cover the full range of research areas required for success in artificial intelligence, including hardware, foundations, software systems, and applications.
Led by Web inventor and Director, Tim Berners-Lee and CEO Jeff Jaffe, the W3C focus is on leading the World Wide Web to its full potential by developing standards, protocols and guidelines that ensure the long-term growth of the Web
We aim to develop a systematic framework for robots to build models of the world and to use these to make effective and safe choices of actions to take in complex scenarios.
Our goal is to develop an adaptive storage manager for analytical database workloads in a distributed setting. It works by partitioning datasets across a cluster and incrementally refining data partitioning as queries are run.
Alloy is a language for describing structures and a tool for exploring them. It has been used in a wide range of applications from finding holes in security mechanisms to designing telephone switching networks. Hundreds of projects have used Alloy for design analysis, for verification, for simulation, and as a backend for many other kinds of analysis and synthesis tools, and Alloy is currently being taught in courses worldwide.
BlueDBM is an architecture of computer clusters consisting of fast distributed flash storage and in-storage accelerators, which often outperforms larger and more expensive clusters in applications such as graph analytics.
We plan to develop a suite of graph compression and reordering techniques as part of the Ligra parallel graph processing framework to reduce space usage and improve performance of graph algorithms.
Data scientists universally report that they spend at least 80% of their time finding data sets of interest, accessing them, cleaning them and assembling them into a unified whole.
We're developing a flexible, high-performance storage architecture for database-backed applications, based on a dynamic set of queries specified by the developer which Soup automatically optimizes.
We are developing a general framework that enforces privacy transparently enabling different kinds of machine learning to be developed that are automatically privacy preserving.
Starling is a scalable query execution engine built on cloud function services that computes at a fine granularity, helping people more easily match workload demand.
This week it was announced that MIT professors and CSAIL principal investigators Shafi Goldwasser, Silvio Micali, Ronald Rivest, and former MIT professor Adi Shamir won this year’s BBVA Foundation Frontiers of Knowledge Awards in the Information and Communication Technologies category for their work in cryptography.
Artificial intelligence (AI) in the form of “neural networks” are increasingly used in technologies like self-driving cars to be able to see and recognize objects. Such systems could even help with tasks like identifying explosives in airport security lines.
When a power company wants to build a new wind farm, it generally hires a consultant to make wind speed measurements at the proposed site for eight to 12 months. Those measurements are correlated with historical data and used to assess the site’s power-generation capacity.This month CSAIL researchers will present a new statistical technique that yields better wind-speed predictions than existing techniques do — even when it uses only three months’ worth of data. That could save power companies time and money, particularly in the evaluation of sites for offshore wind farms, where maintaining measurement stations is particularly costly.