We are an interdisciplinary group of researchers blending approaches from human-computer interaction, social computing, databases, information management, and databases.
We develop techniques for designing, implementing, and reasoning about multiprocessor algorithms, in particular concurrent data structures for multicore machines and the mathematical foundations of the computation models that govern their behavior.
We work on a wide range of problems in distributed computing theory. We study algorithms and lower bounds for typical problems that arise in distributed systems---like resource allocation, implementing shared memory abstractions, and reliable communication.
We aim to develop a systematic framework for robots to build models of the world and to use these to make effective and safe choices of actions to take in complex scenarios.
We study the fundamentals of Bayesian optimization and develop efficient Bayesian optimization methods for global optimization of expensive black-box functions originated from a range of different applications.
Wikipedia is one of the most widely accessed encyclopedia sites in the world, including by scientists. Our project aims to investigate just how far Wikipedia’s influence goes in shaping science.
The robot garden provides an aesthetically pleasing educational platform that can visualize computer science concepts and encourage young students to pursue programming and robotics.
The Robot Compiler allows non-engineering users to rapidly fabricate customized robots, facilitating the proliferation of robots in everyday life. It thereby marks an important step towards the realization of personal robots that have captured imaginations for decades.