# CSAIL Event Calendar: Previous Series

 Deploying Wireless Networks with BeepsSpeaker: Alex Cornejo , MITDate: May 14 2010 Time: 1:00PM to 2:30PM Location: 32-G631Host: Nancy Lynch, MITContact: Rotem Oshman, rotem@csail.mit.eduABSTRACT: What does it mean to send less than a single bit of information? And more importantly, what can we compute with this? We present the \emph{discrete beeping} communication model, which assumes nodes have minimal knowledge about their environment and severely limited communication capabilities. Specifically, nodes have no information regarding the local or global structure of the network, don't have access to synchronized clocks and are woken up by an adversary. Moreover, instead on communicating through messages they rely solely on carrier sensing to exchange information. This model is interesting from a practical point of view, because it is possible to implement it (or emulate it) even in the most primitive of radio networks. From a theory point of view, it shows that complex problems (such as vertex coloring) can be solved efficiently with close to no assumptions and communication. Under this model, we study the problem of \emph{interval coloring}, which is a variant of vertex coloring specially suited for this model. Given a set of resources, the goal of interval coloring is to assign every node a large contiguous fraction of the resources, such that neighboring nodes share no resources. To highlight the importance of the discreteness of the model, we contrast it against a continuous variant described in \cite{infocom09}. We present an $\bigO(1)$ time algorithm that terminates with probability $1$ and assigns an interval of size $\Omega(T/\Delta)$ that repeats every $T$ time units to every node of the network. This improves an $\bigO(\log n)$ time algorithm with the same guarantees presented in \cite{infocom09}, and accentuates the unrealistic assumptions of the continuous model. Under the more realistic discrete model, we present a Las Vegas algorithm that solves $\Omega(T/\Delta)$-interval coloring in $\bigO(\log n)$ time with high probability and describe how to adapt the algorithm for dynamic networks where nodes may join or leave. For constant degree graphs we prove a lower bound of $\Omega(\log n)$ on the time required to solve interval coloring for this model against randomized algorithms. This lower bound implies that our algorithm is asymptotically optimal for constant degree graphs. See other events that are part of Theory of Distributed Systems 2010/2011See other events happening in May 2010