Continuous Time Bayesian Networks
Speaker: Christian Shelton , Intel Research and UC-Riverside
Date: September 22 2003
In this talk I will present continuous time Bayesian networks (CTBNs), which describe structured stochastic processes that evolve over continuous time. The state of the system is decomposed into local process variables whose interactions are described qualitatively by a directed graph, akin to traditional Bayesian networks. CTBNs are distinct from Bayesian networks in that they describe processes in continuous time without the need for time slicing.
I will then present the computational difficulties associated with exact inference in CTBN models and demonstrate an approximate inference algorithm which takes advantage of the structure of the CTBN graph. Finally, I will conclude with an algorithm for learning both the structure and the parameters of a CTBN from data, and some encouraging theoretical results about the complexity of such learning and its relationship to dynamic Bayesian networks.
This work was developed in collaboration with Uri Nodelman and Daphne Koller at Stanford University.
BIOGRAPHY: Christian Shelton received his Ph.D. from MIT in 2001. He worked with Tomaso Poggio on applying reinforcement learning to financial markets. He then spent two years at Stanford working as a post-doc for Daphne Koller. Recently, he joined the faculty at the University of California at Riverside in July. Currently he is on leave from UCR for six months and working at Intel Research in Santa Clara.
See other events that are part of Brains and Machines Seminar Series Fall 2003
See other events happening in September 2003