We study the linear distributed asymptotic agreement(consensus) problem for a network of dynamic agents whose communication network is modeled by a randomly switching graph. The switching is determined by a finite state, Markov process, each topology corresponding to a state of the process. We address both the cases where the dynamics of the agents is expressed in continuous and discrete time. We show that, if the consensus matrices are doubly stochastic, convergence to average consensus is achieved in the mean square and almost sure sense, if and only if the graph resulted from the union of graphs corresponding to the states of the Markov process is strongly connected. The aim of this paper is to show how techniques from the theory of Markovian jump linear systems, in conjunction with results inspired by matrix and graph theory, can be used to prove convergence results for stochastic consensus problems.
You are here
Convergence results for the agreement problem on Markovian random topologies
Type:
Conference Paper›Invited and refereed articles in conference proceedings
Authored by:
Matei, Ion., Baras, John S.
Conference date:
August 28 - September 2, 2011
Conference:
2011 IFAC World Conference, pp. 8860-8865
Full Text Paper:
Abstract: