tag:blogger.com,1999:blog-3722233.post8508515779454734672..comments2020-04-03T01:52:07.280-04:00Comments on Computational Complexity: Infinite Series and Markov ChainsLance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger1125tag:blogger.com,1999:blog-3722233.post-57373862019290686312017-01-19T09:42:03.115-05:002017-01-19T09:42:03.115-05:00Dear Lance,
The fact you're referring to is ...Dear Lance, <br /><br />The fact you're referring to is proved as Prop 1.14 in Levin, Peres and Wilmer's book "Rapid Mixing in Markov Chains" (2nd edition). They show the slightly more general fact that if <br /><br />\pi(y) = E_z (number of visits to y before returning to z),<br /><br />where E_z denotes the expectation when the chain begins from z, then \pi(y) is a stationary measure for the transition matrix P (assuming it is irreducible and the state space is finite).<br /><br />To make a probability distribution out of this they divide by the expected return time from z to z. In this if you consider \pi(z)/E_z(first return to z) that is naturally 1/E_z(first return to z), which proves the result.Amitabhahttps://www.blogger.com/profile/09342604891643593133noreply@blogger.com