tag:blogger.com,1999:blog-3722233.post8298995692495359819..comments2020-08-11T02:40:41.142-04:00Comments on Computational Complexity: Need EASY approaches to getting unif random from non-random sourcesLance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger11125tag:blogger.com,1999:blog-3722233.post-27188431682471310212018-07-31T16:40:01.901-04:002018-07-31T16:40:01.901-04:00More generally, if m = 2^a_1 + ... + 2^a_k is the ...More generally, if m = 2^a_1 + ... + 2^a_k is the binary expansion of n, then you can partition {1,...,m} into parts of sizes 2^a_1, ..., 2^a_k, and extract a_i bits from the i'th part.<br />The expected number of bits you extract in this way (for a random point in {1,...,m}) is log m - O(1) (in fact, the best constant is log(7)-10/7).<br />If you toss a p-biased coin n times, then you see roughly np ones. The corresponding binomial coefficient m = binomial(n,np) satisfies log m = h(p) n - O(log n). More generally, if k ~ Bin(n,p), then E[log m] = h(p) n - o(n). Therefore this strategy reaches the entropy in expectation. <br />To get an asymptotic rate of h(p) with high probability, it suffices to partition the input into omega(1) parts of size omega(1).Anonymoushttps://www.blogger.com/profile/08450062297306341505noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-88897904229572628712018-07-29T19:00:18.693-04:002018-07-29T19:00:18.693-04:00How do you use the other 3 possibilities? Lets say...How do you use the other 3 possibilities? Lets say these are 1110000, 1101000, 1100100. If you get the first one, add the bit 0 to your string. If you get the second, add the bit 1 to your string. And if you get the third, do nothing. This still gives each bit an equal probability of being 0 and 1. This technique gives you .544 bits/coin flip. You can use a modification of this similar to arithmetic coding to get .628 bits/coin flip. And making the length of the blocks go to infinity should make the efficiency go to 1 bit/coin flip.Peter Shorhttps://www.blogger.com/profile/13823970640202949073noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-21447113011222186312018-07-29T18:51:18.307-04:002018-07-29T18:51:18.307-04:00Hi Bill,
Mine should give truly random bits even i...Hi Bill,<br />Mine should give truly random bits even if the original source is very biased (but independent). Whether it gives more bits than your easy protocol is another question. It can certainly be modified to do that using ideas from arithmetic coding, but I don't know whether there's an intermediate-level way of doing this without introducing a large amount of information theory for background.<br /><br />Anyway, in my protocol, if you get 4 0's and 3 1's, you add 5 bits. You add all possible sequences of 5 bits with equal probability, so each of these added bits has an equal probability of being a 0 or a 1. Similarly for any other numbers of 0's and 1's.<br /><br />Peter Shorhttps://www.blogger.com/profile/13823970640202949073noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-24298725081718010082018-07-28T17:48:18.230-04:002018-07-28T17:48:18.230-04:00Thanks- will look at your code and may email you l...Thanks- will look at your code and may email you later.GASARCHhttps://www.blogger.com/profile/03615736448441925334noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-4597298083648509482018-07-28T17:47:55.080-04:002018-07-28T17:47:55.080-04:00THANKS!
We want to get truly random bits even if t...THANKS!<br />We want to get truly random bits even if the original source is very<br />biased (but independent). Does yours do that, or can it be modified<br />to do that? I'm thinking no and yes.GASARCHhttps://www.blogger.com/profile/03615736448441925334noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-72373690698347095492018-07-28T16:33:05.811-04:002018-07-28T16:33:05.811-04:00I don't know exactly what you're looking f...I don't know exactly what you're looking for, but here's one approach. Break your bitstream into blocks of (say) 7. Now, suppose you get 4 0's and 3 1's. There are binomial(7,4)=35 ways of doing this. Choose 3 of these ways to throw out; the other 32 give you 5 bits. Now, suppose you get 5 0's and 2 1's. There are binomial(7,2)=21 ways of doing this. Choose 5 of these to throw out and use the others to get 4 bits. If the probability of a head is roughly 1/2, you roughly get .517 bits/coin flip. This is only slightly better than your first method, but the reason is that you're using 35 possibilities to get 32 random bits, and throwing away the other 3 possibilities. You can use them.Peter Shorhttps://www.blogger.com/profile/13823970640202949073noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-53190482894326285192018-07-26T18:10:29.958-04:002018-07-26T18:10:29.958-04:00I coded my algorithm in python. It is available as...I coded my algorithm in python. It is available as a <a href="https://gist.github.com/YuvalFilmus/b0552be98067acec49ddaa731f2b8b74" rel="nofollow">Gist</a>.<br />(Unfortunately I was unable to figure out how to include it as part of a comment.)<br /><br />As an example, the following code runs the generator with 20 iid bits distributed Bernoulli(0.4), generated from seed 1:<br /><br />>>> test(.4, 20, 1)<br />[1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0]<br /><br />The output is the 12 bits shown above.Anonymoushttps://www.blogger.com/profile/08450062297306341505noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-63061935105557756952018-07-26T17:07:59.821-04:002018-07-26T17:07:59.821-04:00This comment has been removed by the author.Anonymoushttps://www.blogger.com/profile/08450062297306341505noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-56472936732143375662018-07-26T12:03:17.705-04:002018-07-26T12:03:17.705-04:00Clarify please: I understand how you get the three...Clarify please: I understand how you get the three<br />streams<br />Output<br />Daughter1<br />Daughter2<br /><br />but how do you put them together to get a uniform random<br />sequence?<br /><br />GASARCHhttps://www.blogger.com/profile/03615736448441925334noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-5693151915030714012018-07-26T06:14:47.780-04:002018-07-26T06:14:47.780-04:00This all sounds nice in theory, but how does it ap...This all sounds nice in theory, but how does it apply to real physical random sources, like the <a href="http://gamesbyemail.com/News/DiceOMatic" rel="nofollow">Dice-O-Matic</a>? Maybe some of the papers looked complicated, because they just tackled the real physical situation head-on, instead of hiding behind convenient idealizations?<br /><br />How would you <a href="https://math.stackexchange.com/questions/862993/can-a-biased-physical-random-source-be-post-processed-to-control-the-bias" rel="nofollow">post process the output of the Dice-O-Matic</a>? How would you describe what you can achieve (assuming that perfectly unbiased output is not possible)? Can you prove that achieving perfectly unbiased output is not possible, no matter how you post process the output of the Dice-O-Matic?Jakitohttps://www.blogger.com/profile/08235089048981338795noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-50681441936928682552018-07-25T12:17:39.679-04:002018-07-25T12:17:39.679-04:00There is a simple approach that is asymptotically ...There is a simple approach that is asymptotically optimal, though convergence is somewhat slow.<br /><br />We will construct an object called bitstream, which accepts iid random bits. The bitstream will have two daughter bitstreams: daughter1 and daughter2.<br />For every pair of received inputs a,b, the bitstream acts as follows:<br /><br />1. If a != b, output a, and send 1 to daughter2.<br />2. If a = b, send a to daughter1 and 0 to daughter2.<br /><br />The resulting data structure might seem infinite, but it becomes finite once you create the daughter bitstreams only when they're first accessed.<br /><br />Let us denote by f(q) the average number of bits output by a bitstream when the input consists of iid Ber(q). Then<br /><br /> 2f(q) = 2q(1-q) + (1-2q(1-q))f(q^2/(q^2+(1-q)^2)) + f(2q(1-q)).<br /><br />You can check that the entropy function h(q) satisfies the same recurrence (using the chain rule), and so f(q) = h(q). In other words, the asymptotic output rate of this method is optimal. Analyzing the speed of convergence would require more work.<br /><br />The algorithm is quite easy to implement in python, so here are some empirical results:<br /><br />1. When p = 1/2, feeding the bitstream 10^6 random bits resulted in 982635 output bits, or an average of 0.982635 output bits per input bit (compared to the entropy, which is 1).<br />2. When p = 0.4, the same experiment yielded an average of 0.955164 ob/ib, compared to h(p) ~ 0.971.<br />3. When p = 0.3, we get 0.865629, compared to 0.881.<br />4. When p = 0.2, we get 0.708546, compared to 0.722.<br />5. When p = 0.1, we get 0.461367, compared to 0.469.<br /><br />In all of these results, we get upwards of 98% of the optimum.Anonymoushttps://www.blogger.com/profile/08450062297306341505noreply@blogger.com