tag:blogger.com,1999:blog-3722233.post113688386423463365..comments2023-09-24T02:44:04.210-05:00Comments on Computational Complexity: Counting GoLance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger7125tag:blogger.com,1999:blog-3722233.post-1137259854403305182006-01-14T11:30:00.000-06:002006-01-14T11:30:00.000-06:00Although Gunnar co-authored the paper with me, Mic...Although Gunnar co-authored the paper with me, Michal Kouck� deserves credit for<BR/>delveloping the first file based implementations and doing the majority of the computational work for obtaining the 15x15 and 16x16 counts.<BR/>We have been running the 17x17 computation<BR/>for about 3 months now and will need another<BR/>2 to 3 months to finish.John Tromphttps://www.blogger.com/profile/13372730518758591973noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1137071337801752062006-01-12T07:08:00.000-06:002006-01-12T07:08:00.000-06:00Bram wrote:"With 300 billion positions and ten ter...Bram wrote:<BR/>"With 300 billion positions and ten terabytes of storage, the amount of space for storing each number is about 30 bytes, which can store a number up to about 10^72, which is several times as long as the modulus listed on their page for a partial result on the size 17, but at least I'm well within an order of magnitude, so the rest of the slop is probably implementation details."<BR/><BR/>It's actually over 363B states. What you forgot to take into account is that the disk holds both an old state-set and a new state-set,<BR/>each having some redundancy since on average you produce 3 copies of each state, often too far apart in time to merge them in-memory. And you need to encode the states themselves in addition to the counts.<BR/>We avoid disk seeks by using only sequential file access and keep each file sorted by state, allowing differential encoding, which takes between 1 and 2 bytes on average. A column dependent encoding keeps the redundancy down to 1.5<BR/><BR/>In summary, the space required is <BR/>#datasets * #states * redundancy * size_of_state-count-pair = 2 * 363B * 1.5 * (1.2+8) = 10Tbytes.<BR/><BR/>-JohnJohn Tromphttps://www.blogger.com/profile/13372730518758591973noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1136961393404956582006-01-11T00:36:00.000-06:002006-01-11T00:36:00.000-06:00Out of curiosity, what is a "paranimf"Out of curiosity, what is a "paranimf"Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1136942220206136472006-01-10T19:17:00.000-06:002006-01-10T19:17:00.000-06:00last time you bashed people who are using computer...<I>last time you bashed people who are using computers to find large primes</I><BR/><BR/>No he didn't. He was just asking why it qualified as news.<BR/><BR/>The difference is a very wide chasm. For example, if there was a story that said "man stops at red light" you'd ask "why is this news?" That would not be the same as "bashing" someone for stopping at a red light. (In particular, you might even be glad that people do stop at red lights.)Macneil Shonlehttps://www.blogger.com/profile/16382866616548432101noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1136920031836677302006-01-10T13:07:00.000-06:002006-01-10T13:07:00.000-06:00Actually getting off my ass and reading the paper,...Actually getting off my ass and reading the paper, I see that I missed the possibility that libertyless regions can be connected to each other, and that dramatically increases the number of possibilities. I also missed that you can move the border one piece at a time instead of doing the whole layer at once, which removed a quadratic blowup in exchange for a factor of 19 multiplier and a small amount of extra disk space. It also dramatically reduces the amount of disk seeks you have to do. On real computers the runtime for problems like this is mostly directly proportional to disk seeks.<BR/><BR/>The value of the border size (table 1 in the paper) is about 300 billion, and the authors use chinese remaindering to get the space required for each number under control, at the expense of having to do more passes.<BR/><BR/>With 300 billion positions and ten terabytes of storage, the amount of space for storing each number is about 30 bytes, which can store a number up to about 10^72, which is several times as long as the modulus listed on their page for a partial result on the size 17, but at least I'm well within an order of magnitude, so the rest of the slop is probably implementation details.<BR/><BR/>Optimizing the algorithm for reducing disk seeks is an interesting and apparently nontrivial problem, and one which the paper doesn't discuss at all. The main difficulty is that if a piece is connected to other pieces in the back, then if it gets a liberty then that can affect counts of distantly related positions, causing disk seeks. Perhaps a different grouping of positions could fix this problem.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1136916198360881412006-01-10T12:03:00.000-06:002006-01-10T12:03:00.000-06:00My first guess as to how they're doing this is by ...My first guess as to how they're doing this is by extending upwards. First you count all possibilites for the 1x19 with an exposed edge, then the 2x19 with an exposed edge, then 3x19, etc. For each one, you calculate the number of times each bordering edge occurs, then for the 19x19 add up all the ones which don't have a dead region on the bordering edge. The state of the bordering edge is that each cell either contains nothing, white which isn't alive, black which isn't alive, white which is alive, or black which is alive. You then try all possibilities against all possibilities for the next row, with the limitation that a piece which was part of a not alive region can't get sealed off.<BR/><BR/>Obviously you can't have a piece which isn't alive border one which is, so the number of possibilities for the boundary isn't all that much more than 3^19, which is about a gigabyte, but the real number is somewhat more than that. The ten terabytes number appears to be taken from the wildly conservative 5^19 number, a value which leaves the vast majority of the disk unused, because 3/4 of the time a piece will be made alive because there's an empty space on one or both of its sides (or the sides of the extended line of identical-side pieces it's part of).<BR/><BR/>Does all this make sense? If it does, then it should be possible to calculate the whole value with vastly less disk space.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1136903056559801082006-01-10T08:24:00.000-06:002006-01-10T08:24:00.000-06:00Hey, Lance, here we go: last time you bashed peopl...Hey, Lance, here we go: last time you bashed people who are using computers to find large primes, and now you endorse a guy who is counting positions in Go. How is Go counting different from large-prime-finding?pálenicahttps://www.blogger.com/profile/15057290951441022430noreply@blogger.com