Ari Rahikkala (ari_rahikkala) wrote,
Ari Rahikkala
ari_rahikkala

If I had an infinite amount of computation power available to run just one computation...

... I would run one to search through every program in numerical order until it found one that reproduced the entire content of the latest English-language Wikipedia pages-articles.xml.bz2 (the current state of every article text). Granted, extracting any sort of a usable AI out of the ensuing mess would almost certainly be harder than writing one from scratch in the first place... but at least it'd be interesting to see just how large it would be.

Comedy option: See if a program reproducing pages-meta-history.xml.7z (all article text and discussion, and all history of both) would be significantly bigger. Incidentally, that file (from the last time anyone actually managed to create one without the dumping process dying in horrible pain at some point) is apparently over five terabytes uncompressed, but 7z brings it down to 31 gigabytes. That's sort of quite impressive actually.
Subscribe
  • Post a new comment

    Error

    default userpic

    Your reply will be screened

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 2 comments