Wow. I feel like the fact that you can encoding arbitrary information in long numbers is such a far reaching concept in math/computer science, it's really satisfying to see it demonstrated in such an intuitive way.
The caveat is that a function, like the evolution of the universe, General Relativity, differential equations, would be possibly unwieldy and unaesthetic when written on lower dimensions, and locality might also be lost.
So cool, you can find anything you search for: https://libraryofbabel.info/bookmark.cgi?concernedcoder0001 about 1/2 way down the page you can find: "concerned coder is a programmer living in the
sourthern usa, playing a game called eve online while reading hacker news on a
sunday night in january"
Hmm. Other sources suggest that Concerned Coder is a programmer living in the northern USA, playing a game called Eve Online while reading Hacker News on a Sunday night in January:
you can encode any arbitrary information into a one dimensional value....
eg: a large enough number can encode the whole universe....
But decoding any meaningful information has to be at least in two dimensions....
In two dimensions, you can describe everything.... the other dimensions are not needed (and perhaps just emerging, according to the holographic principle)....
It's the uncertainty principle and principle of orthogonal observables. Energy vs time for example. You need two dimensions for a comparison. One dimension is static.
1GB of storage is a single number with 2^8billion binary digits.
Everything we do in computers is encoding information in long numbers. Every character is a single number, every file is, every disk image is, every state of RAM, every database, every cache, every piece of source code.