…Right now, physicists are capable of simulating the entirety of one infinitesimally small part of the universe. They need a supercomputer to do it, and the simulation is only of an area a few femtometers across, which is really freakin’ small. But, the simulation is of everything: particles, energy, space, time, the lot. This means that, fundamentally, the simulation cannot be distinguished from the real thing: if you measure the simulation, you’d get all the same results as if you were measuring a piece of real space the same size.

Now remember, this is stuff we can already do. And it’s not hard to imagine that, in the future as computers get faster and more powerful, the amount of universe that we can simulate will get a little bigger. If we can upgrade from a femtometer simulation to a micrometer simulation, say, an area of fake universe large enough to simulate a single cell. If (or when) we do that, the behavior of that cell and any measurements we make on it would be completely indistinguishable from that of a real cell. If they aren’t, then the simulation needs more work, but there’s no reason why (with enough computing power) this wouldn’t be possible. …

Now, we have to assume that if our universe is a simulation, it’s a pretty darn good one, and if those pixels are out there, they’re going to be awfully small. While the pixels might be smaller than we can see directly, we might be able to figure out if they’re there by measuring high-energy cosmic rays, which exist in smaller and smaller regions of space as their energy increases. If the universe isn’t a simulation, cosmic rays should be able to have as much energy as they want, while if the universe is a simulation, the cosmic rays would be limited in energy to the size of the simulation’s pixels, and there’d be a cut-off point.