Like it has been said, maybe they meant 11mb of info, information is different from data. Maybe raw data is actually larger.
At any rate 11mb per second mean more than half a giga per minute, and more than 154 gigas per hour, or about 3 and a half terabites of data per day. It does sound like a lot
Let's put it this way: an uncompressed 1920 x 1080 screen at 60 Hz in 24-bit color needs something on the order of 2.99
gigabits a second to run. So it certainly feels really wrong that Britannica says that the visual system "only" gathers 10 megabits of raw data a second.
If that's the compressed data rate, sure, that's reasonable; artificial neural networks can achieve insane compression ratios, so one that's orders of magnitude larger than those currently being used in supercomputers could surely do that. Youtube recommends an upload bitrate of 66 - 85 mbps for 4K HDR video, so it's not a giant leap to assume that our vision (pretending for a second that our eyes see 4K HDR) can be compressed down to 10 mbps by the brain, as amazing as it would be for current video codecs to pull off such a feat.
But if that's the uncompressed, raw data rate, that's where I'd object. At 10 mbps, you can drive an 83 x 83 screen at 60 Hz in 24-bit color. That's not enough for much of anything. Or if you ditch the color entirely, that goes up to 408 x 408 black-and-white. Either way, it's not a good time for anyone.