A recent work Slack chat had a dev asking what a particular table contained. They were going through our data inventory and found a randomly-named table 18TB in size. When I ran "select count()" against it, I got back 5,325,451,020,708 rows (that's a copy-and-paste).
It was a temp table that we hadn't garbage collected yet. We don't make a habit of leaving that much junk data around, but it bumped our monthly storage bill several percent, not like tripled it.
I'm assuming this is a joke. You can run databases that size without any of the fancy scalability stuff - no sharding no anything. I'd actually recommend that, it's makes admin super easy!