That is... not quite right.
Most of the code needs to be downloaded when someone uses the runtime on the hosted version to play, so information is being downloaded from the colab system to google cloud and then presented to the player. The player's actions are then saved, transmitted to the database and processed, and then the response is set back. Most computers are incapable of running it so they are dedicating to finding a way to having a hosted version available for people, but the cost of moving so much data between different services is spiraling out of control.
His last tweet on the subject was,
"I'm just confused why I would be getting several thousand dollar egress fees for GCS to colab transfers. Seems strange."
You can go through all the steps he has outlined to play locally, but the resource costs are massive, he recommends no less then 12gb of GPU (Yes, Graphics!) ram and having CUDA installed. Otherwise, they are trying to find a better way to handle terabytes of data from how many people were running it all at once, or find a cheaper way for the data from the neural net to be transferred.
He mentioned his lab was covering the fees, but it quickly hit a cost of 10k per day, so they are trying to figure out why the cost has ballooned so bad. It went beyond their expectations and they thought the costs would be cheaper transferring between the two services as google manages them both.
Really the best hope is if Google comes up with a solution. He had been trying to contact them to come up with a solution to better handle the data transfers.
-Edit-
Oh, lol, I see what you were saying. They also had so many attempts to download the game they also had to stop allowing local downloads in addition to the server one being offline.