The worst thing here is we can't really know the true scope unless CloudFlare have really good forensics, and their reaction seems to be downplaying the effects compared to the known impact from the Google report.
They have been
sending emails to sites which use them claiming that only 150 sites had information revealed. That is based on working with third party caches (Google and similar). Basically searching the caches for inadvertently leaked details and removing them. IMO
this is not enough. For sure those sites were compromised but, based on the approach described, they can't have the confidence to say that other sites weren't compromised as they do in the email.
It is assuming that leaked details don't appear in private caches or public caches that weren't included in their original searches. There are any number of webcrawlers and cache services that may still be retaining details from sites not included in their 150. At worst the bug may have been discovered and exploited by hammering one of the vulnerable sites to retrieve random memory in a heartbleed style attack. Unless CloudFlare have a way to search logs to prove that no such attacks happened, and exhaustively review all leaks that did happen, we can't know the actual scope of the issue. They may have that, but I haven't see the question answered yet.
As of just now I have been able to recover a GET request for what appears to be an Indonesian adult video streaming site (fortunately waited till I was home to give this a go rather than exploring on my work PC). This was from a public cache which was earlier used for a public proof of concept and I have seen reported as cleared elsewhere (one of the ones working with the team). In this case it seems to be a record of a visit by the same webcrawler that created the cache (based off of user agent and other fields), but does show that potentially sensitive (or at least amusingly compromising) information may still be public. From other chatter it looks like major caches are slowly being purged, but people who got in early were still able to pull information from Google as it took longest to clean. And my example suggests the cleaning might not be 100% everywhere even now.
EDIT: Also worth mentioning, it would be trivial for an organisation that held a substantial cache to keep a private copy while purging the public one. Unless CloudFlare are more forthcoming you have to assume that any traffic with sites that use their reverse proxy service has been compromised, including https. Encryption on top of https (as with
1password) would still be safe, but that needs confirmation on a site-by-site basis.