Nothing specifically states the quantum pairs are literally pairs of atoms/molecular fragments. Assume the quantum "black box" is a large collection of entangled atoms synchronized within their own data storage framework (like a supremely dense punchcard, for illustration's sake).
Measuring spin doesn't disentangle the pair, it merely causes the corresponding pair member to take on measurable properties (spin) of the original measured twin.
Transmission source: Measure each member of the local entaglement block, imparting spin analogous to a set of information on the entangled "punch card".
Reception site: Measure the entanglement block, locally. Measure the entire block to reset each quantum bit to 0 (return spin to the accepted "0" state), then measure it again to impart desired data to the local "punch card".
T: Measure entanglement block again, after a preset delay to avoid interference with the "blank" state imparted at reception site. Perform block-reset measure. Repeat procedure.
It sounds clunky, but each measurement is taking place at c, and current testing indicates the entanglement synchronization occurs at *at least* 10,000c.
There is a problem with this situation, because
deriving some form of orthogonal limitations on the vast amounts of physical data present in a given system (in order to convey standard "classical" information) is problematic, even in the unified "punch card" entanglement block above. I don't think it's that much of a stretch to assume sometime between now and the brain cancer future that doesn't exist, that someone has developed a system to orthogonally quantify the physical information in such entanglement blocks to create a working classical information delivery system.