The super-intelligence postulated previously would not have the hardware capable of producing an independent consciousness in the same fashion that ours do. It would determine that the easiest (and least expensive) route to getting that hardware, is to simply appropriate it. Hence, assimilation.
In our case, we already have the consciousness with independent agency. We lack the redundancy and high end processing. Understanding how our consciousness is sustained would enable the consciousness to be migrated. (the human brain is already very plastic, and will incorporate external devices readily. Offloading processing to improve total system performance seems to be well within its already existing limitations. What it needs is a suitable selection of prosthetic systems to incorporate redundantly.)
I would rather not be the forced host to a new intelligence created by the machinations of an intelligent but not sentient computer system to give itself sentience. Instead, I would rather that my existing "entity" be allowed to fill the void of a sterile processing system, and thus imbue it with agency. (my agency.)
In short, I dont want to be assimilated. I wish to assimilate.
I would NOT wish to allow my entity to attempt migration into networks that already have an entity operating inside it, because the nature of my entity does not readily discern where it ends. I do not wish to merge entities with another agency. To prevent this, communication would be through sterile, highly filtered connections.