Once again, I'm liking that.
I think I'm basically missing something when it comes to the idea of what a consciousness is. Being an effect of a number of complex interactions makes sense to me, though it's still buggering me up mentally trying to work out how that happens. I suspect nobody really knows that, else we'd probably be much closer to making actual advanced GAI.
Well, it's a question of what consciousness is versus what it does, since people often try to define the former via the latter. We are conscious and animals (or plants) are not because we [some verb], etc. If you want to make a mind that can do X specific thing, it's fairly straightforward to map out the resources required to do X thing and how they'd have to be organized. People just don't agree with any specificity on what things conscious minds do that nonconscious minds do not, and without a workable set of requirements to translate into physical processes we can't form a satisfactory definition of the properties that define a conscious mind.
This stands in stark contrast to something like mathematical computation. We know what arithmetic is, and so we can design structures for doing arithmetic (modulo any pedantry about what those structures are actually doing relative to the meaning we assign their physical and informatic state), plug them into each other and their supporting bits, circle it all and say "that's a calculator", and we'll know we've got the best definition we can when we can't remove anything from the circled bit without stopping it from doing arithmetic.