Ok, Let's tentatively use Wikipedia's opening blurb then, since it is arbitarily as good as any other outside of more rigorous terms being used--
Sentience is the capacity to feel, perceive or experience subjectively.[1] Eighteenth-century philosophers used the concept to distinguish the ability to think (reason) from the ability to feel (sentience). In modern Western philosophy, sentience is the ability to experience sensations (known in philosophy of mind as "qualia"). In Eastern philosophy, sentience is a metaphysical quality of all things that require respect and care. The concept is central to the philosophy of animal rights because sentience is necessary for the ability to suffer, and thus is held to confer certain rights.
In this sense, the "sentience" of something relates to its ability to make a unique internal signal that relates to the outside environment. A "Feeling".
In the absurdum model I presented, this is no different than a unique signal generated in a less capable system, being backfed into the simulation of the next state. It could be unique because of a difference in the viscosity of the grease used, or a bit of dust in the bearings. It is still a signal that is being generated internally being backfed into the next iteration of the state.
It is not special. It is illusionary to consider it seperate from the operation of the device, as it is an integral part of the operation of the device.
More specifically, when I said this:
that's a quibble over a self-generated signal.
The clock does not tell time, because it does not reference its own output, basically. However, see Babbage's engine. It DOES do that, and is still just clockwork.
Humans are little more than complex state machines that produce internal signals that get fed into the next state along with the sensory data they take in. In this way, the previous state of the state machine influences the next state of the state machine. Without knowledge of the initial state of the state machine, it is not possible to predict the next iteration of the state machine. This does not mean that the fundamental principles of the state machine cannot be known.
It is a direct refutation of the first paragraph of the wikipedia article on sentience:
Philosophy and sentience
In the philosophy of consciousness, sentience can refer to the ability of any entity to have subjective perceptual experiences, or as some philosophers refer to them, "qualia".[2] This is distinct from other aspects of the mind and consciousness, such as creativity, intelligence, sapience, self-awareness, and intentionality (the ability to have thoughts about something). Sentience is a minimalistic way of defining consciousness, which otherwise commonly and collectively describes sentience plus other characteristics of the mind.
Some philosophers, notably Colin McGinn, believe that sentience will never be understood, a position known as "new mysterianism". They do not deny that most other aspects of consciousness are subject to scientific investigation but they argue that subjective experiences will never be explained; i.e., sentience is the only aspect of consciousness that can't be explained. Other philosophers (such as Daniel Dennett, who also argues that non-human animals are not sentient) disagree, arguing that all aspects of consciousness will eventually be explained by science.[3]
The inability to predict a specific instance of the type of state machine, due to incomplete initial state data, does not prove that the knowledge of the core principles of operation are wrong. It simply means you lack sufficient information to establish the state you are transitioning from, and thus, why you failed to arrive at the predicted state.
"Sentience" as used by philosophers is "Magic woo sauce", if you take the above to its conclusion.
It is also why I made the jab at "Can you even define what it *IS*?"--- If, per the beliefs of some philosophers, it CANNOT be understood, how then, CAN you define what it *IS* ? ;P
If you cannot define what it *IS*, how do you define what it *IS NOT*, and subsequently, how can you determine that the simulation lacks it?
When you boil it down, it basically just becomes "I am not complex enough to understand the root causalities of individual quirks in behavior, thus it cannot be known." This is not rational. We are not the highest possible form of life; we (humans) are designing one that can surpass us even as we (you and I) discuss this, even though our faults make us have great difficulty in the performance of that task.
To me, the argument about "Sentience" is specious at worst, and illusion at best.