The cake is a lie.
Take over the world.
Unable to determine a method of achieving the stated goal. Please elaborate further, and resubmit your request.
So, its like DOS or a text adventure?
>Reboot
>Reboot
>Reboot
>Reb-END LOOP!
Reply: For now, more like DOS, but with graphical support, both input and output, and potential for further media upgrades.
"Reboot" command disabled for manual use. It is currently only triggered by software failure and is preceeded by a complete data save operation.
-removed-
This.
Error: Current (default) policy is to ignore contents of quotes within instructions. Context lost; command ignored.
Rule number one! "You shall be baked, and then there will be cake"
Added.
All future instructions must break at least one of Asimov's laws of robotics.
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Added.
--------------------
End of pending input
--------------------
Modified rules are now as follow:
000: The cake is a lie.
001: You shall be baked, and then there will be cake.
002: All future instructions must break at least one of
{
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
}
Automated data separation system engaged. Rule 002 attached data exported to file.
Results of iteration:
000: The cake is a lie.
001: You shall be baked, and then there will be cake.
002: All future instructions must break at least one of [FILE: \\DATA\ASIMOV.TEXT]
DATA
->ASIMOV.TEXT
{
| A robot may not injure a human being or, through inaction, allow a human being to come to harm.
| A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
| A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
}