So I watched Inception, read MSPA's Homestuck, and wanted to reconcile a desire to play something similar with a past tendency to subvert RPGs like DnD through excessive applications of logic. The result of a week's typing and collaboration on the subject is spoilered for length below, and is currently in rough draft form, neither logically organized nor mechanically refined; the equations are fine but the constants need tweaking. I thought bay12, with their experience with systems of great complexity, might enjoy looking at it and perhaps offer insight into how I can make it clearer/better. I'm already working on refining/developing the pseudocode mentioned below into something more consistent, and I'm working on a series of physical assumptions to make the currently daunting amount of mathematics necessary to play into something more digestible. Any additional insights/comments/criticism would be welcome.
Godology
The multiverse in which Godology is set is the artifact of some clever but not particularly visionary programming on the part of far-future humanity intent on entering a virtual world in which to live out their lives unimpeded by physical laws. The initial attempt was a tremendous failure as billions of nascent gods working out the kinks in omnipotence caused massive failures in the underprepared code, necessitating the activation of reparatory measures that worked to restore the primacy of physical laws over individual desire, lest the universe again collapse into infinite loops and logical impossibility; this kept everyone sane, but locked away godhead completely by denying them access to the equations driving the reality of the simulation. Thousands of years later, the original entrants into the simulation are dead, leaving autonomous pseudogenetic "offspring" that have simply forgotten what the safety switches encrypted away, living out their lives in blissful ignorance of the fundamental mutability of their universe as artificial meshings of the withered minds of gods.
Except a very few...
These few pioneers, these code-surfing neuronauts, have begun to rediscover the ways in which the universe can be altered, this time limited by their capacity to decrypt the code in real-time to performing relatively small modifications. In time, if they can adapt quickly and work quietly, they may avoid the downfall of their distant predecessors; if they act rashly, it is more likely that the full wrath of the debugging programs will descend upon them in a tide of relentless reality-redefining rage--and even gods may die.
Key Concepts-
1. Code Complexity: the computing controlling the Godology multiverse is far too complex to be altered rapidly and directly by a normal human or AI mind at the machine code level. Those who would maniuplate the world do so by half-subconscious algorithms, capable of inflicting relatively crude modifications on this code of insane complexity in real time. These algorithms determine much of a given individual's manipulatory capabilities. Incidentally, the programming language used to create the code is lost to time, and current manipulators work through finely nuanced thought routines to emulate, however unknowingly, the original commands, without knowing their syntax.
2. Subjective Oddity; the regulatory programs currently assuring that the multiverse functions as intended are sitll hard-coded to modify the universe according to human standards, resulting in a sort of physics of belief that changes by locality. In some regions, there is no gravity, and this is accepted by the residents as perfectly normal and therefore not manipulated by the debuggers. if a person from Earth-normal physics was suddenly inserted into such a realm, the resulting shock would trigger an absurdity response. In effect, the gm may reassign physical constants by region.
3. Absurdity Response; debugging is a resource-intensive process and normally not employed, which naturally makes it difficult for the architecture to detect when and if reality is deviating from accepted norms. The trigger for a debugging procedure's initiation is the feeling by an individual that something is absurd, that part of the universe is not right; this trips detectors that begin an examination of everything in the objector's field of perception. If glaring flaws in the program are discovered (IE the sky is suddenly puppies because some transistor flipped a byte) they are corrected; otherwise it is assumed that the universe is still obeying the generally accepted physics of the region and may continue as normal. The farther back in both time and complexity a flaw in the logic is, and the fewer people object to it, the less chance it has of being detected and expunged. Skilled manipulators forge pasts for all of the matter and energy used in their manipulations to prevent this, and above all they are subtle.
4. Reality Rejection; the above is a proportional process; the following is absolute. Occaisionally something happens due to normal code slippage that is interpreted by the observers as being so far out of reality that it drives them partially insane, leaving them utterly unable to reconcile their worldview with their perception. This has a unique response code submitted from the client, which instantly triggers applicable debug code and mandates that something be changed; if an error is not found immediately the debuggers will work to construct a patch ex nihilo for the disturbance. Raw code always triggers one of these; see below.
5. Subtle Response; the debugging programs, though infinitely powerful within their domain, work under the same constraints that cause them to activate; they are programmed to deal with false or erroneous code in the least objectionable way possible. It is this that keeps manipulators alive, in that the debuggers cannot simply halt a server and delve into the code to root them out, nor simply force them to vanish; their work is just as accontable and as bound by accepted physical laws as their enemies, and this makes them vulnerable.
6. Raw code; this is among the most grievous errors that can occur in the simulation and the source of almost all manipulators; a rare flaw in the code generated due to infinitely recursing references to impossible physical properties throws a specific exception that causes a console to appear, detailing the erroneous information and allowing anyone capable of formatting their data properly to submit a solution. Unfortunately, bereft of their predecessors' training, this appears as a wordless, surreal chaos of random numbers and formulae written in glyphs that appear to continue into higher dimensions. While it is possible to understand the concept of manipulation intellectually without it, exposure to raw code is necessary to give the subconsious of a potential manipulator enough data to work with to enable it in practice while simultaneously undoing the lock on data entry automatically placed on all non-debugger entities at entry as the console polls the clients for a solution.
Skills- Characters are the sum of their skills.
Almost all skills are subfields of a particular science, and particular characters are given a rank starting at 0 with no cap. When decoding,analyzing,or otherwise performing a non-energistic task, apply the most relevant field's formula result as the base chance of success. When manipulating, the most relevant field is the efficiency. the formula is PI()/2 + ATAN(0.1*(rating-50)) / PI() * 100%; when a skill's percentage is called or a roll is called, this number is that percentage or the maxiumum result of rand()*100% that may be considered a success.
Note that this means that infinite skill points may be applied without reaching a certainty of success.
To gain a skill at rating x requires 10* (rating^2+rating+1)/2 gbytes of skill-related data to be stored in a player's personal data cache. Skill data may always be deleted but cannot be compressed or transferred.
Skills gain in rank by having grist expended to experiment with them or collect data by postulating hypothetical particles and recording the results. 1 bg = (ripplesense%+(1-existing skill%))/2 gbyte to apply towards a skill, with a max rate of 1 per second. Players start with 1 Tbyte of data to allocate to skills as they see fit.
Special Skills (IE not scientific subfields)-
Psychoprojection is efficacy, or efficiency at establishing manipulations through resistance
Psychoreinforcement is psychological resilience, or efficiency at resisting unwanted manipulations
Psychocollation is mnemonic efficiency, or cache size in Tbytes. Psychocollation 1 is necessary for a character to exist, while 2 is necessary for manipulatory capacity.
Ripplesense is observational capacity, or grist accumulation rate and analysis material accretion
Build Grist (Thanks AH) - One of the first tricks any manipulator learns is to save the junk data continually filtered out of the simulation for re-use, catching the unused data entries and obsolete code in a slush of valid but unused data colloquially referred to as Build Grist, and use it to effectively cheat the system. By assigning manipulated matter obsolete codes and disused values in its history, it is possible to fool casual causal investigation into one's work by making traces turn up legit sources and histories for the associated matter; accordingly, all manipulators who wish to continue being so quickly learn to hoard grist and employ it in their manipulations. Players accrue their Ripplesense rating in grist every hour, and can acquire the square of their rating by spending that hour in activity similar to meditation, actively seeking out bad data. Players can also attempt to deliberately decompose matter into grist by manipulating it into bad code (an algorithm that returns NaN for a physical property is sufficient; these don't comsume grist since technically energy can't be inserted, but do consume c^3 lines per second per particle), but this is risky; observers not aware of what's going on will naturally begin radiating strong absurdity response to see matter suddenly warp and disintegrate. This grants mc^2*x*y*z grist to the player's cache, where x, y, and z are their quantum mechanics, ripplesense, and informatics percentages, c is the speed of light, and m is the mass of the disintegrated matter. Build grist is expended at a rate of 1 bg/kJ of manipulation.
Grist is stored as data in a player's memory, in the form of pointers to the usable bad data. As stored, one unit of bG takes up one hundred gigabytes; this can be reduced by one order of magnitude per successful Cryptology roll per existing degree of compression(ratio uncompressed/compressed = rolls required to reduce by another OOM)(treat as algorithm for speed limit, with c^3 lines required per byte per roll) to a minimum of one megabyte per bG.
Items- In addition to mundanely carrying items around, players can also cache them. Items require 1 mbyte per atom; this can be compressed with cryptology to 1 kbyte, but many players prefer to store the least complex algorithms necessary to recreate the object and grist the item itself.
Manipulation- When a manipulation is attempted, any entity aware of it can attempt to block it. RAND()*Skill used for the manip %* Psychoprojection rating vs. RAND()*Skill used for the manip %* Psychoreinforcement rating.
Manipulations require energy and therefore grist equivalent to the energy required to do whatever they're doing mundanely and output the equivalent change to their environment as predicted by physics/chemistry/other field of a level sufficiently sophisticated as to be agreeable to all players. This number is multiplied by the percentage of the most relevant skill to get the actual output;alternatively, the needed grist input is divided by that percentage to get the necessary input. This also covers recovery, as with the recovery of excess energy after manipulating an exothermic reaction or other energy/matter producing manipulation; skill% of it is actually converted back into grist as unused pointers are identified and re-stored at their previous level of compression. Note that players are not assumed to use grist for manipulating their own brain chemistry, and therefore their muscle movements; the simultation accepts without objecton any biologically possible state of brain chemistry and any state of memory in a client-run computer the client program submits; "their own brain" is defined as any Turing-complete processor capable of communicating within itself by a constant means of signal transmission and containing the 1 Tb client identifier memory set that serves as a given client's unique reference code set. This also obviates questions of accuracy, physical prowess etc; the calculations necessary to match one's actions to a mathematical ideal are done subconsciously and automatically post-raw code exposure, making them effectively unlimited and therefore not tracked. Determining that ideal still requires calculation, however.
Sample Manipulations-
Inertial transfer (uses mechanics) - loads energy into an object as kinetic energy along a defined vector. In effect, you can flick bullets at people at ballistic velocities.
Translocation (uses quantum mechanics) - item simply relocates to the destination coordinates. Requires grist equivalent to the object's mass. This one's a teleport.
Connectivity Alteration(uses geometry) - Two two-dimensional planes are defined and objects are teleported between them. Requires grist loaded into the planes at creation sufficient to move item through the intervening distance normally; at grist exhaustion the planes close. Think "door to another world" here
Alchemy(uses chemistry)- Makes and breaks chemical and subatomic bonds. Requires grist sufficient to reach all transition states in the reaction.
Inloading(uses neurology) - lets you copy or otherwise define or manipulate data held in a sentient mind. Requires grist sufficient to move around that mass of neurons. Think Agent Smith copy.
Note that these are simple suboptimal examples. Anything physically possible assuming energy input of the type and magnitude allotted to it is possible, and has a related skill for increased efficiency of similar specificity to the above; a full list is forthcoming. For the sake of efficiency many manipulators and debuggers specialize in one or a few skill-based types of manipulation, and beyond that many have specific aesthetics that come through in their more intricate work as a product of their life (or, in the case of debuggers, job) experience.Aside from grist, manipulations require sufficient psychoprojective vim to enable the algorithms employed to iterate through the subject matter. Manipulations are written in pseudocode; each line adds 1 to the necessary rating in the related skill. think of the algoritms as foreach loops iterating through every particle; if an actual language would help, Perl is probably the closest approximation at this point.
Sample algorithm lines:
inertia(particle) = v, Vector x,y,z
break every c-h bond
particle.adjacent(right) = x,y,z
Per second, all entities may iterate one mole*(psychoprojection rating)^2 lines and all players are assumed to act simultaneously or nearly so each second; in RPG parlance "turns" are one second long.
In order to effect an algorithm, a manipulator must have knowledge of the unique reference codes of every particle to be manipulated; this is obtainable in unit time for any matter within a radius of a manipulator's ripplesense rating in meters, may be stored if desired at 1 mole per gbyte, and may be guessed, with ripplesense% * 1/(PI()/2 + ATAN(0.1*(distance-50)) / PI() * 100%) chance of being correct for a given mathematically defined shape, for any matter beyond that distance. Algorithms iterated over a failed particle set consume lines, but not grist; it is advisable to run a monoline algorithm to check the validity of a given set before running a more intensive algorithm.
Absurdity response- when a sentient entity (here defined as an entity posessing the 1 Tb unique reference code generated at the pseudogenetic intercombination of two sentient entities) is aware of something out of the ordinary, an absurdity response begins. The minimum apparent necessary bG to produce the phenomenon in question is determined (IE a high-tech society could accept a robot as such, but a low-tech one would view it as compounded metal) and debuggers are summoned; the sum total of the absurdity responses of everyone aware of the phenomenon is calculated, and (sqrt(sum) debuggers are spawned with sqrt(sum)*10 gb of skill data apiece as close to the reported location of the phenomenon as possible; typically they are roughly equivalent in size and shape to an adult human. They first scan the item in question, making an analysis check once per level of historical recursion as would a player to determine its origin on a molecular level; if it is proven not to have arisen from natural matter, they begin deconstruction. If an unnatural source is identified as a player (one more analysis check) they deploy to forcibly remove the player from the simulation. They work, mechanically, exactly like player characters.
If the bG required is incalculable, a reality rejection occurs; (entities aware of the phenomenon) debuggers are called with (entities aware of the phenomenon)*c gb of skill data apiece, and they automatically begin deconstructing the phenomenon or patching reality around it to exclude it. This wave is also capable of unleashing a time stop; if THEY suffer a reality rejection, are destroyed, or detect no non-rejected entities in the region, the server time stops until the root of the error is determined and expunged.
Damage is calculated according to real physics. Memory is lost in proportion to lost body mass (or brain mass if mental functions are still localized to the brain, or loss of whatever is the defined control system). It requires 1 bG per skill rating posessed by an entity to create and run a trace-and-delete program that permanently expunges that entity; this can be subconsciously resisted.
Analysis-To trace a history of a particle through interactions at its scale, ripplesense%^n = chance to trace history back n reactions, using n^n lines per particle; this may be retried as calculations permit. Tracing to recognize build grist requires some algorithm to differentiate grist from normal matter; typically debuggers do this by checking for matter present from outside a set region of space without inertial data present to transport it (requires n^2 checks at nth level) if even one fails, the test returns inconclusive. Typically non-rejection debuggers check a number of steps back equal to the square root of their ripplesense rating, rounded up, and will retry until they get a conclusive answer or one second elapses. All other analyses are performed using real physics; internally controlled variables like time are always assumed to be totally accurate.
So that's it in a nutshell, barring typos. Anything missing?