And thus is the beef of the physics community. The currently understood rules say "hey bro-- that thing will equal no net thrust. At most, you will get some parts of the chamber to heat up, and that's about it!"
However, experimental data is saying "WHoa, WTF? There's net thrust? OMG!"
Physics community counters saying "Did you callibrate your testing equipment bro? That shit cant happen. Check your shit."
The experimental researchers counter with "Hey, we callibrated and we altered our experiment to better address some of your raised issues, but hey-- the thrust is still there!"
Physics community again counters with "Dude, you are clearly doing something wrong here."
Experimental researchers say "If we are doing something wrong, then all of us (many teams, different places) are all doing the same thing wrong. At some point occam's razor points the other way bro."
Physics community counters with "Prove it-- Give us good quality data to chew on."
Experimental reseachers say 'Give us the funding bro, and we will!"
And that's where we are currently at.
The issues with giving the good quality data that the physics community wants, is that there are all kinds of costs involved.
1) Cost to career. After the cold fusion hysteria in the 90s, investing any kind of academic weight to a project of this type (goes against conventional wisdom), can turn a bright career path into a lonely desert of blacklisting by academia as a charlatan, even if that label isnt quite accurate.
2) Cost of equipment. High accuracy test instrumentation isn't exactly COTS hardware you know. It costs some major moola. Usually this kind of equipment has to be shared between researchers in a lab setting, which means that while one group is using it, another is waiting in line. This means that research deemed "more valuable" by lab managers is going to get priority on equipment use-- Such testing is likely to be scheduled for at most a few hours on a weekend, and must be dismantled and back into the equipment locker in tested, working order, ready for "real science" before the next team's scheduled use of the equipment.
3) Costs of power to operate/conduct the test and or-- utilities costs for the facility. While the lab may have a utility power connection more than capable of supplying the power to operate such an experiment, you need permission from the powers that be that run the lab to conduct such an experiment, because SOMEBODY has to pay for the electricity you use, amongst other things.
4) Difficulty of getting into peer reveiwed journals with this kind of research. Big publishers are often the more trusted (which is BS, but hey-- just saying the truth here) journals, who have some pretty unscientific means of weeding out what they consider "bad science". This means that findings of research like this is likely to be stuck in places like arxiv and plosone instead of in prestigious journals owned by elsevier and pals. That means the "Peer reviewed"-ness of the research is going to be less, unless a more tenured research group sees the initial research on the open journal, replicates it, and publishes their own findings, and pays the elsevier tax, and meets the "You must be this big to publish" hurdle.
One can sort-of get around some of these problems with lots of little, low quality tests, which force the occam's razor issue to turn around. Such as massive replication of the experiment using COTS equipment, like I cited a protocol to use above-- If you get enough small school and university physics departments to humorously test the device, and get epic shittons of data, it becomes harder and harder to attribute experimental error to the thrust measurements, and forces the phenomenon to be more rigorously tested by more prestigious institutes with better equipment.
Statistics rules when datasets are huge-- even when the data is noisy, like you would expect with low quality experiments. Get enough experimental data, and you can tease even a weak signal out of it, if there is a signal there to tease out.
As for the currently proposed mechanism of action:
"Empty space" is not "Empty". It is a boiling, bubbling cauldron of energy fluctuations, which have measurable properties. These fluctuations are, amusingly enough, called "Vacuum fluctuations", or
"Quantum fluctuations."Essentially, tiny bits of spacetime congeal for a teensy weensie bit of time, and behave kinda-sorta like actual particles during that time, before disappearing again. They can exhibit charge terms, mass terms, spin terms, and basically any kind of particle term you can throw at them, and they come in all kinds of flavors.
It is the ones exhibiting charge and mass terms that are of interest here. To confer momentum, you need a mass term. Because we are dealing with electromagnetic energy, charge term bearing fluctuations can also become involved. In the mass-term bearing kind, we literally confer a tiny amount of kinetic energy to such a fluctuation, and get a net change in the momentum of the donating object, per the second law of motion. The fluctuation is very short lived, and vanishes before it reaches the other side of the chamber. In the charge bearing type, we have particles that are either attracted or repelled by electromagnetic charges, exhibiting force pressure via virtual photons with the walls of the resonator.
In both cases, a small force can be imparted by having some sort of asymmetry in the design of the resonator. The question is-- is that really possible or not? (and if so, where does the energy imparted to the fluctuations go when they return to being spacetime?)
One idea I had, was that the energy imparted to them does not "Go away", but instead contributes to the probability that adjacent regions of spacetime will produce another fluctuation, thus increasing the fluctuation density of that area of space congruent with the added energy.
That would seem to comply with the interferometry data, as the increased fluctuation density would have a measurable effect on virtual photon interactions necessary for EM wave propagation, which is what a laser interferometer would measure. Needs more concrete data, but seems plausible. Causing such localized variations in vacuum energy density would have some pretty profound implications. It would make quite a few physicists shit themselves.