Incorrect premise.
Wattage is a measure of how much power a device uses, often combined with time. (Watt hours.)
Power is indeed defined as V * A. this means that if you increase voltage, amperage will drop, for a given wattage.
EG, 1000W can be satisfied with 1000A at 1v, or 1000v at 1A, or even 10000v at .1A
A device needs to be able to handle the increased voltage (to cover the expected wattage), if plugged into a higher voltage power system, otherwise components will degrade, sometimes explosively/in a fiery mess.
Devices designed to be dual voltage just need a simple outlet shape adapter that is basically just some straight through conductors.
Devices that are NOT designed to be dual voltage will require a big bulky transformer to convert Amps to Volts (or vise versa) to provide the device with the expected voltage, and be beefy enough to provide the combined amp draw to satisfy the wattage requirements of both the transformer and the connected device. Big transformers tend to be noisy and hot.
Take for instance, my fancy electronic knitting machine. It was designed for use in Britain. It is a 230v device, by design. However, it is electrically capable of operating on 110v with the flip of a little switch on the back. If I flip that little switch on the back, it will draw more amps (about twice as many), and I can plug it in with a simple travel adapter.
What the little indian woman was wanting to know, essentially is:
"If I plug this in, will it explode horribly?"
Eg, "Is this device capable of operating on 110v AC instead of [alternative mains voltage]?" OR possibly "Is this device designed to run on 110v? If so, is it dual voltage with [Much higher voltage local mains power] without exploding horribly?"