In order to size a generator, you must figure how much power you'll need. You do this by adding up watts.
A watt is an electrical unit of measurement that represents the amount of energy an appliance uses during start-up and running. For example, a 60-watt light bulb requires 60 running watts of electricity to light up your room.
Appliances with a motor, however, have a minimum wattage to run and a higher minimum wattage to start. Let's look more closely at the difference between starting watts and running watts.
1. Running Watts
Running (or rated) watts are the amount of watts your appliance needs to keep it running. For example, a refrigerator typically needs 500 watts to run.
2. Surge Watts
Surge (or start-up) watts are the amount of watts your appliance needs to start its motor. For example, it can take up to 2,000 watts (or 2 Kilowatts) just to get the same refrigerator's motor and compressor started.
How to Calculate Wattage
There are three simple ways to determine the wattage of your appliances:
1. Data Plate
The easiest way is to just look at the data plate on the back of your appliance. It will tell you how many watts, amps and volts are required to power the appliance.
2. Wattage Meter
You can also use a wattage meter to measure the EXACT amount of power. Simply plug the appliance into the wattage meter. Then plug the wattage meter into the wall to get an accurate measurement.
3. Multiply Amps by Volts
If your appliance doesn't list watts for some reason but has the number of volts and amps, you can multiply them.
Watts = Amps x Volts
It's a more roundabout way that still works if you have no other choice.
Once you calculate the wattage of your appliances, you can pick a generator that can power them all. Make sure the generator can handle not just the running watts, but also the starting watts of your motorized appliances.