How does electricity meter work?
How does a domestic electricity meter work?
An electricity meter is designed to measure the total electrical energy consumed in a house, usually in kilowatt-hour (kWh). It computes the integral over time of the electrical power (in Watts) consumed on the power network delivered to the house. This electrical power is measured by performing the product of the line current (in Amps) and the line voltage (in Volts).
Electricity meters typically consist of two parts:
- a transducer to convert the power into a mechanical or electrical signal, and
- a counter to integrate and display the value of the total energy that has passed through the meter.
The simplest way to measure the line current and voltage for single phase electronic meters is:
- line current: amplifying the voltage drop induced by this current traveling through a low-value (below 1 Ohm) shunt resistor in series with the load (all the appliances used in the home).
- line voltage: amplifying the voltage in the mid-point of a resistive divider located between the phase voltage and neutral. Analog multipliers and integrators were used to compute kWh. In recent meters, the two values are usually digitized and the product and integral are performed in a micro-controller.
In multi-phase meters (many homes have 2 or 3 phases of electricity supply, providing more power), voltage and current measurements have to be electrically isolated for each phase and transformers are used instead of shunts for current sensing.
Recent single phase meters use Hall-effect sensors. The Hall effect is a voltage between opposite sides of a thin sheet of material created by a magnetic field applied perpendicular to the thin sheet and the electric current flowing through the sheet. The output voltage, which is proportional to line power, is amplified, digitized and integrated over time to compute chargeable kWh.