Why is the Rating of Transformers Given in kVA and Not in kW
-
Upload
venkatesan-r -
Category
Documents
-
view
216 -
download
0
Transcript of Why is the Rating of Transformers Given in kVA and Not in kW
-
7/31/2019 Why is the Rating of Transformers Given in kVA and Not in kW
1/3
-
7/31/2019 Why is the Rating of Transformers Given in kVA and Not in kW
2/3
aligns with the voltage or not. Therefore the heat is always proportional to the
square of the current amplitude, irrespective of the phase angle (the shift
between voltage and current). So a transformer has to be rated (and selected) by
apparent power. It is often helpful to think of an extreme example: Imagine a use
case where the only and exclusive load is a static var compensator (and such cases
do exist). Would the load then be zero because the active power is zero? Most
certainly not. Caution: In this situation the voltage across the output terminalswill increase with load rather than drop!
Supplement:
Special care has to be taken if the load current of a transformer includes any
higher frequencies such as harmonics. Then the transformer may even overheatalthough the TRMS load current, measured correctly with a TRMS meter, does not
exceed the current rating!
Why is this? It is because the copper loss includes a share of about 5% to 10% of
so-called supplementary losses. These arise from eddy currents in mechanical,
electrically conductive parts made of ferromagnetic materials and especially in the
low voltage windings with their large cross sections. The magnetic stray fields
originating from a lack of magnetic coupling between the HV and LV windings
(main stray canal) induce something that could be called an eddy voltage inside
the conductors, which drives an eddy current flowing around in a circle across theconductor, perpendicular to the main load current. Now the amplitude of this eddyvoltage is proportional to the rate of change of the magnetic field strength. Therate of change of the magnetic field strength is proportional to both the amplitude
and the frequency of the current. So the eddy current increases proportionally to
the load current and proportionally to the operating frequency, for the limitation to
the eddy current is Ohms Law. The supplementary power loss caused by the eddycurrent is eddy current times eddy voltage.
Hence, the supplementary losses increase by the square of the load current, which
excites the magnetic stray field, and by the square of the frequency, while the
main copper loss increases only by the square of the load current amplitude.
Therefore the transformer runs hotter when the load current has the same
amplitude but is superimposed by higher frequency constituents above the rated
frequency. This additional heat loss is difficult to quantify, especially as the
transformers stray reactance limits the passage of higher frequency currents to
-
7/31/2019 Why is the Rating of Transformers Given in kVA and Not in kW
3/3
some extent, but in an extreme case it may drive the supplementary loss up
from 10% to 80% of the copper loss. This means that the transformer may run
some 70% hotter (of temperature rise above ambient) than specified for rated
(sinusoidal) current. Since the ohmic heat loss, however, depends on the square of
the current, it is enough to limit the load current to some 65% of its rating to
avoid overheating.