amperage lost from a current transformer to utility's electric meter(?)
I work for an Electric utility.
Have noticed that the actual amperage running through a phase does not match the amperage being sent to the meter with it's proper multiplier from the current transformer (CT)?
Example: phase A is using 83 amps. The X1 tap to the meter from the current transformer (400:5 ratio = multiplier of 80) shows 0.8 on my amp check.
If you take the 0.8 and multiply X 83amps you have a metered current of only 64 amps for that phase. Additionally I get a 1.0 amp reading when I check the amps nearer to the CT on the X1 line (real close to 83) and the 0.8 mentioned above happens at the switch gear just under the meter? A "amp loss" if you will.
The result is lost revenue to the utility company. I don't believe the problem lies in the windings of the CT's. I've noticed this on all of my checks. If this were the case and the problem is as prevelent as I think it might be lawsuits to the manufactures might ensue.
Well...
Does anyone out there know why this ? What could be the cause? And the remedy?
Thx. Chafor
PS - I know no one gives a rats tail if any utility gets "cheated" a bit by perhaps their own ineptitude but I would like to get to the answer on this. Could be a "feather in my cap"!