dchan48
Jun 20, 2007, 01:30 AM
A friend of mine (electrician) has conducted IR and Hi-Pot test on a short length (app. 1 km) of 3.3kV XLPE/SWA/LC/PVC cable. The hi-pot test voltage used was 7.2VDC.
IR of one of the phases dropped from 200Gohm (before hi-pot) to 18Gohm
(after hi-pot). The IR for the other phases also dropped by around
30-100Gohms.
The next day he tested the same phase (after cleaning the
terminations with solvent) and the value had gone up to 170+Gohm.
Are the results reasonable? Why was the marked changed in IR (measured by 1kV Megger tester)
IR of one of the phases dropped from 200Gohm (before hi-pot) to 18Gohm
(after hi-pot). The IR for the other phases also dropped by around
30-100Gohms.
The next day he tested the same phase (after cleaning the
terminations with solvent) and the value had gone up to 170+Gohm.
Are the results reasonable? Why was the marked changed in IR (measured by 1kV Megger tester)