- #1
EE_HV
- 5
- 0
Hello,
My online research has revealed international standards to calculate steady state ampacity of power cables (e.g. IEC 60287), as well as some software programs (Lineamps) and FEA calculations to predict thermal resistance and ampacity of power cables.
My question is how to experimentally determine the ampacity of a power cable, so that I can be confident that I am using the a cable gauge that is not going to fail/overheat or have a reduced lifetime.
I will be using high voltage (up to 400VDC, 400V AC) power lines to conduct between 200 and 400 amperes (DC and AC). The power cables are shielded (for EMC shielding).
The cables will be providing power to DC/AC inverters, that will in turn be attached to an electric motor.
The reason I want to determine this experimentally, is to account for the realistic environmental conditions, including heat loss due to conduction, without having to create a complex mathematical model.
My idea is to attach thermal probes to the cables (or use infrared meters) while running the inverter/motors assembly, while being subjected to different ambient temperatures and mechanical loads.
Temperature measurements will be taken until stable tempeature is reached, and then a different load or ambient temperature will be set.
So I will be able to create a table with the cable's insulation temperature, at different ambient temperatures and loads.
Once I have that information, how can I calculate the thermal resistance, and the ampacity of the cable?
What information I need from the cable supplier to determine this ampacity? (insulation temp rating, etc?)
Thank you very much in advance for your suggestions.
My online research has revealed international standards to calculate steady state ampacity of power cables (e.g. IEC 60287), as well as some software programs (Lineamps) and FEA calculations to predict thermal resistance and ampacity of power cables.
My question is how to experimentally determine the ampacity of a power cable, so that I can be confident that I am using the a cable gauge that is not going to fail/overheat or have a reduced lifetime.
I will be using high voltage (up to 400VDC, 400V AC) power lines to conduct between 200 and 400 amperes (DC and AC). The power cables are shielded (for EMC shielding).
The cables will be providing power to DC/AC inverters, that will in turn be attached to an electric motor.
The reason I want to determine this experimentally, is to account for the realistic environmental conditions, including heat loss due to conduction, without having to create a complex mathematical model.
My idea is to attach thermal probes to the cables (or use infrared meters) while running the inverter/motors assembly, while being subjected to different ambient temperatures and mechanical loads.
Temperature measurements will be taken until stable tempeature is reached, and then a different load or ambient temperature will be set.
So I will be able to create a table with the cable's insulation temperature, at different ambient temperatures and loads.
Once I have that information, how can I calculate the thermal resistance, and the ampacity of the cable?
What information I need from the cable supplier to determine this ampacity? (insulation temp rating, etc?)
Thank you very much in advance for your suggestions.