When transmitting electricity over long distances, a transformer is used, increasing the voltage to 6 kV and loaded to a rated power of 1000 kW. At the same time, the difference in readings of electricity meters installed at the transformer substation and at the receiving point increases daily by 216 kWh. How many times must the voltage be increased so that power losses during transmission do not exceed 0.1%?
Detailed solution. Format gif
No feedback yet