Huge wires.

Reply

  #1  
Old 03-15-17, 01:38 PM
Member
Thread Starter
Join Date: Oct 2014
Posts: 516
Huge wires.

So as some of you know I live overseas in the land of 220v electricity but I grew up in the US where my parents still live.

Since my parents know I'm somewhat handy they asked me for some help with their panel which thankfully I was able to provide in part due to what I've learned by lurking here.

What struck me with their situation and what often jumps out at me a posts here are the enormity of the feeder wires going into a panel.

Intellectually i get that P=v*a means they're going to be bigger. But it's still amazing to me to see the pics of the #3 wires in their panel and think that at my house where we probably uses about the same amount of electricity as they do, these are the "feeder" wires going to my main disconnect.



Those are 6mm2 or in other words between 9-10 AWG

And it made me wonder just how much $$$ that extra 100-120v saves a country? Not that any country could change at this point considering the enormous costs a change would imply. But I wonder, in an alternate universe with, say a 350v residential service USA. How many fewer tons of copper and aluminum? How many now unnecessary last mile transformers on utility poles? How much saved in steel and pvc conduits?

Higher voltage obviously is obviously more dangerous but I've been shocked by 220 and by 120 and couldn't tell the difference but then on the other hand its clear that people working on high voltage transmission systems do so with zero margin for error. Where is the ideal safety-efficiency equilibrium? Now with the surging importance of LEDs and solar panels would a dc grid make sense in this imaginary world?

Just some rainy day musings....
 
Sponsored Links
  #2  
Old 03-15-17, 05:41 PM
Tolyn Ironhand's Avatar
Group Moderator
Join Date: Nov 2007
Location: USA
Posts: 11,890
One thing you are missing is we do not pay for electricity by volts or amps. It is by watts, or kilowatt hours to be exact. The wire sizes we use has been determined by the National Electrical Code (NEC) who has put forth a set of rules for safe installations in the field. The power companies are not required to follow the NEC rules, and their engineers are the ones who determine their wire sizes. A good example: We have to install 2/0 copper wires for a 200 amp residential service, the power company typically with run about #6 aluminum triplex for their overhead wires.

Utility voltages before the final transformer is typically about 7200 volts. Running voltages at 347v (347/600 volt system) would still require a transformer.

Wire size is based on amps. If you had a 30 amp circuit at 120 volts, it would require #10 wire. If it was 30 amps at 240 volts, it would require #10 wire. 24 volts @ 30 amps? #10. 600 volts @ 30 amps? Yup, #10. Voltage has no effect on wire size other then you can run smaller wire because as voltage goes up, current goes down.

I think the big reason to why we use 120/240 volts is because as the voltage gets higher, tolerances get tighter. What I mean by that is, when I am fixing things in the field and there is a fault, a 120 volt fault there is a small flash and a pop. With 480 volts it is a much bigger flash along with smoke and a BOOM! Also the equipment we install needs to be more robust to handle the higher voltages.

Looking at your picture (which I am not 100% sure what I am looking at) it appears to be a fuse panel. It looks like there is some short pigtails coming off a bus (what we would call a tap) in the middle to a fuse holder, then off the fuse holder to the branch circuit. I suspect that fuse holder can only take a 30 amp fuse, and what size wire did you say that was? Yeah...#10
 
  #3  
Old 03-29-17, 08:36 AM
Member
Thread Starter
Join Date: Oct 2014
Posts: 516
Forgot about this thread. It was really just idle musing on voltage. It seems like half the world is on 110-120 and half on 220-230. I remember hearing somewhere that the 110 came from the ideal voltage for the original Edison lamps but maybe that's just an old wives tale.

I was thinking if you were designing a grid from scratch, no legacy infrastructure or appliances. Would you do 110, 220 or something higher like 400? At what point does it become significantly more dangerous? Stick with AC or does the advent of solar, LEDs and battery storage lend itself towards a DC network?

As to the picture. That's the main (fused) disconnect for my entire service which covers 2800 sq ft of lights, couple fridges and freezers, 5tn AC, assorted pumps and motors, washing machines, microwaves, etc. More or less the standard stuff you'd see at any suburban home. Which is why it was kind of jarring to see just how fat the cables feeding my parent's panel were, although I do get the math of it. In fact I'm pretty sure
 
  #4  
Old 03-29-17, 09:51 PM
Member
Join Date: Mar 2006
Location: Wet side of Washington state.
Posts: 18,349
To the best of my knowledge the figure of 110 volts was Edison's original generating voltage and the lamps were designed to use 100 volts. Due to the resistance of the distribution wiring the voltage near the power plant was closer to 110 and the voltage at the end of the distribution was about 100. This, of course, meant that the farther from the power plant you were located the longer your light bulbs would last but also the dimmer they would shine. Eventually Edison came up with the dual-winding generator that used three conductors to output two distinct voltages, 220 between the two windings and 110 between either winding and their combined mid-point. This is where the 220-240/110-120 volt system originated. Using the higher voltage allowed less of a voltage drop across the entire system for any particular length of the distribution conductors. The nominal voltages have risen over the decades simply to reduce voltage drop while maintaining a fairly safe system.

My feeling is that the AC system is better than DC for the primary reason that voltages can be so easily raised or lowered with simple transformers. It is true that the highest voltage transmission lines (in excess of 500,000 volts) ARE DC but they require conversion from/to AC at either end. It wasn't until the advent of relatively inexpensive solid state converters that this became practical and even this is only the case in HUGE transfers of power over long distances. Until the cost of the batteries and solar generators becomes low enough to be on a par with AC systems will small DC be practical. Plus, AC motors are far superior to DC motors in most applications. Even with variable speed requirements, the last hold out of the DC motor, utilizing variable frequency controls with AC motors is generally less costly than using DC motors.
 
Reply

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


Thread Tools
Search this Thread
Display Modes
'