MyFriendScott
01-10-2012, 05:23 PM
Read some tech at http://www.madelectrical.com/electricaltech/onewire-threewire.shtml today regarding the differences between 1 and 3-wire alternators. It was definitely a good read, but also clued me in to how hooking up my amp's power source directly to the battery might have been a mistake.
A quick back-story: Last weekend I was driving the car at night, headlights and stereo on, first time actually for doing both at the same time since installing the new sub and amp. While driving at night, I noticed I could only play the music at half volume, where the sound quality of all speakers and sub became very poor. When this amp was installed and tuned, the head unit could be turned up to max without clipping and the gain on all 5 channels were set at appropriate levels...basically I could max the volume on the H/U without sound distortion or clipping after proper tuning. So now with the lights on, the amp's performance is very bad so I started investigating.
I tested resistance from the battery ground to the engine block, to the body (where my amp is grounded), to the subframe, and to the radiator support. Only the radiator support resistance test showed a measurement that could use improvement. I resolved that with a 4 ga wire from the radiator support to the subframe. There was a small improvement in the volume setting before sound quality deteriorated, but I still haven't fixed the problem. I ran out of time today but need to test voltage at the battery and amp next while a helper turns on the lights and starts turning up the H/U's volume. As an aside, all of the amp's specifications are listed at 13.8V so that's my target with lights on and volume up.
During restoration, the '67 Camaro received a new wiring harness and was upgraded to a 3-wire, 94 amp alternator, which gives it a voltage regulator sensing circuit. In very basic terms, this helps the alternator adjust output voltage to the main distribution block based on the voltage drop caused by the attached accessories. By design, and after finishing the reading of the link above, accessories should be attached to this distribution block where the voltage sensing wire is also attached, and not directly to the battery. Here's the basic idea borrowed from MadElectrical.
53822
It's such a common practice to attach a huge power lead wire directly to the battery when installing an amp, but this now seems to be in conflict with how the electrical system is designed to support accessories when using a 3-wire alternator. This is where I am in my understanding now. If the amp's power lead is connected directly to the battery, then the battery is responsible for providing the necessary voltage, not the alternator. But it is the alternator's job to provide the necessary voltage for the accessories (while also providing a charge to the battery). It makes sense to me that the battery then, while being able to provide a lot of amperage, acts as a buffer and hides the alternator's ability to sense the voltage requirement of the amp (by the voltage drop that would occur if the amp were attached to the junction). If the battery begins to drop below 13.8V (and thus the amp's lead wire drops below 13.8V), the alternator only sees this as a very slightly discharged battery and won't adjust the output voltage appropriately so the amp again is getting at least 13.8V....right? Hooking up the amp to the junction instead seems like the logical solution to maintain a constant voltage to the amp (as long as the alternator is working correctly and can handle the demand).
A quick back-story: Last weekend I was driving the car at night, headlights and stereo on, first time actually for doing both at the same time since installing the new sub and amp. While driving at night, I noticed I could only play the music at half volume, where the sound quality of all speakers and sub became very poor. When this amp was installed and tuned, the head unit could be turned up to max without clipping and the gain on all 5 channels were set at appropriate levels...basically I could max the volume on the H/U without sound distortion or clipping after proper tuning. So now with the lights on, the amp's performance is very bad so I started investigating.
I tested resistance from the battery ground to the engine block, to the body (where my amp is grounded), to the subframe, and to the radiator support. Only the radiator support resistance test showed a measurement that could use improvement. I resolved that with a 4 ga wire from the radiator support to the subframe. There was a small improvement in the volume setting before sound quality deteriorated, but I still haven't fixed the problem. I ran out of time today but need to test voltage at the battery and amp next while a helper turns on the lights and starts turning up the H/U's volume. As an aside, all of the amp's specifications are listed at 13.8V so that's my target with lights on and volume up.
During restoration, the '67 Camaro received a new wiring harness and was upgraded to a 3-wire, 94 amp alternator, which gives it a voltage regulator sensing circuit. In very basic terms, this helps the alternator adjust output voltage to the main distribution block based on the voltage drop caused by the attached accessories. By design, and after finishing the reading of the link above, accessories should be attached to this distribution block where the voltage sensing wire is also attached, and not directly to the battery. Here's the basic idea borrowed from MadElectrical.
53822
It's such a common practice to attach a huge power lead wire directly to the battery when installing an amp, but this now seems to be in conflict with how the electrical system is designed to support accessories when using a 3-wire alternator. This is where I am in my understanding now. If the amp's power lead is connected directly to the battery, then the battery is responsible for providing the necessary voltage, not the alternator. But it is the alternator's job to provide the necessary voltage for the accessories (while also providing a charge to the battery). It makes sense to me that the battery then, while being able to provide a lot of amperage, acts as a buffer and hides the alternator's ability to sense the voltage requirement of the amp (by the voltage drop that would occur if the amp were attached to the junction). If the battery begins to drop below 13.8V (and thus the amp's lead wire drops below 13.8V), the alternator only sees this as a very slightly discharged battery and won't adjust the output voltage appropriately so the amp again is getting at least 13.8V....right? Hooking up the amp to the junction instead seems like the logical solution to maintain a constant voltage to the amp (as long as the alternator is working correctly and can handle the demand).