how does transistor amplify ac signal ?
normally for dc sources transistor amplifies the signal without any extra biasing i.e. without introducing additional resistances.But if we have to amplify ac signal can it be done without any extra biasing or is biasing is compulsory? I mean can transistor amplify ac signal directly just like dc signal?please explain in detail and even give references if possible.
Comments
A transistor can't amplify anything unless it is biased correctly. AC or DC.
So you need a bias circuit, typically a voltage divider to drive the base, an emitter resistor and a collector resistor. The emitter resistor can be bypassed with a cap to increase gain.
see the reference, and the section labeled "Voltage divider bias" for details.
.
A transistor is biassed to allow it to faithfully amplify a signal without distortion. It also sets a "minimum" DC level for an input AC signal, and the AC level is such that when the voltage rises to peak, it remains within the "linear" range.
"AC" signals are amplified all the time, if you want to look at it that way.
Let's look at a simple audio amplifier stage.
If it is fed a sine wave, the sine wave is biassed to be within the linear range (the straightest part of its response curve, actually). Thus at the output, an amplified sine wave appears. It is still "biassed" to be a sine wave impressed on a DC voltage. This DC voltage never goes to zero, but remains either positive or negative (depending on transistor type), but rises and falls in sympathy with the input signal.
The transistor just amplifies the variations in a DC voltage, so to speak.
The amplified AC signal is easily separated from the DC component, leaving just an AC sinewave behind, an amplified version of the input signal.
A transistor does not amplify
What it does is it controls the amount of power that is delivered to a load by means of the input signal
The whole kit is called an amplifier but it really is the energy from the power supply that is conveyed to the load.
If your question is “can an AC voltage be augmented) then a transformer would do that.
There would be a reduction in current; therefore the output power would be the same as the input power.
Hope this properly answers your question
Guru
A water faucet is analagous to a transistor. The VOLTAGE on the emitter controls the CURRENT through the collector-base leads.
To simulate DC operation, imagine that you start to open a faucet (voltage). The water (current) starts to flow from the tap. The more you open the faucet, the more water flows out the tap.
To simulate AC operation, imagine that the valve is already partly open and a certain water flow is occurring. If you rotate the faucet knob back and forth a little in order to vary the "voltage", the water flow output (current) starts to vary proportionally from the initial rate. This varying current affects the output power to the controlled device.
I don't think it can. A transistor is a DC device. Current only goes one way through it.
They only way you could kind of fake it would be to use resonance effects, like is done with class-C RF amplifiers. The transistor conducts for a half cycle and then the resonance of the output circuit takes over for the other half cycle. But you're basically limited to frequencies that are close to the tuned frequency of the output circuit. It wouldn't really work for a broadband amplifier, like an audio amp.
more or less yes, as far as you do not use them with high frequency signals (roughly below 1MHz). so for your references: 2219 amplifies DC by 300 @ a temperature and when in front of certain voltage/current couple (check datasheet). And 1946 will do x180 only. voilà !
Thanks to each and every one of you guys for the replies!