Ask Question
8 April, 08:51

If you put two 60 W bulbs in series across a 120 V outlet, how much power would each consume if its resistance were constant?

+4
Answers (2)
  1. 8 April, 09:00
    0
    Answer:15 watts

    Explanation:

    To get the current

    Power P=iV

    (Where P is power of bulb 60watts, i is current, and V is the voltage 120V

    i=P/V

    i = 60/120

    i = 1/2A

    But

    V=iR where R is the resiance

    Therefore R=V/i

    =120:1/2

    =120*2

    =240ohms

    For series connection, each bulb draws the same current since,

    Since the two bulbs are 60-watt bulbs, they will have the same resistance, therefore the voltage across each bulb is the same and equals halve of the applied voltage, 120/2 = 60 volts.

    Taken the resistance 240 ohm is constant across the series, we can roughly estimate the current flow and calculate power dissipation in the series connection.

    We have 120 V / (240 + 240) ohms = 1/4 A.

    The power dissipated in each bulb is (1/4) * 60 = 15 watts.
  2. 8 April, 09:14
    0
    Each would use 15 W power.

    Together they would use 30 W.

    Explanation:

    For the 60 W bulb R = ΔV²/P = (120) ²/60=14400/60 = 240 OHMS

    We can use Ohms Law to find the current

    I = ΔV/R = 120/240 = 0.5 Amperes

    When they are placed in series the total resistance is about 480 ohms, so each bulb sees a current of 120/480 = 0.25 A.

    Using the power equation to find the power of the bulbs in series

    P = I²R = 0.25² * 240 = 0.0625 * 240 = 15 W

    Each would use 15 W power.

    Together they would use 30 W.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question 👍 “If you put two 60 W bulbs in series across a 120 V outlet, how much power would each consume if its resistance were constant? ...” in 📗 Physics if the answers seem to be not correct or there’s no answer. Try a smart search to find answers to similar questions.
Search for Other Answers