How do computers interpret light signals as data?

Just a question on fiber optics. I was wondering how the concept is actually carried out. Is there some sort of international standard ?

Comments

  • Put simply information transmitted is a series of pulses.

    Going up a copper wire it's just electrical pulses. Simplest way to describe it is like dots and dashes of Morse code except it just 1s and 0s. [on or off]

    With fibre optics there is no electricity used, but light frequency impulses. Therefore there is no resistance like you have in a copper wire. Ohms law which is Amps/Volts/Resistance over distance.

    Fibre optic goes at the speed of light. 186,000 miles per second.

    The 1s and 0s are fed into a coder which transfers it to light pulses. They are then fed into a fibre optic strand to a decoder at the other end which turns it back into 1s and 0s the computer can understand.

    The pulses are so fast you can't see them with the naked eye. You have to use an oscilloscope to see them.

    I have never really thought about an International Standard but I suppose there must be one of Industrial Standard or nothing would work between different countries.

  • Each byte of computer information represents a symbol eg; an a or a b or a character. Eight bits are required to make each byte. Each bit is either a 0 or a 1. So perhaps the light blinks a bit like morse code with a short dash or long to create the 0 or 1 ...

  • Its just like an electrical connection, but light travels faster than electricity. the light blinks so rapidly that you cannot see it blinking, this light tells the computer info, and in return, the computer sends light over the fiber optics.

Sign In or Register to comment.