What is digital?

greenspun.com : LUSENET : Imaging Resource Discussion : One Thread

I see the world "digital" in many places. Just what determines if an item is digital ?

-- Elwin L. Chapman Sr (chapman2@home.com), May 08, 1999

Answers

Elwin, your question requires a hefty answer, it is very difficult to define the digital technology with a short response. I am going to try to give you a very simplified and short answer anyway. The Digital signal is based on the on/off signal also known as high/low or 1/0 Bits. Combination of highs and lows maps a specific image or sound, etc. The conventional Analog circuits worked on basis of Minimum, Maximum and the bandwidth in between (i.e. 4 to 20mA signals still used widely in many applications today, slowly being replaced with more secure & stable digital signals). Many system use a "Analog to Digital (A/D) conversion method" in order to achieve higher data resolution and control. The true digital Integrated Circuits (ICs) are superior to the conventional Analog technology. Again the basis of digital electronics is based on raw data in high/low Bits. I hope this helped a little. I do recommend to check out some books on digital technology, there are many good books available.

-- Fred (tabarrok@ariver.com), May 10, 1999.

Fred gave a very good answer. Defining thing is a tough job.

I'd say: Digital, strictly speaking, means that something can be assigned a numeric value to a precision dependant only on how many DIVISIONS are present in the system or device being used to MEASURE the value. Think simply on and off for binary values and patterns of on's and off's to represent larger values. The limitation here is precision. If you don't have enough divisions over the range you wish to measure you have a problem. For instance, if you had a ruler and measured the length of a board and wrote it's dimensions down to the nearest mark on the ruler that would be a digital measurement. If you cut another object based on the number you'd writen down it could be a bit larger or smaller than the first depending on which way you erred when selecting the nearest mark on the ruler. If you cut something to the exact length as the first board, and used it to cut another board it would be an analog representation of the length. Analog means exact equivalent. The same as...

Now someone is going to yell "FOUL!", and say "Well, if I'm given enough digits I can represent a value either way with the same precision!" Well that's true, but not exactly how the game is played when it pertains to the real world.

For instance, if we had a digital circuit using a 12Bit or 4096 division or step converter measuring a 0 to 10 volt signal we could only see changes equal to or larger than 10/4095 or .002442 volts each. If we used an analog circuit that merely compared the incoming voltage with another precisely regulated voltage or reference source we could see a much, much smaller change in voltage. On the other hand it would be very easy to save the digital values and very difficult to store the analog voltages. The other difference is that most digital circuits are simpler to build, but operate much slower than analog circuits. That's partly why we have digital computers, but build analog circuits or analog computers to do certain things that need to be done very quickly. The thing to remember is this, if you have enough divisions or bits to represent a value then the precision becomes good enough that you really don't need an exact value. For instance, there is typically no need to differentiate between 11978 and 11978.000000000000142, unless .000000000000142 happens to be very important to you.

There is also a great blurring of the line between digital and analog in modern devices. For instance, the voltage on a CCD image sensor in a digicam is an analog value that gets converted to a digital value in order to make it compatible with the camera's memory and other digital devices like the computer to which you will likely download the image. This is also the case with sound. A computer uses a sound card to convert analog voltages produced by a microphone and converts them to digital values to store the sound on a disk. When it plays them back the digital values are again converted back to analog values by the sound card and reproduced through speakers. The sound card is essentially an Analog to Digital and a Digital to Analog converter.

To make matters more confusing we must also consider the existence of Analog or Hybrid Analog/Digital computers. If we ever get to the point where we can easily, accurately, and cheaply store and manipulate analog values we will have much faster computers, since an analog computer has little need to convert analog values to digital ones, or vice-versa, in order to manipulate them. Some research is being done in these areas with storage being based on holographic techniques. Maybe one day our computers will run at nearly the full speed of light with no propogation delays. My guess is they'll get pretty bored waiting for us to utter the next syllable or hit the next key... :-)

-- Gerald M. Payne (gmp@francorp.francomm.com), May 11, 1999.


Fred gave a very good answer. Defining a thing is a tough job.

I'd say: Digital, strictly speaking, means that something can be assigned a numeric value to a precision dependant only on how many DIVISIONS are present in the system or device being used to MEASURE the value. Think simply on and off for binary values and patterns of on's and off's to represent larger values. The limitation here is precision. If you don't have enough divisions over the range you wish to measure you have a problem. For instance, if you had a ruler and measured the length of a board and wrote it's dimensions down to the nearest mark on the ruler that would be a digital measurement. If you cut another object based on the number you'd writen down it could be a bit larger or smaller than the first depending on which way you erred when selecting the nearest mark on the ruler. If you cut something to the exact length as the first board, and used it to cut another board it would be an analog representation of the length. Analog means exact equivalent. The same as...

Now someone is going to yell "FOUL!", and say "Well, if I'm given enough digits I can represent a value either way with the same precision!" Well that's true, but not exactly how the game is played when it pertains to the real world.

For instance, if we had a digital circuit using a 12Bit or 4096 division or step converter measuring a 0 to 10 volt signal we could only see changes equal to or larger than 10/4095 or .002442 volts each. If we used an analog circuit that merely compared the incoming voltage with another precisely regulated voltage or reference source we could see a much, much smaller change in voltage. On the other hand it would be very easy to save the digital values and very difficult to store the analog voltages. The other difference is that most digital circuits are simpler to build, but operate much slower than analog circuits. That's partly why we have digital computers, but build analog circuits or analog computers to do certain things that need to be done very quickly. The thing to remember is this, if you have enough divisions or bits to represent a value then the precision becomes good enough that you really don't need an exact value. For instance, there is typically no need to differentiate between 11978 and 11978.000000000000142, unless .000000000000142 happens to be very important to you.

There is also a great blurring of the line between digital and analog in modern devices. For instance, the voltage on a CCD image sensor in a digicam is an analog value that gets converted to a digital value in order to make it compatible with the camera's memory and other digital devices like the computer to which you will likely download the image. This is also the case with sound. A computer uses a sound card to convert analog voltages produced by a microphone and converts them to digital values to store the sound on a disk. When it plays them back the digital values are again converted back to analog values by the sound card and reproduced through speakers. The sound card is essentially an Analog to Digital and a Digital to Analog converter.

To make matters more confusing we must also consider the existence of Analog or Hybrid Analog/Digital computers. If we ever get to the point where we can easily, accurately, and cheaply store and manipulate analog values we will have much faster computers, since an analog computer has little need to convert analog values to digital ones, or vice-versa, in order to manipulate them. Some research is being done in these areas with storage being based on holographic techniques. Maybe one day our computers will run at nearly the full speed of light with no propogation delays. My guess is they'll get pretty bored waiting for us to utter the next syllable or hit the next key... :-)

-- Gerald M. Payne (gmp@francorp.francomm.com), May 11, 1999.


Moderation questions? read the FAQ