We use many different methods to send data over wired and wireless networks. In this video, you’ll learn about analog and digital communication, data transfer speeds, connection quality, frequencies, and much more.
<< Previous: Wireless Interface Speeds and DistancesNext: Computer Power Connectors >>
When we’re sending a signal over the network, whether it’s on a copper cable, fiber connection, or over wireless, we are either sending an analog signal or a digital signal. An analog signal is one that is using this data signal. You can see the wave of the analog signal being sent. We would be sending this over what we call an analog channel, such as radio frequencies. A good example of this would be AM and FM radio, over those radio frequencies. And as you get farther and farther away from the AM or FM radio antennas, the signal slowly begins to degrade, until some point where all you get is static, and you’re not able to pick up the actual transmitted signal.
With digital communication, we’re doing this a little bit differently. We’re still using that analog channel. We could still be using the radio waves, for instance, to send the signal, but the type of signal is very different. You can see it’s really just a series of ones and zeroes, where the zero might be the bottom of this wave, and the one might be on the top. This is very common when we’re sending things like satellite radio frequencies. If we’re listening to satellite radio, it never degrades. It either is working, or it’s not working. We either hear a communication, or we hear nothing. And that’s because we’re sending a digital signal, which doesn’t degrade, but eventually we could lose the entire signal.
Here’s visually how you might be able to get a digital signal out of an analog channel. I’ve got this analog signal at the bottom, and any time I’m using the higher frequency, that means the digital value is a 1, and any time it’s a lower frequency, the digital value is a zero. This, of course, is just an example, but it’s certainly one of the ways that you can use to get a digital signal out of an analog communication.
One challenge we always have with our networks is there’s a certain distance we can go until we simply can’t go any further. That’s because the signal will degrade as it goes through that medium, whether it’s a copper, a fiber, or a wireless connection. There’s usually a set of standards though that gives us a guideline. It will tell us that we can run this particular kind of Ethernet, over this particular kind of cable, to this exact distance. And all of the devices that we’re using must adhere to that set of standards.
But of course, you can go a little bit further than the standards if you really have to. Going those additional lengths is, of course, not supported by the standard. But as long as we’re able to get the signal from one end to the other, we still should technically be able to operate. To be able to really determine just how much signal we’re able to get through a connection, you’re going to need some type of advanced testing equipment. This will be able to look at everything that’s being sent and evaluate how much of that is ending up on the other side.
Another concern we have with our networks is how fast can we get this data through this connection. We want to know the total amount of traffic we can put through in a particular period of time. This is almost never something that’s straightforward to calculate, and you’ll notice that we used different terms depending on where the information is. If we’re storing something on disk, we’re usually referring to it as a number of bytes. If we’re transferring data through a network, we almost always describe it as a number of bits.
This is obviously two very different ways to talk about the same type of information. So you’ll need to look at your network connection, or your storage device, and make sure you’re able to discern the difference between the bits and the bytes. We also have the challenge of looking at the total amount of throughput over a connection, and evaluating that versus how much real data is getting through. Here’s what I mean by this.
If we look at something like SATA connectivity. So our hard drives, our SSD connections, and other storage devices that we’re plugging into inside of our computer. These use a very specific signaling type to send the data. The signaling type is called 8b/10b. That means, to send eight bits across that SATA connection, it really has to send 10 bits of actual information. If we do the calculation then, our SATA version 3 can communicate at six gigabits per second. So it can effectively send 750 megabytes of information over that period. We’ve taken six gigabits and divided it by eight to give us that number.
But of course SATA has this 8b/10b signaling, so it’s sending 10 bits to ultimately get eight. So we’re getting only really 80% total throughput through these devices. So if we look at the storage, we’re really communicating 600 megabytes per second because it’s 80% of that 750 megabytes of actual data that was sent across the link. So you need to be very careful when you’re looking at the specifications of SATA and SATA version 3. It may say that it communicates at six gigabits per second, and it can store information at a total of 600 megabytes per second. And the difference there is because of the 8b/10b signaling.
Another concern we have with sending data over our network is the quality of that signal. One measure of quality is how loud the signal is, the strength of the signal overall. Whether we are sending information through copper, or fiber, over a wireless connection, we need to be able to hear the signal once it finally gets to us. The signal itself, though, needs to be of a high quality. We need to be able not only to hear it, but are we able to hear it over the other noises that are occurring, especially over something like a wireless network. We often call this the signal-to-noise ratio. So we can be able to get our signal just above the noise level, so that it can be heard by the other side.
There are many different ways to affect the quality of the signal going over these networks. We can change out connectors, we can use different types of fiber, or even change out the type of antennas we’re using on our wireless connections to improve the quality of our signals.
Digital Rights Management is a way that the owner of a particular kind of media can control how that media is being used. You can see this associated with every type of media type that you’ll run into. Whether it’s games, or movies, or movies, or even documents, the owner of the content control how you are using any of these. It could be something used in software, or you may need some piece of hardware to be able to use in conjunction with that media. You may be plugging in a USB key, for instance, to be able to run a piece of software.
And every application works a little bit differently, and every DRM implementation has a different way of being implemented. So you have to make sure that you’re working with the manufacturer or the owner of the content, especially if you’re changing out a computer, or moving different components around, you want to be sure that the DRM is still going to work once you update that device.
One term used very often in computing is the frequency of a particular bus or the particular speeds on a network. We usually refer to these frequencies in hertz. So it might be in hertz, or megahertz, or gigahertz. And it’s really referring to the number of cycles that can be seen in a single second. If you have faster frequencies, you generally get faster speeds. Let’s see how frequencies are used and calculated on something like gigabit Ethernet.
For gig Ethernet, we might run this over Category 5 cable. And Category 5 is certified for us to send signals over that link that can go all the way up to 125 megahertz frequencies. Well, Ethernet– especially gigabit Ethernet– codes two bits per signal, and we are using four pairs of wires to send that signal. That means we’ve got the 125 megahertz frequencies, sending two bits per signal over four pair, which gets us to the maximum of 1,000 megabits per second, or one gigabit of speed.
On wireless networks like 802.11, we might be running at 2.4 gigahertz or 5 gigahertz frequencies. So that’s another example of how we use these frequencies to determine how much information we’re sending over a media. If we’re talking about optical fiber connections, then we’re talking about the wavelength, which is of course a way to describe the frequencies. So it’s not uncommon, for example, to see optical networks running at a frequency of 850 nanometers.