A gigabit is a unit of measurement used in computers, equal to one billion bits of data. A bit is the smallest unit of data. It takes eight bits to form or store a single character of text. These 8-bit units are known as bytes. Hence, the difference between a gigabitand a gigabyteis that the latter is 8x greater, or eight billion bits.
Storage capacity is normally indicated in bytes rather than bits. You’ll probably not hear someone describe a 200-gigabyte drive as being 1,600 gigabits. Instead, bits are typically used to describe data transfer rates (DTRs), or how fast bits of information can move between devices, such as modems or Universal Serial Bus (USB) ports.
Source: www.wisegeek.com
Rating
|
Average rating:
No Ratings Yet
Number of Ratings : 0
|
|
|
|
View Topic History
|