A megabyte and a gigabyte are units used to measure the amount of storage inside the central processor unit, as well as of memory - both real as well as virtual.
A megabyte is the smaller unit of the two. There has been some controversy as to the actual amount of bytes that make up one megabyte. The IBM Dictionary of Computing described one mega yte as being equal to 1,000,000 bytes. This number is in decimal notation of course. However, the other number that is put out to describe a megabyte is 1,048,576 bytes. This is based on the claim that the megabyte, in fact any byte value, has to be calculated to the power of 2.
A gigabyte is larger than a megabyte. In fact, one gigabyte is made of a 1000 megabytes or 1024 megabytes, by the definition that requires bytes to be calculated in powers of 2. This means that a gigabyte is roughly equal to a billion bytes.
A megabyte is the smaller unit of the two. There has been some controversy as to the actual amount of bytes that make up one megabyte. The IBM Dictionary of Computing described one mega yte as being equal to 1,000,000 bytes. This number is in decimal notation of course. However, the other number that is put out to describe a megabyte is 1,048,576 bytes. This is based on the claim that the megabyte, in fact any byte value, has to be calculated to the power of 2.
A gigabyte is larger than a megabyte. In fact, one gigabyte is made of a 1000 megabytes or 1024 megabytes, by the definition that requires bytes to be calculated in powers of 2. This means that a gigabyte is roughly equal to a billion bytes.