Question 827642
So first we need to know that the word "giga" means 1.0*10^9

We also need to know that "micro" means 1.0*10^-6

The first thing we should do is write 500 in scientific notation:

1.0*10^2

Now we count that there are 1.0*10^15 powers difference from "giga" to "micro" we must multiply this by our original amount:

5.0*10^2*1.0*10^15=5.0*10^17

So....

There are [5.0*10^17] micro bytes in 500 gigabytes.