Agent Hunt transferred classified files from the CIA mainframe onto his flash drive. The drive had some files on it before the transfer, and the transfer happened at a rate of 4.4 megabytes per second. After 32 seconds, there were 384 megabytes on the drive. The drive had a maximum capacity of 1000 megabytes. How full was the drive when the transfer began? How long from the time that Agent Hunt started the transfer did it take the drive to be completely full?

Question
Answer:
Step One
========
Start with how much was transferred.
r = 4.4 megabytes / second
t = 32 seconds
amount transferred = ????
Formula: amount = r*t
amount = 4.4 * 32 = 140.8 megabytes were transferred.

Step Two
=======
Find the number of megabytes on the drive before the transfer took place.
The amount transferred = 140 megabytes.
The drive recorded when he was done there was 384 megabytes
The amount on the drive was 384 - 140 = 243.2 megabytes to begin with.

Step 3
=====
Find the number of megabytes left before the transfer.
1000 - 243.2 = 756.8 megabytes were left before he began his 32 second transfer.

Step 4
=====
Find the amount of time it would take to fill the drive.
He needs to transfer 756.8
The rate of transfer = 4.4 megabytes / second
t = ????
Formula: Amount transferred = r * t; t= amount / r
t = 756.8 / 4.4 
t = 172 seconds.

Interesting question. Thanks for posting. It is a unique application of the more familiar formula

d = v * t


solved
general 11 months ago 9170