Question 1099897
<br>
The traditional algebraic method for solving this kind of problem is to work with the fraction of the job each computer does in a minute.<br>
let x = number of minutes the faster computer takes to do the job
then 1/x = the fraction of the job the faster computer does in 1 minute<br>
The slower computer does the job alone in 30 minutes; so in 1 minute it does 1/30 of the job.
Working together, the two do the job in 10 minutes; so in 1 minute the two together do 1/10 of the job. Then
{{{1/30 + 1/x = 1/10}}}<br>
... and you can solve the problem from there.<br>
But here is an alternative method for solving this kind of problem, which usually (as in this case) gets you to the answer much faster.<br>
Consider the 30 minutes the slower computer takes to do the job alone.
Since the two computers together can do the job in 10 minutes, in 30 minutes they could complete 3 of the jobs.
But in those 30 minutes the slower computer is only doing one job; that means the faster computer must be doing the other two jobs.
So the faster computer can do the job twice in 30 minutes; that means it can do the job once in 15 minutes.<br>
You should of course get that same answer if you finish solving the problem by the traditional algebraic method above.