Question 335738
Since "Ruth’s apples sell at 2 for $1", this means that 2 apples = $1. Divide both sides by 2 to get 1 apple = 1/2 dollars


So one of Ruth's apples is 50 cents.


Because "Betty’s slightly smaller apples sell at 3 for $1", we know that 3 smaller apples = $1 and that 1 smaller apple = 33 cents (rounded to the nearest cent)



Now if all of the apples are sold in groups of 5 for $2, then 5 apples = $2 which means that 1 apple = 2/5 dollars or 1 apple = 40 cents.



Recall that the bigger apples sold at 50 cents a piece, while the smaller ones sold at about 33 cents a piece. So the 40 cent per apple price is too low for the larger apples and too high for the smaller apples. So there's either going to be more money or less money than expected when everything is sold.



What they're expecting: 


There are 30 large apples. If each is sold at 50 cents, then 30(0.5) = 15 dollars is made.


There are 30 small apples. Selling them all (at 33 cents an apple) generates about 30(0.33)= 9.90 dollars


Add the two figures up to get 15+9.90 = 24.90



So they're expecting to make about $24.90



On the other hand...


What happens when the apples are sold at 5 for $2:


Every apple is now 40 cents a piece (see above). So selling all 60 apples brings in 60(0.40) = 24 dollars


So only $24 is made when selling apples at 5 for $2. 



This means that they lost 90 cents. This may not seem like a big deal (it's not really), but imagine scaling up the amount of apples sold. If more apples are sold, then this difference will be bigger (ie If they sold 120 apples, they'd lose about $3.60, which is a bigger difference).