Question 692339

Please help me in this inequality problem. 
David has $15.000 to invest. He invests $10.000 in a mutual fund that pays 12% annual simple interest. If he wants to make at least $2200 in yearly interest , at what minimum rate does the remainder of the money need to be invested? 


Let rate that $5,000 ($15,000 – 10,000) need to be invested at, be r
Then: .12(10,000) + r(5,000) ≥ 2,200
1,200 + 5,000r ≥ 2,200
5,000r ≥ 2,200 – 1,200
5,000r ≥ 1,000


r, or rate to invest $5,000 ≥ {{{1000/5000}}}, or {{{highlight_green(r >= 20)}}}% 

OR


$5,000 need to be invested at a certain rate in order to make $1,000 ($2,200 - 1,200). This is then written as: r(5,000) ≥ 1,000. Solve for r and the rate is found.


You can do the check!!


Send comments and “thank-yous” to “D” at MathMadEzy@aol.com