|
Question 173296: A financial manager wants to invest $23000 for a client by putting some of the money in a low-risk inverstment that earns 7% per year and some of the money in a high risk investment that earns 10% per year. create an appropriate matrix to solve the given situation. how much money should be invested at each intrest rate to earn $5000 in intrest per year?
Found 2 solutions by Mathtut, checkley77: Answer by Mathtut(3670) (Show Source):
You can put this solution on YOUR website! lets call the amount invested at 7% x and the amount invested at 10% y and y=23000-x. This problem has to be some other amount of interest...not $5000 because even all the money at 10% interest would only be $2300 of interest....and if all the money was at 7% it would be $1610...so the interest has to be somewhere between those two.......I will show you the equations here and you need to plug in the proper numbers from the problem. either the interest rates, principal amount or interest earned is wrong. for this problem I am letting the interest earned be $2000
:
.07x+.1(23000-y)=2000
:
.07x+2300-.1y=2000
:
-.03x=-300
:
dollars invested at 7%
:
dollars invested at 10%
Answer by checkley77(12844) (Show Source):
You can put this solution on YOUR website! .07x+.10(23,000-x)=5,000
.07x+2,300-.10x=5,000
-.03x=5,000-2,300
-.03x=2,700
x=2,700/-.03
x=-90,000 not an answer.
Because even with all the $23,000 invested @ 10 % the max amount of interest is .10*23,000=$2,300. Nowhere near the required $5,000.
To obtain the required $5,000 interest you would have to invest the entire $23,000 @ a rate of:
23,000x=5,000
x=5,000/23,000
x=.2174 or 21.74 % annual interest rate.
|
|
|
| |