Question 1100032: An Apple Pie Company knows that the number of pies sold each day varies from day to day. The owner believes that on 50% of the days she sells 100 pies. On another 25% of the days she sells 150 pies, and she sells 200 pies on the remaining 25% of the days. To make sure she has enough product, the owner bakes 200 pies each day at a cost of $2 each. Assume any pies that go unsold are thrown out at the end of the day. If she sells the pies for $4 each, find the probability distribution for her daily profit and calculate mean daily profit for this business.
Answer by Theo(13342) (Show Source):
You can put this solution on YOUR website! she sells 100 pies on 50% of the days.
she sells 150 pies on 25% of the days.
she sells 200 pies on 25% of the days.
she bakes 200 pies each day at 2 dollars each.
she sells the pies at 4 dollars each.
her cost on each day is fixed at 200 * 2 = 400 dollars.
on 50% of the days, she just breaks even because she sells 100 at 4 dollars apiece for a total of 400 dollars minus a fixed cost of 400 dollars = 0 dollars profit.
on 25% of the days, she sells 150 pies at 4 dollars apiece for 600 dollars minus a fixed cost of 400 dollars = 200 dollars profit.
on 25% of the days, she sells 200 pies at 4 dollars apiece for 800 dollars minus a fixed cost of 400 dollars = 400 dollars profit.
her average profit per day is therefore .5 * 0 + .25 * 200 + .25 * 400 = 150 per day.
her total profit depends on the number of days she sells the pies.
for example:
if she sold pies for 300 days, then:
she made 0 profit on 150 days.
she made 200 profit on 75 days.
she made 400 profit on 75 days.
her total profit would be 150 * 0 + 75 * 200 + 75 * 400 = 45000.
the total number of days is 300.
the average profit per day is 45000 / 300 = 150 per day.
|
|
|