SOLUTION: A dealer buys 50 apples for $40 and sells them for $1.20 each .calculate his percentage profit

Algebra.Com
Question 1140270: A dealer buys 50 apples for $40 and sells them for $1.20 each .calculate his percentage profit

Answer by ikleyn(52787)   (Show Source): You can put this solution on YOUR website!
.

His percentage profit is equal to   = 50%.


The steps:  a) calculate the profit;  b) relate it to the cost;  c) multiply this ratio by 100 to get a percentage.


RELATED QUESTIONS

A car dealer buys a car for N$ 50 000, and then sells it for N$ 72 000. Calculate the... (answered by vleith,macston)
A dealer purchases 22 pencils for Rs.20 and sells them at the rate of 10 pencils for Rs... (answered by rfer)
A shopkeeper buys 35 radios for £435.75. If she sells them at £15 each, what is her... (answered by LinnW)
Mrs. Jones bought 25 apples for $7.50. She sold 12 apples for 60 cents each and the... (answered by josgarithmetic,ikleyn)
Sadiq buys a guitar for £150 and sells it for £210. Work out his percentage profit.... (answered by Alan3354)
A trader buys an item for 50naira and sells it for 60naira what is his percentage... (answered by fractalier)
A shopkeeper buys 300 identical articles at a total cost of $ 1500. He fixes the selling... (answered by rwm)
Winston buys eggs from a village 120 kilometres from the town where he sells them. He... (answered by richwmiller)