Question 285517
First let's find how much profit the owner was making at the beginning. "A storeowner's average day sales is $500, realizing a profit of 20% of the selling price."
{{{500 * 0.20 = 100}}} He was making $100 a day in profits.

"he advertised his goods by spending an average of $20 a day for advertising." 
So we know that whatever his daily sales become, he must add $20 to his daily costs.

"If his average daily sales rose to $700 with the rate of the profit still remaining the same, ..."
{{{700 * 0.20 = 140}}}
So the storeowner's sales went up by 40%. He now sees $140 a day, but his costs went up by $20 in advertising. He must deduct that $20 from his profits.
{{{140-20 = 120}}}

"how much additional profit did the advertising bring?"
{{{120-100= 20}}}


Let's look at that another way.
The original sales of $500 resulted in profits of $20. Let's say, for fun, that he sold 5 items worth $100 each. If that were the case, the owner makes $20 on each sale.

Now, he advertises and sells 7 items. Each item would also add $20. So 2 more sales means $40. 

But, he spends an additional $20 advertising to get those two extra sales. So he nets 40 - 20 = $20 in 'real profit'.

If the owner was only able to sell 6 units instead of 7 ($600 in sales), then his revenue would increase but his profit would stay the same.

If the owner sold more than $500 and less than $600, his sales would go up, but his profit would decline. It is not easy being a shopowner.