Question 460796
Let {{{p}}} = number of pennies
Let {{{d}}} = number of dimes
Let {{{h}}} = number of half dollars
given:
(1) {{{ p + d + h = 100 }}}
(2) {{{ 1*p + 10d + 50h = 500 }}}
-----------------------------
There are 2 equations and 3 unknowns, so
it is not directly solvable, but I can make
some logical conclusions.
(a)
The number of pennies is even and also
a multiple of {{{ 10 }}}, otherwise, you
can't end up with a total of {{{ 500 }}}.
(b)
You can't have {{{ 50 }}} dimes, since that
equals {{{ 500 }}}. 
(c)
The most dimes you could possibly have is 
{{{ 40 }}}, and then you have:
{{{ 50 }}} pennies = $.50
{{{ 40 }}} dimes = $4.00
{{{ 1 }}} half dollar = $.50
but then you only have {{{ 91 }}} coins
------------------------------------------
Suppose you have 1 less dime. You have:
{{{ 60 }}} pennies = $.60
{{{ 39 }}} dimes = $3.90
{{{ 1 }}} half dollar = $.50
That adds up to {{{ 100 }}} coins and
must be the answer.