Question relating to percentages
Ok, I have a problem in my textbook which is pretty weird.
The question: The table below shows the number of Excellent Service Awards given to individuals from 1996 to 1999.
Year---------------Individual Award
1996---------------------935
1997---------------------1764
Calculate the percentage increase in the number of individual awards given from 1996 to 1997.
Ok ok, not too hard.
Here's what I did:
Step 1) I subtracted 1764 - 935 = 829 found the difference
Step 2) find the average 1764 + 935/2 = 1349.5 found the average
Step 3) 829/1349.5 multiplied by 100 = 61.4%
Am I right? There are 61.4% more awards given from 1996 to 1997
In my textbook it says 88.7% more awards were given meaning they didn't find the average and they just divided 829 by 935 multiplied by 100. What the hell! Are they doing this even right?
ebaines told me in one of my earlier posts that, "Percentage difference is expressed as the difference between second and first values divided by the first value, times 100%. Thus a $5 stick of butter is (5-2)/2 x 100% = 150% higher price than the $2 stick of butter." But on http://www.mathsisfun.com/percentage-difference.html it says to find an average to avoid confusion. Can you please explain to me in what situation you should and shouldn't find the average.
Thanks!