The first rock is thrown horizontally with initial velocity v1, while the second rock is thrown at an angle of θ2 = 60 degree with initial velocity v2 = 2 v1. Rock 1 lands at a distance D1 form the base of the cliff, while rock 2 lands at a distance D2 form the base of the cliff.
D1 < D2
D1 = D2
D1 > D2
after I graph the problem, it looks like D2 will travel a greater distance than D1. However, I don't know how to prove it in a scientific way?