solving equations using functions and argument transformations
Hello, I have a mathematics problem, though wasn't sure about an appropriate title. Here goes:
Consider a function g(t), and the argument transformation t_prime=\beta t, where \beta is an arbitrary constant. Assuming g(t)=const/t, under the above transformation, g(t_prime) = const/t_prime = const/(\beta t) = g(t)/(\beta). Thus g(t) = \beta g(t_prime).
Now I'm sure it should be simple, but I seem to be missing a trick. How can I now revert the logic and use the assumption g(t) = \beta g(t_prime) to then imply that g(t)=const/t?
Many thanks
Comment on ebaines's post
Hi, thanks for your reply! It's not quite what I'm looking for though: I think of the first part as finding g(t)= \beta g(t_prime) given the transformation t_prime=\beta t and the functional form g(t) = A/t. What I would like to know is indeed the inverse - how can I infer the functional form given just the argument transformation and the relation g(t) = \beta g(t_prime)? The problem with what you say above is that in the second equality in the inverse section you assume the form, g(t_prime)=a/(t_prime), which is what I want to prove as the end product. I'm not explaining it very well, does this make sense?
Comment on ebaines's post
Basically my problem is that I have the function g(t) = \beta g(t_prime) and the transformation t_prime = \beta t. It can be shown that solutions to this can have the form g(t) = const/t, simply by assuming g(t) = const/t, transforming t -> t_prime, then rearrange and the form g(t) = \beta g(t_prime) falls out, thus validating the assumption of g(t) = const/t. But I would like to find a stronger proof, to actually prove that this is the solution, rather than just saying it can be a solution. I think it must be possible, as if you assume a different form (for example) g(t) = A t, you get g(t) = (\beta / t^2) g(t_prime), the wrong relation - thus the transformation together with g(t) = \beta g(t_prime) must tell you something about the functional form mustn't it?