PDA

View Full Version : Solving equations using functions and argument transformations


phufbv
Jul 12, 2011, 09:24 AM
Hello, I have a mathematics problem, though wasn't sure about an appropriate title. Here goes:

Consider a function g(t), and the argument transformation t_prime=\beta t, where \beta is an arbitrary constant. Assuming g(t)=const/t, under the above transformation, g(t_prime) = const/t_prime = const/(\beta t) = g(t)/(\beta). Thus g(t) = \beta g(t_prime).

Now I'm sure it should be simple, but I seem to be missing a trick. How can I now revert the logic and use the assumption g(t) = \beta g(t_prime) to then imply that g(t)=const/t?

Many thanks

ebaines
Jul 12, 2011, 10:45 AM
Given the function g(t) = \frac A t where A = constant, and given t' = \beta t where \beta is a constant, then


g(t) = \frac A t = \frac A {(\frac {t'} {\beta})}= \frac {A \beta} {t'} = \beta g(t').

So to do the inverse of this:


g(t) = \beta g(t') = \beta \frac A {t'} = \beta \frac A {\beta t} = \frac A t = g(t).


Is that what you're looking for?

phufbv
Jul 13, 2011, 02:27 AM
Hi, thanks for your reply! It's not quite what I'm looking for though: I think of the first part as finding g(t)= \beta g(t_prime) given the transformation t_prime=\beta t and the functional form g(t) = A/t. What I would like to know is indeed the inverse - how can I infer the functional form given just the argument transformation and the relation g(t) = \beta g(t_prime)? The problem with what you say above is that in the second equality in the inverse section you assume the form, g(t_prime)=a/(t_prime), which is what I want to prove as the end product. I'm not explaining it very well, does this make sense?

phufbv
Jul 13, 2011, 02:37 AM
Basically my problem is that I have the function g(t) = \beta g(t_prime) and the transformation t_prime = \beta t. It can be shown that solutions to this can have the form g(t) = const/t, simply by assuming g(t) = const/t, transforming t -> t_prime, then rearrange and the form g(t) = \beta g(t_prime) falls out, thus validating the assumption of g(t) = const/t. But I would like to find a stronger proof, to actually prove that this is the solution, rather than just saying it can be a solution. I think it must be possible, as if you assume a different form (for example) g(t) = A t, you get g(t) = (\beta / t^2) g(t_prime), the wrong relation - thus the transformation together with g(t) = \beta g(t_prime) must tell you something about the functional form mustn't it?

jcaron2
Jul 14, 2011, 09:07 PM
A couple of things: First, I believe the proper English word for what you trying to find is the "converse" of your original statement, not the inverse.

Second, it's not the end of the world, but we usually prefer that you use the "Answer" box, rather than the "Comment" box. It makes it much easier to follow the thread from all the different skins.

Anyway, that being said, now onto the somewhat hand-waving solution.

If you have g(t)=\beta g(t') and t'=\beta t, then you can write your equation as

g(t)=\beta g(\beta t)

or

g(\beta t)=\frac{1}{\beta}g(t)

Now because of the commutative property of multiplication, you can also say

g(\beta t)=g(t \beta)=\frac{1}{t}g(\beta)

Combining those last two equations, you get

\frac{1}{\beta}g(t)=\frac{1}{t}g(\beta)

g(t)=\frac{\beta g(\beta)}{t}

But since \beta g(\beta) is a constant, we can simply absorb it into some arbitrary constant \alpha and get

g(t)=\frac{\alpha}{t}



For what it's worth, I believe this is called a "Trivial Functional Decomposition Problem". The "Trivial" part comes about because one of the functions is linear (t'=\beta t). If you want to see some really, really complicated math, Google "Non-trivial Functional Decomposition Problem". NTFD theory is used extensively in cryptography among other things.