Ask Experts Questions for FREE Help !
Ask
    caksters's Avatar
    caksters Posts: 1, Reputation: 1
    New Member
     
    #1

    Feb 19, 2017, 02:15 AM
    why f(x+dx)=f(x) + d/dx f(x) dx ?
    Can please someone explain (or provide a reference) which explains the following:

    Assuming we have an arbitrary function f(x). If we increase x by delta x, the new function at x + dx is:

    f(x+dx)= f(x) + d/dx f(x) dx

    So far I have noticed it works, if I use generic formulas (e.g. f(x)=ax and f(x+dx)=a(x+dx)), but I don't fully understand this.

    Any help will be much appreciated
    jcaron2's Avatar
    jcaron2 Posts: 986, Reputation: 204
    Senior Member
     
    #2

    Feb 19, 2017, 02:47 PM
    Any function, no matter how crazy or nonlinear, looks like a straight line if you zoom in far enough. When you're looking at the value of a function at two points separated by an infinitesimal distance, dx, that's the ultimate manifestation of that same effect. As dx approaches zero, the function (ANY function!) looks like a straight line over that interval, and it's slope is df(x)/dx. Hence, you can approximate the function over that interval with the equation of a line. If we use the form (y-y0)=slope*(x-x0), we can substitute your values:

    f(x+dx)-f(x)=d/dx f(x)*((x+dx)-x),

    which simplifies to

    f(x+dx) = f(x) + d/dx f(x) * dx.

Not your question? Ask your question View similar questions

 

Question Tools Search this Question
Search this Question:

Advanced Search

Add your answer here.



View more questions Search