Calculus : Lecture 1 - General discussion, history and infinitesimals

 $$\mathrm{The}~\varepsilon-\delta~\mathrm{blog}$$


Calculus : Lecture 1 - General discussion, history and infinitesimals


This is the first "lecture" of a series of lectures regarding calculus.

This will be a general discussion about calculus, why we need it, why it's not just a bunch of formulae to be memorized but rather something to be deeply understood and explored, like everything else in mathematics, how it all began, what the two general approaches to it are, how those two approaches compare against each other and which one do I prefer and why etc.


A concise history

Note : As the heading suggests, this isn't the full history of calculus, just a summary of sorts to give a basic idea of it's development. For a complete history, please visit this Wikipedia page about the history of calculus.

Mathematicians, even those from ancient times have been interested in changes in the values of functions and in finding areas and volumes. This is how the field of calculus was born.

Calculus originated with the need to find areas of figures and slopes of tangents of non-circular curves. The calculus that deals with slopes of tangents, which are also interpreted as "instantaneous" rates of change is known as "differential calculus" and the one that deals with finding areas under curves and volumes of certain shapes is known as "integral calculus".

Ancient Greeks used something called "the method of exhaustion" to prove statements related to areas of figures. It was used to deduce that $A_{\mathrm{circle}} \propto r^2$ too. That was most probably where it all began. It was, to some extent, a basic idea of limits. Ancient Greeks are also known to have made use of infinitesimals i.e. infinitely small numbers.

Archimedes is said to be the first one to find the slope of tangent to a non-circular curve which was a rough idea of differential calculus.

In the medieval period, calculus was further worked upon in India and Islamic Middle Easts.

A lot of other mathematicians worked on improving and further exploring calculus, inventing new ways of finding the slope of a tangent to a point on a function, the maximum and minimum values attained by a function (maxima and minima) and areas under curves but a major breakthrough was in late $17^{\mathrm{th}}$ century when Sir Isaac Newton and Gottfried Wilhelm Leibniz researched calculus.


Newton believed that calculus can play a crucial part in describing the motion of bodies and Leibniz took on the tangent problem, which involved finding the slopes of tangents to curves.

Newton supposedly began working on calculus around 1665, when he was just 22 years old. He was developing mechanics (now known as Newtonian mechanics) but the mathematics back then wasn't sufficient. So, he decided to develop the mathematics on his own. He didn't publish his results till late, though.

Leibniz supposedly began working around 1674 and published his first paper regarding calculus in 1684 titled "Nova Methodus pro Maximis et Minimis" (no, it's not the name of a smartphone, it's Latin) which translates to "A new method for maxima and minima".

Newton's supporters accused Leibniz of plagiarizing Newton's unpublished works and the fierce Newton-Leibniz controversy began. Newton was appointed PRS (President of the Royal Society) in 1703 and this gave him an advantage over Leibniz in the controversy. The "battle" lasted for over a decade till Leibniz's death in 1716.

It is now believed that both Newton and Leibniz independently developed calculus.


Newton posed his "method of fluxions" for finding the "instantaneous" rate of change of a function, which is now known as "derivative". He did it in the context of mechanics and so, he often used the term "time derivative" to convey "instantaneous rate of change of a function with respect to time". He called what we now call "derivative" as "fluxion". He is also said to have used the term "fluent" for what we now call a function. He made use of an infinitely small quantity in his methods of fluxions i.e. an infinitesimal.

For some fluent $f$, Newton denoted it's fluxion by $\dot f$.


Leibniz also made use of infinitesimals and called the "instantaneous" rate of change "the derivative". His derivative and integral notation are widely used today too.

Much like Newton, he thought of the derivative as a ratio of change in the value of a function with respect to the change in the variable for an infinitesimal change in the variable, he used the notation $\dfrac{\mathrm df(x)}{\mathrm dx}$ to denote the derivative of a function $f$ at $x$.

The notation $\Delta x$ is often used to denote the change in $x$ i.e. the difference in two values of it $(x_2-x_1)$. The $\mathrm d$ in the notation probably stands for the first letter of the name of the $\Delta$ symbol, delta. It is possible that it stands for "difference too".

Anyway, $\mathrm dx$ denotes a very small $\Delta x$. Similarly, $\mathrm df(x)$ denotes a very small $\Delta f(x)$.


So, basically, the derivative of a function $f$ at $x$ was then defined as follows

$$\dfrac{\mathrm df(x)}{\mathrm dx} = \dfrac{f(x+\Delta x)-f(x)}{\Delta x}$$

for an "infinitesimal" $\Delta x$. Newton also used the alphabet $o$ to denote the infinitesimal quantity.


Leibniz's notation for an integral was the summa symbol $\bigg(\displaystyle\int\bigg)$. He used $\displaystyle\int_a^bf(x)\mathrm dx$ to denote the area under the curve $y = f(x)$ for the interval $[a,b]$ i.e. for $x=a$ to $x=b$, the area under $y=f(x)$ was denoted by $\displaystyle\int_a^bf(x)\mathrm dx$.


Now that the derivative is commonly thought of as an operator rather than a ratio (more details later in the post), the notation $\dfrac{\mathrm df(x)}{\mathrm dx}$ is often written as $\dfrac{\mathrm d}{\mathrm dx}f(x)$ and $\dfrac{\mathrm d}{dx}$ is thought of as an operator that takes in functions and returns their derivative.


Another common notation to denote the derivative of $f$ at $x$ is $f'(x)$, which is "pronounced" as "$f$ prime".


So far, we've observed the rather magnificent development of calculus over the centuries from a rough idea to something with vast applications nowadays.


Roughly quoting Grant Sanderson from a YouTube channel $\mathrm{3Blue1Brown}$

(I initially learned calculus from his videos, they're great if you pay attention and are truly devoted to learning), "A lot of real life situations are modeled with the use of calculus and derivatives and that's why we need to know how to evaluate them".

Calculus also has vast applications in other fields of mathematics and various sciences, as I stated earlier.

That is why we need to understand it.


By the way, most mathematicians are more interested in something called "pure mathematics" rather than applied mathematics. They do it for the fun of it rather than the applications of it, which is doubtlessly exhilarating too. I'm saying this to tell you that there might not always be a purpose to exploring some field of mathematics. Sometimes it's all for the sake of fun. I think it's worth mentioning that things from pure mathematics may also have applications. Number theory was considered to be a part of pure mathematics for long but it showed vast applications in cryptography.

Something interesting : At higher levels, mathematicians even study higher dimensions like 4d, 5d and so on!


About why calculus isn't just a bunch of formulae to be memorized but rather something to be understood and explored, I don't have a clue as to why people think calculus (or anything in mathematics) is supposed to be memorized rather than understood. The very subject of mathematics is based on logical and critical thinking, not pointless memorization. Even if you aren't planning to become a mathematician, understanding mathematics can still vastly help you. It can teach you how to approach problems, even those in real life and develop your brain to think logically.

Another thing to keep in mind is that there is no point in "solving" questions in mathematics if you don't even know what you're doing. Don't think that you "understand" math just because you know the formulae and are able to "solve" the questions from a particular textbook. Anyone who knows algebra can apply the formulae, but understanding, that's something different than knowing. It really infuriates me when someone says that he/she understands it by just having memorized the formulae and "solved" questions from a book.


Now, coming back to calculus, the idea of infinitesimals was discarded because infinitesimal numbers can't exist in the real number system whereas the limit approach (will discuss more about the limit approach soon) does work in the standard real number system.

The reason I don't like infinitesimals is because they give an idea of approximation which makes me uncomfortable, no matter how small the error is.

Defining derivatives using infinitesimals gives me a sense of approximation whereas defining them using limits gives me a sense of accuracy.

By the way, there is a whole part of mathematics devoted to rigorously defining infinitesimals and then approaching calculus using them. It's called "non-standard analysis".


I'd like to end this post here.


In my next post, I would begin by explaining the idea of limits, (hopefully) their rigorous definition (the $\varepsilon-\delta$ definition), how they give a sense of accuracy unlike infinitesimals, etc.


$\text{~}\rho\alpha\xi\delta\epsilon\epsilon\pi~\sigma\iota\nu\delta\eta\theta$

Comments

  1. I didn't know you had a blog! --khaxan

    ReplyDelete
    Replies
    1. I mean it's completely inactive, so it's as good as not having one.
      Plus, I did this when I was a lil kid. I'm old now 👴
      I would re-read this but I'm afraid I might die of cringe.

      Delete

Post a Comment

Popular posts from this blog

About me

How I study and teach