Why does the series $sum_{n=1}^inftyfrac1n$ not converge?
up vote
120
down vote
favorite
Can someone give a simple explanation as to why the harmonic series
$$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$
doesn't converge, on the other hand it grows very slowly?
I'd prefer an easily comprehensible explanation rather than a rigorous proof regularly found in undergraduate textbooks.
calculus sequences-and-series harmonic-numbers
|
show 1 more comment
up vote
120
down vote
favorite
Can someone give a simple explanation as to why the harmonic series
$$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$
doesn't converge, on the other hand it grows very slowly?
I'd prefer an easily comprehensible explanation rather than a rigorous proof regularly found in undergraduate textbooks.
calculus sequences-and-series harmonic-numbers
3
This is not meant to be an answer but an interesting note. Suppose we denote $H(n) = 1/1 + 1/2 + ... + 1/n$ then $H(n!) - H((n-1)!) approx log(n)$ for large n. Does this give a hint? ;)
– Roupam Ghosh
Jul 11 '11 at 4:14
5
Here is a weakly related question: What is a textbook, or even a popularization for the general public, that (1) discusses infinite series, but (2) does not have an explanation for the divergence of this exact series?
– GEdgar
Nov 3 '13 at 19:50
to avoid defining the logarithm, use the Cauchy condensation test to show that $sum 1/n$ converges iff $sum 1$ converges
– reuns
Jan 30 '16 at 23:31
These are two of my favourite papers: The Harmonic Series Diverges Again and Again and More Proofs of Divergence of the Harmonic Series. See these.
– user249332
Mar 9 '16 at 22:56
If it converges, then it contradicts the dominated convergence theorem. This proof is easily comprehensible if you know the dominated convergence theorem, but that theorem is not the most comprehensible.
– Oiler
Oct 6 '16 at 23:08
|
show 1 more comment
up vote
120
down vote
favorite
up vote
120
down vote
favorite
Can someone give a simple explanation as to why the harmonic series
$$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$
doesn't converge, on the other hand it grows very slowly?
I'd prefer an easily comprehensible explanation rather than a rigorous proof regularly found in undergraduate textbooks.
calculus sequences-and-series harmonic-numbers
Can someone give a simple explanation as to why the harmonic series
$$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$
doesn't converge, on the other hand it grows very slowly?
I'd prefer an easily comprehensible explanation rather than a rigorous proof regularly found in undergraduate textbooks.
calculus sequences-and-series harmonic-numbers
calculus sequences-and-series harmonic-numbers
edited Jul 27 '16 at 15:16
haqnatural
20.6k72457
20.6k72457
asked Jul 21 '10 at 5:00
bryn
3,43773130
3,43773130
3
This is not meant to be an answer but an interesting note. Suppose we denote $H(n) = 1/1 + 1/2 + ... + 1/n$ then $H(n!) - H((n-1)!) approx log(n)$ for large n. Does this give a hint? ;)
– Roupam Ghosh
Jul 11 '11 at 4:14
5
Here is a weakly related question: What is a textbook, or even a popularization for the general public, that (1) discusses infinite series, but (2) does not have an explanation for the divergence of this exact series?
– GEdgar
Nov 3 '13 at 19:50
to avoid defining the logarithm, use the Cauchy condensation test to show that $sum 1/n$ converges iff $sum 1$ converges
– reuns
Jan 30 '16 at 23:31
These are two of my favourite papers: The Harmonic Series Diverges Again and Again and More Proofs of Divergence of the Harmonic Series. See these.
– user249332
Mar 9 '16 at 22:56
If it converges, then it contradicts the dominated convergence theorem. This proof is easily comprehensible if you know the dominated convergence theorem, but that theorem is not the most comprehensible.
– Oiler
Oct 6 '16 at 23:08
|
show 1 more comment
3
This is not meant to be an answer but an interesting note. Suppose we denote $H(n) = 1/1 + 1/2 + ... + 1/n$ then $H(n!) - H((n-1)!) approx log(n)$ for large n. Does this give a hint? ;)
– Roupam Ghosh
Jul 11 '11 at 4:14
5
Here is a weakly related question: What is a textbook, or even a popularization for the general public, that (1) discusses infinite series, but (2) does not have an explanation for the divergence of this exact series?
– GEdgar
Nov 3 '13 at 19:50
to avoid defining the logarithm, use the Cauchy condensation test to show that $sum 1/n$ converges iff $sum 1$ converges
– reuns
Jan 30 '16 at 23:31
These are two of my favourite papers: The Harmonic Series Diverges Again and Again and More Proofs of Divergence of the Harmonic Series. See these.
– user249332
Mar 9 '16 at 22:56
If it converges, then it contradicts the dominated convergence theorem. This proof is easily comprehensible if you know the dominated convergence theorem, but that theorem is not the most comprehensible.
– Oiler
Oct 6 '16 at 23:08
3
3
This is not meant to be an answer but an interesting note. Suppose we denote $H(n) = 1/1 + 1/2 + ... + 1/n$ then $H(n!) - H((n-1)!) approx log(n)$ for large n. Does this give a hint? ;)
– Roupam Ghosh
Jul 11 '11 at 4:14
This is not meant to be an answer but an interesting note. Suppose we denote $H(n) = 1/1 + 1/2 + ... + 1/n$ then $H(n!) - H((n-1)!) approx log(n)$ for large n. Does this give a hint? ;)
– Roupam Ghosh
Jul 11 '11 at 4:14
5
5
Here is a weakly related question: What is a textbook, or even a popularization for the general public, that (1) discusses infinite series, but (2) does not have an explanation for the divergence of this exact series?
– GEdgar
Nov 3 '13 at 19:50
Here is a weakly related question: What is a textbook, or even a popularization for the general public, that (1) discusses infinite series, but (2) does not have an explanation for the divergence of this exact series?
– GEdgar
Nov 3 '13 at 19:50
to avoid defining the logarithm, use the Cauchy condensation test to show that $sum 1/n$ converges iff $sum 1$ converges
– reuns
Jan 30 '16 at 23:31
to avoid defining the logarithm, use the Cauchy condensation test to show that $sum 1/n$ converges iff $sum 1$ converges
– reuns
Jan 30 '16 at 23:31
These are two of my favourite papers: The Harmonic Series Diverges Again and Again and More Proofs of Divergence of the Harmonic Series. See these.
– user249332
Mar 9 '16 at 22:56
These are two of my favourite papers: The Harmonic Series Diverges Again and Again and More Proofs of Divergence of the Harmonic Series. See these.
– user249332
Mar 9 '16 at 22:56
If it converges, then it contradicts the dominated convergence theorem. This proof is easily comprehensible if you know the dominated convergence theorem, but that theorem is not the most comprehensible.
– Oiler
Oct 6 '16 at 23:08
If it converges, then it contradicts the dominated convergence theorem. This proof is easily comprehensible if you know the dominated convergence theorem, but that theorem is not the most comprehensible.
– Oiler
Oct 6 '16 at 23:08
|
show 1 more comment
21 Answers
21
active
oldest
votes
up vote
134
down vote
accepted
Let's group the terms as follows:
Group $1$ : $displaystylefrac11qquad$ ($1$ term)
Group $2$ : $displaystylefrac12+frac13qquad$($2$ terms)
Group $3$ : $displaystylefrac14+frac15+frac16+frac17qquad$($4$ terms)
Group $4$ : $displaystylefrac18+frac19+cdots+frac1{15}qquad$ ($8$ terms)
$quadvdots$
In general, group $n$ contains $2^{n-1}$ terms. But also, notice that the smallest element in group $n$ is larger than $dfrac1{2^n}$. For example all elements in group $2$ are larger than $dfrac1{2^2}$. So the sum of the terms in each group is larger than $2^{n-1} cdot dfrac1{2^n} = dfrac1{2}$. Since there are infinitely many groups, and the sum in each group is larger than $dfrac1{2}$, it follows that the total sum is infinite.
This proof is often attributed to Nicole Oresme.
8
+1: This is nice: it's easy to turn this into a rigorous proof, and it even gives you a lower bound for the order of growth!
– Simon Nickerson
Jul 21 '10 at 5:19
2
I assume you mean that group 4 as 8 terms? Or do you mean to go all the way to 1/23?
– Tomas Aschan
Jul 21 '10 at 7:37
1
Is there a closed-form function for this value?
– John Gietzen
Jul 21 '10 at 18:29
2
Interestingly, this proof goes as far back as Nicole Oresme in the 14th century. Wikipedia has a nice display of this proof [en.wikipedia.org/wiki/Harmonic_series_%28mathematics%29]
– Neil Mayhew
Jul 22 '10 at 13:20
1
@John: There's no explicit closed-form, but they're generally known as the Harmonic Numbers; there are a number of identities involving them (how to sum them or sum multiples of them, etc.)
– Steven Stadnicki
Jul 10 '11 at 21:23
|
show 2 more comments
up vote
37
down vote
There is a fantastic collection of $20$ different proofs that this series diverges. I recommend you read it (it can be found here). I especially like proof $14$, which appeals to triangular numbers for a sort of cameo role.
EDIT
It seems the original link is broken, due to the author moving to his own site. So I followed up and found the new link. In addition, the author has an extended addendum, bringing the total number of proofs to 42+.
2
Proof 6 is also nice.
– leonbloy
Mar 15 '13 at 20:51
Apparently, the list has been updated.
– David Mitra
Jun 19 '13 at 16:15
add a comment |
up vote
23
down vote
Let's group the terms as follows:$$A=frac11+frac12+frac13+frac14+cdots\ $$
$$
A=underbrace{(frac{1}{1}+frac{1}{2}+frac{1}{3}+cdots+frac{1}{9})}_{color{red} {9- terms}}
+underbrace{(frac{1}{10}+frac{1}{11}+frac{1}{12}+cdots+frac{1}{99})}_{color{red} {90- terms}}\+underbrace{(frac{1}{101}+frac{1}{102}+frac{1}{103}+cdots+frac{1}{999})}_{color{red} {900- terms}}+cdots \ to $$
$$\A>9 times(frac{1}{10})+(99-10+1)times frac{1}{100}+(999-100+1)times frac{1}{1000}+... \A>frac{9}{10}+frac{90}{100}+frac{90}{100}+frac{900}{1000}+...\ to A>underbrace{frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+...}_{color{red} {text{ m group} ,text{ and} space mto infty}} to infty
$$
Showing that $A$ diverges by grouping numbers.
add a comment |
up vote
22
down vote
The answer given by AgCl is a classic one. And possibly pedagogically best; I don't know.
I also like the following argument. I'm not sure what students who are new to the topic will think about it.
Suppose 1 + 1/2 + 1/3 + 1/4 + ... adds up to some finite total S. Now group terms in the following way:
$$1 + frac{1}{2} > frac{1}{2} + frac{1}{2} = frac{2}{2} = 1$$
$$frac{1}{3} + frac{1}{4} > frac{1}{4} + frac{1}{4} = frac{2}{4} = frac{1}{2}$$
$$frac{1}{5} + frac{1}{6} > frac{1}{6} + frac{1}{6} = frac{2}{6} = frac{1}{3}$$
Continuing in this way, we get $S > S$, a contradiction.
1
Not really. From $S_n > T_n$ you can only conclude that $lim S_n ge lim T_n$.
– lhf
Jul 10 '11 at 21:24
6
@lhf: That's right, but that can be easily fixed here (with $S_n = 1 + 1/2 + dots + 1/2n$ and $T_n = 1 + 1/2 + dots + 1/n$): we can use a better inequality, like say $S_n ge T_n + 1/2$ (using just the first step) to conclude that $lim S_n ge lim T_n + 1/2$, contradicting $S = lim S_n = lim T_n$.
– ShreevatsaR
Jul 11 '11 at 4:18
add a comment |
up vote
19
down vote
An alternative proof (translated and adapted from this comment by Filipe Oliveira, in Portuguese, posted also here). Let $ f(x)=ln(1+x)$. Then $f'(x)=dfrac {1}{1+x}$ and $ f'(0)=1$. Hence
$$displaystylelim_{xto 0}dfrac{ln(1+x)}{x}=lim_{xto 0}dfrac{ln(1+x)-ln(1)}{x-0}=1,$$
and
$$ displaystylelim_{ntoinfty} dfrac{lnleft(1+dfrac{1}{n}right)}{dfrac {1}{n}}=1>0.$$
So, the series $displaystylesumdfrac{1}{n}$ and $displaystylesumlnleft(1+dfrac {1}{n}right)$ are both convergent or divergent. Since
$$lnleft(1+dfrac {1}{n}right)=lnleft(dfrac{n+1}{n}right)=ln (n+1)-ln(n),$$
we have
$$displaystylesum_{n=1}^Nlnleft(1+dfrac {1}{n}right)=ln(N+1)-ln(1)=ln(N+1).$$
Thus $displaystylesum_{n=1}^{infty}lnleft(1+dfrac {1}{n}right)$ is divergent and so is $displaystylesum_{n=1}^{infty}dfrac{1}{n}$.
add a comment |
up vote
19
down vote
This is not as good an answer as AgCl's, nonetheless people may find it interesting.
If you're used to calculus then you might notice that the sum $$
1+frac{1}{2}+frac{1}{3}+dots+frac{1}{n}$$ is very close to the integral from $1$ to $n$ of $frac{1}{x}$. This definite integral is ln(n), so you should expect $1+frac{1}{2}+frac{1}{3}+dots+
frac{1}{n}$ to grow like $ln(n)$.
Although this argument can be made rigorous, it's still unsatisfying because it depends on the fact that the derivative of $ln(x)$ is $frac{1}{x}$, which is probably harder than the original question. Nonetheless it does illustrate a good general heuristic for quickly determining how sums behave if you already know calculus.
2
If you look at a Riemann sum for intervals with width 1, you can pretty quickly see that the integral of 1/x from 1 to infinity must be less than the sum of the harmonic series.
– Isaac
Jul 21 '10 at 5:51
Thank you for adding this answer. I was hoping to avoid an answer that involved integration, so I also prefer AgCl's answer. But I am happy to see more than one demonstration/proof.
– bryn
Jul 22 '10 at 11:33
The sum is closer to the integral from $frac{1}{2}$ to $n+frac{1}{2}$ of $frac{1}{x}$, which is $log(2n+1)$ math.stackexchange.com/a/1602945/134791
– Jaume Oliver Lafont
Jan 25 '16 at 23:02
add a comment |
up vote
17
down vote
This was bumped, so I'll add a proof sweet proof I saw in this site. Exponentiate $H_n$ and get $$e^{H_n}=prod_{k=1}^n e^{1/k}gtprod_{k=1}^nleft(1+frac{1}{k}right)=n+1.$$ Therefore, $H_ngtlog(n+1)$, so we are done. We used $e^xgt1+x$ and telescoped the resulting product.
Oh, that's unique.
– Simply Beautiful Art
Oct 7 '16 at 1:12
add a comment |
up vote
13
down vote
Another interesting proof is based upon one of the consequences of the Lagrange's theorem applied on $ln(x)$ function, namely:
$$frac{1}{k+1} < ln(k+1)-ln(k)<frac{1}{k} space , space kinmathbb{N} ,space k>0$$
Taking $k=1,2,...,n$ values to the inequality and then summing all relations, we get the required result.
The proof is complete.
add a comment |
up vote
11
down vote
There also exists a proof for the divergence of the harmonic series that involves the Integral Test. It goes as follows.
It is possible to prove that the harmonic series diverges by comparing its sum with an improper integral. Specifically, consider the arrangement of rectangles shown in the figure of $ y = dfrac {1}{x} $:
Each rectangle is $1$ unit wide and $frac{1}{n}$ units high, so the total area of the rectangles is the sum of the harmonic series: $$ displaystylesum left( text {enclosed rectangle are} right) = displaystylesum_{k=1}^{infty} dfrac {1}{k}. $$Now, the total area under the curve is given by $$ displaystyleint_{1}^{infty} dfrac {mathrm{d}x}{x} = infty. $$Since this area is entirely contained within the rectangles, the total area of the rectangles must be infinite as well. More precisely, this proves that $$ displaystylesum_{n=1}^{k} dfrac {1}{n} > displaystyleint_{1}^{k+1} dfrac {mathrm{d}x}{x} = ln (k+1). $$This is the backbone of what we know today as the integral test.
Interestingly, the alternating harmonic series does converge: $$ displaystylesum_{n=1}^{infty} dfrac {(-1)^n}{n} = ln 2. $$And so does the $p$-harmonic series with $p>1$.
add a comment |
up vote
8
down vote
Let's assume that $sum_{n=1}^{infty}frac1n=:Hin mathbb{R}$, then
$$H=frac11+frac12+frac13+frac14+frac15+frac16 +ldots $$
$$Hgeqslant frac11+frac12 +frac14+frac14+frac16+frac16+ldots$$
$$Hgeqslant frac11+frac12+frac12+frac13+frac14+frac15+ldots$$
$$Hgeqslant frac12 +H Rightarrow 0geqslant frac12$$
Since the last inequality doesn't hold, we can conclude that the sum doesn't converge.
add a comment |
up vote
8
down vote
$$int_{0}^{infty}e^{-nx}dx=frac1n$$
$$sum_{n=1}^{infty}int_{0}^{infty}e^{-nx}dx=lim_{ m to infty}sum_{n=1}^{m}frac1n$$
using the law of Geometric series
$$int_{0}^{infty}(frac{1}{1-e^{-x}}-1)dx=lim_{ m to infty}H_m$$
$$lim_{ m to infty}H_m=left [ ln(e^x-1)-x right ]_0^{infty}toinfty$$
Hm, the lower bound goes to $-infty$ it appears.
– Simply Beautiful Art
Oct 6 '16 at 16:40
@SimpleArt The upper bound goes to 0 and The Lower goes to $+infty$ ,,,$-ln 0^+$
– mhd.math
Oct 7 '16 at 11:20
Oh right, duh, didn't quite use that FTOC correctly.
– Simply Beautiful Art
Oct 7 '16 at 13:19
add a comment |
up vote
7
down vote
Another (different) answer, by the Cauchy Condensation Test :
$$sum_{n=1}^infty frac{1}{n} < infty iff sum_{n=1}^infty 2^n frac{1}{2^n} = sum_{n=1}^infty 1< infty $$
The latter is obviously divergent, therefore the former diverges. This is THE shortest proof there is.
add a comment |
up vote
7
down vote
Suppose to the contrary that converges.
Let $s_n$ denote the $n$-th partial sum. Since the serie converges, $(s_n)$ is a Cauchy sequence. Let $varepsilon = 1/3$, then there is some $n_0$ such that $|s_q-s_p|< 1/3$ for all $q>pge n_0$. Let $q=2n_0$ and $p=n_0$. Then
$$frac{1}{3}>bigg|sum_{n=n_0+1}^{2n_0} frac{1}{n}bigg|gebigg|sum_{n=n_0+1}^{2n_0} frac{1}{2n_0}bigg|=frac{1}{2}$$
a contradiction. Then this contradiction shows that the series diverges.
add a comment |
up vote
7
down vote
First suppose $displaystyle A=frac11+frac12+frac13+frac14+cdots$ converges
then show that $A>A$. That's paradox.
11
Ideally use Latex.
– Meow
Nov 1 '13 at 17:13
2
I am afraid that this approach is incorrect, since similar versions of it can be applied to convergent infinite series. The reason that it does not work is because the number of terms is infinite. All you've proven is that the second series approaches the final value faster than the first one. But whether this value is ultimately finite or not, you have not shown.
– Lucian
Jan 14 '15 at 16:32
add a comment |
up vote
6
down vote
A non-rigorous explanation I thought of once: consider a savings scheme where you put a dollar in your piggy bank every day. So after $n$ days, you have $n$ dollars; clearly, your savings approach infinity. On the other hand, each day you add an additional $1/n$ proportion of your existing savings, "so" (the non-rigorous step) the accumulated percentage after $n$ days is $1 + 1/2 + cdots + 1/n$.
This can be made rigorous through the infinite product argument
$$prod_{n = 1}^infty (1 + tfrac{1}{n}) < infty iff sum_{n = 1}^infty frac{1}{n} < infty$$
which is obtained, essentially, by taking the logarithm of the left-hand side and using the power series for $log (1 + x)$.
add a comment |
up vote
6
down vote
Another answer that's very similar to others. But it's prettier, and perhaps easier to understand for the 9-th grade student who asked the same question here.
The student's question was ... does the sum equal some number $S$. But, look:
So, whatever it is, $S$ is larger than the sum of the infinite string of $tfrac12$'s shown in the last line. No number can be this large, so $S$ can't be equal to any number. Mathematicians say that the series "diverges to infinity".
add a comment |
up vote
5
down vote
I think the integral test gives the most intuitive explanation. Observe that $$int^n_1 frac1x dx= log n$$ The sum $displaystylesum^n_{k=1}frac1k$ can be viewed as the area of $n$ rectangles of height $frac1k$, width $1$ (with the first one having it's left hand side on the y axis, and all having their bottom on the x axis). The graph of $xmapsto frac1x$ can be drawn under these, so the sum will grow with $n$ at (least) as fast as the integral - hence will grow (at least) logarithmically.
use log to get nice formatting for $log$
– Tyler
Nov 1 '13 at 17:24
add a comment |
up vote
4
down vote
Let be the partial sum $H_n = frac11 + frac12 + frac13 + cdots + frac1n$. Using Cesàro-Stolz:
$$
lim_{ntoinfty}frac{H_n}{log n} = lim_{ntoinfty}frac{H_{n+1}-H_n}{log(n+1)-log n} = lim_{ntoinfty}frac{frac1{n+1}}{log(1+1/n)}
= lim_{ntoinfty}frac{frac1{n+1}}{frac1n} = 1
$$
and
$$sum_{n=1}^inftyfrac1n = lim_{ntoinfty}H_n = infty.$$
add a comment |
up vote
2
down vote
We all know that $$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$ diverges and grows very slowly!! I have seen many proofs of the result but recently found the following: $$S =frac 1 1 + frac 12 + frac 13 +frac 14+ frac 15+ frac 16+ cdots$$ $$> frac 12+frac 12+ frac 14+ frac 14+ frac 16+ frac 16+ cdots =frac 1 1 + frac 12 + frac 13 +cdots = S.$$
In this way we see that $S > S$.
O.o This. Is. Amazing!! =)
– user378947
Dec 12 '16 at 1:18
You can also see it here math.stackexchange.com/questions/1160527/…
– user8795
Dec 12 '16 at 1:20
I have saved this to my personal The Book :) That being said... Come on! The last inequality itself is proof enough!! :P
– user378947
Dec 12 '16 at 1:31
add a comment |
up vote
2
down vote
Using Euler's form of the Harmonic numbers,
$$sum_{k=1}^nfrac1k=int_0^1frac{1-x^n}{1-x}dx$$
$$begin{align}
lim_{ntoinfty}sum_{k=1}^nfrac1k & =lim_{ntoinfty}int_0^1frac{1-x^n}{1-x}dx \
& =int_0^1frac1{1-x}dx \
& =left.lim_{pto1^+}-ln(1-x)right]_0^p \
& to+infty
end{align}$$
Using the Taylor expansion of $ln(1-x)$,
$$-ln(1-x)=x+frac{x^2}2+frac{x^3}3+frac{x^4}4+dots$$
$$-ln(1-1)=1+frac12+frac13+frac14+dotsquad $$
Using Euler's relationship between the Riemann zeta function and the Dirichlet eta function,
$$begin{align}
sum_{k=1}^inftyfrac1{k^s} & =frac1{1-2^{1-s}}sum_{k=1}^inftyfrac{(-1)^{k+1}}{k^s} \
sum_{k=1}^inftyfrac1k & =frac10sum_{k=1}^inftyfrac{(-1)^{k+1}}ktag{$s=1$} \
& to+infty
end{align}$$
But isn't the series for $ln(1-x)$ only valid for $-1le x<1$?
– TheSimpliFire
Mar 17 at 9:31
Yes, but since the limit as $xto1$ in $x^n$ is monotone, it equals the asked series, if they exist.
– Simply Beautiful Art
Mar 19 at 2:33
add a comment |
up vote
-3
down vote
A series converges if and only if the tail of the series tends to zero, i.e. the summation from N to infinity tends to zero for N to infinity. But in case of the Harmonic series, we have that the summation from N to 2 N is larger than the smallest term ( which is 1/(2N)), times the number of terms N, which yields 1/2. So, the tail clearly does not tend to zero for N to infinity.
2
What do you think you were adding that hasn't been addressed thoroughly?
– user223391
Jan 31 '16 at 1:05
@avid19 I added an argument that's tl;dr-proof.
– Count Iblis
Jan 31 '16 at 4:29
@ZacharySelk My proof is the best, as it's the most concise, self contained proof given. Unlike the other proofs my proof can be modified into an argument that a 6 year old could understand (with some effort).
– Count Iblis
Oct 6 '16 at 22:35
2
@CountIblis Your proof is mathematically identical to the six-year-earlier (and top-voted, and accepted) answer by AgCl, which also explains the situation much more clearly. (Note that bringing tails into the picture is unnecessary, and just adds a layer of complexity: we can reason about the blocks directly as AgCl does, and as you do essentially, to show that the harmonic series goes to infinity.)
– Noah Schweber
Oct 11 '16 at 1:52
add a comment |
21 Answers
21
active
oldest
votes
21 Answers
21
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
134
down vote
accepted
Let's group the terms as follows:
Group $1$ : $displaystylefrac11qquad$ ($1$ term)
Group $2$ : $displaystylefrac12+frac13qquad$($2$ terms)
Group $3$ : $displaystylefrac14+frac15+frac16+frac17qquad$($4$ terms)
Group $4$ : $displaystylefrac18+frac19+cdots+frac1{15}qquad$ ($8$ terms)
$quadvdots$
In general, group $n$ contains $2^{n-1}$ terms. But also, notice that the smallest element in group $n$ is larger than $dfrac1{2^n}$. For example all elements in group $2$ are larger than $dfrac1{2^2}$. So the sum of the terms in each group is larger than $2^{n-1} cdot dfrac1{2^n} = dfrac1{2}$. Since there are infinitely many groups, and the sum in each group is larger than $dfrac1{2}$, it follows that the total sum is infinite.
This proof is often attributed to Nicole Oresme.
8
+1: This is nice: it's easy to turn this into a rigorous proof, and it even gives you a lower bound for the order of growth!
– Simon Nickerson
Jul 21 '10 at 5:19
2
I assume you mean that group 4 as 8 terms? Or do you mean to go all the way to 1/23?
– Tomas Aschan
Jul 21 '10 at 7:37
1
Is there a closed-form function for this value?
– John Gietzen
Jul 21 '10 at 18:29
2
Interestingly, this proof goes as far back as Nicole Oresme in the 14th century. Wikipedia has a nice display of this proof [en.wikipedia.org/wiki/Harmonic_series_%28mathematics%29]
– Neil Mayhew
Jul 22 '10 at 13:20
1
@John: There's no explicit closed-form, but they're generally known as the Harmonic Numbers; there are a number of identities involving them (how to sum them or sum multiples of them, etc.)
– Steven Stadnicki
Jul 10 '11 at 21:23
|
show 2 more comments
up vote
134
down vote
accepted
Let's group the terms as follows:
Group $1$ : $displaystylefrac11qquad$ ($1$ term)
Group $2$ : $displaystylefrac12+frac13qquad$($2$ terms)
Group $3$ : $displaystylefrac14+frac15+frac16+frac17qquad$($4$ terms)
Group $4$ : $displaystylefrac18+frac19+cdots+frac1{15}qquad$ ($8$ terms)
$quadvdots$
In general, group $n$ contains $2^{n-1}$ terms. But also, notice that the smallest element in group $n$ is larger than $dfrac1{2^n}$. For example all elements in group $2$ are larger than $dfrac1{2^2}$. So the sum of the terms in each group is larger than $2^{n-1} cdot dfrac1{2^n} = dfrac1{2}$. Since there are infinitely many groups, and the sum in each group is larger than $dfrac1{2}$, it follows that the total sum is infinite.
This proof is often attributed to Nicole Oresme.
8
+1: This is nice: it's easy to turn this into a rigorous proof, and it even gives you a lower bound for the order of growth!
– Simon Nickerson
Jul 21 '10 at 5:19
2
I assume you mean that group 4 as 8 terms? Or do you mean to go all the way to 1/23?
– Tomas Aschan
Jul 21 '10 at 7:37
1
Is there a closed-form function for this value?
– John Gietzen
Jul 21 '10 at 18:29
2
Interestingly, this proof goes as far back as Nicole Oresme in the 14th century. Wikipedia has a nice display of this proof [en.wikipedia.org/wiki/Harmonic_series_%28mathematics%29]
– Neil Mayhew
Jul 22 '10 at 13:20
1
@John: There's no explicit closed-form, but they're generally known as the Harmonic Numbers; there are a number of identities involving them (how to sum them or sum multiples of them, etc.)
– Steven Stadnicki
Jul 10 '11 at 21:23
|
show 2 more comments
up vote
134
down vote
accepted
up vote
134
down vote
accepted
Let's group the terms as follows:
Group $1$ : $displaystylefrac11qquad$ ($1$ term)
Group $2$ : $displaystylefrac12+frac13qquad$($2$ terms)
Group $3$ : $displaystylefrac14+frac15+frac16+frac17qquad$($4$ terms)
Group $4$ : $displaystylefrac18+frac19+cdots+frac1{15}qquad$ ($8$ terms)
$quadvdots$
In general, group $n$ contains $2^{n-1}$ terms. But also, notice that the smallest element in group $n$ is larger than $dfrac1{2^n}$. For example all elements in group $2$ are larger than $dfrac1{2^2}$. So the sum of the terms in each group is larger than $2^{n-1} cdot dfrac1{2^n} = dfrac1{2}$. Since there are infinitely many groups, and the sum in each group is larger than $dfrac1{2}$, it follows that the total sum is infinite.
This proof is often attributed to Nicole Oresme.
Let's group the terms as follows:
Group $1$ : $displaystylefrac11qquad$ ($1$ term)
Group $2$ : $displaystylefrac12+frac13qquad$($2$ terms)
Group $3$ : $displaystylefrac14+frac15+frac16+frac17qquad$($4$ terms)
Group $4$ : $displaystylefrac18+frac19+cdots+frac1{15}qquad$ ($8$ terms)
$quadvdots$
In general, group $n$ contains $2^{n-1}$ terms. But also, notice that the smallest element in group $n$ is larger than $dfrac1{2^n}$. For example all elements in group $2$ are larger than $dfrac1{2^2}$. So the sum of the terms in each group is larger than $2^{n-1} cdot dfrac1{2^n} = dfrac1{2}$. Since there are infinitely many groups, and the sum in each group is larger than $dfrac1{2}$, it follows that the total sum is infinite.
This proof is often attributed to Nicole Oresme.
edited Jul 29 '14 at 5:35
Tunk-Fey
22.9k868100
22.9k868100
answered Jul 21 '10 at 5:13
AgCl
3,90753134
3,90753134
8
+1: This is nice: it's easy to turn this into a rigorous proof, and it even gives you a lower bound for the order of growth!
– Simon Nickerson
Jul 21 '10 at 5:19
2
I assume you mean that group 4 as 8 terms? Or do you mean to go all the way to 1/23?
– Tomas Aschan
Jul 21 '10 at 7:37
1
Is there a closed-form function for this value?
– John Gietzen
Jul 21 '10 at 18:29
2
Interestingly, this proof goes as far back as Nicole Oresme in the 14th century. Wikipedia has a nice display of this proof [en.wikipedia.org/wiki/Harmonic_series_%28mathematics%29]
– Neil Mayhew
Jul 22 '10 at 13:20
1
@John: There's no explicit closed-form, but they're generally known as the Harmonic Numbers; there are a number of identities involving them (how to sum them or sum multiples of them, etc.)
– Steven Stadnicki
Jul 10 '11 at 21:23
|
show 2 more comments
8
+1: This is nice: it's easy to turn this into a rigorous proof, and it even gives you a lower bound for the order of growth!
– Simon Nickerson
Jul 21 '10 at 5:19
2
I assume you mean that group 4 as 8 terms? Or do you mean to go all the way to 1/23?
– Tomas Aschan
Jul 21 '10 at 7:37
1
Is there a closed-form function for this value?
– John Gietzen
Jul 21 '10 at 18:29
2
Interestingly, this proof goes as far back as Nicole Oresme in the 14th century. Wikipedia has a nice display of this proof [en.wikipedia.org/wiki/Harmonic_series_%28mathematics%29]
– Neil Mayhew
Jul 22 '10 at 13:20
1
@John: There's no explicit closed-form, but they're generally known as the Harmonic Numbers; there are a number of identities involving them (how to sum them or sum multiples of them, etc.)
– Steven Stadnicki
Jul 10 '11 at 21:23
8
8
+1: This is nice: it's easy to turn this into a rigorous proof, and it even gives you a lower bound for the order of growth!
– Simon Nickerson
Jul 21 '10 at 5:19
+1: This is nice: it's easy to turn this into a rigorous proof, and it even gives you a lower bound for the order of growth!
– Simon Nickerson
Jul 21 '10 at 5:19
2
2
I assume you mean that group 4 as 8 terms? Or do you mean to go all the way to 1/23?
– Tomas Aschan
Jul 21 '10 at 7:37
I assume you mean that group 4 as 8 terms? Or do you mean to go all the way to 1/23?
– Tomas Aschan
Jul 21 '10 at 7:37
1
1
Is there a closed-form function for this value?
– John Gietzen
Jul 21 '10 at 18:29
Is there a closed-form function for this value?
– John Gietzen
Jul 21 '10 at 18:29
2
2
Interestingly, this proof goes as far back as Nicole Oresme in the 14th century. Wikipedia has a nice display of this proof [en.wikipedia.org/wiki/Harmonic_series_%28mathematics%29]
– Neil Mayhew
Jul 22 '10 at 13:20
Interestingly, this proof goes as far back as Nicole Oresme in the 14th century. Wikipedia has a nice display of this proof [en.wikipedia.org/wiki/Harmonic_series_%28mathematics%29]
– Neil Mayhew
Jul 22 '10 at 13:20
1
1
@John: There's no explicit closed-form, but they're generally known as the Harmonic Numbers; there are a number of identities involving them (how to sum them or sum multiples of them, etc.)
– Steven Stadnicki
Jul 10 '11 at 21:23
@John: There's no explicit closed-form, but they're generally known as the Harmonic Numbers; there are a number of identities involving them (how to sum them or sum multiples of them, etc.)
– Steven Stadnicki
Jul 10 '11 at 21:23
|
show 2 more comments
up vote
37
down vote
There is a fantastic collection of $20$ different proofs that this series diverges. I recommend you read it (it can be found here). I especially like proof $14$, which appeals to triangular numbers for a sort of cameo role.
EDIT
It seems the original link is broken, due to the author moving to his own site. So I followed up and found the new link. In addition, the author has an extended addendum, bringing the total number of proofs to 42+.
2
Proof 6 is also nice.
– leonbloy
Mar 15 '13 at 20:51
Apparently, the list has been updated.
– David Mitra
Jun 19 '13 at 16:15
add a comment |
up vote
37
down vote
There is a fantastic collection of $20$ different proofs that this series diverges. I recommend you read it (it can be found here). I especially like proof $14$, which appeals to triangular numbers for a sort of cameo role.
EDIT
It seems the original link is broken, due to the author moving to his own site. So I followed up and found the new link. In addition, the author has an extended addendum, bringing the total number of proofs to 42+.
2
Proof 6 is also nice.
– leonbloy
Mar 15 '13 at 20:51
Apparently, the list has been updated.
– David Mitra
Jun 19 '13 at 16:15
add a comment |
up vote
37
down vote
up vote
37
down vote
There is a fantastic collection of $20$ different proofs that this series diverges. I recommend you read it (it can be found here). I especially like proof $14$, which appeals to triangular numbers for a sort of cameo role.
EDIT
It seems the original link is broken, due to the author moving to his own site. So I followed up and found the new link. In addition, the author has an extended addendum, bringing the total number of proofs to 42+.
There is a fantastic collection of $20$ different proofs that this series diverges. I recommend you read it (it can be found here). I especially like proof $14$, which appeals to triangular numbers for a sort of cameo role.
EDIT
It seems the original link is broken, due to the author moving to his own site. So I followed up and found the new link. In addition, the author has an extended addendum, bringing the total number of proofs to 42+.
edited Nov 3 '13 at 18:26
answered Jul 11 '11 at 4:08
davidlowryduda♦
73.8k7116248
73.8k7116248
2
Proof 6 is also nice.
– leonbloy
Mar 15 '13 at 20:51
Apparently, the list has been updated.
– David Mitra
Jun 19 '13 at 16:15
add a comment |
2
Proof 6 is also nice.
– leonbloy
Mar 15 '13 at 20:51
Apparently, the list has been updated.
– David Mitra
Jun 19 '13 at 16:15
2
2
Proof 6 is also nice.
– leonbloy
Mar 15 '13 at 20:51
Proof 6 is also nice.
– leonbloy
Mar 15 '13 at 20:51
Apparently, the list has been updated.
– David Mitra
Jun 19 '13 at 16:15
Apparently, the list has been updated.
– David Mitra
Jun 19 '13 at 16:15
add a comment |
up vote
23
down vote
Let's group the terms as follows:$$A=frac11+frac12+frac13+frac14+cdots\ $$
$$
A=underbrace{(frac{1}{1}+frac{1}{2}+frac{1}{3}+cdots+frac{1}{9})}_{color{red} {9- terms}}
+underbrace{(frac{1}{10}+frac{1}{11}+frac{1}{12}+cdots+frac{1}{99})}_{color{red} {90- terms}}\+underbrace{(frac{1}{101}+frac{1}{102}+frac{1}{103}+cdots+frac{1}{999})}_{color{red} {900- terms}}+cdots \ to $$
$$\A>9 times(frac{1}{10})+(99-10+1)times frac{1}{100}+(999-100+1)times frac{1}{1000}+... \A>frac{9}{10}+frac{90}{100}+frac{90}{100}+frac{900}{1000}+...\ to A>underbrace{frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+...}_{color{red} {text{ m group} ,text{ and} space mto infty}} to infty
$$
Showing that $A$ diverges by grouping numbers.
add a comment |
up vote
23
down vote
Let's group the terms as follows:$$A=frac11+frac12+frac13+frac14+cdots\ $$
$$
A=underbrace{(frac{1}{1}+frac{1}{2}+frac{1}{3}+cdots+frac{1}{9})}_{color{red} {9- terms}}
+underbrace{(frac{1}{10}+frac{1}{11}+frac{1}{12}+cdots+frac{1}{99})}_{color{red} {90- terms}}\+underbrace{(frac{1}{101}+frac{1}{102}+frac{1}{103}+cdots+frac{1}{999})}_{color{red} {900- terms}}+cdots \ to $$
$$\A>9 times(frac{1}{10})+(99-10+1)times frac{1}{100}+(999-100+1)times frac{1}{1000}+... \A>frac{9}{10}+frac{90}{100}+frac{90}{100}+frac{900}{1000}+...\ to A>underbrace{frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+...}_{color{red} {text{ m group} ,text{ and} space mto infty}} to infty
$$
Showing that $A$ diverges by grouping numbers.
add a comment |
up vote
23
down vote
up vote
23
down vote
Let's group the terms as follows:$$A=frac11+frac12+frac13+frac14+cdots\ $$
$$
A=underbrace{(frac{1}{1}+frac{1}{2}+frac{1}{3}+cdots+frac{1}{9})}_{color{red} {9- terms}}
+underbrace{(frac{1}{10}+frac{1}{11}+frac{1}{12}+cdots+frac{1}{99})}_{color{red} {90- terms}}\+underbrace{(frac{1}{101}+frac{1}{102}+frac{1}{103}+cdots+frac{1}{999})}_{color{red} {900- terms}}+cdots \ to $$
$$\A>9 times(frac{1}{10})+(99-10+1)times frac{1}{100}+(999-100+1)times frac{1}{1000}+... \A>frac{9}{10}+frac{90}{100}+frac{90}{100}+frac{900}{1000}+...\ to A>underbrace{frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+...}_{color{red} {text{ m group} ,text{ and} space mto infty}} to infty
$$
Showing that $A$ diverges by grouping numbers.
Let's group the terms as follows:$$A=frac11+frac12+frac13+frac14+cdots\ $$
$$
A=underbrace{(frac{1}{1}+frac{1}{2}+frac{1}{3}+cdots+frac{1}{9})}_{color{red} {9- terms}}
+underbrace{(frac{1}{10}+frac{1}{11}+frac{1}{12}+cdots+frac{1}{99})}_{color{red} {90- terms}}\+underbrace{(frac{1}{101}+frac{1}{102}+frac{1}{103}+cdots+frac{1}{999})}_{color{red} {900- terms}}+cdots \ to $$
$$\A>9 times(frac{1}{10})+(99-10+1)times frac{1}{100}+(999-100+1)times frac{1}{1000}+... \A>frac{9}{10}+frac{90}{100}+frac{90}{100}+frac{900}{1000}+...\ to A>underbrace{frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+frac{9}{10}+...}_{color{red} {text{ m group} ,text{ and} space mto infty}} to infty
$$
Showing that $A$ diverges by grouping numbers.
edited Jul 28 '17 at 8:31
answered Nov 1 '13 at 17:07
Khosrotash
16.8k12361
16.8k12361
add a comment |
add a comment |
up vote
22
down vote
The answer given by AgCl is a classic one. And possibly pedagogically best; I don't know.
I also like the following argument. I'm not sure what students who are new to the topic will think about it.
Suppose 1 + 1/2 + 1/3 + 1/4 + ... adds up to some finite total S. Now group terms in the following way:
$$1 + frac{1}{2} > frac{1}{2} + frac{1}{2} = frac{2}{2} = 1$$
$$frac{1}{3} + frac{1}{4} > frac{1}{4} + frac{1}{4} = frac{2}{4} = frac{1}{2}$$
$$frac{1}{5} + frac{1}{6} > frac{1}{6} + frac{1}{6} = frac{2}{6} = frac{1}{3}$$
Continuing in this way, we get $S > S$, a contradiction.
1
Not really. From $S_n > T_n$ you can only conclude that $lim S_n ge lim T_n$.
– lhf
Jul 10 '11 at 21:24
6
@lhf: That's right, but that can be easily fixed here (with $S_n = 1 + 1/2 + dots + 1/2n$ and $T_n = 1 + 1/2 + dots + 1/n$): we can use a better inequality, like say $S_n ge T_n + 1/2$ (using just the first step) to conclude that $lim S_n ge lim T_n + 1/2$, contradicting $S = lim S_n = lim T_n$.
– ShreevatsaR
Jul 11 '11 at 4:18
add a comment |
up vote
22
down vote
The answer given by AgCl is a classic one. And possibly pedagogically best; I don't know.
I also like the following argument. I'm not sure what students who are new to the topic will think about it.
Suppose 1 + 1/2 + 1/3 + 1/4 + ... adds up to some finite total S. Now group terms in the following way:
$$1 + frac{1}{2} > frac{1}{2} + frac{1}{2} = frac{2}{2} = 1$$
$$frac{1}{3} + frac{1}{4} > frac{1}{4} + frac{1}{4} = frac{2}{4} = frac{1}{2}$$
$$frac{1}{5} + frac{1}{6} > frac{1}{6} + frac{1}{6} = frac{2}{6} = frac{1}{3}$$
Continuing in this way, we get $S > S$, a contradiction.
1
Not really. From $S_n > T_n$ you can only conclude that $lim S_n ge lim T_n$.
– lhf
Jul 10 '11 at 21:24
6
@lhf: That's right, but that can be easily fixed here (with $S_n = 1 + 1/2 + dots + 1/2n$ and $T_n = 1 + 1/2 + dots + 1/n$): we can use a better inequality, like say $S_n ge T_n + 1/2$ (using just the first step) to conclude that $lim S_n ge lim T_n + 1/2$, contradicting $S = lim S_n = lim T_n$.
– ShreevatsaR
Jul 11 '11 at 4:18
add a comment |
up vote
22
down vote
up vote
22
down vote
The answer given by AgCl is a classic one. And possibly pedagogically best; I don't know.
I also like the following argument. I'm not sure what students who are new to the topic will think about it.
Suppose 1 + 1/2 + 1/3 + 1/4 + ... adds up to some finite total S. Now group terms in the following way:
$$1 + frac{1}{2} > frac{1}{2} + frac{1}{2} = frac{2}{2} = 1$$
$$frac{1}{3} + frac{1}{4} > frac{1}{4} + frac{1}{4} = frac{2}{4} = frac{1}{2}$$
$$frac{1}{5} + frac{1}{6} > frac{1}{6} + frac{1}{6} = frac{2}{6} = frac{1}{3}$$
Continuing in this way, we get $S > S$, a contradiction.
The answer given by AgCl is a classic one. And possibly pedagogically best; I don't know.
I also like the following argument. I'm not sure what students who are new to the topic will think about it.
Suppose 1 + 1/2 + 1/3 + 1/4 + ... adds up to some finite total S. Now group terms in the following way:
$$1 + frac{1}{2} > frac{1}{2} + frac{1}{2} = frac{2}{2} = 1$$
$$frac{1}{3} + frac{1}{4} > frac{1}{4} + frac{1}{4} = frac{2}{4} = frac{1}{2}$$
$$frac{1}{5} + frac{1}{6} > frac{1}{6} + frac{1}{6} = frac{2}{6} = frac{1}{3}$$
Continuing in this way, we get $S > S$, a contradiction.
edited Mar 15 '13 at 20:15
Dominic Michaelis
17.6k43570
17.6k43570
answered Jul 10 '11 at 21:16
idmercer
1,3231225
1,3231225
1
Not really. From $S_n > T_n$ you can only conclude that $lim S_n ge lim T_n$.
– lhf
Jul 10 '11 at 21:24
6
@lhf: That's right, but that can be easily fixed here (with $S_n = 1 + 1/2 + dots + 1/2n$ and $T_n = 1 + 1/2 + dots + 1/n$): we can use a better inequality, like say $S_n ge T_n + 1/2$ (using just the first step) to conclude that $lim S_n ge lim T_n + 1/2$, contradicting $S = lim S_n = lim T_n$.
– ShreevatsaR
Jul 11 '11 at 4:18
add a comment |
1
Not really. From $S_n > T_n$ you can only conclude that $lim S_n ge lim T_n$.
– lhf
Jul 10 '11 at 21:24
6
@lhf: That's right, but that can be easily fixed here (with $S_n = 1 + 1/2 + dots + 1/2n$ and $T_n = 1 + 1/2 + dots + 1/n$): we can use a better inequality, like say $S_n ge T_n + 1/2$ (using just the first step) to conclude that $lim S_n ge lim T_n + 1/2$, contradicting $S = lim S_n = lim T_n$.
– ShreevatsaR
Jul 11 '11 at 4:18
1
1
Not really. From $S_n > T_n$ you can only conclude that $lim S_n ge lim T_n$.
– lhf
Jul 10 '11 at 21:24
Not really. From $S_n > T_n$ you can only conclude that $lim S_n ge lim T_n$.
– lhf
Jul 10 '11 at 21:24
6
6
@lhf: That's right, but that can be easily fixed here (with $S_n = 1 + 1/2 + dots + 1/2n$ and $T_n = 1 + 1/2 + dots + 1/n$): we can use a better inequality, like say $S_n ge T_n + 1/2$ (using just the first step) to conclude that $lim S_n ge lim T_n + 1/2$, contradicting $S = lim S_n = lim T_n$.
– ShreevatsaR
Jul 11 '11 at 4:18
@lhf: That's right, but that can be easily fixed here (with $S_n = 1 + 1/2 + dots + 1/2n$ and $T_n = 1 + 1/2 + dots + 1/n$): we can use a better inequality, like say $S_n ge T_n + 1/2$ (using just the first step) to conclude that $lim S_n ge lim T_n + 1/2$, contradicting $S = lim S_n = lim T_n$.
– ShreevatsaR
Jul 11 '11 at 4:18
add a comment |
up vote
19
down vote
An alternative proof (translated and adapted from this comment by Filipe Oliveira, in Portuguese, posted also here). Let $ f(x)=ln(1+x)$. Then $f'(x)=dfrac {1}{1+x}$ and $ f'(0)=1$. Hence
$$displaystylelim_{xto 0}dfrac{ln(1+x)}{x}=lim_{xto 0}dfrac{ln(1+x)-ln(1)}{x-0}=1,$$
and
$$ displaystylelim_{ntoinfty} dfrac{lnleft(1+dfrac{1}{n}right)}{dfrac {1}{n}}=1>0.$$
So, the series $displaystylesumdfrac{1}{n}$ and $displaystylesumlnleft(1+dfrac {1}{n}right)$ are both convergent or divergent. Since
$$lnleft(1+dfrac {1}{n}right)=lnleft(dfrac{n+1}{n}right)=ln (n+1)-ln(n),$$
we have
$$displaystylesum_{n=1}^Nlnleft(1+dfrac {1}{n}right)=ln(N+1)-ln(1)=ln(N+1).$$
Thus $displaystylesum_{n=1}^{infty}lnleft(1+dfrac {1}{n}right)$ is divergent and so is $displaystylesum_{n=1}^{infty}dfrac{1}{n}$.
add a comment |
up vote
19
down vote
An alternative proof (translated and adapted from this comment by Filipe Oliveira, in Portuguese, posted also here). Let $ f(x)=ln(1+x)$. Then $f'(x)=dfrac {1}{1+x}$ and $ f'(0)=1$. Hence
$$displaystylelim_{xto 0}dfrac{ln(1+x)}{x}=lim_{xto 0}dfrac{ln(1+x)-ln(1)}{x-0}=1,$$
and
$$ displaystylelim_{ntoinfty} dfrac{lnleft(1+dfrac{1}{n}right)}{dfrac {1}{n}}=1>0.$$
So, the series $displaystylesumdfrac{1}{n}$ and $displaystylesumlnleft(1+dfrac {1}{n}right)$ are both convergent or divergent. Since
$$lnleft(1+dfrac {1}{n}right)=lnleft(dfrac{n+1}{n}right)=ln (n+1)-ln(n),$$
we have
$$displaystylesum_{n=1}^Nlnleft(1+dfrac {1}{n}right)=ln(N+1)-ln(1)=ln(N+1).$$
Thus $displaystylesum_{n=1}^{infty}lnleft(1+dfrac {1}{n}right)$ is divergent and so is $displaystylesum_{n=1}^{infty}dfrac{1}{n}$.
add a comment |
up vote
19
down vote
up vote
19
down vote
An alternative proof (translated and adapted from this comment by Filipe Oliveira, in Portuguese, posted also here). Let $ f(x)=ln(1+x)$. Then $f'(x)=dfrac {1}{1+x}$ and $ f'(0)=1$. Hence
$$displaystylelim_{xto 0}dfrac{ln(1+x)}{x}=lim_{xto 0}dfrac{ln(1+x)-ln(1)}{x-0}=1,$$
and
$$ displaystylelim_{ntoinfty} dfrac{lnleft(1+dfrac{1}{n}right)}{dfrac {1}{n}}=1>0.$$
So, the series $displaystylesumdfrac{1}{n}$ and $displaystylesumlnleft(1+dfrac {1}{n}right)$ are both convergent or divergent. Since
$$lnleft(1+dfrac {1}{n}right)=lnleft(dfrac{n+1}{n}right)=ln (n+1)-ln(n),$$
we have
$$displaystylesum_{n=1}^Nlnleft(1+dfrac {1}{n}right)=ln(N+1)-ln(1)=ln(N+1).$$
Thus $displaystylesum_{n=1}^{infty}lnleft(1+dfrac {1}{n}right)$ is divergent and so is $displaystylesum_{n=1}^{infty}dfrac{1}{n}$.
An alternative proof (translated and adapted from this comment by Filipe Oliveira, in Portuguese, posted also here). Let $ f(x)=ln(1+x)$. Then $f'(x)=dfrac {1}{1+x}$ and $ f'(0)=1$. Hence
$$displaystylelim_{xto 0}dfrac{ln(1+x)}{x}=lim_{xto 0}dfrac{ln(1+x)-ln(1)}{x-0}=1,$$
and
$$ displaystylelim_{ntoinfty} dfrac{lnleft(1+dfrac{1}{n}right)}{dfrac {1}{n}}=1>0.$$
So, the series $displaystylesumdfrac{1}{n}$ and $displaystylesumlnleft(1+dfrac {1}{n}right)$ are both convergent or divergent. Since
$$lnleft(1+dfrac {1}{n}right)=lnleft(dfrac{n+1}{n}right)=ln (n+1)-ln(n),$$
we have
$$displaystylesum_{n=1}^Nlnleft(1+dfrac {1}{n}right)=ln(N+1)-ln(1)=ln(N+1).$$
Thus $displaystylesum_{n=1}^{infty}lnleft(1+dfrac {1}{n}right)$ is divergent and so is $displaystylesum_{n=1}^{infty}dfrac{1}{n}$.
answered Jul 10 '11 at 21:50
Américo Tavares
32.2k1079202
32.2k1079202
add a comment |
add a comment |
up vote
19
down vote
This is not as good an answer as AgCl's, nonetheless people may find it interesting.
If you're used to calculus then you might notice that the sum $$
1+frac{1}{2}+frac{1}{3}+dots+frac{1}{n}$$ is very close to the integral from $1$ to $n$ of $frac{1}{x}$. This definite integral is ln(n), so you should expect $1+frac{1}{2}+frac{1}{3}+dots+
frac{1}{n}$ to grow like $ln(n)$.
Although this argument can be made rigorous, it's still unsatisfying because it depends on the fact that the derivative of $ln(x)$ is $frac{1}{x}$, which is probably harder than the original question. Nonetheless it does illustrate a good general heuristic for quickly determining how sums behave if you already know calculus.
2
If you look at a Riemann sum for intervals with width 1, you can pretty quickly see that the integral of 1/x from 1 to infinity must be less than the sum of the harmonic series.
– Isaac
Jul 21 '10 at 5:51
Thank you for adding this answer. I was hoping to avoid an answer that involved integration, so I also prefer AgCl's answer. But I am happy to see more than one demonstration/proof.
– bryn
Jul 22 '10 at 11:33
The sum is closer to the integral from $frac{1}{2}$ to $n+frac{1}{2}$ of $frac{1}{x}$, which is $log(2n+1)$ math.stackexchange.com/a/1602945/134791
– Jaume Oliver Lafont
Jan 25 '16 at 23:02
add a comment |
up vote
19
down vote
This is not as good an answer as AgCl's, nonetheless people may find it interesting.
If you're used to calculus then you might notice that the sum $$
1+frac{1}{2}+frac{1}{3}+dots+frac{1}{n}$$ is very close to the integral from $1$ to $n$ of $frac{1}{x}$. This definite integral is ln(n), so you should expect $1+frac{1}{2}+frac{1}{3}+dots+
frac{1}{n}$ to grow like $ln(n)$.
Although this argument can be made rigorous, it's still unsatisfying because it depends on the fact that the derivative of $ln(x)$ is $frac{1}{x}$, which is probably harder than the original question. Nonetheless it does illustrate a good general heuristic for quickly determining how sums behave if you already know calculus.
2
If you look at a Riemann sum for intervals with width 1, you can pretty quickly see that the integral of 1/x from 1 to infinity must be less than the sum of the harmonic series.
– Isaac
Jul 21 '10 at 5:51
Thank you for adding this answer. I was hoping to avoid an answer that involved integration, so I also prefer AgCl's answer. But I am happy to see more than one demonstration/proof.
– bryn
Jul 22 '10 at 11:33
The sum is closer to the integral from $frac{1}{2}$ to $n+frac{1}{2}$ of $frac{1}{x}$, which is $log(2n+1)$ math.stackexchange.com/a/1602945/134791
– Jaume Oliver Lafont
Jan 25 '16 at 23:02
add a comment |
up vote
19
down vote
up vote
19
down vote
This is not as good an answer as AgCl's, nonetheless people may find it interesting.
If you're used to calculus then you might notice that the sum $$
1+frac{1}{2}+frac{1}{3}+dots+frac{1}{n}$$ is very close to the integral from $1$ to $n$ of $frac{1}{x}$. This definite integral is ln(n), so you should expect $1+frac{1}{2}+frac{1}{3}+dots+
frac{1}{n}$ to grow like $ln(n)$.
Although this argument can be made rigorous, it's still unsatisfying because it depends on the fact that the derivative of $ln(x)$ is $frac{1}{x}$, which is probably harder than the original question. Nonetheless it does illustrate a good general heuristic for quickly determining how sums behave if you already know calculus.
This is not as good an answer as AgCl's, nonetheless people may find it interesting.
If you're used to calculus then you might notice that the sum $$
1+frac{1}{2}+frac{1}{3}+dots+frac{1}{n}$$ is very close to the integral from $1$ to $n$ of $frac{1}{x}$. This definite integral is ln(n), so you should expect $1+frac{1}{2}+frac{1}{3}+dots+
frac{1}{n}$ to grow like $ln(n)$.
Although this argument can be made rigorous, it's still unsatisfying because it depends on the fact that the derivative of $ln(x)$ is $frac{1}{x}$, which is probably harder than the original question. Nonetheless it does illustrate a good general heuristic for quickly determining how sums behave if you already know calculus.
edited Mar 15 '13 at 19:33
Dominic Michaelis
17.6k43570
17.6k43570
answered Jul 21 '10 at 5:28
Noah Snyder
7,47722854
7,47722854
2
If you look at a Riemann sum for intervals with width 1, you can pretty quickly see that the integral of 1/x from 1 to infinity must be less than the sum of the harmonic series.
– Isaac
Jul 21 '10 at 5:51
Thank you for adding this answer. I was hoping to avoid an answer that involved integration, so I also prefer AgCl's answer. But I am happy to see more than one demonstration/proof.
– bryn
Jul 22 '10 at 11:33
The sum is closer to the integral from $frac{1}{2}$ to $n+frac{1}{2}$ of $frac{1}{x}$, which is $log(2n+1)$ math.stackexchange.com/a/1602945/134791
– Jaume Oliver Lafont
Jan 25 '16 at 23:02
add a comment |
2
If you look at a Riemann sum for intervals with width 1, you can pretty quickly see that the integral of 1/x from 1 to infinity must be less than the sum of the harmonic series.
– Isaac
Jul 21 '10 at 5:51
Thank you for adding this answer. I was hoping to avoid an answer that involved integration, so I also prefer AgCl's answer. But I am happy to see more than one demonstration/proof.
– bryn
Jul 22 '10 at 11:33
The sum is closer to the integral from $frac{1}{2}$ to $n+frac{1}{2}$ of $frac{1}{x}$, which is $log(2n+1)$ math.stackexchange.com/a/1602945/134791
– Jaume Oliver Lafont
Jan 25 '16 at 23:02
2
2
If you look at a Riemann sum for intervals with width 1, you can pretty quickly see that the integral of 1/x from 1 to infinity must be less than the sum of the harmonic series.
– Isaac
Jul 21 '10 at 5:51
If you look at a Riemann sum for intervals with width 1, you can pretty quickly see that the integral of 1/x from 1 to infinity must be less than the sum of the harmonic series.
– Isaac
Jul 21 '10 at 5:51
Thank you for adding this answer. I was hoping to avoid an answer that involved integration, so I also prefer AgCl's answer. But I am happy to see more than one demonstration/proof.
– bryn
Jul 22 '10 at 11:33
Thank you for adding this answer. I was hoping to avoid an answer that involved integration, so I also prefer AgCl's answer. But I am happy to see more than one demonstration/proof.
– bryn
Jul 22 '10 at 11:33
The sum is closer to the integral from $frac{1}{2}$ to $n+frac{1}{2}$ of $frac{1}{x}$, which is $log(2n+1)$ math.stackexchange.com/a/1602945/134791
– Jaume Oliver Lafont
Jan 25 '16 at 23:02
The sum is closer to the integral from $frac{1}{2}$ to $n+frac{1}{2}$ of $frac{1}{x}$, which is $log(2n+1)$ math.stackexchange.com/a/1602945/134791
– Jaume Oliver Lafont
Jan 25 '16 at 23:02
add a comment |
up vote
17
down vote
This was bumped, so I'll add a proof sweet proof I saw in this site. Exponentiate $H_n$ and get $$e^{H_n}=prod_{k=1}^n e^{1/k}gtprod_{k=1}^nleft(1+frac{1}{k}right)=n+1.$$ Therefore, $H_ngtlog(n+1)$, so we are done. We used $e^xgt1+x$ and telescoped the resulting product.
Oh, that's unique.
– Simply Beautiful Art
Oct 7 '16 at 1:12
add a comment |
up vote
17
down vote
This was bumped, so I'll add a proof sweet proof I saw in this site. Exponentiate $H_n$ and get $$e^{H_n}=prod_{k=1}^n e^{1/k}gtprod_{k=1}^nleft(1+frac{1}{k}right)=n+1.$$ Therefore, $H_ngtlog(n+1)$, so we are done. We used $e^xgt1+x$ and telescoped the resulting product.
Oh, that's unique.
– Simply Beautiful Art
Oct 7 '16 at 1:12
add a comment |
up vote
17
down vote
up vote
17
down vote
This was bumped, so I'll add a proof sweet proof I saw in this site. Exponentiate $H_n$ and get $$e^{H_n}=prod_{k=1}^n e^{1/k}gtprod_{k=1}^nleft(1+frac{1}{k}right)=n+1.$$ Therefore, $H_ngtlog(n+1)$, so we are done. We used $e^xgt1+x$ and telescoped the resulting product.
This was bumped, so I'll add a proof sweet proof I saw in this site. Exponentiate $H_n$ and get $$e^{H_n}=prod_{k=1}^n e^{1/k}gtprod_{k=1}^nleft(1+frac{1}{k}right)=n+1.$$ Therefore, $H_ngtlog(n+1)$, so we are done. We used $e^xgt1+x$ and telescoped the resulting product.
answered Dec 27 '13 at 14:49
Ian Mateus
4,66032452
4,66032452
Oh, that's unique.
– Simply Beautiful Art
Oct 7 '16 at 1:12
add a comment |
Oh, that's unique.
– Simply Beautiful Art
Oct 7 '16 at 1:12
Oh, that's unique.
– Simply Beautiful Art
Oct 7 '16 at 1:12
Oh, that's unique.
– Simply Beautiful Art
Oct 7 '16 at 1:12
add a comment |
up vote
13
down vote
Another interesting proof is based upon one of the consequences of the Lagrange's theorem applied on $ln(x)$ function, namely:
$$frac{1}{k+1} < ln(k+1)-ln(k)<frac{1}{k} space , space kinmathbb{N} ,space k>0$$
Taking $k=1,2,...,n$ values to the inequality and then summing all relations, we get the required result.
The proof is complete.
add a comment |
up vote
13
down vote
Another interesting proof is based upon one of the consequences of the Lagrange's theorem applied on $ln(x)$ function, namely:
$$frac{1}{k+1} < ln(k+1)-ln(k)<frac{1}{k} space , space kinmathbb{N} ,space k>0$$
Taking $k=1,2,...,n$ values to the inequality and then summing all relations, we get the required result.
The proof is complete.
add a comment |
up vote
13
down vote
up vote
13
down vote
Another interesting proof is based upon one of the consequences of the Lagrange's theorem applied on $ln(x)$ function, namely:
$$frac{1}{k+1} < ln(k+1)-ln(k)<frac{1}{k} space , space kinmathbb{N} ,space k>0$$
Taking $k=1,2,...,n$ values to the inequality and then summing all relations, we get the required result.
The proof is complete.
Another interesting proof is based upon one of the consequences of the Lagrange's theorem applied on $ln(x)$ function, namely:
$$frac{1}{k+1} < ln(k+1)-ln(k)<frac{1}{k} space , space kinmathbb{N} ,space k>0$$
Taking $k=1,2,...,n$ values to the inequality and then summing all relations, we get the required result.
The proof is complete.
edited Oct 26 '12 at 13:03
answered May 29 '12 at 13:52
user 1357113
22.2k875224
22.2k875224
add a comment |
add a comment |
up vote
11
down vote
There also exists a proof for the divergence of the harmonic series that involves the Integral Test. It goes as follows.
It is possible to prove that the harmonic series diverges by comparing its sum with an improper integral. Specifically, consider the arrangement of rectangles shown in the figure of $ y = dfrac {1}{x} $:
Each rectangle is $1$ unit wide and $frac{1}{n}$ units high, so the total area of the rectangles is the sum of the harmonic series: $$ displaystylesum left( text {enclosed rectangle are} right) = displaystylesum_{k=1}^{infty} dfrac {1}{k}. $$Now, the total area under the curve is given by $$ displaystyleint_{1}^{infty} dfrac {mathrm{d}x}{x} = infty. $$Since this area is entirely contained within the rectangles, the total area of the rectangles must be infinite as well. More precisely, this proves that $$ displaystylesum_{n=1}^{k} dfrac {1}{n} > displaystyleint_{1}^{k+1} dfrac {mathrm{d}x}{x} = ln (k+1). $$This is the backbone of what we know today as the integral test.
Interestingly, the alternating harmonic series does converge: $$ displaystylesum_{n=1}^{infty} dfrac {(-1)^n}{n} = ln 2. $$And so does the $p$-harmonic series with $p>1$.
add a comment |
up vote
11
down vote
There also exists a proof for the divergence of the harmonic series that involves the Integral Test. It goes as follows.
It is possible to prove that the harmonic series diverges by comparing its sum with an improper integral. Specifically, consider the arrangement of rectangles shown in the figure of $ y = dfrac {1}{x} $:
Each rectangle is $1$ unit wide and $frac{1}{n}$ units high, so the total area of the rectangles is the sum of the harmonic series: $$ displaystylesum left( text {enclosed rectangle are} right) = displaystylesum_{k=1}^{infty} dfrac {1}{k}. $$Now, the total area under the curve is given by $$ displaystyleint_{1}^{infty} dfrac {mathrm{d}x}{x} = infty. $$Since this area is entirely contained within the rectangles, the total area of the rectangles must be infinite as well. More precisely, this proves that $$ displaystylesum_{n=1}^{k} dfrac {1}{n} > displaystyleint_{1}^{k+1} dfrac {mathrm{d}x}{x} = ln (k+1). $$This is the backbone of what we know today as the integral test.
Interestingly, the alternating harmonic series does converge: $$ displaystylesum_{n=1}^{infty} dfrac {(-1)^n}{n} = ln 2. $$And so does the $p$-harmonic series with $p>1$.
add a comment |
up vote
11
down vote
up vote
11
down vote
There also exists a proof for the divergence of the harmonic series that involves the Integral Test. It goes as follows.
It is possible to prove that the harmonic series diverges by comparing its sum with an improper integral. Specifically, consider the arrangement of rectangles shown in the figure of $ y = dfrac {1}{x} $:
Each rectangle is $1$ unit wide and $frac{1}{n}$ units high, so the total area of the rectangles is the sum of the harmonic series: $$ displaystylesum left( text {enclosed rectangle are} right) = displaystylesum_{k=1}^{infty} dfrac {1}{k}. $$Now, the total area under the curve is given by $$ displaystyleint_{1}^{infty} dfrac {mathrm{d}x}{x} = infty. $$Since this area is entirely contained within the rectangles, the total area of the rectangles must be infinite as well. More precisely, this proves that $$ displaystylesum_{n=1}^{k} dfrac {1}{n} > displaystyleint_{1}^{k+1} dfrac {mathrm{d}x}{x} = ln (k+1). $$This is the backbone of what we know today as the integral test.
Interestingly, the alternating harmonic series does converge: $$ displaystylesum_{n=1}^{infty} dfrac {(-1)^n}{n} = ln 2. $$And so does the $p$-harmonic series with $p>1$.
There also exists a proof for the divergence of the harmonic series that involves the Integral Test. It goes as follows.
It is possible to prove that the harmonic series diverges by comparing its sum with an improper integral. Specifically, consider the arrangement of rectangles shown in the figure of $ y = dfrac {1}{x} $:
Each rectangle is $1$ unit wide and $frac{1}{n}$ units high, so the total area of the rectangles is the sum of the harmonic series: $$ displaystylesum left( text {enclosed rectangle are} right) = displaystylesum_{k=1}^{infty} dfrac {1}{k}. $$Now, the total area under the curve is given by $$ displaystyleint_{1}^{infty} dfrac {mathrm{d}x}{x} = infty. $$Since this area is entirely contained within the rectangles, the total area of the rectangles must be infinite as well. More precisely, this proves that $$ displaystylesum_{n=1}^{k} dfrac {1}{n} > displaystyleint_{1}^{k+1} dfrac {mathrm{d}x}{x} = ln (k+1). $$This is the backbone of what we know today as the integral test.
Interestingly, the alternating harmonic series does converge: $$ displaystylesum_{n=1}^{infty} dfrac {(-1)^n}{n} = ln 2. $$And so does the $p$-harmonic series with $p>1$.
edited Feb 8 '15 at 21:45
Cyclohexanol.
7,06911753
7,06911753
answered Nov 1 '13 at 17:38
Ahaan S. Rungta
6,46052160
6,46052160
add a comment |
add a comment |
up vote
8
down vote
Let's assume that $sum_{n=1}^{infty}frac1n=:Hin mathbb{R}$, then
$$H=frac11+frac12+frac13+frac14+frac15+frac16 +ldots $$
$$Hgeqslant frac11+frac12 +frac14+frac14+frac16+frac16+ldots$$
$$Hgeqslant frac11+frac12+frac12+frac13+frac14+frac15+ldots$$
$$Hgeqslant frac12 +H Rightarrow 0geqslant frac12$$
Since the last inequality doesn't hold, we can conclude that the sum doesn't converge.
add a comment |
up vote
8
down vote
Let's assume that $sum_{n=1}^{infty}frac1n=:Hin mathbb{R}$, then
$$H=frac11+frac12+frac13+frac14+frac15+frac16 +ldots $$
$$Hgeqslant frac11+frac12 +frac14+frac14+frac16+frac16+ldots$$
$$Hgeqslant frac11+frac12+frac12+frac13+frac14+frac15+ldots$$
$$Hgeqslant frac12 +H Rightarrow 0geqslant frac12$$
Since the last inequality doesn't hold, we can conclude that the sum doesn't converge.
add a comment |
up vote
8
down vote
up vote
8
down vote
Let's assume that $sum_{n=1}^{infty}frac1n=:Hin mathbb{R}$, then
$$H=frac11+frac12+frac13+frac14+frac15+frac16 +ldots $$
$$Hgeqslant frac11+frac12 +frac14+frac14+frac16+frac16+ldots$$
$$Hgeqslant frac11+frac12+frac12+frac13+frac14+frac15+ldots$$
$$Hgeqslant frac12 +H Rightarrow 0geqslant frac12$$
Since the last inequality doesn't hold, we can conclude that the sum doesn't converge.
Let's assume that $sum_{n=1}^{infty}frac1n=:Hin mathbb{R}$, then
$$H=frac11+frac12+frac13+frac14+frac15+frac16 +ldots $$
$$Hgeqslant frac11+frac12 +frac14+frac14+frac16+frac16+ldots$$
$$Hgeqslant frac11+frac12+frac12+frac13+frac14+frac15+ldots$$
$$Hgeqslant frac12 +H Rightarrow 0geqslant frac12$$
Since the last inequality doesn't hold, we can conclude that the sum doesn't converge.
answered Feb 8 '15 at 22:01
Rasmus Erlemann
2,32421325
2,32421325
add a comment |
add a comment |
up vote
8
down vote
$$int_{0}^{infty}e^{-nx}dx=frac1n$$
$$sum_{n=1}^{infty}int_{0}^{infty}e^{-nx}dx=lim_{ m to infty}sum_{n=1}^{m}frac1n$$
using the law of Geometric series
$$int_{0}^{infty}(frac{1}{1-e^{-x}}-1)dx=lim_{ m to infty}H_m$$
$$lim_{ m to infty}H_m=left [ ln(e^x-1)-x right ]_0^{infty}toinfty$$
Hm, the lower bound goes to $-infty$ it appears.
– Simply Beautiful Art
Oct 6 '16 at 16:40
@SimpleArt The upper bound goes to 0 and The Lower goes to $+infty$ ,,,$-ln 0^+$
– mhd.math
Oct 7 '16 at 11:20
Oh right, duh, didn't quite use that FTOC correctly.
– Simply Beautiful Art
Oct 7 '16 at 13:19
add a comment |
up vote
8
down vote
$$int_{0}^{infty}e^{-nx}dx=frac1n$$
$$sum_{n=1}^{infty}int_{0}^{infty}e^{-nx}dx=lim_{ m to infty}sum_{n=1}^{m}frac1n$$
using the law of Geometric series
$$int_{0}^{infty}(frac{1}{1-e^{-x}}-1)dx=lim_{ m to infty}H_m$$
$$lim_{ m to infty}H_m=left [ ln(e^x-1)-x right ]_0^{infty}toinfty$$
Hm, the lower bound goes to $-infty$ it appears.
– Simply Beautiful Art
Oct 6 '16 at 16:40
@SimpleArt The upper bound goes to 0 and The Lower goes to $+infty$ ,,,$-ln 0^+$
– mhd.math
Oct 7 '16 at 11:20
Oh right, duh, didn't quite use that FTOC correctly.
– Simply Beautiful Art
Oct 7 '16 at 13:19
add a comment |
up vote
8
down vote
up vote
8
down vote
$$int_{0}^{infty}e^{-nx}dx=frac1n$$
$$sum_{n=1}^{infty}int_{0}^{infty}e^{-nx}dx=lim_{ m to infty}sum_{n=1}^{m}frac1n$$
using the law of Geometric series
$$int_{0}^{infty}(frac{1}{1-e^{-x}}-1)dx=lim_{ m to infty}H_m$$
$$lim_{ m to infty}H_m=left [ ln(e^x-1)-x right ]_0^{infty}toinfty$$
$$int_{0}^{infty}e^{-nx}dx=frac1n$$
$$sum_{n=1}^{infty}int_{0}^{infty}e^{-nx}dx=lim_{ m to infty}sum_{n=1}^{m}frac1n$$
using the law of Geometric series
$$int_{0}^{infty}(frac{1}{1-e^{-x}}-1)dx=lim_{ m to infty}H_m$$
$$lim_{ m to infty}H_m=left [ ln(e^x-1)-x right ]_0^{infty}toinfty$$
edited Oct 7 '16 at 11:22
answered Dec 27 '13 at 14:10
mhd.math
3,41711852
3,41711852
Hm, the lower bound goes to $-infty$ it appears.
– Simply Beautiful Art
Oct 6 '16 at 16:40
@SimpleArt The upper bound goes to 0 and The Lower goes to $+infty$ ,,,$-ln 0^+$
– mhd.math
Oct 7 '16 at 11:20
Oh right, duh, didn't quite use that FTOC correctly.
– Simply Beautiful Art
Oct 7 '16 at 13:19
add a comment |
Hm, the lower bound goes to $-infty$ it appears.
– Simply Beautiful Art
Oct 6 '16 at 16:40
@SimpleArt The upper bound goes to 0 and The Lower goes to $+infty$ ,,,$-ln 0^+$
– mhd.math
Oct 7 '16 at 11:20
Oh right, duh, didn't quite use that FTOC correctly.
– Simply Beautiful Art
Oct 7 '16 at 13:19
Hm, the lower bound goes to $-infty$ it appears.
– Simply Beautiful Art
Oct 6 '16 at 16:40
Hm, the lower bound goes to $-infty$ it appears.
– Simply Beautiful Art
Oct 6 '16 at 16:40
@SimpleArt The upper bound goes to 0 and The Lower goes to $+infty$ ,,,$-ln 0^+$
– mhd.math
Oct 7 '16 at 11:20
@SimpleArt The upper bound goes to 0 and The Lower goes to $+infty$ ,,,$-ln 0^+$
– mhd.math
Oct 7 '16 at 11:20
Oh right, duh, didn't quite use that FTOC correctly.
– Simply Beautiful Art
Oct 7 '16 at 13:19
Oh right, duh, didn't quite use that FTOC correctly.
– Simply Beautiful Art
Oct 7 '16 at 13:19
add a comment |
up vote
7
down vote
Another (different) answer, by the Cauchy Condensation Test :
$$sum_{n=1}^infty frac{1}{n} < infty iff sum_{n=1}^infty 2^n frac{1}{2^n} = sum_{n=1}^infty 1< infty $$
The latter is obviously divergent, therefore the former diverges. This is THE shortest proof there is.
add a comment |
up vote
7
down vote
Another (different) answer, by the Cauchy Condensation Test :
$$sum_{n=1}^infty frac{1}{n} < infty iff sum_{n=1}^infty 2^n frac{1}{2^n} = sum_{n=1}^infty 1< infty $$
The latter is obviously divergent, therefore the former diverges. This is THE shortest proof there is.
add a comment |
up vote
7
down vote
up vote
7
down vote
Another (different) answer, by the Cauchy Condensation Test :
$$sum_{n=1}^infty frac{1}{n} < infty iff sum_{n=1}^infty 2^n frac{1}{2^n} = sum_{n=1}^infty 1< infty $$
The latter is obviously divergent, therefore the former diverges. This is THE shortest proof there is.
Another (different) answer, by the Cauchy Condensation Test :
$$sum_{n=1}^infty frac{1}{n} < infty iff sum_{n=1}^infty 2^n frac{1}{2^n} = sum_{n=1}^infty 1< infty $$
The latter is obviously divergent, therefore the former diverges. This is THE shortest proof there is.
answered Mar 11 '14 at 4:46
Squirtle
4,1131641
4,1131641
add a comment |
add a comment |
up vote
7
down vote
Suppose to the contrary that converges.
Let $s_n$ denote the $n$-th partial sum. Since the serie converges, $(s_n)$ is a Cauchy sequence. Let $varepsilon = 1/3$, then there is some $n_0$ such that $|s_q-s_p|< 1/3$ for all $q>pge n_0$. Let $q=2n_0$ and $p=n_0$. Then
$$frac{1}{3}>bigg|sum_{n=n_0+1}^{2n_0} frac{1}{n}bigg|gebigg|sum_{n=n_0+1}^{2n_0} frac{1}{2n_0}bigg|=frac{1}{2}$$
a contradiction. Then this contradiction shows that the series diverges.
add a comment |
up vote
7
down vote
Suppose to the contrary that converges.
Let $s_n$ denote the $n$-th partial sum. Since the serie converges, $(s_n)$ is a Cauchy sequence. Let $varepsilon = 1/3$, then there is some $n_0$ such that $|s_q-s_p|< 1/3$ for all $q>pge n_0$. Let $q=2n_0$ and $p=n_0$. Then
$$frac{1}{3}>bigg|sum_{n=n_0+1}^{2n_0} frac{1}{n}bigg|gebigg|sum_{n=n_0+1}^{2n_0} frac{1}{2n_0}bigg|=frac{1}{2}$$
a contradiction. Then this contradiction shows that the series diverges.
add a comment |
up vote
7
down vote
up vote
7
down vote
Suppose to the contrary that converges.
Let $s_n$ denote the $n$-th partial sum. Since the serie converges, $(s_n)$ is a Cauchy sequence. Let $varepsilon = 1/3$, then there is some $n_0$ such that $|s_q-s_p|< 1/3$ for all $q>pge n_0$. Let $q=2n_0$ and $p=n_0$. Then
$$frac{1}{3}>bigg|sum_{n=n_0+1}^{2n_0} frac{1}{n}bigg|gebigg|sum_{n=n_0+1}^{2n_0} frac{1}{2n_0}bigg|=frac{1}{2}$$
a contradiction. Then this contradiction shows that the series diverges.
Suppose to the contrary that converges.
Let $s_n$ denote the $n$-th partial sum. Since the serie converges, $(s_n)$ is a Cauchy sequence. Let $varepsilon = 1/3$, then there is some $n_0$ such that $|s_q-s_p|< 1/3$ for all $q>pge n_0$. Let $q=2n_0$ and $p=n_0$. Then
$$frac{1}{3}>bigg|sum_{n=n_0+1}^{2n_0} frac{1}{n}bigg|gebigg|sum_{n=n_0+1}^{2n_0} frac{1}{2n_0}bigg|=frac{1}{2}$$
a contradiction. Then this contradiction shows that the series diverges.
answered Mar 11 '14 at 5:34
Jose Antonio
4,39421427
4,39421427
add a comment |
add a comment |
up vote
7
down vote
First suppose $displaystyle A=frac11+frac12+frac13+frac14+cdots$ converges
then show that $A>A$. That's paradox.
11
Ideally use Latex.
– Meow
Nov 1 '13 at 17:13
2
I am afraid that this approach is incorrect, since similar versions of it can be applied to convergent infinite series. The reason that it does not work is because the number of terms is infinite. All you've proven is that the second series approaches the final value faster than the first one. But whether this value is ultimately finite or not, you have not shown.
– Lucian
Jan 14 '15 at 16:32
add a comment |
up vote
7
down vote
First suppose $displaystyle A=frac11+frac12+frac13+frac14+cdots$ converges
then show that $A>A$. That's paradox.
11
Ideally use Latex.
– Meow
Nov 1 '13 at 17:13
2
I am afraid that this approach is incorrect, since similar versions of it can be applied to convergent infinite series. The reason that it does not work is because the number of terms is infinite. All you've proven is that the second series approaches the final value faster than the first one. But whether this value is ultimately finite or not, you have not shown.
– Lucian
Jan 14 '15 at 16:32
add a comment |
up vote
7
down vote
up vote
7
down vote
First suppose $displaystyle A=frac11+frac12+frac13+frac14+cdots$ converges
then show that $A>A$. That's paradox.
First suppose $displaystyle A=frac11+frac12+frac13+frac14+cdots$ converges
then show that $A>A$. That's paradox.
edited Jul 29 '14 at 5:41
Tunk-Fey
22.9k868100
22.9k868100
answered Nov 1 '13 at 17:02
Khosrotash
16.8k12361
16.8k12361
11
Ideally use Latex.
– Meow
Nov 1 '13 at 17:13
2
I am afraid that this approach is incorrect, since similar versions of it can be applied to convergent infinite series. The reason that it does not work is because the number of terms is infinite. All you've proven is that the second series approaches the final value faster than the first one. But whether this value is ultimately finite or not, you have not shown.
– Lucian
Jan 14 '15 at 16:32
add a comment |
11
Ideally use Latex.
– Meow
Nov 1 '13 at 17:13
2
I am afraid that this approach is incorrect, since similar versions of it can be applied to convergent infinite series. The reason that it does not work is because the number of terms is infinite. All you've proven is that the second series approaches the final value faster than the first one. But whether this value is ultimately finite or not, you have not shown.
– Lucian
Jan 14 '15 at 16:32
11
11
Ideally use Latex.
– Meow
Nov 1 '13 at 17:13
Ideally use Latex.
– Meow
Nov 1 '13 at 17:13
2
2
I am afraid that this approach is incorrect, since similar versions of it can be applied to convergent infinite series. The reason that it does not work is because the number of terms is infinite. All you've proven is that the second series approaches the final value faster than the first one. But whether this value is ultimately finite or not, you have not shown.
– Lucian
Jan 14 '15 at 16:32
I am afraid that this approach is incorrect, since similar versions of it can be applied to convergent infinite series. The reason that it does not work is because the number of terms is infinite. All you've proven is that the second series approaches the final value faster than the first one. But whether this value is ultimately finite or not, you have not shown.
– Lucian
Jan 14 '15 at 16:32
add a comment |
up vote
6
down vote
A non-rigorous explanation I thought of once: consider a savings scheme where you put a dollar in your piggy bank every day. So after $n$ days, you have $n$ dollars; clearly, your savings approach infinity. On the other hand, each day you add an additional $1/n$ proportion of your existing savings, "so" (the non-rigorous step) the accumulated percentage after $n$ days is $1 + 1/2 + cdots + 1/n$.
This can be made rigorous through the infinite product argument
$$prod_{n = 1}^infty (1 + tfrac{1}{n}) < infty iff sum_{n = 1}^infty frac{1}{n} < infty$$
which is obtained, essentially, by taking the logarithm of the left-hand side and using the power series for $log (1 + x)$.
add a comment |
up vote
6
down vote
A non-rigorous explanation I thought of once: consider a savings scheme where you put a dollar in your piggy bank every day. So after $n$ days, you have $n$ dollars; clearly, your savings approach infinity. On the other hand, each day you add an additional $1/n$ proportion of your existing savings, "so" (the non-rigorous step) the accumulated percentage after $n$ days is $1 + 1/2 + cdots + 1/n$.
This can be made rigorous through the infinite product argument
$$prod_{n = 1}^infty (1 + tfrac{1}{n}) < infty iff sum_{n = 1}^infty frac{1}{n} < infty$$
which is obtained, essentially, by taking the logarithm of the left-hand side and using the power series for $log (1 + x)$.
add a comment |
up vote
6
down vote
up vote
6
down vote
A non-rigorous explanation I thought of once: consider a savings scheme where you put a dollar in your piggy bank every day. So after $n$ days, you have $n$ dollars; clearly, your savings approach infinity. On the other hand, each day you add an additional $1/n$ proportion of your existing savings, "so" (the non-rigorous step) the accumulated percentage after $n$ days is $1 + 1/2 + cdots + 1/n$.
This can be made rigorous through the infinite product argument
$$prod_{n = 1}^infty (1 + tfrac{1}{n}) < infty iff sum_{n = 1}^infty frac{1}{n} < infty$$
which is obtained, essentially, by taking the logarithm of the left-hand side and using the power series for $log (1 + x)$.
A non-rigorous explanation I thought of once: consider a savings scheme where you put a dollar in your piggy bank every day. So after $n$ days, you have $n$ dollars; clearly, your savings approach infinity. On the other hand, each day you add an additional $1/n$ proportion of your existing savings, "so" (the non-rigorous step) the accumulated percentage after $n$ days is $1 + 1/2 + cdots + 1/n$.
This can be made rigorous through the infinite product argument
$$prod_{n = 1}^infty (1 + tfrac{1}{n}) < infty iff sum_{n = 1}^infty frac{1}{n} < infty$$
which is obtained, essentially, by taking the logarithm of the left-hand side and using the power series for $log (1 + x)$.
answered Nov 3 '13 at 18:37
Ryan Reich
5,3711627
5,3711627
add a comment |
add a comment |
up vote
6
down vote
Another answer that's very similar to others. But it's prettier, and perhaps easier to understand for the 9-th grade student who asked the same question here.
The student's question was ... does the sum equal some number $S$. But, look:
So, whatever it is, $S$ is larger than the sum of the infinite string of $tfrac12$'s shown in the last line. No number can be this large, so $S$ can't be equal to any number. Mathematicians say that the series "diverges to infinity".
add a comment |
up vote
6
down vote
Another answer that's very similar to others. But it's prettier, and perhaps easier to understand for the 9-th grade student who asked the same question here.
The student's question was ... does the sum equal some number $S$. But, look:
So, whatever it is, $S$ is larger than the sum of the infinite string of $tfrac12$'s shown in the last line. No number can be this large, so $S$ can't be equal to any number. Mathematicians say that the series "diverges to infinity".
add a comment |
up vote
6
down vote
up vote
6
down vote
Another answer that's very similar to others. But it's prettier, and perhaps easier to understand for the 9-th grade student who asked the same question here.
The student's question was ... does the sum equal some number $S$. But, look:
So, whatever it is, $S$ is larger than the sum of the infinite string of $tfrac12$'s shown in the last line. No number can be this large, so $S$ can't be equal to any number. Mathematicians say that the series "diverges to infinity".
Another answer that's very similar to others. But it's prettier, and perhaps easier to understand for the 9-th grade student who asked the same question here.
The student's question was ... does the sum equal some number $S$. But, look:
So, whatever it is, $S$ is larger than the sum of the infinite string of $tfrac12$'s shown in the last line. No number can be this large, so $S$ can't be equal to any number. Mathematicians say that the series "diverges to infinity".
edited Apr 13 '17 at 12:21
Community♦
1
1
answered Sep 25 '16 at 2:17
bubba
29.5k32984
29.5k32984
add a comment |
add a comment |
up vote
5
down vote
I think the integral test gives the most intuitive explanation. Observe that $$int^n_1 frac1x dx= log n$$ The sum $displaystylesum^n_{k=1}frac1k$ can be viewed as the area of $n$ rectangles of height $frac1k$, width $1$ (with the first one having it's left hand side on the y axis, and all having their bottom on the x axis). The graph of $xmapsto frac1x$ can be drawn under these, so the sum will grow with $n$ at (least) as fast as the integral - hence will grow (at least) logarithmically.
use log to get nice formatting for $log$
– Tyler
Nov 1 '13 at 17:24
add a comment |
up vote
5
down vote
I think the integral test gives the most intuitive explanation. Observe that $$int^n_1 frac1x dx= log n$$ The sum $displaystylesum^n_{k=1}frac1k$ can be viewed as the area of $n$ rectangles of height $frac1k$, width $1$ (with the first one having it's left hand side on the y axis, and all having their bottom on the x axis). The graph of $xmapsto frac1x$ can be drawn under these, so the sum will grow with $n$ at (least) as fast as the integral - hence will grow (at least) logarithmically.
use log to get nice formatting for $log$
– Tyler
Nov 1 '13 at 17:24
add a comment |
up vote
5
down vote
up vote
5
down vote
I think the integral test gives the most intuitive explanation. Observe that $$int^n_1 frac1x dx= log n$$ The sum $displaystylesum^n_{k=1}frac1k$ can be viewed as the area of $n$ rectangles of height $frac1k$, width $1$ (with the first one having it's left hand side on the y axis, and all having their bottom on the x axis). The graph of $xmapsto frac1x$ can be drawn under these, so the sum will grow with $n$ at (least) as fast as the integral - hence will grow (at least) logarithmically.
I think the integral test gives the most intuitive explanation. Observe that $$int^n_1 frac1x dx= log n$$ The sum $displaystylesum^n_{k=1}frac1k$ can be viewed as the area of $n$ rectangles of height $frac1k$, width $1$ (with the first one having it's left hand side on the y axis, and all having their bottom on the x axis). The graph of $xmapsto frac1x$ can be drawn under these, so the sum will grow with $n$ at (least) as fast as the integral - hence will grow (at least) logarithmically.
edited Jul 29 '14 at 5:43
Tunk-Fey
22.9k868100
22.9k868100
answered Nov 1 '13 at 17:09
Matt Rigby
2,085713
2,085713
use log to get nice formatting for $log$
– Tyler
Nov 1 '13 at 17:24
add a comment |
use log to get nice formatting for $log$
– Tyler
Nov 1 '13 at 17:24
use log to get nice formatting for $log$
– Tyler
Nov 1 '13 at 17:24
use log to get nice formatting for $log$
– Tyler
Nov 1 '13 at 17:24
add a comment |
up vote
4
down vote
Let be the partial sum $H_n = frac11 + frac12 + frac13 + cdots + frac1n$. Using Cesàro-Stolz:
$$
lim_{ntoinfty}frac{H_n}{log n} = lim_{ntoinfty}frac{H_{n+1}-H_n}{log(n+1)-log n} = lim_{ntoinfty}frac{frac1{n+1}}{log(1+1/n)}
= lim_{ntoinfty}frac{frac1{n+1}}{frac1n} = 1
$$
and
$$sum_{n=1}^inftyfrac1n = lim_{ntoinfty}H_n = infty.$$
add a comment |
up vote
4
down vote
Let be the partial sum $H_n = frac11 + frac12 + frac13 + cdots + frac1n$. Using Cesàro-Stolz:
$$
lim_{ntoinfty}frac{H_n}{log n} = lim_{ntoinfty}frac{H_{n+1}-H_n}{log(n+1)-log n} = lim_{ntoinfty}frac{frac1{n+1}}{log(1+1/n)}
= lim_{ntoinfty}frac{frac1{n+1}}{frac1n} = 1
$$
and
$$sum_{n=1}^inftyfrac1n = lim_{ntoinfty}H_n = infty.$$
add a comment |
up vote
4
down vote
up vote
4
down vote
Let be the partial sum $H_n = frac11 + frac12 + frac13 + cdots + frac1n$. Using Cesàro-Stolz:
$$
lim_{ntoinfty}frac{H_n}{log n} = lim_{ntoinfty}frac{H_{n+1}-H_n}{log(n+1)-log n} = lim_{ntoinfty}frac{frac1{n+1}}{log(1+1/n)}
= lim_{ntoinfty}frac{frac1{n+1}}{frac1n} = 1
$$
and
$$sum_{n=1}^inftyfrac1n = lim_{ntoinfty}H_n = infty.$$
Let be the partial sum $H_n = frac11 + frac12 + frac13 + cdots + frac1n$. Using Cesàro-Stolz:
$$
lim_{ntoinfty}frac{H_n}{log n} = lim_{ntoinfty}frac{H_{n+1}-H_n}{log(n+1)-log n} = lim_{ntoinfty}frac{frac1{n+1}}{log(1+1/n)}
= lim_{ntoinfty}frac{frac1{n+1}}{frac1n} = 1
$$
and
$$sum_{n=1}^inftyfrac1n = lim_{ntoinfty}H_n = infty.$$
answered Jan 30 '15 at 12:37
Martín-Blas Pérez Pinilla
33.7k42770
33.7k42770
add a comment |
add a comment |
up vote
2
down vote
We all know that $$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$ diverges and grows very slowly!! I have seen many proofs of the result but recently found the following: $$S =frac 1 1 + frac 12 + frac 13 +frac 14+ frac 15+ frac 16+ cdots$$ $$> frac 12+frac 12+ frac 14+ frac 14+ frac 16+ frac 16+ cdots =frac 1 1 + frac 12 + frac 13 +cdots = S.$$
In this way we see that $S > S$.
O.o This. Is. Amazing!! =)
– user378947
Dec 12 '16 at 1:18
You can also see it here math.stackexchange.com/questions/1160527/…
– user8795
Dec 12 '16 at 1:20
I have saved this to my personal The Book :) That being said... Come on! The last inequality itself is proof enough!! :P
– user378947
Dec 12 '16 at 1:31
add a comment |
up vote
2
down vote
We all know that $$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$ diverges and grows very slowly!! I have seen many proofs of the result but recently found the following: $$S =frac 1 1 + frac 12 + frac 13 +frac 14+ frac 15+ frac 16+ cdots$$ $$> frac 12+frac 12+ frac 14+ frac 14+ frac 16+ frac 16+ cdots =frac 1 1 + frac 12 + frac 13 +cdots = S.$$
In this way we see that $S > S$.
O.o This. Is. Amazing!! =)
– user378947
Dec 12 '16 at 1:18
You can also see it here math.stackexchange.com/questions/1160527/…
– user8795
Dec 12 '16 at 1:20
I have saved this to my personal The Book :) That being said... Come on! The last inequality itself is proof enough!! :P
– user378947
Dec 12 '16 at 1:31
add a comment |
up vote
2
down vote
up vote
2
down vote
We all know that $$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$ diverges and grows very slowly!! I have seen many proofs of the result but recently found the following: $$S =frac 1 1 + frac 12 + frac 13 +frac 14+ frac 15+ frac 16+ cdots$$ $$> frac 12+frac 12+ frac 14+ frac 14+ frac 16+ frac 16+ cdots =frac 1 1 + frac 12 + frac 13 +cdots = S.$$
In this way we see that $S > S$.
We all know that $$sum_{n=1}^inftyfrac1n=frac 1 1 + frac 12 + frac 13 + cdots $$ diverges and grows very slowly!! I have seen many proofs of the result but recently found the following: $$S =frac 1 1 + frac 12 + frac 13 +frac 14+ frac 15+ frac 16+ cdots$$ $$> frac 12+frac 12+ frac 14+ frac 14+ frac 16+ frac 16+ cdots =frac 1 1 + frac 12 + frac 13 +cdots = S.$$
In this way we see that $S > S$.
answered Feb 22 '15 at 19:25
user8795
5,56461846
5,56461846
O.o This. Is. Amazing!! =)
– user378947
Dec 12 '16 at 1:18
You can also see it here math.stackexchange.com/questions/1160527/…
– user8795
Dec 12 '16 at 1:20
I have saved this to my personal The Book :) That being said... Come on! The last inequality itself is proof enough!! :P
– user378947
Dec 12 '16 at 1:31
add a comment |
O.o This. Is. Amazing!! =)
– user378947
Dec 12 '16 at 1:18
You can also see it here math.stackexchange.com/questions/1160527/…
– user8795
Dec 12 '16 at 1:20
I have saved this to my personal The Book :) That being said... Come on! The last inequality itself is proof enough!! :P
– user378947
Dec 12 '16 at 1:31
O.o This. Is. Amazing!! =)
– user378947
Dec 12 '16 at 1:18
O.o This. Is. Amazing!! =)
– user378947
Dec 12 '16 at 1:18
You can also see it here math.stackexchange.com/questions/1160527/…
– user8795
Dec 12 '16 at 1:20
You can also see it here math.stackexchange.com/questions/1160527/…
– user8795
Dec 12 '16 at 1:20
I have saved this to my personal The Book :) That being said... Come on! The last inequality itself is proof enough!! :P
– user378947
Dec 12 '16 at 1:31
I have saved this to my personal The Book :) That being said... Come on! The last inequality itself is proof enough!! :P
– user378947
Dec 12 '16 at 1:31
add a comment |
up vote
2
down vote
Using Euler's form of the Harmonic numbers,
$$sum_{k=1}^nfrac1k=int_0^1frac{1-x^n}{1-x}dx$$
$$begin{align}
lim_{ntoinfty}sum_{k=1}^nfrac1k & =lim_{ntoinfty}int_0^1frac{1-x^n}{1-x}dx \
& =int_0^1frac1{1-x}dx \
& =left.lim_{pto1^+}-ln(1-x)right]_0^p \
& to+infty
end{align}$$
Using the Taylor expansion of $ln(1-x)$,
$$-ln(1-x)=x+frac{x^2}2+frac{x^3}3+frac{x^4}4+dots$$
$$-ln(1-1)=1+frac12+frac13+frac14+dotsquad $$
Using Euler's relationship between the Riemann zeta function and the Dirichlet eta function,
$$begin{align}
sum_{k=1}^inftyfrac1{k^s} & =frac1{1-2^{1-s}}sum_{k=1}^inftyfrac{(-1)^{k+1}}{k^s} \
sum_{k=1}^inftyfrac1k & =frac10sum_{k=1}^inftyfrac{(-1)^{k+1}}ktag{$s=1$} \
& to+infty
end{align}$$
But isn't the series for $ln(1-x)$ only valid for $-1le x<1$?
– TheSimpliFire
Mar 17 at 9:31
Yes, but since the limit as $xto1$ in $x^n$ is monotone, it equals the asked series, if they exist.
– Simply Beautiful Art
Mar 19 at 2:33
add a comment |
up vote
2
down vote
Using Euler's form of the Harmonic numbers,
$$sum_{k=1}^nfrac1k=int_0^1frac{1-x^n}{1-x}dx$$
$$begin{align}
lim_{ntoinfty}sum_{k=1}^nfrac1k & =lim_{ntoinfty}int_0^1frac{1-x^n}{1-x}dx \
& =int_0^1frac1{1-x}dx \
& =left.lim_{pto1^+}-ln(1-x)right]_0^p \
& to+infty
end{align}$$
Using the Taylor expansion of $ln(1-x)$,
$$-ln(1-x)=x+frac{x^2}2+frac{x^3}3+frac{x^4}4+dots$$
$$-ln(1-1)=1+frac12+frac13+frac14+dotsquad $$
Using Euler's relationship between the Riemann zeta function and the Dirichlet eta function,
$$begin{align}
sum_{k=1}^inftyfrac1{k^s} & =frac1{1-2^{1-s}}sum_{k=1}^inftyfrac{(-1)^{k+1}}{k^s} \
sum_{k=1}^inftyfrac1k & =frac10sum_{k=1}^inftyfrac{(-1)^{k+1}}ktag{$s=1$} \
& to+infty
end{align}$$
But isn't the series for $ln(1-x)$ only valid for $-1le x<1$?
– TheSimpliFire
Mar 17 at 9:31
Yes, but since the limit as $xto1$ in $x^n$ is monotone, it equals the asked series, if they exist.
– Simply Beautiful Art
Mar 19 at 2:33
add a comment |
up vote
2
down vote
up vote
2
down vote
Using Euler's form of the Harmonic numbers,
$$sum_{k=1}^nfrac1k=int_0^1frac{1-x^n}{1-x}dx$$
$$begin{align}
lim_{ntoinfty}sum_{k=1}^nfrac1k & =lim_{ntoinfty}int_0^1frac{1-x^n}{1-x}dx \
& =int_0^1frac1{1-x}dx \
& =left.lim_{pto1^+}-ln(1-x)right]_0^p \
& to+infty
end{align}$$
Using the Taylor expansion of $ln(1-x)$,
$$-ln(1-x)=x+frac{x^2}2+frac{x^3}3+frac{x^4}4+dots$$
$$-ln(1-1)=1+frac12+frac13+frac14+dotsquad $$
Using Euler's relationship between the Riemann zeta function and the Dirichlet eta function,
$$begin{align}
sum_{k=1}^inftyfrac1{k^s} & =frac1{1-2^{1-s}}sum_{k=1}^inftyfrac{(-1)^{k+1}}{k^s} \
sum_{k=1}^inftyfrac1k & =frac10sum_{k=1}^inftyfrac{(-1)^{k+1}}ktag{$s=1$} \
& to+infty
end{align}$$
Using Euler's form of the Harmonic numbers,
$$sum_{k=1}^nfrac1k=int_0^1frac{1-x^n}{1-x}dx$$
$$begin{align}
lim_{ntoinfty}sum_{k=1}^nfrac1k & =lim_{ntoinfty}int_0^1frac{1-x^n}{1-x}dx \
& =int_0^1frac1{1-x}dx \
& =left.lim_{pto1^+}-ln(1-x)right]_0^p \
& to+infty
end{align}$$
Using the Taylor expansion of $ln(1-x)$,
$$-ln(1-x)=x+frac{x^2}2+frac{x^3}3+frac{x^4}4+dots$$
$$-ln(1-1)=1+frac12+frac13+frac14+dotsquad $$
Using Euler's relationship between the Riemann zeta function and the Dirichlet eta function,
$$begin{align}
sum_{k=1}^inftyfrac1{k^s} & =frac1{1-2^{1-s}}sum_{k=1}^inftyfrac{(-1)^{k+1}}{k^s} \
sum_{k=1}^inftyfrac1k & =frac10sum_{k=1}^inftyfrac{(-1)^{k+1}}ktag{$s=1$} \
& to+infty
end{align}$$
edited Oct 6 '16 at 17:16
answered Oct 6 '16 at 16:57
Simply Beautiful Art
50k577181
50k577181
But isn't the series for $ln(1-x)$ only valid for $-1le x<1$?
– TheSimpliFire
Mar 17 at 9:31
Yes, but since the limit as $xto1$ in $x^n$ is monotone, it equals the asked series, if they exist.
– Simply Beautiful Art
Mar 19 at 2:33
add a comment |
But isn't the series for $ln(1-x)$ only valid for $-1le x<1$?
– TheSimpliFire
Mar 17 at 9:31
Yes, but since the limit as $xto1$ in $x^n$ is monotone, it equals the asked series, if they exist.
– Simply Beautiful Art
Mar 19 at 2:33
But isn't the series for $ln(1-x)$ only valid for $-1le x<1$?
– TheSimpliFire
Mar 17 at 9:31
But isn't the series for $ln(1-x)$ only valid for $-1le x<1$?
– TheSimpliFire
Mar 17 at 9:31
Yes, but since the limit as $xto1$ in $x^n$ is monotone, it equals the asked series, if they exist.
– Simply Beautiful Art
Mar 19 at 2:33
Yes, but since the limit as $xto1$ in $x^n$ is monotone, it equals the asked series, if they exist.
– Simply Beautiful Art
Mar 19 at 2:33
add a comment |
up vote
-3
down vote
A series converges if and only if the tail of the series tends to zero, i.e. the summation from N to infinity tends to zero for N to infinity. But in case of the Harmonic series, we have that the summation from N to 2 N is larger than the smallest term ( which is 1/(2N)), times the number of terms N, which yields 1/2. So, the tail clearly does not tend to zero for N to infinity.
2
What do you think you were adding that hasn't been addressed thoroughly?
– user223391
Jan 31 '16 at 1:05
@avid19 I added an argument that's tl;dr-proof.
– Count Iblis
Jan 31 '16 at 4:29
@ZacharySelk My proof is the best, as it's the most concise, self contained proof given. Unlike the other proofs my proof can be modified into an argument that a 6 year old could understand (with some effort).
– Count Iblis
Oct 6 '16 at 22:35
2
@CountIblis Your proof is mathematically identical to the six-year-earlier (and top-voted, and accepted) answer by AgCl, which also explains the situation much more clearly. (Note that bringing tails into the picture is unnecessary, and just adds a layer of complexity: we can reason about the blocks directly as AgCl does, and as you do essentially, to show that the harmonic series goes to infinity.)
– Noah Schweber
Oct 11 '16 at 1:52
add a comment |
up vote
-3
down vote
A series converges if and only if the tail of the series tends to zero, i.e. the summation from N to infinity tends to zero for N to infinity. But in case of the Harmonic series, we have that the summation from N to 2 N is larger than the smallest term ( which is 1/(2N)), times the number of terms N, which yields 1/2. So, the tail clearly does not tend to zero for N to infinity.
2
What do you think you were adding that hasn't been addressed thoroughly?
– user223391
Jan 31 '16 at 1:05
@avid19 I added an argument that's tl;dr-proof.
– Count Iblis
Jan 31 '16 at 4:29
@ZacharySelk My proof is the best, as it's the most concise, self contained proof given. Unlike the other proofs my proof can be modified into an argument that a 6 year old could understand (with some effort).
– Count Iblis
Oct 6 '16 at 22:35
2
@CountIblis Your proof is mathematically identical to the six-year-earlier (and top-voted, and accepted) answer by AgCl, which also explains the situation much more clearly. (Note that bringing tails into the picture is unnecessary, and just adds a layer of complexity: we can reason about the blocks directly as AgCl does, and as you do essentially, to show that the harmonic series goes to infinity.)
– Noah Schweber
Oct 11 '16 at 1:52
add a comment |
up vote
-3
down vote
up vote
-3
down vote
A series converges if and only if the tail of the series tends to zero, i.e. the summation from N to infinity tends to zero for N to infinity. But in case of the Harmonic series, we have that the summation from N to 2 N is larger than the smallest term ( which is 1/(2N)), times the number of terms N, which yields 1/2. So, the tail clearly does not tend to zero for N to infinity.
A series converges if and only if the tail of the series tends to zero, i.e. the summation from N to infinity tends to zero for N to infinity. But in case of the Harmonic series, we have that the summation from N to 2 N is larger than the smallest term ( which is 1/(2N)), times the number of terms N, which yields 1/2. So, the tail clearly does not tend to zero for N to infinity.
edited Oct 6 '16 at 22:29
answered Jan 31 '16 at 0:07
Count Iblis
8,15121333
8,15121333
2
What do you think you were adding that hasn't been addressed thoroughly?
– user223391
Jan 31 '16 at 1:05
@avid19 I added an argument that's tl;dr-proof.
– Count Iblis
Jan 31 '16 at 4:29
@ZacharySelk My proof is the best, as it's the most concise, self contained proof given. Unlike the other proofs my proof can be modified into an argument that a 6 year old could understand (with some effort).
– Count Iblis
Oct 6 '16 at 22:35
2
@CountIblis Your proof is mathematically identical to the six-year-earlier (and top-voted, and accepted) answer by AgCl, which also explains the situation much more clearly. (Note that bringing tails into the picture is unnecessary, and just adds a layer of complexity: we can reason about the blocks directly as AgCl does, and as you do essentially, to show that the harmonic series goes to infinity.)
– Noah Schweber
Oct 11 '16 at 1:52
add a comment |
2
What do you think you were adding that hasn't been addressed thoroughly?
– user223391
Jan 31 '16 at 1:05
@avid19 I added an argument that's tl;dr-proof.
– Count Iblis
Jan 31 '16 at 4:29
@ZacharySelk My proof is the best, as it's the most concise, self contained proof given. Unlike the other proofs my proof can be modified into an argument that a 6 year old could understand (with some effort).
– Count Iblis
Oct 6 '16 at 22:35
2
@CountIblis Your proof is mathematically identical to the six-year-earlier (and top-voted, and accepted) answer by AgCl, which also explains the situation much more clearly. (Note that bringing tails into the picture is unnecessary, and just adds a layer of complexity: we can reason about the blocks directly as AgCl does, and as you do essentially, to show that the harmonic series goes to infinity.)
– Noah Schweber
Oct 11 '16 at 1:52
2
2
What do you think you were adding that hasn't been addressed thoroughly?
– user223391
Jan 31 '16 at 1:05
What do you think you were adding that hasn't been addressed thoroughly?
– user223391
Jan 31 '16 at 1:05
@avid19 I added an argument that's tl;dr-proof.
– Count Iblis
Jan 31 '16 at 4:29
@avid19 I added an argument that's tl;dr-proof.
– Count Iblis
Jan 31 '16 at 4:29
@ZacharySelk My proof is the best, as it's the most concise, self contained proof given. Unlike the other proofs my proof can be modified into an argument that a 6 year old could understand (with some effort).
– Count Iblis
Oct 6 '16 at 22:35
@ZacharySelk My proof is the best, as it's the most concise, self contained proof given. Unlike the other proofs my proof can be modified into an argument that a 6 year old could understand (with some effort).
– Count Iblis
Oct 6 '16 at 22:35
2
2
@CountIblis Your proof is mathematically identical to the six-year-earlier (and top-voted, and accepted) answer by AgCl, which also explains the situation much more clearly. (Note that bringing tails into the picture is unnecessary, and just adds a layer of complexity: we can reason about the blocks directly as AgCl does, and as you do essentially, to show that the harmonic series goes to infinity.)
– Noah Schweber
Oct 11 '16 at 1:52
@CountIblis Your proof is mathematically identical to the six-year-earlier (and top-voted, and accepted) answer by AgCl, which also explains the situation much more clearly. (Note that bringing tails into the picture is unnecessary, and just adds a layer of complexity: we can reason about the blocks directly as AgCl does, and as you do essentially, to show that the harmonic series goes to infinity.)
– Noah Schweber
Oct 11 '16 at 1:52
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f255%2fwhy-does-the-series-sum-n-1-infty-frac1n-not-converge%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
3
This is not meant to be an answer but an interesting note. Suppose we denote $H(n) = 1/1 + 1/2 + ... + 1/n$ then $H(n!) - H((n-1)!) approx log(n)$ for large n. Does this give a hint? ;)
– Roupam Ghosh
Jul 11 '11 at 4:14
5
Here is a weakly related question: What is a textbook, or even a popularization for the general public, that (1) discusses infinite series, but (2) does not have an explanation for the divergence of this exact series?
– GEdgar
Nov 3 '13 at 19:50
to avoid defining the logarithm, use the Cauchy condensation test to show that $sum 1/n$ converges iff $sum 1$ converges
– reuns
Jan 30 '16 at 23:31
These are two of my favourite papers: The Harmonic Series Diverges Again and Again and More Proofs of Divergence of the Harmonic Series. See these.
– user249332
Mar 9 '16 at 22:56
If it converges, then it contradicts the dominated convergence theorem. This proof is easily comprehensible if you know the dominated convergence theorem, but that theorem is not the most comprehensible.
– Oiler
Oct 6 '16 at 23:08