For an integer \(a>1\), can \(\sum _{ n=1 }^{ k }{ \sqrt [ a ]{ n } } \) ever be an integer for any integer \(k>1\)?

For \(a=2\), the answer is most likely no. However, I don't have one single clue of how to prove this with \(a=2\), let alone the generalization!

Is there a definitive answer for all integer \(a\)? If so, how?

(It might have something to do with Calculus here, it looks like some kind of series...)

No vote yet

1 vote

×

Problem Loading...

Note Loading...

Set Loading...

Easy Math Editor

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in \( ... \) or \[ ... \] to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

Sort by:

TopNewestIt doesn't make sense to me, do you mean the n to be an x? Otherwise, you are just multiplying the root by the value of k

Log in to reply

Thanks for your info. I didn't have much time when I was typing down.

Any ideas?

Log in to reply

I agree that it is probably no, but not sure how to prove it

Log in to reply

Log in to reply

While doing some research about this, I came across this blog post from almost a decade ago, which applies to your problem for \(a = 2.\) As for higher values of \(a,\) I haven't found anything yet, but perhaps the methods in the blog post can help.

Log in to reply

Yes I believe this result implies the problem for \(a=2\) can never yield an integer. If your sum is \(\sum_{k=1}^n \sqrt{k} =N \), then setting \(s_1=1-N, r_1=1,\) and \(k=r_ks_k^2\) for squarefree \(r_k\), \(2 \leq k \leq n\), then \(\sum_{k=1}^n s_k \sqrt{r_k} =0 \), contradicting the result in the blog.

Log in to reply

Thanks! I’ll check it out.

Log in to reply

It might be easier to think about whether the sum can ever be

rationalfor \(k > 1\). Consider the pair of irrational numbers, \(I,J \in \mathbb{R} \backslash \{\mathbb{Q}\}\).Then \(I+J\) is rational if and only if \(I = R-J\), where \(R \in \mathbb{Q}\).

Now the square-root of any integer is irrational if

at leastone of the prime factors of that integer has odd multiplicity: in other words, \(\displaystyle \sqrt{n} = \sqrt{p_1^{a_1}\cdot p_2^{a_2} \cdots p_j^{a_j}}\), where \(j,n \in \mathbb{Z}\), and we have \(j\) primes with \(a_i\) multiplicity (\(1 \le i \le j\)). In fact, the \(k^{\text{th}}\) root of any integer is irrational provided the multiplicity of at least one of the prime factors of that number isnota multiple of \(k\): \[\displaystyle \sqrt[k]{n} = \sqrt[k]{p_1^{a_1}\cdot p_2^{a_2}\cdots p_i^{a_j}},\] where at least one \(a_i, \space (1\le i \le j)\), is not a multiple of \(k\).Now the \(k^{\text{th}}\) root of any prime number is irrational for integers \(k>1\), so we know that for \(k>1, \space \sqrt[k]{2} \notin \mathbb{Q}\). Without even considering whether the other numbers in this series are rational or not, we know that for \(k,n>1, \space \sqrt[k]{n} \neq R-\sqrt[k]{2}\) for some rational number \(R\), so this sum cannot ever be rational (let alone an integer!)

This 'proof' definitely makes a few assumptions like the irrationality of the \(k^{\text{th}}\) roots of prime numbers \(\left(k>1 \in \mathbb{Z}\right)\), or the fact that for \(k,n>1, \space \sqrt[k]{n} \neq R-\sqrt[k]{p}\) for some rational number \(R\), and prime \(p\), but I suppose you could try and prove those individually if you're looking for a rigorous proof.

Log in to reply

this is for series i think

Log in to reply