PS: This note has been written to explain the problems with the question posted by a new user whose question context bewildered many people who tried to attempt the question but couldn't understand it, and also to explain the peculiar output of the program posted in the solution. This note is no way intended to insult the user, but rather to show him the mistakes that he did in framing the question and the program.
Recently, a newbie user Aman Banka posted a Computer Science question which had provisions of yielding infinitely many answers as such (the question was posted with the scope of a single answer only). Since the problem has been deleted now, we show you the screenshot of the question posted by him: So, it is pretty evident that the difference between the sums must depend on the diagonal sums, which in turn depend on the values entered in the diagonals of the square matrix. Clearly, the difference won't be a constant. As a result, user Rishabh Cool posted a report on the question, mentioning the same thing that the difference won't be constant as such. But Aman Banka didn't provide a proper explanation and didn't understand that the answer won't be unique. The answer set to his question was 1, claiming that his program was showing the difference between the left and right diagonal sum of any matrix as 1. (Which obviously hints at a huge error in his program) But Aman Banka deletes this problem and then posts another problem which again sounds meaningless:
difference between the two sums must be 1 which is always a constant
After reading this, the first question that must arise in someone's mind is that how can the difference between the sums be a constant if the matrix can be filled up in infinitely many ways, and it is the values which account for the difference between the diagonal sums. One solution (yet a stupid one) is to assign $s2 = s1  1$. However, the only thing that can be comprehended from here is that the question (supposedly) is asking that the right diagonal sum and left diagonal sum must be printed if and only if the difference between the sums is $1$, which can be easily be reached at through this:
1 2 3 4 5 6 7 8 9 10 

1 2 3 4 5 6 

Where $s1$ and $s2$ denote the left diagonal sum and the right diagonal sum respectively
But this is too far fetched. Finally, Calvin Lin posts a report on it, marking the deletion of this question because the question has not been phrased in a proper manner.
But why was that program created by Aman Banka showing a constant difference of 1 everytime?
Well, why don't we take a look at the program that he posted?
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 

Check lines 8797, there's where he had committed the mistake. His program aims at finding the left diagonal sum and right diagonal sum of a squared matrix. But the method of finding the sum of right diagonal sum is incorrect. The flow of control of a forloop like this: And the loop considered here is:
1 2 3 4 5 6 7 8 9 10 11 12 

So, this loop actually sums up the elements of the left diagonal only, but except the $[0][0]$ element, because, the condition given is that the row index and column index must be more than 0. So the last element to be summed is $[1][1]$ since $1>0$ but in the next step, it decreases to 0, which marks the end of the loop as 0 is not more than 0 and doesn't satisfy the condition of the loop anymore.
Since the right diagonal sum here is the left diagonal sum only, but without the $[0][0]$ element being added to it, the difference between the left diagonal sum and the right diagonal sum will always be equal to the value entered at $[0][0]$. In Aman Banka's output, he had entered 1 in $[0][0]$ and as expected his difference came out to be 1. In a nutshell, the difference of the sums in this program will always be equal to the value entered at $[0][0]$, and is irrespective of the other values entered in the square matrix. For example, if you enter 'x' at $[0][0]$ and fill up other elements in the array with any number you wish, you will still find that the difference between the sums is equal to 'x'.
Then, what should be the rectification?
The right diagonal of a square matrix is actually this:
As we see for an (n by n) matrix, we have to start adding from $[0][n1]$, and each succeeding right diagonal element has row index $1$ more than the previous row index and has column index $1$ less than the previous column index, which must run till the row index becomes equal to $n1$, i.e, till it is lesser than $n$. So, we can grab the intuition from here that the loop must be like this:
1 2 3 4 5 6 

Where, $i$ and $j$ are the row and column index counter and $s2$ is the sum of the right diagonal elements initiated at 0.
$</code> ... <code>$</code>...<code>."> Easy Math Editor
*italics*
or_italics_
**bold**
or__bold__
paragraph 1
paragraph 2
[example link](https://brilliant.org)
> This is a quote
2 \times 3
2^{34}
a_{i1}
\frac{2}{3}
\sqrt{2}
\sum_{i=1}^3
\sin \theta
\boxed{123}
Comments
Sort by:
Top NewestNice... This might clarify his misunderstanding..
Log in to reply
I hope it does :3
Log in to reply
What is meant by duology here?
Log in to reply
A set or compilation of 2 things. (Duology)
Log in to reply
for second diagonal we could also use i+j=n condition
Log in to reply
You happen to mean $i + j = n  1$, right? Array indices start from $0$ to $n1$.
Your approach is certainly aiming at:
This would work, undoubtedly. But note that this block of code will run through a number of $n^2$ iterations, making it slower than that of mine involving just $n$ iterations.
Log in to reply