If a computer can print a line containing all 26 letters of the alphabet in 0.01 seconds, estimate how long it would take to print all possible permutations of the alphabets.

All possible permutations of the alphabets would be \(26!\), and since one line could print 26 alphabets, there would be \(\frac {26!}{26}\) lines, and it would be \(25!\), so, it would take \(25! \cdot \frac {1}{100}=\frac {25!}{100}=\frac {24!}{4}\)seconds for the printer to print it

## Comments

Sort by:

TopNewestAll possible permutations of the alphabets would be \(26!\), and since one line could print 26 alphabets, there would be \(\frac {26!}{26}\) lines, and it would be \(25!\), so, it would take \(25! \cdot \frac {1}{100}=\frac {25!}{100}=\frac {24!}{4}\)seconds for the printer to print it

Log in to reply

\( 26! \times 0.1\) seconds?

Log in to reply

Which is approxomately 4

10^24 seconds or 1.2810^14 millenia.Log in to reply

There should be an asterisk between 4 and 10, and 1.28 and 10.

Log in to reply

\( 26! \times 0.01 \) seconds

Log in to reply