Not understanding Entropy is okay. It is the hardest concept in information theory and thermodynamics.

Entropy is the amount of information in a system.

Can it be equated to randomness? Yes! Because, to store more random arrangements, you need more bits. However, the more the order, the less the entropy. Why? Because to store the arrangement, you just need to know the order.

Which of the following string has more entropy?
a) aaaaaaaaaaa
b) xyzhwkofgbf
c) aaaabbbbccc

$</code> ... <code>$</code>...<code>."> Easy Math Editor

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in $</span> ... <span>$ or $</span> ... <span>$ to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

Sort by:

TopNewestThe entropy of this note is too damn high!

Log in to reply

Haha, I got that xD xD xD Thanks To @Agnishom Chattopadhyay xD

Log in to reply

Try answering the question on coins.

Log in to reply

I think , entropy is a measure of the number of specific ways in which a system(keep in mind that the system is thermodynamic) may be arranged.

Log in to reply

In which class are you? @Mehul Arora

Log in to reply

@Parth Lohomi , Class 9 Now. :D

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

@Parth Lohomi

LOL why are u following me again? xD -_-Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

$\triangle S=\displaystyle\int \dfrac{dQ_{rev}} {T}$

$\triangle S$ is change in entropy...

$T$ is absolute temperature

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Not understanding Entropy is okay. It is the hardest concept in information theory and thermodynamics.

Entropy is the amount of information in a system.

Can it be equated to randomness? Yes! Because, to store more random arrangements, you need more bits. However, the more the order, the less the entropy. Why? Because to store the arrangement, you just need to know the order.

Which of the following string has more entropy? a) aaaaaaaaaaa b) xyzhwkofgbf c) aaaabbbbccc

Log in to reply

I guess, The 2nd one. (b)

Log in to reply

Correct.

How many bits of entropy are contained in a system containing two coins?

Log in to reply

Log in to reply

Have you heard of the equation S= K ln(W)?

Log in to reply

@Parth Lohomi @Agnishom Chattopadhyay @Raghav Vaidyanathan and @Nihar Mahajan

Nope, Not ever xD xD I wanted to Learn the basics On Entropy. Like, The definition and all. Once I have the Base Strong, I shall move to the Advanced study on this topic. ThanksLog in to reply

Maxwell's Demon

Try this problem:It explains some interesting things about entropy from an Information Theory and Thermodynamic point of view

Log in to reply

Mehul , this might help you .

Log in to reply

For trying to help someone , one gets downvotes. How selfish the downvoters might be! :/ -_-

Log in to reply

xD I know! Try My Problem! The one Inspired by you :)

Log in to reply

Log in to reply

Sorry, I downvoted the comment because the thought an encyclopedia link would not be too useful

Log in to reply

Log in to reply

Log in to reply