This discussion board is a place to discuss our Daily Challenges and the math and science
related to those challenges. Explanations are more than just a solution — they should
explain the steps and thinking strategies that you used to obtain the solution. Comments
should further the discussion of math and science.

When posting on Brilliant:

Use the emojis to react to an explanation, whether you're congratulating a job well done , or just really confused .

Ask specific questions about the challenge or the steps in somebody's explanation. Well-posed questions can add a lot to the discussion, but posting "I don't understand!" doesn't help anyone.

Try to contribute something new to the discussion, whether it is an extension, generalization or other idea related to the challenge.

Stay on topic — we're all here to learn more about math and science, not to hear about your favorite get-rich-quick scheme or current world events.

Markdown

Appears as

*italics* or _italics_

italics

**bold** or __bold__

bold

- bulleted - list

bulleted

list

1. numbered 2. list

numbered

list

Note: you must add a full line of space before and after lists for them to show up correctly

Not understanding Entropy is okay. It is the hardest concept in information theory and thermodynamics.

Entropy is the amount of information in a system.

Can it be equated to randomness? Yes! Because, to store more random arrangements, you need more bits. However, the more the order, the less the entropy. Why? Because to store the arrangement, you just need to know the order.

Which of the following string has more entropy?
a) aaaaaaaaaaa
b) xyzhwkofgbf
c) aaaabbbbccc

Easy Math Editor

This discussion board is a place to discuss our Daily Challenges and the math and science related to those challenges. Explanations are more than just a solution — they should explain the steps and thinking strategies that you used to obtain the solution. Comments should further the discussion of math and science.

When posting on Brilliant:

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in`\(`

...`\)`

or`\[`

...`\]`

to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

Sort by:

TopNewestThe entropy of this note is too damn high!

Log in to reply

Haha, I got that xD xD xD Thanks To @Agnishom Chattopadhyay xD

Log in to reply

Try answering the question on coins.

Log in to reply

I think , entropy is a measure of the number of specific ways in which a system(keep in mind that the system is thermodynamic) may be arranged.

Log in to reply

In which class are you? @Mehul Arora

Log in to reply

@Parth Lohomi , Class 9 Now. :D

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

@Parth Lohomi

LOL why are u following me again? xD -_-Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

$\triangle S=\displaystyle\int \dfrac{dQ_{rev}} {T}$

$\triangle S$ is change in entropy...

$T$ is absolute temperature

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Log in to reply

Not understanding Entropy is okay. It is the hardest concept in information theory and thermodynamics.

Entropy is the amount of information in a system.

Can it be equated to randomness? Yes! Because, to store more random arrangements, you need more bits. However, the more the order, the less the entropy. Why? Because to store the arrangement, you just need to know the order.

Which of the following string has more entropy? a) aaaaaaaaaaa b) xyzhwkofgbf c) aaaabbbbccc

Log in to reply

I guess, The 2nd one. (b)

Log in to reply

Correct.

How many bits of entropy are contained in a system containing two coins?

Log in to reply

Log in to reply

Have you heard of the equation S= K ln(W)?

Log in to reply

@Parth Lohomi @Agnishom Chattopadhyay @Raghav Vaidyanathan and @Nihar Mahajan

Nope, Not ever xD xD I wanted to Learn the basics On Entropy. Like, The definition and all. Once I have the Base Strong, I shall move to the Advanced study on this topic. ThanksLog in to reply

Maxwell's Demon

Try this problem:It explains some interesting things about entropy from an Information Theory and Thermodynamic point of view

Log in to reply

Mehul , this might help you .

Log in to reply

For trying to help someone , one gets downvotes. How selfish the downvoters might be! :/ -_-

Log in to reply

xD I know! Try My Problem! The one Inspired by you :)

Log in to reply

Log in to reply

Sorry, I downvoted the comment because the thought an encyclopedia link would not be too useful

Log in to reply

Log in to reply

Log in to reply