# Introduction to Entropy

Hi everyone. I learnt About Entropy Today, But I didn't quite seem to Understand it. Anyone Who Can Explain It to me?Thanks In advance!

Note by Mehul Arora
3 years, 2 months ago

MarkdownAppears as
*italics* or _italics_ italics
**bold** or __bold__ bold
- bulleted- list
• bulleted
• list
1. numbered2. list
1. numbered
2. list
Note: you must add a full line of space before and after lists for them to show up correctly
paragraph 1paragraph 2

paragraph 1

paragraph 2

[example link](https://brilliant.org)example link
> This is a quote
This is a quote
    # I indented these lines
# 4 spaces, and now they show
# up as a code block.

print "hello world"
# I indented these lines
# 4 spaces, and now they show
# up as a code block.

print "hello world"
MathAppears as
Remember to wrap math in $$...$$ or $...$ to ensure proper formatting.
2 \times 3 $$2 \times 3$$
2^{34} $$2^{34}$$
a_{i-1} $$a_{i-1}$$
\frac{2}{3} $$\frac{2}{3}$$
\sqrt{2} $$\sqrt{2}$$
\sum_{i=1}^3 $$\sum_{i=1}^3$$
\sin \theta $$\sin \theta$$
\boxed{123} $$\boxed{123}$$

Sort by:

The entropy of this note is too damn high!

- 3 years, 2 months ago

Haha, I got that xD xD xD Thanks To @Agnishom Chattopadhyay xD

- 3 years, 2 months ago

Try answering the question on coins.

Staff - 3 years, 2 months ago

I think , entropy is a measure of the number of specific ways in which a system(keep in mind that the system is thermodynamic) may be arranged.

- 3 years, 2 months ago

In which class are you? @Mehul Arora

- 3 years, 2 months ago

@Parth Lohomi , Class 9 Now. :D

- 3 years, 2 months ago

Preparing for RMO, best of luck....[ I will not be there to congratulate you that time :(.....]

- 3 years, 2 months ago

Best of Luck to you! :D :D

- 3 years, 2 months ago

Are you done with the explanation?

- 3 years, 2 months ago

Not yet. :/

- 3 years, 2 months ago

Let's tell you more

$\triangle S=\displaystyle\int \dfrac{dQ_{rev}} {T}$

$$\triangle S$$ is change in entropy...

$$T$$ is absolute temperature

- 3 years, 2 months ago

Went tangentially over my head xD

- 3 years, 2 months ago

Typo should be "tangentially"

- 3 years, 2 months ago

- 3 years, 2 months ago

Haha...

- 3 years, 2 months ago

I see!! XD...

- 3 years, 2 months ago

Hey parth! accept my request on fb.Thanks

- 3 years, 2 months ago

Hey Sorry. I am not too much Online guy.... So am inactive but will surely accept it when I open it next time!!

- 3 years, 2 months ago

You happy now? c'mmon find the reason..

- 3 years, 2 months ago

Find the reason? For what?

- 3 years, 2 months ago

In which class are you

- 3 years, 2 months ago

I entered 10th std. BTW how ur age changed from 14 to 13? xD

- 3 years, 2 months ago

BCos i am actually 13 XD

- 3 years, 2 months ago

Oh! How u be so intelligent even if ur so young? You will crack IIT for sure with good rank. :)

- 3 years, 2 months ago

I am not "so intelligent"... Just an average guy.. And for the IIT stuff i will try my best... (see my status)

- 3 years, 2 months ago

Being level 5 in calculus at age of 13! Thats so extraordinary! :P

- 3 years, 2 months ago

Calculus is my favourite topic in mathematics..

- 3 years, 2 months ago

Will u create your own messageboard? It will be easier for all to communicate there.

- 3 years, 2 months ago

Ah ya sure! (first i want 1000 followers XD)

- 3 years, 2 months ago

- 3 years, 2 months ago

LOL why are u following me again? xD -_- @Parth Lohomi

- 3 years, 2 months ago

Not understanding Entropy is okay. It is the hardest concept in information theory and thermodynamics.

Entropy is the amount of information in a system.

Can it be equated to randomness? Yes! Because, to store more random arrangements, you need more bits. However, the more the order, the less the entropy. Why? Because to store the arrangement, you just need to know the order.

Which of the following string has more entropy? a) aaaaaaaaaaa b) xyzhwkofgbf c) aaaabbbbccc

Staff - 3 years, 2 months ago

I guess, The 2nd one. (b)

- 3 years, 2 months ago

Correct.

How many bits of entropy are contained in a system containing two coins?

Staff - 3 years, 2 months ago

- 3 years, 2 months ago

Kayrrect!

Have you heard of the equation S= K ln(W)?

Staff - 3 years, 2 months ago

Nope, Not ever xD xD I wanted to Learn the basics On Entropy. Like, The definition and all. Once I have the Base Strong, I shall move to the Advanced study on this topic. Thanks @Parth Lohomi @Agnishom Chattopadhyay @Raghav Vaidyanathan and @Nihar Mahajan

- 3 years, 2 months ago

Try this problem: Maxwell's Demon

It explains some interesting things about entropy from an Information Theory and Thermodynamic point of view

Staff - 3 years, 2 months ago

- 3 years, 2 months ago

For trying to help someone , one gets downvotes. How selfish the downvoters might be! :/ -_-

- 3 years, 2 months ago

Sorry, I downvoted the comment because the thought an encyclopedia link would not be too useful

Staff - 3 years, 2 months ago

It might be useful for someone else! -_-

- 3 years, 2 months ago

I agree. I just expressed my opinion. Thank you for sharing the article anyway :)

Staff - 3 years, 2 months ago

xD I know! Try My Problem! The one Inspired by you :)

- 3 years, 2 months ago

I almost got it .... am working on it.. :)

- 3 years, 2 months ago