If \(\tan(xy) = xy\), show that \(y' = -y/x\).

These are my steps : \(xy'+y = \sec^2(xy) \times (xy'+y)\)) which implies \(\sec^2(xy)=1\) [ \((xy'+y)\) cancels out on both sides.]

And then I don't know how to proceed. Help!

No vote yet

1 vote

×

Problem Loading...

Note Loading...

Set Loading...

Easy Math Editor

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in \( ... \) or \[ ... \] to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

Sort by:

TopNewestYou did it correctly. You just made the mistake of cancelling out the \( xy' + y \) term . Remember you can't cancel out terms unless you're sure they're not zero.

So you've got \( xy' + y = \sec^2(xy) \times (xy' + y) \)

\( (xy' + y)(\sec^2(xy) - 1) = 0 \)

So this implies \( (\sec^2(xy) - 1) = 0 \) or \( (xy' + y) = 0 \)

Case 1 - \(\sec^2(xy) = 1 \) implies \( \tan(xy)=xy = 0 \). This implies \( x = 0 \) or \( y =0 \).

If \( x = 0 \), then you can't prove it.

If \( y = 0 \), then \( y' = 0 = \dfrac{-y}{x} \).

Case 2 -\( (xy' + y) = 0 \implies xy' = - y \implies y' = \dfrac{-y}{x} \) which is what you wanted to prove.

Log in to reply