๊ด€๋ฆฌ ๋ฉ”๋‰ด

๋ชฉ๋กactivation function (1)

Hey Tech

[Deep Learning] Activation Function ๊ฐœ๋… ๋ฐ ์ข…๋ฅ˜: sign, tanh, sigmoid, softmax, ReLU, Leaky ReLU

๐Ÿ“š ๋ชฉ์ฐจ 1. ํ™œ์„ฑํ™” ํ•จ์ˆ˜์˜ ๊ฐœ๋… 2. ํ™œ์„ฑํ™” ํ•จ์ˆ˜์˜ ์ข…๋ฅ˜ 2.1. Sign ํ•จ์ˆ˜ 2.2. Sigmoid ํ•จ์ˆ˜ 2.3. Tanh ํ•จ์ˆ˜ 2.4. Softmax ํ•จ์ˆ˜ 2.5. ReLU ํ•จ์ˆ˜ 2.6. Leaky ReLU ํ•จ์ˆ˜ 1. ํ™œ์„ฑํ™” ํ•จ์ˆ˜์˜ ๊ฐœ๋… ํ™œ์„ฑํ™” ํ•จ์ˆ˜(Activation Function)๋ž€ ํผ์…‰ํŠธ๋ก (Perceptron)์˜ ์ถœ๋ ฅ๊ฐ’์„ ๊ฒฐ์ •ํ•˜๋Š” ๋น„์„ ํ˜•(non-linear) ํ•จ์ˆ˜์ž…๋‹ˆ๋‹ค. ์ฆ‰, ํ™œ์„ฑํ™” ํ•จ์ˆ˜๋Š” ํผ์…‰ํŠธ๋ก ์—์„œ ์ž…๋ ฅ๊ฐ’์˜ ์ดํ•ฉ์„ ์ถœ๋ ฅํ• ์ง€ ๋ง์ง€ ๊ฒฐ์ •ํ•˜๊ณ , ์ถœ๋ ฅํ•œ๋‹ค๋ฉด ์–ด๋–ค ๊ฐ’์œผ๋กœ ๋ณ€ํ™˜ํ•˜์—ฌ ์ถœ๋ ฅํ• ์ง€ ๊ฒฐ์ •ํ•˜๋Š” ํ•จ์ˆ˜์ž…๋‹ˆ๋‹ค. ํผ์…‰ํŠธ๋ก ์— ๋Œ€ํ•œ ์ž์„ธํ•œ ๋‚ด์šฉ์€ ์ด๊ณณ์„ ์ฐธ๊ณ ํ•ด ์ฃผ์„ธ์š”. ์•„๋ž˜ ๊ทธ๋ฆผ 1์— ๋…ธ๋ž€์ƒ‰์œผ๋กœ ์ƒ‰์น ํ•œ ๋ถ€๋ถ„์ด ํผ์…‰ํŠธ๋ก ์˜ ํ™œ์„ฑํ™” ํ•จ์ˆ˜ ๋ถ€๋ถ„์ž…๋‹ˆ๋‹ค. 2. ํ™œ์„ฑํ™” ํ•จ์ˆ˜์˜ ์ข…๋ฅ˜ 2.1. Sign ํ•จ์ˆ˜ ์œ„์˜ ํผ์…‰..