A

Admin • 670.22K Points
Coach

Q. Which of the following is a common activation function?

  • (A) ReLU
  • (B) Hyperlink
  • (C) Cascade
  • (D) SIFT

Explanation by: Admin
ReLU (Rectified Linear Unit) is a widely used activation function in deep learning.

You must be Logged in to update hint/solution

Discusssion

Login to discuss.

Be the first to start discuss.