Skip to content

[Feature Request]: Adding Types of activation function in Artificial neural network (ANN) in Deep Learning docs #3829

Closed
@Shantnu-singh

Description

@Shantnu-singh

Is there an existing issue for this?

  • I have searched the existing issues

Feature Description

This feature will cover the mathematical formulations, properties, and practical applications of activation functions such as sigmoid, tanh, ReLU, Leaky ReLU, and softmax, along with visualizations and code snippets for implementation.

Use Case

This feature is beneficial for students, educators, and practitioners in the field of deep learning. Students can use the comprehensive documentation to understand the role and impact of different activation functions on neural network performance.

Benefits

No response

Add ScreenShots

No response

Priority

High

Record

  • I have read the Contributing Guidelines
  • I'm a GSSOC'24 contributor
  • I have starred the repository

Metadata

Metadata

Assignees

Labels

CodeHarborHub - Thanks for creating an issue!GSSOC'24GirlScript Summer of Code | ContributordocumentationImprovements or additions to documentationgssocGirlScript Summer of Code | Contributorlevel1GirlScript Summer of Code | Contributor's Levels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions