ReLu

Rectified Linear Unit (ReLu) is a very popular activation function in deep learning. ReLu is defined as

$$ \begin{cases} x, & \text{if }x>=0 \\ 0, & \text{else.} \end{cases} $$

Visualizations

ReLu

Derivative of ReLu

Characteristics

In trained models, ReLu doesn’t preserve the qualitative distributions of values after the activation.

Lippe P. Tutorial 3: Activation Functions — UvA DL Notebooks v1.1 documentation. In: UvA Deep Learning Tutorials [Internet].

Lippe P. Tutorial 3: Activation Functions — UvA DL Notebooks v1.1 documentation. In: UvA Deep Learning Tutorials [Internet].

Because of the zero values in ReLu, many neurons actually don’t participate in any of the tasks as they are just nullified to zeros and provide no gradient. Such neurons are dead neurons.

Code

def relu(x):
    return x * (x > 0)
Full code to generate the data used in this article

Full code to generate the data used in this article

from torch import nn
import matplotlib.pyplot as plt
import torch
from typing import Union, Optional
from pathlib import Path
import json


def visualize_activation(
    x: torch.Tensor, acti: torch.nn.Module,
    save_path: Optional[Union[str, Path]] = None
) -> dict:
    """Visualize activation function on the domain of x"""

    y = acti(x)

    # Calculate the grad of the activation function
    x = x.clone().requires_grad_()
    acti(x).sum().backward()
    yp = x.grad

    activation_dict = {
        "x": x.detach().numpy().tolist(),
        "y": y.detach().numpy().tolist(),
        "yp": yp.detach().numpy().tolist()
    }

    if save_path is not None:
        if isinstance(save_path, str):
            save_path = Path(save_path)
        save_path.parent.mkdir(parents=True, exist_ok=True)
        with open(save_path, "w") as f:
            json.dump(activation_dict, f, indent=4)

    return activation_dict

class ReLU(nn.Module):
    """Rectified Linear Unit"""

    def __init__(self):
        super().__init__()

    def forward(self, x: torch.Tensor) -> torch.Tensor:
        return x * (x > 0).float()

    def __str__(self) -> str:
        return f"Activation Function: {super().__str__()}"


if __name__ == "__main__":

    relu = ReLU()

    save_path = "data/activations/relu.json"
    x = torch.linspace(-2, 2, 1000)
    data = visualize_activation(x, relu, save_path=save_path)

    fig, ax = plt.subplots()
    ax.plot(data["x"], data["y"])
    ax.plot(data["x"], data["yp"])
    ax.set_title("ReLU")
    plt.show()

    pass

Planted: by ;

Dynamic Backlinks to cards/machine-learning/neural-networks/activation-relu:
cards/machine-learning/neural-networks/activation-relu Links to:

L Ma (2018). 'ReLu', Datumorphism, 11 April. Available at: https://datumorphism.leima.is/cards/machine-learning/neural-networks/activation-relu/.