PyTorch Leaky ReLU - Useful Tutorial - Python Guides (2024)

In this Python tutorial, we will learn about the PyTorch leaky ReLU function. This function is an activation function and it is used to solve the problem of dying neurons. And additionally, we will cover the different examples related to the PyTorch Leaky ReLU. And also covers these topics.

  • PyTorch leaky relu
  • PyTorch leaky relu example
  • PyTorch leaky relu inplace
  • PyTorch leaky relu slope
  • PyTorch leaky relu functional
  • PyTorch leaky relu vs relu

Table of Contents

PyTorch Leaky Relu

In this section, we will learn about how PyTorch Leaky Relu works in python.

The PyTorch leaky relu is an activation function. It is a beneficial function if the input is negative the derivative of the function is not zero and the learning rate of the neuron does not stop. This function is used to solve the problem of dying neurons.

Syntax:

The syntax of leaky relu is:

torch.nn.LeakyReLU(negative_slope = 0.01, inplace = False)

Parameters

The following are the parameter that is used within LeakyReLU() function.

  • negative_slope: It is used to control the angle of the negative slope. The default value of the negative_slope is 1e-2.
  • inplace: It can optionally do the operation in-place. The default value of inplace is False. If the value of inplace is True, it will alter the input directly without assigning any additional output.

So, with this, we understood how the PyTorch leaky relu works in python.

Read: PyTorch Activation Function

PyTorch leaky relu example

In this section, we will learn about the PyTorch leaky relu with the help of an example in python.

The PyTorch leaky relu is defined as an activation function. If the input is negative the derivative of the function would be a very small fraction and never zero.

This makes sure that the learning rate of the neuron does not stop during backpropagation and thus avoiding the dying neuron issue.

Code:

In the following code, firstly we will import the torch module and after that, we will import torch.nn as nn.

  • n = nn.LeakyReLU(0.2) Here we are using LeakyReLU() function.
  • input = torch.randn(4) Here we are describing the input variable by using torch.random() function.
  • output = n(input) Here we are declaring the output variable.
  • print(output) is used to print the output values by using the print() function.
# Import libraryimport torchimport torch.nn as nn# Using the leakyReLU()n = nn.LeakyReLU(0.2)# Describing the input variableinput = torch.randn(4)# Declaring the output variableoutput = n(input)print(output)

Output:

After running the above code, we get the following output in which we can see that the PyTorch leaky relu value is printed on the screen.

PyTorch Leaky ReLU - Useful Tutorial - Python Guides (1)

This is how the implementation of the PyTorch leaky relu is done.

Read: PyTorch fully connected layer

PyTorch leaky relu inplace

In this section, we will learn about the PyTorch leaky relu inplace in PyThon.

The PyTorch leaky relu inplace is defined as an activation function and within this function, we are using the parameter that is inplace.

Syntax:

The syntax of PyTorch leaky relu inplace:

torch.nn.LeakyReLU(inplace=True)

Parameter:

The following are the parameter:

  • inplace = True Which means that it will alter the input directly without allocating any additional output and the default value of the inplace parameter is False.

This is how the inplace parameter works in the Pytorch leaky relu function.

Read: PyTorch Model Summary

PyTorch leaky relu slope

In this section, we will learn about how PyTorch leaky relu works in python.

Before moving forward we should have a piece of knowledge about slope. The slope is a surface where one side is higher that the other side.

The PyTorch leaky relu slope is defined as when the input is negative and the differentiation of the function is not zero.

Syntax:

The syntax of leaky relu slope:

torch.nn.LeakyReLU(negative_slope = 0.01)

Parameter:

The following are the parameter of leaky relu:

negative_slope: It is used to control the angle of the negative slope. The default value of the negative_slope is 1e-2.

Code:

In the following code, firstly we will import the torch module and after that, we will import torch.nn as nn.

  • s = nn.LeakyReLU(0.4) is used to define the LeakyReLU() function and within this function, we are using the parameter 0.4 that controls the negative slope.
  • input = torch.Tensor([2,-4,5,-6]) is used to create a tensor with an array.
  • output = s(input) Here we are declaring the output variable.
  • print(output) is used to print the output values with the help of the print() function.
# Importing librariesimport torchimport torch.nn as nn # Defining Leaky relu and the parameter 0.4 is passed to control the negative slope s = nn.LeakyReLU(0.4) # Creating a Tensor with an arrayinput = torch.Tensor([2,-4,5,-6])# Declaring the output variableoutput = s(input)# Print the outputprint(output)

Output:

After running the above code, we get the following output in which we can see that the PyTorch leaky relu slope value is printed on the screen.

PyTorch Leaky ReLU - Useful Tutorial - Python Guides (2)

So, with this, we understood how the PyTorch leaky relu slope works.

Read: PyTorch Logistic Regression

PyTorch leaky relu functional

In this section, we will learn about the PyTorch leaky relu functional in python.

The PyTorch leaky relu functional is defined as a process that is used to solve the problem of dying neurons.

This function is very helpful and useful. The derivative of the function is not zero if the input value is negative.

Syntax:

The syntax of the PyTorch leaky relu functional:

torch.nn.functional.leaky_relu(input,negative_slope = 0.01, inplace = False)

Parameter:

The following are the parameters that are used within the leaky relu functional:

  • negative_slope: It is used to control the angle of the negative slope. The default value of the negative_slope is 1e-2.
  • inplace: It can optionally do the operation in-place. The default value of inplace is False. If the value of inplace is T, it will alter the input directly without assigning any additional output.

This is how the Pytorch leaky relu functional works.

Read: PyTorch Model Eval + Examples

PyTorch leaky relu vs relu

In this section, we will learn the difference between the PyTorch leaky relu and relu in python.

PyTorch leaky relu:

The leaky relu function is very useful. In leaky relu the derivative becomes not zero if the input value is negative.

The leaky relu also solves the problem of dying the neurons and the learning rate of the neuron does not stop.

Example:

In the following code, firstly we will import all the necessary libraries such as import torch and import torch.nn as nn.

re = nn.LeakyReLU(0.6): Here we are defining the LeakyReLU() function.

input = torch.Tensor([2,-3,4,-6]) is used to create a tensor with an array.

output = re(input) is used to pass the array to leaky relu function.

print(output) is used to print the output using the print() function.

# Importing librariesimport torchimport torch.nn as nn # defining Leaky relure = nn.LeakyReLU(0.6) # Creating a Tensor with an arrayinput = torch.Tensor([2,-3,4,-6]) # Passing the array to leaky relu functionoutput = re(input)# Print the outputprint(output)

Output:

After running the above example, we get the following output in which we can see that the PyTorch leaky relu value is printed on the screen.

PyTorch Leaky ReLU - Useful Tutorial - Python Guides (3)

PyTorch relu:

The relu function is a non-linear and differentiable function. In relu the derivative becomes zero if the inputs are negative which causes the dying of neurons and the learning rate of the neuron to stop.

Example:

In the following code, firstly we will import all the necessary libraries such as import torch and import torch.nn as nn.

  • lr = nn.ReLU(): Here we are defining the ReLU() function.
  • input = torch.Tensor([2,-3,4,-6]) is used to create a tensor with an array.
  • output = lr(input) is used to pass the array to the relu function.
  • print(output) is used to print the function with the help of print() function.
# Importing libarariesimport torchimport torch.nn as nn # defining relulr = nn.ReLU() # Creating a Tensor with an arrayinput = torch.Tensor([2,-3,4,-6]) # Passing the array to relu functionoutput = lr(input)# Print outputprint(output)

Output:

In the below output, you can see that the PyTorch relu value is printed on the screen.

PyTorch Leaky ReLU - Useful Tutorial - Python Guides (4)

So, with this we understood the difference between the PyTorch leaky relu and relu function.

Also, take a look at some more PyTorch tutorials.

  • PyTorch nn Conv2d
  • PyTorch Early Stopping + Examples
  • PyTorch MSELoss – Detailed Guide
  • PyTorch Batch Normalization
  • PyTorch Load Model + Examples

So, in this tutorial, we discussed the PyTorch Leaky ReLU and covered different examples related to its implementation. Here is the list of examples that we have covered.

  • PyTorch leaky relu
  • PyTorch leaky relu example
  • PyTorch leaky relu inplace
  • PyTorch leaky relu slope
  • PyTorch leaky relu functional
  • PyTorch leaky relu vs relu

PyTorch Leaky ReLU - Useful Tutorial - Python Guides (5)

Bijay Kumar

I am Bijay Kumar, a Microsoft MVP in SharePoint. Apart from SharePoint, I started working on Python, Machine learning, and artificial intelligence for the last 5 years. During this time I got expertise in various Python libraries also like Tkinter, Pandas, NumPy, Turtle, Django, Matplotlib, Tensorflow, Scipy, Scikit-Learn, etc… for various clients in the United States, Canada, the United Kingdom, Australia, New Zealand, etc. Check out my profile.

PyTorch Leaky ReLU - Useful Tutorial - Python Guides (2024)

References

Top Articles
Latest Posts
Article information

Author: Mrs. Angelic Larkin

Last Updated:

Views: 6429

Rating: 4.7 / 5 (47 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Mrs. Angelic Larkin

Birthday: 1992-06-28

Address: Apt. 413 8275 Mueller Overpass, South Magnolia, IA 99527-6023

Phone: +6824704719725

Job: District Real-Estate Facilitator

Hobby: Letterboxing, Vacation, Poi, Homebrewing, Mountain biking, Slacklining, Cabaret

Introduction: My name is Mrs. Angelic Larkin, I am a cute, charming, funny, determined, inexpensive, joyous, cheerful person who loves writing and wants to share my knowledge and understanding with you.