What Does Rectified Linear Unit (ReLU) Mean?

The rectified linear unit (ReLU) is one of the most common activation functions in machine learning models. As a component of an artificial neuron in artificial neural networks (ANN), the activation function is responsible for processing weighted inputs and helping to deliver an output.

Techopedia Explains Rectified Linear Unit (ReLU)

With the ReLU as the activation function, the function returns positive values, but does not return negative values, returning zero if negative input applies.