What Does Layer-Wise Relevance Propagation Mean?
Layer-wise relevance propagation is a method for understanding deep neural networks that uses a particular design path to observe how the individual layers of the program work.
Techopedia Explains Layer-Wise Relevance Propagation
Specifically, experts contrast layer-wise relevance propagation with a deepLIFT model which uses backpropagation to examine activation differences between artificial neurons in various layers of the deep network. Some describe layer-wise relevance propagation as a deepLIFT method that sets all reference activations of artificial neurons to the same baseline for analysis.