Addressed the degradation problem in deep neural networks through residual learning with shortcut connections. This approach enables the training of extremely deep networks by allowing gradients to bypass layers, facilitating better gradient flow and improving training stability. Their work demonstrated successful training of networks with depths up to 1,000 layers, significantly enhancing performance on the ImageNet dataset and winning the ImageNet Large Scale Visual Recognition Challenge (ILSVRC) 2015. This innovation paved the way for deeper and more efficient neural networks in various applications