Abstract—Accelerators used for machine learning (ML) inference provide great performance benefits over CPUs. Securing confidential model in inference against off-chip side-channel attacks is critical ...
A technical paper titled “Hardware-Software Co-design for Side-Channel Protected Neural Network Inference” was published (preprint) by researchers at North Carolina State University and Intel.