2023 article
PreFlush: Lightweight Hardware Prediction Mechanism for Cache Line Flush and Writeback
2023 32ND INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES AND COMPILATION TECHNIQUES, PACT, pp. 74–85.
Non-Volatile Main Memory (NVMM) technologies make it possible for applications to permanently store data in memory. To do so, they need to make sure that updates to persistent data comply with the crash consistency model, which often involves explicitly flushing a dirty cache line after a store and then waiting for the flush operation to complete using a store fence. While cache line flush and write back instructions can complete in the background, fence instructions expose the latency of flushing to the critical path of the program's execution, incurring significant overheads. If flush operations are started earlier, the penalty of fences can be significantly reduced. We propose PreFlush, a lightweight and transparent hardware mechanism that predicts when a cache line flush or write back is needed and speculatively performs the operation early. Since we speculatively perform the flush, we add hardware to handle flush misspeculation to ensure correct execution of the code without the need for any complex recovery mechanisms. Our PreFlush design is transparent to the programmer (i.e. it requires no modification on existing NVMM-enabled code). Our results show that PreFlush can improve performance by up to 25% (15.7% average) for the WHISPER NVM benchmark suite and loop-based matrix microbenchmarks.