Abstract
With the popularity of the applications equipped with neural networks on edge devices, robustness has become the focus of researchers. However, when deploying the applications onto the hardware, environmental noise is unavoidable, in which errors may cause applications crash, especially for the safety-critic applications. In this paper, we propose FTR-NAS to optimize recurrent neural architectures to enhance the fault tolerance. First, according to real deployment scenarios, we formalize computational faults and weight faults, which are simulated with Multiply-Accumulate (MAC)-independent and identically distributed (i.i.d) Bit-Bias (MiBB) model and Stuck-at-Fault (SAF) model, respectively. Next, we establish a multi-objective NAS framework powered by the fault models to discover high-performance and fault-tolerant recurrent architectures. Moreover, we incorporate fault-tolerant training (FTT) in the search process to further enhance the fault tolerance of the recurrent architectures. Experimentally, C-FTT-RNN and W-FTT-RNN we discovered on PTB dataset have promising fault tolerance for computational and weight faults. Besides, we further demonstrate the usefulness of the learned architectures by transferring it to WT2 dataset well.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
\(\alpha _l\) keeps consistent in both architecture searching and training process.
References
Li, W., Ning, X., Ge, G., Chen, X., Wang, Y., Yang, H.: FTT-NAS: discovering fault-tolerant neural architecture. In: 2020 25th Asia and South Pacific Design Automation Conference (ASP-DAC), pp. 211–216. IEEE (2020)
Pham, H., Guan, M.Y., Zoph, B., Le, Q.V., Dean, J.: Efficient neural architecture search via parameter sharing. arXiv preprint arXiv:1802.03268 (2018)
Liu, H., Simonyan, K., Yang, Y.: Darts: Differentiable architecture search. arXiv preprint arXiv:1806.09055 (2018)
He, Z., Lin, J., Ewetz, R., Yuan, J.S., Fan, D.: Noise injection adaption: end-to-end reram crossbar non-ideal effect adaption for neural network mapping. In: Proceedings of the 56th Annual Design Automation Conference 2019, pp. 1–6 (2019)
Zilly, J.G., Srivastava, R.K., Koutnık, J., Schmidhuber, J.: Recurrent highway networks. In: Proceedings of the 34th International Conference on Machine Learning-Volume 70. pp. 4189–4198. JMLR. org (2017)
Chen, C.Y., et al.: RRAM defect modeling and failure analysis based on march test and a novel squeeze-search scheme. IEEE Trans. Comput. 64(1), 180–190 (2014)
Hu, K., Shuo, T., Shasha, G., Nan, L., Li, L., Wang, L.: Recurrent neural architecture search based on randomness-enhanced Tabu algorithm (2020, in press)
Cai, H., Zhu, L., Han, S.: Proxylessnas: direct neural architecture search on target task and hardware. arXiv preprint arXiv:1812.00332 (2018)
Acknowledgments
This work is founded by National Key R&D Program of China [grant numbers 2018YFB2202603].
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Hu, K., Ding, D., Tian, S., Gong, R., Luo, L., Wang, L. (2020). FTR-NAS: Fault-Tolerant Recurrent Neural Architecture Search. In: Yang, H., Pasupa, K., Leung, A.CS., Kwok, J.T., Chan, J.H., King, I. (eds) Neural Information Processing. ICONIP 2020. Communications in Computer and Information Science, vol 1333. Springer, Cham. https://doi.org/10.1007/978-3-030-63823-8_67
Download citation
DOI: https://doi.org/10.1007/978-3-030-63823-8_67
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-63822-1
Online ISBN: 978-3-030-63823-8
eBook Packages: Computer ScienceComputer Science (R0)