- 【Updated on May 12, 2025】 Integration of CiNii Dissertations and CiNii Books into CiNii Research
- Trial version of CiNii Research Knowledge Graph Search feature is available on CiNii Labs
- 【Updated on June 30, 2025】Suspension and deletion of data provided by Nikkei BP
- Regarding the recording of “Research Data” and “Evidence Data”
Auto-Differentiated Fixed Point Notation on Low-Powered Hardware Acceleration
-
- Nsinga Robert
- Graduate School of Advanced Technology and Science
-
- Karungaru Stephen
- Graduate School of Advanced Technology and Science
-
- Terada Kenji
- Graduate School of Advanced Technology and Science
Search this article
Description
<p>Using less electric power or speeding up processing is catching the interests of researchers in deep learning. Quanti-zation has offered distillation mechanisms that substitute floating numbers for integers, but little has been suggested about the floating numbers themselves. The use of Q-format notation reduces computational overheads that frees resources for the in-troduction of more operations. Our experiments, conditioned on varying regimes, introduce automatic differentiation on algo-rithms like the fast Fourier transforms and Winograd minimal filtering to reduce computational complexity (expressed in total number of MACs) and suggest a path towards the assistive intelligence concept. Empirical results show that, under specific heuristics, the Q-format number notation can overcome the shortfalls of floating numbers, especially for embedded systems. Further benchmarks like the FPBench standard give more details by comparing our proposals with common deep learning operations.</p>
Journal
-
- Journal of Signal Processing
-
Journal of Signal Processing 26 (5), 131-140, 2022-09-01
Research Institute of Signal Processing, Japan
- Tweet
Details 詳細情報について
-
- CRID
- 1390011793672543360
-
- ISSN
- 18801013
- 13426230
-
- Text Lang
- en
-
- Article Type
- journal article
-
- Data Source
-
- JaLC
- IRDB
- Crossref
- OpenAIRE
-
- Abstract License Flag
- Disallowed