Improving Accuracy of Evolving GMM Under GPGPU-Friendly Block-Evolutionary Pattern

  • Chunlei Chen
    School of Computer Engineering, Weifang University, Weifang 261061, P. R. China
  • Chengduan Wang
    School of Computer Engineering, Weifang University, Weifang 261061, P. R. China
  • Jinkui Hou
    School of Computer Engineering, Weifang University, Weifang 261061, P. R. China
  • Ming Qi
    College of Information Engineering, Weifang Vocational College, Weifang 261041, P. R. China
  • Jiangyan Dai
    School of Computer Engineering, Weifang University, Weifang 261061, P. R. China
  • Yonghui Zhang
    School of Computer Engineering, Weifang University, Weifang 261061, P. R. China
  • Peng Zhang
    School of Computer Engineering, Weifang University, Weifang 261061, P. R. China

抄録

<jats:p> As a classical clustering model, Gaussian Mixture Model (GMM) can be the footstone of dominant machine learning methods like transfer learning. Evolving GMM is an approximation to the classical GMM under time-critical or memory-critical application scenarios. Such applications often have constraints on time-to-answer or high data volume, and raise high computation demand. A prominent approach to address the demand is GPGPU-powered computing. However, the existing evolving GMM algorithms are confronted with a dilemma between clustering accuracy and parallelism. Point-wise algorithms achieve high accuracy but exhibit limited parallelism due to point-evolutionary pattern. Block-wise algorithms tend to exhibit higher parallelism. Whereas, it is challenging to achieve high accuracy under a block-evolutionary pattern due to the fact that it is difficult to track evolving process of the mixture model in fine granularity. Consequently, the existing block-wise algorithm suffers from significant accuracy degradation, compared to its batch-mode counterpart: the standard EM algorithm. To cope with this dilemma, we focus on the accuracy issue and develop an improved block-evolutionary GMM algorithm for GPGPU-powered computing systems. Our algorithm leverages evolving history of the model to estimate the latest model order in each incremental clustering step. With this model order as a constraint, we can perform similarity test in an elastic manner. Finally, we analyze the evolving history of both mixture components and the data points, and propose our method to merge similar components. Experiments on real images show that our algorithm significantly improves accuracy of the original general purpose bock-wise algorithm. The accuracy of our algorithm is at least comparable to that of the standard EM algorithm and even outperforms the latter under certain scenarios. </jats:p>

収録刊行物

被引用文献 (1)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ