Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Author "Jin, Qianyue"

Sort by: Order: Results:

  • Jin, Qianyue (2021)
    Mixture of Experts (MoE) is a machine learning tool that utilizes multiple expert models to solve machine learning tasks. By combining perspectives of individual experts using a product of their output, the overall system can produce comprehensive decisions. The hope is that by doing this, the individual experts can focus on modeling different aspects of the data. In this thesis, we study MoEs in the context of deep learning and image classification using empirical comparisons. The different datasets, gating and expert networks, numbers of experts and objective functions for the gate are controlled in the experiments, and the performance (classification accuracy) and the behavior (how the gate distributes the data over the experts) are obtained as the results. Based on the result that the experimented mixtures of networks are performing mostly on par with the single network baseline, we conclude that either the mixture of experts is not suitable to be learned for the image classification tasks or it requires some different engineering in the architecture and/or optimization algorithm selection.