University of Lincoln
Browse

Deep Bayesian Self-Training

Version 2 2024-03-12, 17:34
Version 1 2024-03-01, 11:10
journal contribution
posted on 2024-03-12, 17:34 authored by Fabio De Sousa Ribeiro, Francesco Caliva, Mark SwainsonMark Swainson, Kjartan Gudmundsson, George Leontidis, Stefanos KolliasStefanos Kollias

Supervised Deep Learning has been highly successful in recent years, achieving state-of-the-art results in most tasks. However, with the ongoing uptake of such methods in industrial applications, the requirement for large amounts of annotated data is often a challenge. In most real world problems, manual annotation is practically intractable due to time/labour constraints, thus the development of automated and adaptive data annotation systems is highly sought after. In this paper, we propose both a (i) Deep Bayesian Self-Training methodology for automatic data annotation, by leveraging predictive uncertainty estimates using variational inference and modern Neural Network architectures, as well as (ii) a practical adaptation procedure for handling high label variability between different dataset distributions through clustering of Neural Network latent variable representations. An experimental study on both public and private datasets is presented illustrating the superior performance of the proposed approach over standard Self-Training baselines, highlighting the importance of predictive uncertainty estimates in safety-critical domains.

History

School affiliated with

  • School of Computer Science (Research Outputs)

Publication Title

Neural Computing and Applications

Volume

32

Issue

9

Pages/Article Number

4275-4291

Publisher

Springer

ISSN

0941-0643

eISSN

1433-3058

Date Submitted

2019-07-05

Date Accepted

2019-06-28

Date of First Publication

2019-07-10

Date of Final Publication

2020-05-01

Date Document First Uploaded

2019-07-03

ePrints ID

36321