University of Lincoln
Browse

A Dataset for Action Recognition in the Wild

Download (2.84 MB)
conference contribution
posted on 2024-02-07, 19:50 authored by Paul BaxterPaul Baxter, Nicola Bellotto, Alexander Gabriel, Serhan Cosar
<p>The development of autonomous robots for agriculture depends on a successful approach to recognize user needs as well as datasets reflecting the characteristics of the domain. Available datasets for 3D Action Recognition generally feature controlled lighting and framing while recording subjects from the front. They mostly reflect good recording conditions and therefore fail to account for the highly variable conditions the robot would have to work with in the field, e.g. when providing in-field logistic support for human fruit pickers as in our scenario. Existing work on Intention Recognition mostly labels plans or actions as intentions, but neither of those fully capture the extend of human intent. In this work, we argue for a holistic view on human Intention Recognition and propose a set of recording conditions, gestures and behaviors that better reflect the environment and conditions an agricultural robot might find itself in. We demonstrate the utility of the dataset by means of evaluating two human detection methods: bounding boxes and skeleton extraction.</p>

History

School affiliated with

  • School of Computer Science (Research Outputs)

Volume

11649

Publisher

Springer

ISSN

0302-9743

Date Submitted

2019-07-05

Date Accepted

2019-04-04

Date of First Publication

2019-06-28

Date of Final Publication

2019-06-28

Event Name

TAROS 2019

Event Dates

3rd - 5th July

Date Document First Uploaded

2019-07-05

ePrints ID

36395

Usage metrics

    University of Lincoln (Research Outputs)

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC