<?xml version='1.0' encoding='utf-8'?>
<eprints xmlns='http://eprints.org/ep2/data/2.0'>
  <eprint id='https://researchdata.bath.ac.uk/id/eprint/1574'>
    <eprintid>1574</eprintid>
    <rev_number>38</rev_number>
    <documents>
      <document id='https://researchdata.bath.ac.uk/id/document/19381'>
        <docid>19381</docid>
        <rev_number>4</rev_number>
        <files>
          <file id='https://researchdata.bath.ac.uk/id/file/78106'>
            <fileid>78106</fileid>
            <datasetid>document</datasetid>
            <objectid>19381</objectid>
            <filename>RoundaboutHD.zip</filename>
            <mime_type>application/zip</mime_type>
            <hash>26ab511a5d89b7ea860bdd9f473cc7e1</hash>
            <hash_type>MD5</hash_type>
            <filesize>8075314494</filesize>
            <mtime>2025-07-24 16:12:23</mtime>
            <url>https://researchdata.bath.ac.uk/1574/1/RoundaboutHD.zip</url>
          </file>
        </files>
        <eprintid>1574</eprintid>
        <pos>1</pos>
        <placement>1</placement>
        <mime_type>application/zip</mime_type>
        <format>other</format>
        <formatdesc>The RoundaboutHD dataset.</formatdesc>
        <language>en</language>
        <security>public</security>
        <license>cc_mit</license>
        <main>RoundaboutHD.zip</main>
        <content>data</content>
      </document>
    </documents>
    <eprint_status>archive</eprint_status>
    <userid>14060</userid>
    <dir>disk0/00/00/15/74</dir>
    <datestamp>2025-08-04 07:32:15</datestamp>
    <lastmod>2025-08-12 07:52:58</lastmod>
    <status_changed>2025-08-04 07:32:15</status_changed>
    <type>data_collection</type>
    <metadata_visibility>show</metadata_visibility>
    <creators>
      <item>
        <name>
          <family>Lin</family>
          <given>Yuqiang</given>
        </name>
        <id>yl4300@bath.ac.uk</id>
        <affiliation>University of Bath</affiliation>
        <contact>TRUE</contact>
      </item>
      <item>
        <name>
          <family>Lockyer</family>
          <given>Sam</given>
        </name>
        <id>sl2726@bath.ac.uk</id>
        <affiliation>University of Bath</affiliation>
        <contact>FALSE</contact>
      </item>
    </creators>
    <contributors>
      <item>
        <type>DataCollector</type>
        <name>
          <family>Sui</family>
          <given>Michael</given>
        </name>
        <id>ms2832@bath.ac.uk</id>
        <affiliation>University of Bath</affiliation>
      </item>
      <item>
        <type>DataCollector</type>
        <name>
          <family>Gan</family>
          <given>Lee</given>
        </name>
        <id>lg561@bath.ac.uk</id>
        <affiliation>University of Bath</affiliation>
      </item>
      <item>
        <type>Other</type>
        <name>
          <family>Stanek</family>
          <given>Florian</given>
        </name>
        <id>flstanek@googlemail.com</id>
        <affiliation>Starwit Technologies GmbH</affiliation>
      </item>
      <item>
        <type>Other</type>
        <name>
          <family>Zarbock</family>
          <given>Markus</given>
        </name>
        <id>markus.zarbock@starwit.de</id>
        <affiliation>Starwit Technologies GmbH</affiliation>
      </item>
      <item>
        <type>Supervisor</type>
        <name>
          <family>Li</family>
          <given>Wenbin</given>
        </name>
        <id>W.Li@bath.ac.uk</id>
        <orcid>0000-0002-5593-2599</orcid>
        <affiliation>University of Bath</affiliation>
      </item>
      <item>
        <type>Supervisor</type>
        <name>
          <family>Evans</family>
          <given>Adrian</given>
        </name>
        <id>A.N.Evans@bath.ac.uk</id>
        <orcid>0000-0001-8586-8295</orcid>
        <affiliation>University of Bath</affiliation>
      </item>
      <item>
        <type>Supervisor</type>
        <name>
          <family>Zhang</family>
          <given>Nic</given>
        </name>
        <id>qz254@bath.ac.uk</id>
        <orcid>0000-0002-3752-0689</orcid>
        <affiliation>University of Bath</affiliation>
      </item>
    </contributors>
    <corp_creators>
      <item>Starwit Technologies GmbH</item>
    </corp_creators>
    <title>Dataset for, &quot;RoundaboutHD: High-Resolution Real-World Urban Environment Benchmark for Multi-Camera Vehicle Tracking&quot;</title>
    <subjects>
      <item>CP0130</item>
      <item>FB0010</item>
      <item>FB0110</item>
      <item>FB0160</item>
    </subjects>
    <divisions>
      <item>dept_mech_eng</item>
    </divisions>
    <keywords>Multi Camera Vehicle Tracking, Object Detection, Single Camera Tracking, Vehicle Re-Identification</keywords>
    <abstract>The multi-camera vehicle tracking (MCVT) framework holds significant potential for smart city applications, including anomaly detection, traffic density estimation, and suspect vehicle tracking. However, current publicly available datasets exhibit limitations, such as overly simplistic scenarios, low-resolution footage, and insufficiently diverse conditions, creating a considerable gap between academic research and real-world scenario. To fill this gap, we introduce RoundaboutHD, a comprehensive, high-resolution multi-camera vehicle tracking benchmark dataset specifically designed to represent real-world roundabout scenarios. RoundaboutHD provides a total of 40 minutes of labelled video footage captured by four non-overlapping, high-resolution (4K resolution, 15 fps) cameras. In total, 512 unique vehicle identities are annotated across different camera views, offering rich cross-camera association data. RoundaboutHD offers temporal consistency video footage and enhanced challenges, including increased occlusions and nonlinear movement inside the roundabout. In addition to the full MCVT dataset, several subsets are also available for object detection, single camera tracking, and image-based vehicle re-identification (ReID) tasks. Vehicle model information and camera modelling/ geometry information are also included to support further analysis. We provide baseline results for vehicle detection, single-camera tracking, image-based vehicle re-identification, and multi-camera tracking. The dataset is publicly available.</abstract>
    <date>2025-08-04</date>
    <publisher>University of Bath</publisher>
    <full_text_status>public</full_text_status>
    <funding>
      <item>
        <funder_name>Engineering and Physical Sciences Research Council</funder_name>
        <funder_id>https://doi.org/10.13039/501100000266</funder_id>
        <grant_id>EP/S023364/1</grant_id>
        <project_name>EPSRC Centre for Doctoral Training in Advanced Automotive Propulsion Systems</project_name>
      </item>
    </funding>
    <research_centres>
      <item>inst_aaps</item>
    </research_centres>
    <collection_method>The dataset was collected using fixed-position, real-world traffic cameras located in Indiana, USA, provided by an industrial partner under a collaborative agreement. The video footage was captured under natural driving conditions, without experimental interference, to reflect realistic urban traffic patterns. All annotations were manually curated using a custom-built semi-automated labeling toolkit developed specifically for this project. This tool significantly enhanced annotation efficiency while ensuring high labeling accuracy. The labeling process included object detection, tracking, and identity association across multiple cameras.</collection_method>
    <provenance>No third-party datasets are used.</provenance>
    <techinfo>The data was collected using fixed-position traffic surveillance cameras provided by an industrial partner. Each camera recorded 4K-resolution video at 15 frames per second under real-world traffic conditions. The annotation process was conducted using a custom-built semi-automated labeling toolkit developed in Python, running on Ubuntu 20.04. Key libraries and frameworks used include OpenCV, NumPy, and Matplotlib for visualization and annotation support.

To view and utilize the dataset, users will require basic tools for handling image and text data (e.g., Python with OpenCV) and a machine with sufficient storage and memory to process high-resolution video frames and annotation files. The dataset follows YOLO-style text annotations for detection tasks and CSV-format files for tracking metadata.

For reproducibility, we provide the labeling toolkit and evaluation scripts in the associated GitHub repository, along with documentation detailing the annotation format and dataset structure.</techinfo>
    <methodurl>
      <item>https://github.com/siri-rouser/multi_camera_tracking_labelling_tool</item>
    </methodurl>
    <language>en</language>
    <version>1</version>
    <doi>10.15125/BATH-01574</doi>
    <related_resources>
      <item>
        <link>https://arxiv.org/abs/2507.08729</link>
        <type>pub</type>
      </item>
    </related_resources>
    <access_types>
      <item>open</item>
    </access_types>
    <resourcetype>
      <general>Software</general>
    </resourcetype>
  </eprint>
</eprints>
