<?xml version='1.0' encoding='utf-8'?>
<eprints xmlns='http://eprints.org/ep2/data/2.0'>
  <eprint id='https://researchdata.bath.ac.uk/id/eprint/1303'>
    <eprintid>1303</eprintid>
    <rev_number>24</rev_number>
    <documents>
      <document id='https://researchdata.bath.ac.uk/id/document/17300'>
        <docid>17300</docid>
        <rev_number>3</rev_number>
        <files>
          <file id='https://researchdata.bath.ac.uk/id/file/62069'>
            <fileid>62069</fileid>
            <datasetid>document</datasetid>
            <objectid>17300</objectid>
            <filename>multimodal_dataset.zip</filename>
            <mime_type>application/zip</mime_type>
            <hash>d54b3bc2716b0c054c83163cdbcd1bc0</hash>
            <hash_type>MD5</hash_type>
            <filesize>169743</filesize>
            <mtime>2023-07-26 17:40:04</mtime>
            <url>https://researchdata.bath.ac.uk/1303/1/multimodal_dataset.zip</url>
          </file>
        </files>
        <eprintid>1303</eprintid>
        <pos>1</pos>
        <placement>1</placement>
        <mime_type>application/zip</mime_type>
        <format>other</format>
        <formatdesc>Data collected from five different textures with the multimodal soft tactile sensor presented in the work &quot;Soft Tactile Sensor with Multimodal Data Processing for Texture Recognition&quot;. The zip file has 5 folders each containing a folder with 5 data samples from piezovibration sensing, 5 data samples from velostat sensing and 5 data samples from IMU sensing.</formatdesc>
        <language>en</language>
        <security>public</security>
        <license>cc_by</license>
        <main>multimodal_dataset.zip</main>
        <content>data</content>
      </document>
    </documents>
    <eprint_status>archive</eprint_status>
    <userid>9738</userid>
    <dir>disk0/00/00/13/03</dir>
    <datestamp>2024-03-14 09:08:55</datestamp>
    <lastmod>2024-07-15 11:00:11</lastmod>
    <status_changed>2024-03-14 09:08:55</status_changed>
    <type>data_collection</type>
    <metadata_visibility>show</metadata_visibility>
    <creators>
      <item>
        <name>
          <family>Martinez-Hernandez</family>
          <given>Uriel</given>
        </name>
        <id>U.Martinez@bath.ac.uk</id>
        <orcid>0000-0002-9922-7912</orcid>
        <affiliation>University of Bath</affiliation>
        <contact>TRUE</contact>
      </item>
    </creators>
    <title>Dataset for &quot;Soft Tactile Sensor with Multimodal Data Processing for Texture Recognition&quot;</title>
    <subjects>
      <item>EA0090</item>
      <item>KS0100</item>
    </subjects>
    <divisions>
      <item>dept_elec_eng</item>
    </divisions>
    <keywords>multimodal sensing, machine learning, texture recognition, soft tactile sensors, robotics</keywords>
    <abstract>The dataset is composed of 5 folders that contain tactile data from 5 different textures. Each folder has 15 files: 5 text files with data collected from piezovibration sensing modality, 5 text files with data collected from velostat (piezoresistive) sensing modality, 5 files with data from IMU (accelerometer, gyroscope) and pressure data. This dataset was collected to investigate the potential of the proposed multimodal soft tactile sensor to read data using multiple different sensing elements and their combination for texture recognition.</abstract>
    <date>2023-08-01</date>
    <publisher>University of Bath</publisher>
    <full_text_status>public</full_text_status>
    <corp_contributors>
      <item>
        <type>RightsHolder</type>
        <corpname>University of Bath</corpname>
      </item>
    </corp_contributors>
    <funding>
      <item>
        <funder_name>Engineering and Physical Sciences Research Council</funder_name>
        <funder_id>https://doi.org/10.13039/501100000266</funder_id>
        <grant_id>EP/V051083/1</grant_id>
        <project_name>Manufacturing in Hospital: BioMed 4.0</project_name>
      </item>
    </funding>
    <research_centres>
      <item>cent_aur</item>
      <item>cent_dmade</item>
    </research_centres>
    <collection_method>The methodology can be found in the associated paper.</collection_method>
    <language>en</language>
    <version>1</version>
    <doi>10.15125/BATH-01303</doi>
    <related_resources>
      <item>
        <link>https://doi.org/10.1109/LSENS.2023.3300796</link>
        <type>pub</type>
      </item>
    </related_resources>
    <access_types>
      <item>open</item>
    </access_types>
    <resourcetype>
      <general>Dataset</general>
    </resourcetype>
  </eprint>
</eprints>
