KinematicGraspingMovements.xml

XML-based description - Florian Lier, 2011-09-02 14:05

Download (6.782 KB)

 
1
<?xml version="1.0" encoding="UTF-8" ?>
2
<opendatametainfo>
3
  <title>Kinematic data of grasping movements directed towards virtual and real objects </title>
4
  <organisation>
5
        <name>CITEC -- Center of Excellence Cognitive Interaction Technology, Bielefeld University</name>
6
        <url>https://cit-ec.de</url>
7
  </organisation>
8
  <description>
9
Eleven right handed subjects (age: 24-39 years, 4 women) participated in a series of three experiments. All subjects had normal or corrected-to-normal vision and had no known impairments related to arm or hand movement. All subjects gave written informed consent to be part of the study. The experiment was carried out according to the principles laid out in the 1964 Declaration of Helsinki. Subjects performed all three experiments in the same order, starting with Experiment 1, directly followed by Experiment 2, and then Experiment 3. The experiments were carried out at the Manual Intelligence Lab, making use of its sophisticated multimodal set-up for investigating manual interaction (Maycock et al., 2010). During the data collection, the subjects stood in front of a table (with dimensions 210 x 130 x 100 cm). Subjects wore an Immersion CyberGlove II wireless data glove (Immersion Corp., San Jose, CA; data acquisition rate: 100Hz; sensor resolution: 1 degree) on the right hand that allowed for the recording of whole handkinematics (22 DOF). In front of the subject (at a distance of 40cm), a holding device for spherical objects (golf tee) was positioned on the table. A laptop computer screen was positioned behind the holding device. A small round bowl (10cm in diameter) located 40cm to the right of the holding device served as target for placing the objects. A 14         camera Vicon digital optical motion capture system (Vicon, Los Angeles, CA) mounted around the table was used to monitor the trajectories of the hand movements via three retro-reflective markers placed on the back of the data glove 
10
 </description>
11
  <projects>
12
        <project>
13
        <name>MINDA</name>
14
        <url>http://www.cit-ec.de/research/MINDA"</url>
15
        <description>
16
MINDA is creating incrementally growing database of manual interactions to help put manual intelligence research on a firmer empirical bases. This involves the study of manual interactions in humans using a multi-sensing approach. The database contains: geometry information, tactile sensor information, vision information and sound information. Using these multimodal information sources allows us to build models that can aid robots to carry out complex tasks of the type that humans perform with ease.
17
        </description>
18
        </project>
19
        <project>
20
        <name>CORTESMA</name>
21
        <url>http://www.cit-ec.de/research/CORTESMA"</url>
22
        <description>
23
                CORTESMA investigated hand kinematics and mental representations of grasping movements directed towards real and virtual spherical objects systematically varying                         in size. Results suggest that grasping movements are influenced by object size at an early stage of the movement for real and virtual objects. The analyses of mental                         representations (via SDA) and of motor synergies (via PCA) reveal a separation of the smallest three objects from the larger ones, pointing towards a conceptual                         influence on the grasping movement.
24
        </description>
25
        </project>        
26
  </projects>
27
  <version>v1.0.0</version>
28
  <date>2011-07-20</date>
29
  <creators>
30
        <creator>
31
        <name>Jonathan Maycock</name>
32
        <url>http://www.cit-ec.de/users/jmaycock</url>
33
        </creator>
34

    
35
        <creator>
36
        <name>Bettina Blaesing</name>
37
        <url>http://www.cit-ec.de/users/bblaesin</url>
38
        </creator>
39

    
40
        <creator>
41
        <name>Till Bockemuehl</name>
42
        <url></url>
43
        </creator>
44

    
45
        <creator>
46
        <name>Helge Ritter</name>
47
        <url>https://www.cit-ec.de/users/helge</url>
48
        </creator>
49

    
50
        <creator>
51
        <name>Thomas Schack</name>
52
        <url>https://www.cit-ec.de/users/tschack</url>
53
        </creator>
54
  </creators>
55
  <contributors>
56
  <contributor>
57
          <name>-</name>
58
          <url>-</url>
59
  </contributor>
60
  </contributors>
61
  <downloadurls>
62
        <downloadurl>http://opensource.cit-ec.de/attachments/download/23/Subject10.tar.gz</downloadurl>
63
        <downloadurl>http://opensource.cit-ec.de/attachments/download/19/Subject1.tar.gz</downloadurl>
64
        <downloadurl>http://opensource.cit-ec.de/attachments/download/20/Subject2.tar.gz</downloadurl>
65
        <downloadurl>http://opensource.cit-ec.de/attachments/download/21/Subject3.tar.gz</downloadurl>
66
        <downloadurl>http://opensource.cit-ec.de/attachments/download/22/Subject4.tar.gz</downloadurl>
67
        <downloadurl>http://opensource.cit-ec.de/attachments/download/24/Subject5.tar.gz</downloadurl>
68
        <downloadurl>http://opensource.cit-ec.de/attachments/download/15/Subject6.tar.gz</downloadurl>
69
        <downloadurl>http://opensource.cit-ec.de/attachments/download/16/Subject7.tar.gz</downloadurl>
70
        <downloadurl>http://opensource.cit-ec.de/attachments/download/17/Subject8.tar.gz</downloadurl>
71
        <downloadurl>http://opensource.cit-ec.de/attachments/download/19/Subject9.tar.gz</downloadurl>
72
  </downloadurls>
73
  <license>
74
        MINDA, CORTESMA: 'Kinematic data of grasping movements directed towards virtual and real objects' Copyright(c) Jonathan Maycock, Bettina Blaesing, Till Bockemuehl,Helge                 Ritter and Thomas Schack http://opendata.cit-ec.de/projects/virtual-and-real-grasping http://opendata.cit-ec.de/projects/virtual-and-real-grasping. There is NO warranty; not                 even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. The development of this database was supported by the Excellence Cluster EXC 277 Cognitive Interaction                 Technology. The Excellence Cluster EXC 277 is a grant of the Deutsche Forschungsgemeinschaft (DFG) in the context of the German Excellence Initiative. 'Kinematic data of                 grasping movements directed towards virtual and real objects' is made available under the Open Database License: http://opendatacommons.org/licenses/odbl/1.0/. Any rights in                 individual contents of the database are licensed under the Database Contents License: http://opendatacommons.org/licenses/dbcl/1.0/
75
  </license>
76
  <keywords>
77
        <keyword>manual action</keyword>
78
        <keyword>grasping</keyword>
79
        <keyword>hand kinematics</keyword>
80
        <keyword>virtual objects</keyword>
81
        <keyword>motor synergies</keyword>
82
        <keyword>mental representation of movement </keyword> 
83
  </keywords>
84
  <structure>
85
        Time series of joint angles and hand positions. Note that the Cyberglove captures at approximately 90hz and the Vicon system at exactly 200hz. 
86
  </structure>
87
  <formats>
88
          <format>XML</format>
89
        <format>CSV</format>
90
        <format>C3D</format>
91
  </formats>                
92
  <relation>http://pub.uni-bielefeld.de/pub?func=drec&#38;id=2034003"</relation>
93
  <acknowledgements>
94
        The development of this database was supported by the Excellence Cluster EXC 277 Cognitive Interaction Technology. The Excellence Cluster EXC 277 is a grant of the Deutsche                 Forschungsgemeinschaft (DFG) in the context of the German Excellence Initiative. 
95
  </acknowledgements>
96
</opendatametainfo>
97
<!-- Future: Add RDF description/attributes -->