

{"id":160,"date":"2019-06-18T11:11:47","date_gmt":"2019-06-18T09:11:47","guid":{"rendered":"https:\/\/project.inria.fr\/bowi\/?page_id=160"},"modified":"2019-06-18T11:11:53","modified_gmt":"2019-06-18T09:11:53","slug":"usage-and-case-studies","status":"publish","type":"page","link":"https:\/\/project.inria.fr\/bowi\/usage-and-case-studies\/","title":{"rendered":"Usage and Case Studies"},"content":{"rendered":"<p><\/p>\n<p id=\"aui_3_4_0_1_997\"><span id=\"aui_3_4_0_1_996\">BoWI proposes the first &#8220;Hierarchical Posture Grammar&#8221; based on Arm, Back and Legs postures.\u00a0<\/span><\/p>\n<div>\n<p class=\"p1\">BoWI opens new, unexplored opportunities for interactions between users (inter-BoWI), and between\u00a0users and the environment. Original use cases have been proposed to test the BoWI systems.\u00a0Considering the possibility to identify posture with low-cost low-power resources, the BoWI team came up with three proposals related to posture recognitions<\/p>\n<h4 class=\"p1\">Combinatorial alphabet<\/h4>\n<p class=\"p1\">The first one (Fig. 1) is the proposal of a combinatorial alphabet of gestures that can be used for\u00a0any control. This idea has two three advantages. First the combination of local simple libraries (ARM, LEG, &#8230;)\u00a0gives a huge number of possible postures. Secondly, each local gesture can be identified with precomputed\u00a0signatures that make the discrimination of gestures easier in noisy contexts. Finally the library is meaningless,\u00a0so general, and can be exploited for different types of control.<\/p>\n<div id=\"attachment_110\" style=\"width: 667px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage1.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-110\" class=\"size-full wp-image-110\" src=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage1.png\" alt=\"\" width=\"657\" height=\"443\" srcset=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage1.png 657w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage1-300x202.png 300w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage1-150x101.png 150w\" sizes=\"auto, (max-width: 657px) 100vw, 657px\" \/><\/a><p id=\"caption-attachment-110\" class=\"wp-caption-text\">Fig. 1: Alphabet of postures.<\/p><\/div>\n<h4 class=\"p1\">Imitation game<\/h4>\n<p id=\"aui_3_4_0_1_1037\" class=\"p1\">The second case (Fig. 2) study is a game inspired by the famous toy of the eighties called &#8220;Simon&#8221;.\u00a0<span id=\"aui_3_4_0_1_1036\">Here the idea is that each player must redo the list of previously registered gestures and add a new one.\u00a0<\/span>The registration and the control is performed by the BoWI system. This game can also be used to populize\u00a0the BoWI Gesture Alphabet.<\/p>\n<div id=\"attachment_109\" style=\"width: 518px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage2.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-109\" class=\"size-full wp-image-109\" src=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage2.png\" alt=\"\" width=\"508\" height=\"381\" srcset=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage2.png 508w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage2-300x225.png 300w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage2-150x113.png 150w\" sizes=\"auto, (max-width: 508px) 100vw, 508px\" \/><\/a><p id=\"caption-attachment-109\" class=\"wp-caption-text\">Fig. 2: Gestural &#8220;Simon&#8221; game.<\/p><\/div>\n<h4 id=\"aui_3_4_0_1_1086\" class=\"p2\">Functional rehabilitation<\/h4>\n<p class=\"p1\">The third case study is the registration of daily gestures to be analysed by physician or occupational therapists.\u00a0Many very practical applications will benefit from this case study such as the monitoring of position at work,\u00a0the monitoring of elder people at home or for functional rehabilitation at home.<\/p>\n<div id=\"attachment_108\" style=\"width: 480px\" class=\"wp-caption aligncenter\"><a href=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage3.png\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-108\" class=\"size-full wp-image-108\" src=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage3.png\" alt=\"\" width=\"470\" height=\"352\" srcset=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage3.png 470w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage3-300x225.png 300w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage3-150x112.png 150w\" sizes=\"auto, (max-width: 470px) 100vw, 470px\" \/><\/a><p id=\"caption-attachment-108\" class=\"wp-caption-text\">Fig. 3: Remote, everyday monitoring.<\/p><\/div>\n<div id=\"aui_3_4_0_1_1122\">\n<h3><strong>Zyggie Demonstrator scenario (see videos)<\/strong><\/h3>\n<p><strong>SIMON (user vs. Computer)<\/strong><\/p>\n<ul id=\"aui_3_4_0_1_1121\">\n<li>Computer adds a new posture to a sequence the user must replay<\/li>\n<li id=\"aui_3_4_0_1_1120\">To be used to compare classification:\n<ul id=\"aui_3_4_0_1_1119\">\n<li>Data (IMU, with\/without Gyro, RSSI patterns, distances, etc.)<\/li>\n<li id=\"aui_3_4_0_1_1118\">Methods: PCA, K-means, SVN+NN (+ learning phase)<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p><a href=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage4.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-107 aligncenter\" src=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage4.png\" alt=\"\" width=\"682\" height=\"202\" srcset=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage4.png 682w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage4-300x89.png 300w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage4-150x44.png 150w\" sizes=\"auto, (max-width: 682px) 100vw, 682px\" \/><\/a><\/p>\n<div id=\"aui_3_4_0_1_1144\"><strong id=\"aui_3_4_0_1_1143\">Outdoor Functional Rehabilitation = Remote Physiotherapist<\/strong><\/div>\n<div>a)\u00a0\u00a0\u00a0\u00a0 Calibration<\/div>\n<div>b)\u00a0\u00a0\u00a0\u00a0 Sequence of postures (= gestures) learning<\/div>\n<div>c)\u00a0\u00a0\u00a0\u00a0 Replay<\/div>\n<ul>\n<li>Constraints:<\/li>\n<\/ul>\n<ol>\n<li>Tend \u2013 Tstart &lt; Td<\/li>\n<li>Tj+1 \u2013 Tj &lt; Tdj<\/li>\n<li>Posture Matching<\/li>\n<\/ol>\n<ul>\n<li>Classification + Timing observation<\/li>\n<\/ul>\n<p><a href=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage5-1.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-161 aligncenter\" src=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage5-1.png\" alt=\"\" width=\"826\" height=\"191\" srcset=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage5-1.png 826w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage5-1-300x69.png 300w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage5-1-768x178.png 768w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage5-1-150x35.png 150w\" sizes=\"auto, (max-width: 826px) 100vw, 826px\" \/><\/a><\/p>\n<h2 id=\"aui_3_4_0_1_315\"><strong id=\"aui_3_4_0_1_314\">Prospective usage<\/strong><\/h2>\n<p>Low cost Outdoor motion capture: BoWI BAN (player gestures) + 2 or 3 \u00a0BoWI nodes as fixed beacons around the field \u00a0(geolocation)<\/p>\n<\/div>\n<p><a href=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage6.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-105 aligncenter\" src=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage6.png\" alt=\"\" width=\"440\" height=\"278\" srcset=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage6.png 440w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage6-300x190.png 300w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage6-150x95.png 150w\" sizes=\"auto, (max-width: 440px) 100vw, 440px\" \/><\/a><\/p>\n<p id=\"aui_3_4_0_1_338\"><strong>Social network<\/strong>: share emotion with gestures<\/p>\n<p><a href=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage7.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-104 aligncenter\" src=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage7.png\" alt=\"\" width=\"514\" height=\"288\" srcset=\"https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage7.png 514w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage7-300x168.png 300w, https:\/\/project.inria.fr\/bowi\/files\/2019\/06\/usage7-150x84.png 150w\" sizes=\"auto, (max-width: 514px) 100vw, 514px\" \/><\/a><\/p>\n<\/div>\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>BoWI proposes the first &#8220;Hierarchical Posture Grammar&#8221; based on Arm, Back and Legs postures.\u00a0 BoWI opens new, unexplored opportunities for interactions between users (inter-BoWI), and\u2026<\/p>\n<p> <a class=\"continue-reading-link\" href=\"https:\/\/project.inria.fr\/bowi\/usage-and-case-studies\/\"><span>Continue reading<\/span><i class=\"crycon-right-dir\"><\/i><\/a> <\/p>\n","protected":false},"author":1611,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-160","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/pages\/160","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/users\/1611"}],"replies":[{"embeddable":true,"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/comments?post=160"}],"version-history":[{"count":1,"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/pages\/160\/revisions"}],"predecessor-version":[{"id":162,"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/pages\/160\/revisions\/162"}],"wp:attachment":[{"href":"https:\/\/project.inria.fr\/bowi\/wp-json\/wp\/v2\/media?parent=160"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}