ºÝºÝߣshows by User: SriramSubramanian2 / http://www.slideshare.net/images/logo.gif ºÝºÝߣshows by User: SriramSubramanian2 / Sun, 14 Aug 2016 10:09:31 GMT ºÝºÝߣShare feed for ºÝºÝߣshows by User: SriramSubramanian2 The design space of shape-changing interfaces: a repertory grid study /slideshow/the-design-space-of-shapechanging-interfaces-a-repertory-grid-study/64978367 repgrid-160814100931
Publication: Matthijs Kwak, Kasper Hornbæk, Panos Markopoulos, and Miguel Bruns Alonso. 2014. The design space of shape-changing interfaces: a repertory grid study. In Proceedings of the 2014 conference on Designing interactive systems (DIS '14). ACM, New York, NY, USA, 181-190. DOI: http://dx.doi.org/10.1145/2598510.2598573 Abstract: Technologies for shape-changing user interfaces are rapidly evolving, but our understanding of the design space of such interfaces is still limited. We report a repertory grid study that aims to describe the design space from the users' point of view by eliciting personal constructs about shape-change. The study is based on six similar-sized, shape-changing artifacts that combine simple sensing of users with actuation that change volume, texture, and orientation. Our results show that the 18 respondents distinguish artifacts on dimensions that differ from those of most models of shape change. For instance, they characterize shape-change in terms of personality, territoriality, and state of mind, in addition to more common categories such as appearance and product properties. We discuss how the dimensions derived from users might be used to design shape-changing interfaces.]]>

Publication: Matthijs Kwak, Kasper Hornbæk, Panos Markopoulos, and Miguel Bruns Alonso. 2014. The design space of shape-changing interfaces: a repertory grid study. In Proceedings of the 2014 conference on Designing interactive systems (DIS '14). ACM, New York, NY, USA, 181-190. DOI: http://dx.doi.org/10.1145/2598510.2598573 Abstract: Technologies for shape-changing user interfaces are rapidly evolving, but our understanding of the design space of such interfaces is still limited. We report a repertory grid study that aims to describe the design space from the users' point of view by eliciting personal constructs about shape-change. The study is based on six similar-sized, shape-changing artifacts that combine simple sensing of users with actuation that change volume, texture, and orientation. Our results show that the 18 respondents distinguish artifacts on dimensions that differ from those of most models of shape change. For instance, they characterize shape-change in terms of personality, territoriality, and state of mind, in addition to more common categories such as appearance and product properties. We discuss how the dimensions derived from users might be used to design shape-changing interfaces.]]>
Sun, 14 Aug 2016 10:09:31 GMT /slideshow/the-design-space-of-shapechanging-interfaces-a-repertory-grid-study/64978367 SriramSubramanian2@slideshare.net(SriramSubramanian2) The design space of shape-changing interfaces: a repertory grid study SriramSubramanian2 Publication: Matthijs Kwak, Kasper Hornbæk, Panos Markopoulos, and Miguel Bruns Alonso. 2014. The design space of shape-changing interfaces: a repertory grid study. In Proceedings of the 2014 conference on Designing interactive systems (DIS '14). ACM, New York, NY, USA, 181-190. DOI: http://dx.doi.org/10.1145/2598510.2598573 Abstract: Technologies for shape-changing user interfaces are rapidly evolving, but our understanding of the design space of such interfaces is still limited. We report a repertory grid study that aims to describe the design space from the users' point of view by eliciting personal constructs about shape-change. The study is based on six similar-sized, shape-changing artifacts that combine simple sensing of users with actuation that change volume, texture, and orientation. Our results show that the 18 respondents distinguish artifacts on dimensions that differ from those of most models of shape change. For instance, they characterize shape-change in terms of personality, territoriality, and state of mind, in addition to more common categories such as appearance and product properties. We discuss how the dimensions derived from users might be used to design shape-changing interfaces. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/repgrid-160814100931-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Publication: Matthijs Kwak, Kasper Hornbæk, Panos Markopoulos, and Miguel Bruns Alonso. 2014. The design space of shape-changing interfaces: a repertory grid study. In Proceedings of the 2014 conference on Designing interactive systems (DIS &#39;14). ACM, New York, NY, USA, 181-190. DOI: http://dx.doi.org/10.1145/2598510.2598573 Abstract: Technologies for shape-changing user interfaces are rapidly evolving, but our understanding of the design space of such interfaces is still limited. We report a repertory grid study that aims to describe the design space from the users&#39; point of view by eliciting personal constructs about shape-change. The study is based on six similar-sized, shape-changing artifacts that combine simple sensing of users with actuation that change volume, texture, and orientation. Our results show that the 18 respondents distinguish artifacts on dimensions that differ from those of most models of shape change. For instance, they characterize shape-change in terms of personality, territoriality, and state of mind, in addition to more common categories such as appearance and product properties. We discuss how the dimensions derived from users might be used to design shape-changing interfaces.
The design space of shape-changing interfaces: a repertory grid study from University of Sussex
]]>
314 2 https://cdn.slidesharecdn.com/ss_thumbnails/repgrid-160814100931-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices /slideshow/an-empirical-characterization-of-touchgesture-inputforce-on-mobile-devices/64978315 itspresentation2014-160814100533
Publication: Faisal Taher, Jason Alexander, John Hardy, and Eduardo Velloso. 2014. An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces (ITS '14). ACM, New York, NY, USA, 195-204. DOI=http://dx.doi.org/10.1145/2669485.2669515 Abstract: Designers of force-sensitive user interfaces lack a ground-truth characterization of input force while performing common touch gestures (zooming, panning, tapping, and rotating). This paper provides such a characterization firstly by deriving baseline force profiles in a tightly-controlled user study; then by examining how these profiles vary in different conditions such as form factor (mobile phone and tablet), interaction position (walking and sitting) and urgency (timed tasks and untimed tasks). We conducted two user studies with 14 and 24 participants respectively and report: (1) force profile graphs that depict the force variations of common touch gestures, (2) the effect of the different conditions on force exerted and gesture completion time, (3) the most common forces that users apply, and the time taken to complete the gestures. This characterization is intended to aid the design of interactive devices that integrate force-input with common touch gestures in different conditions.]]>

Publication: Faisal Taher, Jason Alexander, John Hardy, and Eduardo Velloso. 2014. An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces (ITS '14). ACM, New York, NY, USA, 195-204. DOI=http://dx.doi.org/10.1145/2669485.2669515 Abstract: Designers of force-sensitive user interfaces lack a ground-truth characterization of input force while performing common touch gestures (zooming, panning, tapping, and rotating). This paper provides such a characterization firstly by deriving baseline force profiles in a tightly-controlled user study; then by examining how these profiles vary in different conditions such as form factor (mobile phone and tablet), interaction position (walking and sitting) and urgency (timed tasks and untimed tasks). We conducted two user studies with 14 and 24 participants respectively and report: (1) force profile graphs that depict the force variations of common touch gestures, (2) the effect of the different conditions on force exerted and gesture completion time, (3) the most common forces that users apply, and the time taken to complete the gestures. This characterization is intended to aid the design of interactive devices that integrate force-input with common touch gestures in different conditions.]]>
Sun, 14 Aug 2016 10:05:33 GMT /slideshow/an-empirical-characterization-of-touchgesture-inputforce-on-mobile-devices/64978315 SriramSubramanian2@slideshare.net(SriramSubramanian2) An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices SriramSubramanian2 Publication: Faisal Taher, Jason Alexander, John Hardy, and Eduardo Velloso. 2014. An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces (ITS '14). ACM, New York, NY, USA, 195-204. DOI=http://dx.doi.org/10.1145/2669485.2669515 Abstract: Designers of force-sensitive user interfaces lack a ground-truth characterization of input force while performing common touch gestures (zooming, panning, tapping, and rotating). This paper provides such a characterization firstly by deriving baseline force profiles in a tightly-controlled user study; then by examining how these profiles vary in different conditions such as form factor (mobile phone and tablet), interaction position (walking and sitting) and urgency (timed tasks and untimed tasks). We conducted two user studies with 14 and 24 participants respectively and report: (1) force profile graphs that depict the force variations of common touch gestures, (2) the effect of the different conditions on force exerted and gesture completion time, (3) the most common forces that users apply, and the time taken to complete the gestures. This characterization is intended to aid the design of interactive devices that integrate force-input with common touch gestures in different conditions. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/itspresentation2014-160814100533-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Publication: Faisal Taher, Jason Alexander, John Hardy, and Eduardo Velloso. 2014. An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces (ITS &#39;14). ACM, New York, NY, USA, 195-204. DOI=http://dx.doi.org/10.1145/2669485.2669515 Abstract: Designers of force-sensitive user interfaces lack a ground-truth characterization of input force while performing common touch gestures (zooming, panning, tapping, and rotating). This paper provides such a characterization firstly by deriving baseline force profiles in a tightly-controlled user study; then by examining how these profiles vary in different conditions such as form factor (mobile phone and tablet), interaction position (walking and sitting) and urgency (timed tasks and untimed tasks). We conducted two user studies with 14 and 24 participants respectively and report: (1) force profile graphs that depict the force variations of common touch gestures, (2) the effect of the different conditions on force exerted and gesture completion time, (3) the most common forces that users apply, and the time taken to complete the gestures. This characterization is intended to aid the design of interactive devices that integrate force-input with common touch gestures in different conditions.
An Empirical Characterization of Touch-Gesture Input-Force on Mobile Devices from University of Sussex
]]>
164 3 https://cdn.slidesharecdn.com/ss_thumbnails/itspresentation2014-160814100533-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Morphees: toward high "shape resolution" in self-actuated flexible mobile devices /slideshow/morphees-toward-high-shape-resolution-in-selfactuated-flexible-mobile-devices/64978195 roudautchi13b-160814095635
Publication: Anne Roudaut, Abhijit Karnik, Markus Löchtefeld, and Sriram Subramanian. 2013. Morphees: toward high "shape resolution" in self-actuated flexible mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 593-602. DOI: http://dx.doi.org/10.1145/2470654.2470738 We introduce the term shape resolution, which adds to the existing definitions of screen and touch resolution. We propose a framework, based on a geometric model (Non-Uniform Rational B-splines), which defines a metric for shape resolution in ten features. We illustrate it by comparing the current related work of shape changing devices. We then propose the concept of Morphees that are self-actuated flexible mobile devices adapting their shapes on their own to the context of use in order to offer better affordances. For instance, when a game is launched, the mobile device morphs into a console-like shape by curling two opposite edges to be better grasped with two hands. We then create preliminary prototypes of Morphees in order to explore six different building strategies using advanced shape changing materials (dielectric electro active polymers and shape memory alloys). By comparing the shape resolution of our prototypes, we generate insights to help designers toward creating high shape resolution Morphees.]]>

Publication: Anne Roudaut, Abhijit Karnik, Markus Löchtefeld, and Sriram Subramanian. 2013. Morphees: toward high "shape resolution" in self-actuated flexible mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 593-602. DOI: http://dx.doi.org/10.1145/2470654.2470738 We introduce the term shape resolution, which adds to the existing definitions of screen and touch resolution. We propose a framework, based on a geometric model (Non-Uniform Rational B-splines), which defines a metric for shape resolution in ten features. We illustrate it by comparing the current related work of shape changing devices. We then propose the concept of Morphees that are self-actuated flexible mobile devices adapting their shapes on their own to the context of use in order to offer better affordances. For instance, when a game is launched, the mobile device morphs into a console-like shape by curling two opposite edges to be better grasped with two hands. We then create preliminary prototypes of Morphees in order to explore six different building strategies using advanced shape changing materials (dielectric electro active polymers and shape memory alloys). By comparing the shape resolution of our prototypes, we generate insights to help designers toward creating high shape resolution Morphees.]]>
Sun, 14 Aug 2016 09:56:35 GMT /slideshow/morphees-toward-high-shape-resolution-in-selfactuated-flexible-mobile-devices/64978195 SriramSubramanian2@slideshare.net(SriramSubramanian2) Morphees: toward high "shape resolution" in self-actuated flexible mobile devices SriramSubramanian2 Publication: Anne Roudaut, Abhijit Karnik, Markus Löchtefeld, and Sriram Subramanian. 2013. Morphees: toward high "shape resolution" in self-actuated flexible mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 593-602. DOI: http://dx.doi.org/10.1145/2470654.2470738 We introduce the term shape resolution, which adds to the existing definitions of screen and touch resolution. We propose a framework, based on a geometric model (Non-Uniform Rational B-splines), which defines a metric for shape resolution in ten features. We illustrate it by comparing the current related work of shape changing devices. We then propose the concept of Morphees that are self-actuated flexible mobile devices adapting their shapes on their own to the context of use in order to offer better affordances. For instance, when a game is launched, the mobile device morphs into a console-like shape by curling two opposite edges to be better grasped with two hands. We then create preliminary prototypes of Morphees in order to explore six different building strategies using advanced shape changing materials (dielectric electro active polymers and shape memory alloys). By comparing the shape resolution of our prototypes, we generate insights to help designers toward creating high shape resolution Morphees. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/roudautchi13b-160814095635-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Publication: Anne Roudaut, Abhijit Karnik, Markus Löchtefeld, and Sriram Subramanian. 2013. Morphees: toward high &quot;shape resolution&quot; in self-actuated flexible mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI &#39;13). ACM, New York, NY, USA, 593-602. DOI: http://dx.doi.org/10.1145/2470654.2470738 We introduce the term shape resolution, which adds to the existing definitions of screen and touch resolution. We propose a framework, based on a geometric model (Non-Uniform Rational B-splines), which defines a metric for shape resolution in ten features. We illustrate it by comparing the current related work of shape changing devices. We then propose the concept of Morphees that are self-actuated flexible mobile devices adapting their shapes on their own to the context of use in order to offer better affordances. For instance, when a game is launched, the mobile device morphs into a console-like shape by curling two opposite edges to be better grasped with two hands. We then create preliminary prototypes of Morphees in order to explore six different building strategies using advanced shape changing materials (dielectric electro active polymers and shape memory alloys). By comparing the shape resolution of our prototypes, we generate insights to help designers toward creating high shape resolution Morphees.
Morphees: toward high "shape resolution" in self-actuated flexible mobile devices from University of Sussex
]]>
377 4 https://cdn.slidesharecdn.com/ss_thumbnails/roudautchi13b-160814095635-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Deformable and Shape Changing Interfaces /slideshow/deformable-and-shape-changing-interfaces/64978152 deformableandshapechanginginterfaces-160814095236
Talk delivered by Prof. Kasper Hornbaek (Copenhagen University) on 29th Aug 2014 in Eindhoven (Netherlands).]]>

Talk delivered by Prof. Kasper Hornbaek (Copenhagen University) on 29th Aug 2014 in Eindhoven (Netherlands).]]>
Sun, 14 Aug 2016 09:52:36 GMT /slideshow/deformable-and-shape-changing-interfaces/64978152 SriramSubramanian2@slideshare.net(SriramSubramanian2) Deformable and Shape Changing Interfaces SriramSubramanian2 Talk delivered by Prof. Kasper Hornbaek (Copenhagen University) on 29th Aug 2014 in Eindhoven (Netherlands). <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/deformableandshapechanginginterfaces-160814095236-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Talk delivered by Prof. Kasper Hornbaek (Copenhagen University) on 29th Aug 2014 in Eindhoven (Netherlands).
Deformable and Shape Changing Interfaces from University of Sussex
]]>
618 4 https://cdn.slidesharecdn.com/ss_thumbnails/deformableandshapechanginginterfaces-160814095236-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication /slideshow/reform-integrating-physical-and-digital-design-through-bidirectional-fabrication/64978058 reform-160814094420
Publication: Christian Weichel, John Hardy, Jason Alexander, and Hans Gellersen. 2015. ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 93-102. DOI: http://dx.doi.org/10.1145/2807442.2807451 Abstract: Digital fabrication machines such as 3D printers and laser-cutters allow users to produce physical objects based on virtual models. The creation process is currently unidirectional: once an object is fabricated it is separated from its originating virtual model. Consequently, users are tied into digital modeling tools, the virtual design must be completed before fabrication, and once fabricated, re-shaping the physical object no longer influences the digital model. To provide a more flexible design process that allows objects to iteratively evolve through both digital and physical input, we introduce bidirectional fabrication. To demonstrate the concept, we built ReForm, a system that integrates digital modeling with shape input, shape output, annotation for machine commands, and visual output. By continually synchronizing the physical object and digital model it supports object versioning to allow physical changes to be undone. Through application examples, we demonstrate the benefits of ReForm to the digital fabrication process. ]]>

Publication: Christian Weichel, John Hardy, Jason Alexander, and Hans Gellersen. 2015. ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 93-102. DOI: http://dx.doi.org/10.1145/2807442.2807451 Abstract: Digital fabrication machines such as 3D printers and laser-cutters allow users to produce physical objects based on virtual models. The creation process is currently unidirectional: once an object is fabricated it is separated from its originating virtual model. Consequently, users are tied into digital modeling tools, the virtual design must be completed before fabrication, and once fabricated, re-shaping the physical object no longer influences the digital model. To provide a more flexible design process that allows objects to iteratively evolve through both digital and physical input, we introduce bidirectional fabrication. To demonstrate the concept, we built ReForm, a system that integrates digital modeling with shape input, shape output, annotation for machine commands, and visual output. By continually synchronizing the physical object and digital model it supports object versioning to allow physical changes to be undone. Through application examples, we demonstrate the benefits of ReForm to the digital fabrication process. ]]>
Sun, 14 Aug 2016 09:44:20 GMT /slideshow/reform-integrating-physical-and-digital-design-through-bidirectional-fabrication/64978058 SriramSubramanian2@slideshare.net(SriramSubramanian2) ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication SriramSubramanian2 Publication: Christian Weichel, John Hardy, Jason Alexander, and Hans Gellersen. 2015. ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 93-102. DOI: http://dx.doi.org/10.1145/2807442.2807451 Abstract: Digital fabrication machines such as 3D printers and laser-cutters allow users to produce physical objects based on virtual models. The creation process is currently unidirectional: once an object is fabricated it is separated from its originating virtual model. Consequently, users are tied into digital modeling tools, the virtual design must be completed before fabrication, and once fabricated, re-shaping the physical object no longer influences the digital model. To provide a more flexible design process that allows objects to iteratively evolve through both digital and physical input, we introduce bidirectional fabrication. To demonstrate the concept, we built ReForm, a system that integrates digital modeling with shape input, shape output, annotation for machine commands, and visual output. By continually synchronizing the physical object and digital model it supports object versioning to allow physical changes to be undone. Through application examples, we demonstrate the benefits of ReForm to the digital fabrication process. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/reform-160814094420-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Publication: Christian Weichel, John Hardy, Jason Alexander, and Hans Gellersen. 2015. ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication. In Proceedings of the 28th Annual ACM Symposium on User Interface Software &amp; Technology (UIST &#39;15). ACM, New York, NY, USA, 93-102. DOI: http://dx.doi.org/10.1145/2807442.2807451 Abstract: Digital fabrication machines such as 3D printers and laser-cutters allow users to produce physical objects based on virtual models. The creation process is currently unidirectional: once an object is fabricated it is separated from its originating virtual model. Consequently, users are tied into digital modeling tools, the virtual design must be completed before fabrication, and once fabricated, re-shaping the physical object no longer influences the digital model. To provide a more flexible design process that allows objects to iteratively evolve through both digital and physical input, we introduce bidirectional fabrication. To demonstrate the concept, we built ReForm, a system that integrates digital modeling with shape input, shape output, annotation for machine commands, and visual output. By continually synchronizing the physical object and digital model it supports object versioning to allow physical changes to be undone. Through application examples, we demonstrate the benefits of ReForm to the digital fabrication process.
ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication from University of Sussex
]]>
248 2 https://cdn.slidesharecdn.com/ss_thumbnails/reform-160814094420-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Characterising the Physicality of Everyday Buttons /SriramSubramanian2/characterising-the-physicality-of-everyday-buttons buttons-its-160814093734
Accompanying research paper at - http://dl.acm.org/citation.cfm?doid=2669485.2669519 A significant milestone in the development of physically-dynamic surfaces is the ability for buttons to protrude outwards from any location on a touch-screen. As a first step toward developing interaction requirements for this technology we conducted a survey of 1515 electronic push buttons in everyday home environments. We report a characterisation that describes the features of the data set and discusses important button properties that we expect will inform the design of future physically-dynamic devices and surfaces. ]]>

Accompanying research paper at - http://dl.acm.org/citation.cfm?doid=2669485.2669519 A significant milestone in the development of physically-dynamic surfaces is the ability for buttons to protrude outwards from any location on a touch-screen. As a first step toward developing interaction requirements for this technology we conducted a survey of 1515 electronic push buttons in everyday home environments. We report a characterisation that describes the features of the data set and discusses important button properties that we expect will inform the design of future physically-dynamic devices and surfaces. ]]>
Sun, 14 Aug 2016 09:37:34 GMT /SriramSubramanian2/characterising-the-physicality-of-everyday-buttons SriramSubramanian2@slideshare.net(SriramSubramanian2) Characterising the Physicality of Everyday Buttons SriramSubramanian2 Accompanying research paper at - http://dl.acm.org/citation.cfm?doid=2669485.2669519 A significant milestone in the development of physically-dynamic surfaces is the ability for buttons to protrude outwards from any location on a touch-screen. As a first step toward developing interaction requirements for this technology we conducted a survey of 1515 electronic push buttons in everyday home environments. We report a characterisation that describes the features of the data set and discusses important button properties that we expect will inform the design of future physically-dynamic devices and surfaces. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/buttons-its-160814093734-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Accompanying research paper at - http://dl.acm.org/citation.cfm?doid=2669485.2669519 A significant milestone in the development of physically-dynamic surfaces is the ability for buttons to protrude outwards from any location on a touch-screen. As a first step toward developing interaction requirements for this technology we conducted a survey of 1515 electronic push buttons in everyday home environments. We report a characterisation that describes the features of the data set and discusses important button properties that we expect will inform the design of future physically-dynamic devices and surfaces.
Characterising the Physicality of Everyday Buttons from University of Sussex
]]>
105 2 https://cdn.slidesharecdn.com/ss_thumbnails/buttons-its-160814093734-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Evaluating the Effectiveness of Physical Shape-Change for In-pocket Mobile Device Notifications /slideshow/evaluating-the-effectiveness-of-physical-shapechange-for-inpocket-mobile-device-notifications/64977926 mobilenotifications-160814093157
Accompanying research article at http://dl.acm.org/citation.cfm?id=2557164 Audio and vibrotactile output are the standard mechanisms mobile devices use to attract their owner's attention. Yet in busy and noisy environments, or when the user is physically active, these channels sometimes fail. Recent work has explored the use of physical shape-change as an additional method for conveying notifications when the device is in-hand or viewable. However, we do not yet understand the effectiveness of physical shape-change as a method for communicating in-pocket notifications. This paper presents three robustly implemented, mobile-device sized shape-changing devices, and two user studies to evaluate their effectiveness at conveying notifications. The studies reveal that (1) different types and configurations of shape-change convey different levels of urgency and; (2) fast pulsing shape-changing notifications are missed less often and recognised more quickly than the standard slower vibration pulse rates of a mobile device.]]>

Accompanying research article at http://dl.acm.org/citation.cfm?id=2557164 Audio and vibrotactile output are the standard mechanisms mobile devices use to attract their owner's attention. Yet in busy and noisy environments, or when the user is physically active, these channels sometimes fail. Recent work has explored the use of physical shape-change as an additional method for conveying notifications when the device is in-hand or viewable. However, we do not yet understand the effectiveness of physical shape-change as a method for communicating in-pocket notifications. This paper presents three robustly implemented, mobile-device sized shape-changing devices, and two user studies to evaluate their effectiveness at conveying notifications. The studies reveal that (1) different types and configurations of shape-change convey different levels of urgency and; (2) fast pulsing shape-changing notifications are missed less often and recognised more quickly than the standard slower vibration pulse rates of a mobile device.]]>
Sun, 14 Aug 2016 09:31:57 GMT /slideshow/evaluating-the-effectiveness-of-physical-shapechange-for-inpocket-mobile-device-notifications/64977926 SriramSubramanian2@slideshare.net(SriramSubramanian2) Evaluating the Effectiveness of Physical Shape-Change for In-pocket Mobile Device Notifications SriramSubramanian2 Accompanying research article at http://dl.acm.org/citation.cfm?id=2557164 Audio and vibrotactile output are the standard mechanisms mobile devices use to attract their owner's attention. Yet in busy and noisy environments, or when the user is physically active, these channels sometimes fail. Recent work has explored the use of physical shape-change as an additional method for conveying notifications when the device is in-hand or viewable. However, we do not yet understand the effectiveness of physical shape-change as a method for communicating in-pocket notifications. This paper presents three robustly implemented, mobile-device sized shape-changing devices, and two user studies to evaluate their effectiveness at conveying notifications. The studies reveal that (1) different types and configurations of shape-change convey different levels of urgency and; (2) fast pulsing shape-changing notifications are missed less often and recognised more quickly than the standard slower vibration pulse rates of a mobile device. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/mobilenotifications-160814093157-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Accompanying research article at http://dl.acm.org/citation.cfm?id=2557164 Audio and vibrotactile output are the standard mechanisms mobile devices use to attract their owner&#39;s attention. Yet in busy and noisy environments, or when the user is physically active, these channels sometimes fail. Recent work has explored the use of physical shape-change as an additional method for conveying notifications when the device is in-hand or viewable. However, we do not yet understand the effectiveness of physical shape-change as a method for communicating in-pocket notifications. This paper presents three robustly implemented, mobile-device sized shape-changing devices, and two user studies to evaluate their effectiveness at conveying notifications. The studies reveal that (1) different types and configurations of shape-change convey different levels of urgency and; (2) fast pulsing shape-changing notifications are missed less often and recognised more quickly than the standard slower vibration pulse rates of a mobile device.
Evaluating the Effectiveness of Physical Shape-Change for In-pocket Mobile Device Notifications from University of Sussex
]]>
106 4 https://cdn.slidesharecdn.com/ss_thumbnails/mobilenotifications-160814093157-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
LeviPath: Modular Acoustic Levitation for 3D Path Visualisations. /slideshow/levipath-modular-acoustic-levitation-for-3d-path-visualisations/64977861 weh9evchswejcflxxxsi-signature-befb3c8351e839f99496aa7e951ba3883a60ea2b4bb1931eb52b07724a968467-poli-160814092602
LeviPath is a modular system to levitate objects across 3D paths. It consists of two opposed arrays of transducers that create a standing wave capable of suspending objects in mid-air. To control the standing wave, the system employs a novel algorithm based on combining basic patterns of movement. Our approach allows the control of multiple beads simultaneously along different 3D paths. Due to the patterns and the use of only two opposed arrays, the system is modular and can scale its interaction space by joining several LeviPaths. In this paper, we describe the hardware architecture, the basic patterns of movement and how to combine them to produce 3D path visualisations.]]>

LeviPath is a modular system to levitate objects across 3D paths. It consists of two opposed arrays of transducers that create a standing wave capable of suspending objects in mid-air. To control the standing wave, the system employs a novel algorithm based on combining basic patterns of movement. Our approach allows the control of multiple beads simultaneously along different 3D paths. Due to the patterns and the use of only two opposed arrays, the system is modular and can scale its interaction space by joining several LeviPaths. In this paper, we describe the hardware architecture, the basic patterns of movement and how to combine them to produce 3D path visualisations.]]>
Sun, 14 Aug 2016 09:26:02 GMT /slideshow/levipath-modular-acoustic-levitation-for-3d-path-visualisations/64977861 SriramSubramanian2@slideshare.net(SriramSubramanian2) LeviPath: Modular Acoustic Levitation for 3D Path Visualisations. SriramSubramanian2 LeviPath is a modular system to levitate objects across 3D paths. It consists of two opposed arrays of transducers that create a standing wave capable of suspending objects in mid-air. To control the standing wave, the system employs a novel algorithm based on combining basic patterns of movement. Our approach allows the control of multiple beads simultaneously along different 3D paths. Due to the patterns and the use of only two opposed arrays, the system is modular and can scale its interaction space by joining several LeviPaths. In this paper, we describe the hardware architecture, the basic patterns of movement and how to combine them to produce 3D path visualisations. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/weh9evchswejcflxxxsi-signature-befb3c8351e839f99496aa7e951ba3883a60ea2b4bb1931eb52b07724a968467-poli-160814092602-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> LeviPath is a modular system to levitate objects across 3D paths. It consists of two opposed arrays of transducers that create a standing wave capable of suspending objects in mid-air. To control the standing wave, the system employs a novel algorithm based on combining basic patterns of movement. Our approach allows the control of multiple beads simultaneously along different 3D paths. Due to the patterns and the use of only two opposed arrays, the system is modular and can scale its interaction space by joining several LeviPaths. In this paper, we describe the hardware architecture, the basic patterns of movement and how to combine them to produce 3D path visualisations.
LeviPath: Modular Acoustic Levitation for 3D Path Visualisations. from University of Sussex
]]>
1288 3 https://cdn.slidesharecdn.com/ss_thumbnails/weh9evchswejcflxxxsi-signature-befb3c8351e839f99496aa7e951ba3883a60ea2b4bb1931eb52b07724a968467-poli-160814092602-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-SriramSubramanian2-48x48.jpg?cb=1550429163 www.interact-lab.com https://cdn.slidesharecdn.com/ss_thumbnails/repgrid-160814100931-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/the-design-space-of-shapechanging-interfaces-a-repertory-grid-study/64978367 The design space of sh... https://cdn.slidesharecdn.com/ss_thumbnails/itspresentation2014-160814100533-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/an-empirical-characterization-of-touchgesture-inputforce-on-mobile-devices/64978315 An Empirical Character... https://cdn.slidesharecdn.com/ss_thumbnails/roudautchi13b-160814095635-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/morphees-toward-high-shape-resolution-in-selfactuated-flexible-mobile-devices/64978195 Morphees: toward high ...