際際滷shows by User: MatthiasWlfel / http://www.slideshare.net/images/logo.gif 際際滷shows by User: MatthiasWlfel / Wed, 16 Oct 2024 22:29:16 GMT 際際滷Share feed for 際際滷shows by User: MatthiasWlfel Social Interaction in Immersive Environments.pdf /slideshow/social-interaction-in-immersive-environments-pdf/272477325 socialinteractioninimmersiveenvironments-241016222916-f983b9d5
We are witnessing a remarkable transformation in the way technology shapes our social lives. What originated with the Internet, social media, and smartphones is now evolving into immersive technologies. While immersive technologies began as single-user experiences, they are now transforming into rich, multi-user environments, creating new opportunities for social interaction, embodiment, and shared experiences. In this keynote, we'll explore how immersive environments influence human connection, the challenges of embodiment, ways of interacting between digital and physical realms in location-based VR, and new possibilities these spaces open up for learning and social science research.]]>

We are witnessing a remarkable transformation in the way technology shapes our social lives. What originated with the Internet, social media, and smartphones is now evolving into immersive technologies. While immersive technologies began as single-user experiences, they are now transforming into rich, multi-user environments, creating new opportunities for social interaction, embodiment, and shared experiences. In this keynote, we'll explore how immersive environments influence human connection, the challenges of embodiment, ways of interacting between digital and physical realms in location-based VR, and new possibilities these spaces open up for learning and social science research.]]>
Wed, 16 Oct 2024 22:29:16 GMT /slideshow/social-interaction-in-immersive-environments-pdf/272477325 MatthiasWlfel@slideshare.net(MatthiasWlfel) Social Interaction in Immersive Environments.pdf MatthiasWlfel We are witnessing a remarkable transformation in the way technology shapes our social lives. What originated with the Internet, social media, and smartphones is now evolving into immersive technologies. While immersive technologies began as single-user experiences, they are now transforming into rich, multi-user environments, creating new opportunities for social interaction, embodiment, and shared experiences. In this keynote, we'll explore how immersive environments influence human connection, the challenges of embodiment, ways of interacting between digital and physical realms in location-based VR, and new possibilities these spaces open up for learning and social science research. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/socialinteractioninimmersiveenvironments-241016222916-f983b9d5-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> We are witnessing a remarkable transformation in the way technology shapes our social lives. What originated with the Internet, social media, and smartphones is now evolving into immersive technologies. While immersive technologies began as single-user experiences, they are now transforming into rich, multi-user environments, creating new opportunities for social interaction, embodiment, and shared experiences. In this keynote, we&#39;ll explore how immersive environments influence human connection, the challenges of embodiment, ways of interacting between digital and physical realms in location-based VR, and new possibilities these spaces open up for learning and social science research.
Social Interaction in Immersive Environments.pdf from Matthias Wlfel
]]>
238 0 https://cdn.slidesharecdn.com/ss_thumbnails/socialinteractioninimmersiveenvironments-241016222916-f983b9d5-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Color Preference Differences Between HMD and PC /slideshow/color-preference-differences-between-hmd-and-pc/119125363 f35-181011151033
Recently virtual reality (VR) applications are shifting from professional use cases to more entertainment-centered approaches. Therefore aesthetic aspects in virtual environments gain in relevance. This paper examines the influence of different color determining parameters on user perception habits between head mounted displays (HMD) and computer screens. We conducted an empirical study with 50 persons that were asked to adjust the color temperature, saturation and contrast according to their personal preferences using a HMD as well as a computer screen, respectively. For cross validation we tested a second user group of 36 persons that were asked to adjust the color temperature exclusively. By using a set of five different panorama images-each of them representing an exemplary scenario-we have found that color perception differs significantly. This depends on the used output device as well as gender: i.e. females preferred a significantly colder color scheme in VR compared to their preferences on the computer screen. Furthermore they also chose a significant colder color scheme on the HMD compared to their male counterparts. Our findings demonstrate that content created for conventional screens can not simply be transferred to immersive virtual environments but for optimal results needs reevaluation of its visual aesthetics.]]>

Recently virtual reality (VR) applications are shifting from professional use cases to more entertainment-centered approaches. Therefore aesthetic aspects in virtual environments gain in relevance. This paper examines the influence of different color determining parameters on user perception habits between head mounted displays (HMD) and computer screens. We conducted an empirical study with 50 persons that were asked to adjust the color temperature, saturation and contrast according to their personal preferences using a HMD as well as a computer screen, respectively. For cross validation we tested a second user group of 36 persons that were asked to adjust the color temperature exclusively. By using a set of five different panorama images-each of them representing an exemplary scenario-we have found that color perception differs significantly. This depends on the used output device as well as gender: i.e. females preferred a significantly colder color scheme in VR compared to their preferences on the computer screen. Furthermore they also chose a significant colder color scheme on the HMD compared to their male counterparts. Our findings demonstrate that content created for conventional screens can not simply be transferred to immersive virtual environments but for optimal results needs reevaluation of its visual aesthetics.]]>
Thu, 11 Oct 2018 15:10:33 GMT /slideshow/color-preference-differences-between-hmd-and-pc/119125363 MatthiasWlfel@slideshare.net(MatthiasWlfel) Color Preference Differences Between HMD and PC MatthiasWlfel Recently virtual reality (VR) applications are shifting from professional use cases to more entertainment-centered approaches. Therefore aesthetic aspects in virtual environments gain in relevance. This paper examines the influence of different color determining parameters on user perception habits between head mounted displays (HMD) and computer screens. We conducted an empirical study with 50 persons that were asked to adjust the color temperature, saturation and contrast according to their personal preferences using a HMD as well as a computer screen, respectively. For cross validation we tested a second user group of 36 persons that were asked to adjust the color temperature exclusively. By using a set of five different panorama images-each of them representing an exemplary scenario-we have found that color perception differs significantly. This depends on the used output device as well as gender: i.e. females preferred a significantly colder color scheme in VR compared to their preferences on the computer screen. Furthermore they also chose a significant colder color scheme on the HMD compared to their male counterparts. Our findings demonstrate that content created for conventional screens can not simply be transferred to immersive virtual environments but for optimal results needs reevaluation of its visual aesthetics. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/f35-181011151033-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Recently virtual reality (VR) applications are shifting from professional use cases to more entertainment-centered approaches. Therefore aesthetic aspects in virtual environments gain in relevance. This paper examines the influence of different color determining parameters on user perception habits between head mounted displays (HMD) and computer screens. We conducted an empirical study with 50 persons that were asked to adjust the color temperature, saturation and contrast according to their personal preferences using a HMD as well as a computer screen, respectively. For cross validation we tested a second user group of 36 persons that were asked to adjust the color temperature exclusively. By using a set of five different panorama images-each of them representing an exemplary scenario-we have found that color perception differs significantly. This depends on the used output device as well as gender: i.e. females preferred a significantly colder color scheme in VR compared to their preferences on the computer screen. Furthermore they also chose a significant colder color scheme on the HMD compared to their male counterparts. Our findings demonstrate that content created for conventional screens can not simply be transferred to immersive virtual environments but for optimal results needs reevaluation of its visual aesthetics.
Color Preference Differences Between HMD and PC from Matthias Wlfel
]]>
215 1 https://cdn.slidesharecdn.com/ss_thumbnails/f35-181011151033-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Effects of Electrical Pain Stimuli on Immersion鐃in Virtual Reality /slideshow/effects-of-electrical-pain-stimuli-on-immersionin-virtual-reality/119125015 s74-181011150555
The ultimate goal of virtual realty is to create a simulated world around us which is indistinguishable from the physical world as we know it. In such an environment our actions could lead to severe effects on our body. What would happen if one gets hit by a bullet, car or lightning? How would the felt pain change our perception of the virtual environment? It turns out that the influence of nociception (pain) on the human perception in virtual environments is not well covered in the scientific literature besides pain control/management. The goal of this publication is to investigate the influence of pain stimuli on immersion as well as decision making and to foster research and discussion in this direction.]]>

The ultimate goal of virtual realty is to create a simulated world around us which is indistinguishable from the physical world as we know it. In such an environment our actions could lead to severe effects on our body. What would happen if one gets hit by a bullet, car or lightning? How would the felt pain change our perception of the virtual environment? It turns out that the influence of nociception (pain) on the human perception in virtual environments is not well covered in the scientific literature besides pain control/management. The goal of this publication is to investigate the influence of pain stimuli on immersion as well as decision making and to foster research and discussion in this direction.]]>
Thu, 11 Oct 2018 15:05:55 GMT /slideshow/effects-of-electrical-pain-stimuli-on-immersionin-virtual-reality/119125015 MatthiasWlfel@slideshare.net(MatthiasWlfel) Effects of Electrical Pain Stimuli on Immersion鐃in Virtual Reality MatthiasWlfel The ultimate goal of virtual realty is to create a simulated world around us which is indistinguishable from the physical world as we know it. In such an environment our actions could lead to severe effects on our body. What would happen if one gets hit by a bullet, car or lightning? How would the felt pain change our perception of the virtual environment? It turns out that the influence of nociception (pain) on the human perception in virtual environments is not well covered in the scientific literature besides pain control/management. The goal of this publication is to investigate the influence of pain stimuli on immersion as well as decision making and to foster research and discussion in this direction. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/s74-181011150555-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> The ultimate goal of virtual realty is to create a simulated world around us which is indistinguishable from the physical world as we know it. In such an environment our actions could lead to severe effects on our body. What would happen if one gets hit by a bullet, car or lightning? How would the felt pain change our perception of the virtual environment? It turns out that the influence of nociception (pain) on the human perception in virtual environments is not well covered in the scientific literature besides pain control/management. The goal of this publication is to investigate the influence of pain stimuli on immersion as well as decision making and to foster research and discussion in this direction.
Effects of Electrical Pain Stimuli on Immersion in Virtual Reality from Matthias Wlfel
]]>
147 3 https://cdn.slidesharecdn.com/ss_thumbnails/s74-181011150555-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
What User Interface to Use for VR: 2D, 3D or Speech A User Study /MatthiasWlfel/what-user-interface-to-use-for-vr-2d-3d-or-speech-a-user-study f76-181011145905
In virtual reality different demands on the user interface have to be addressed than on classic screen applications. That's why established strategies from other digital media cannot be transferred unreflected and at least adaptation is required. So one of the leading questions is: which form of interface is preferable for virtual reality? Are 2D interfaces-that are mostly used in combination with mouse or touch interactions-the means of choice, although they do not use the medium's full capabilities? What about 3D interfaces that can be naturally integrated into the virtual space? And last but not least: are speech interfaces, the fastest and most natural form of human interaction/communication, which have recently established themselves in other areas (e.g. digital assistants), ready to conquer the world of virtual reality? To answer these question this work compares these three approaches based on a quantitative user study and highlights advantages and disadvantages of the respective interfaces for virtual reality applications. Index Terms-virtual reality, comparison of user interfaces, input modality, 2D interface, 3D interface, speech interface]]>

In virtual reality different demands on the user interface have to be addressed than on classic screen applications. That's why established strategies from other digital media cannot be transferred unreflected and at least adaptation is required. So one of the leading questions is: which form of interface is preferable for virtual reality? Are 2D interfaces-that are mostly used in combination with mouse or touch interactions-the means of choice, although they do not use the medium's full capabilities? What about 3D interfaces that can be naturally integrated into the virtual space? And last but not least: are speech interfaces, the fastest and most natural form of human interaction/communication, which have recently established themselves in other areas (e.g. digital assistants), ready to conquer the world of virtual reality? To answer these question this work compares these three approaches based on a quantitative user study and highlights advantages and disadvantages of the respective interfaces for virtual reality applications. Index Terms-virtual reality, comparison of user interfaces, input modality, 2D interface, 3D interface, speech interface]]>
Thu, 11 Oct 2018 14:59:05 GMT /MatthiasWlfel/what-user-interface-to-use-for-vr-2d-3d-or-speech-a-user-study MatthiasWlfel@slideshare.net(MatthiasWlfel) What User Interface to Use for VR: 2D, 3D or Speech A User Study MatthiasWlfel In virtual reality different demands on the user interface have to be addressed than on classic screen applications. That's why established strategies from other digital media cannot be transferred unreflected and at least adaptation is required. So one of the leading questions is: which form of interface is preferable for virtual reality? Are 2D interfaces-that are mostly used in combination with mouse or touch interactions-the means of choice, although they do not use the medium's full capabilities? What about 3D interfaces that can be naturally integrated into the virtual space? And last but not least: are speech interfaces, the fastest and most natural form of human interaction/communication, which have recently established themselves in other areas (e.g. digital assistants), ready to conquer the world of virtual reality? To answer these question this work compares these three approaches based on a quantitative user study and highlights advantages and disadvantages of the respective interfaces for virtual reality applications. Index Terms-virtual reality, comparison of user interfaces, input modality, 2D interface, 3D interface, speech interface <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/f76-181011145905-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> In virtual reality different demands on the user interface have to be addressed than on classic screen applications. That&#39;s why established strategies from other digital media cannot be transferred unreflected and at least adaptation is required. So one of the leading questions is: which form of interface is preferable for virtual reality? Are 2D interfaces-that are mostly used in combination with mouse or touch interactions-the means of choice, although they do not use the medium&#39;s full capabilities? What about 3D interfaces that can be naturally integrated into the virtual space? And last but not least: are speech interfaces, the fastest and most natural form of human interaction/communication, which have recently established themselves in other areas (e.g. digital assistants), ready to conquer the world of virtual reality? To answer these question this work compares these three approaches based on a quantitative user study and highlights advantages and disadvantages of the respective interfaces for virtual reality applications. Index Terms-virtual reality, comparison of user interfaces, input modality, 2D interface, 3D interface, speech interface
What User Interface to Use for VR: 2D, 3D or Speech A User Study from Matthias W旦lfel
]]>
360 3 https://cdn.slidesharecdn.com/ss_thumbnails/f76-181011145905-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Digitale Erlebniswelten https://de.slideshare.net/MatthiasWlfel/digitale-erlebniswelten digitaleerlebniswelten-170523132529
Talk at Porsche Customer Innovations Day 2017.]]>

Talk at Porsche Customer Innovations Day 2017.]]>
Tue, 23 May 2017 13:25:29 GMT https://de.slideshare.net/MatthiasWlfel/digitale-erlebniswelten MatthiasWlfel@slideshare.net(MatthiasWlfel) Digitale Erlebniswelten MatthiasWlfel Talk at Porsche Customer Innovations Day 2017. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/digitaleerlebniswelten-170523132529-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Talk at Porsche Customer Innovations Day 2017.
from Matthias Wlfel
]]>
324 9 https://cdn.slidesharecdn.com/ss_thumbnails/digitaleerlebniswelten-170523132529-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Atmosphere in Virtual Reality /slideshow/atmosphere-in-virtual-reality/76253295 atmosphereinvirtualreality-170523130706
Talk at FMX 2017 about different aspects in VR. ]]>

Talk at FMX 2017 about different aspects in VR. ]]>
Tue, 23 May 2017 13:07:06 GMT /slideshow/atmosphere-in-virtual-reality/76253295 MatthiasWlfel@slideshare.net(MatthiasWlfel) Atmosphere in Virtual Reality MatthiasWlfel Talk at FMX 2017 about different aspects in VR. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/atmosphereinvirtualreality-170523130706-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Talk at FMX 2017 about different aspects in VR.
Atmosphere in Virtual Reality from Matthias Wlfel
]]>
384 7 https://cdn.slidesharecdn.com/ss_thumbnails/atmosphereinvirtualreality-170523130706-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Acceptance of dynamic feedback to poor sitting habits by anthropomorphic objects /slideshow/acceptance-of-dynamic-feedback-to-poor-sitting-habits-by-anthropomorphic-objects/76252558 acceptanceofdynamicfeedbacktopoorsittinghabitsbyanthropomorphicobjects-170523124409
The human body is designed for regular movement. Many humans, however, spend the bulk of their day sitting still instead. On average, for instance an adult spends approximately 10 hours each day sittingin Asia, Europe as well as US. While a brief period of sitting here and there is natural, long periods of sitting day-in and day-out can seriously impact health and are associated with a significantly higher risk of heart disease, diabetes, obesity, cancer, and depression, as well as muscle and joint problems. Even working out vigorously may not compensate for long sitting sessions. The key is to build frequent movement variety into the day and to change the sitting position from time to time. About every 20-30 minutes the body needs a posture break by moving for a couple of minutes or, at least, by changing the sitting position. Most humans, even knowing about bad behavior and willing to change it, are not able to do so for many different reasons. In order to support behavior changes we have developed a system which is able to track sitting behavior and reflect this by anthropomorphic objects. By doing so we can provide a constant feedback of the seating position and give a reminder to sit right, to change the seating position from time to time or to stand up. In our user study we demonstrate that such a system is accepted by the user and believed to lead to better posture awareness and sitting behavior.]]>

The human body is designed for regular movement. Many humans, however, spend the bulk of their day sitting still instead. On average, for instance an adult spends approximately 10 hours each day sittingin Asia, Europe as well as US. While a brief period of sitting here and there is natural, long periods of sitting day-in and day-out can seriously impact health and are associated with a significantly higher risk of heart disease, diabetes, obesity, cancer, and depression, as well as muscle and joint problems. Even working out vigorously may not compensate for long sitting sessions. The key is to build frequent movement variety into the day and to change the sitting position from time to time. About every 20-30 minutes the body needs a posture break by moving for a couple of minutes or, at least, by changing the sitting position. Most humans, even knowing about bad behavior and willing to change it, are not able to do so for many different reasons. In order to support behavior changes we have developed a system which is able to track sitting behavior and reflect this by anthropomorphic objects. By doing so we can provide a constant feedback of the seating position and give a reminder to sit right, to change the seating position from time to time or to stand up. In our user study we demonstrate that such a system is accepted by the user and believed to lead to better posture awareness and sitting behavior.]]>
Tue, 23 May 2017 12:44:09 GMT /slideshow/acceptance-of-dynamic-feedback-to-poor-sitting-habits-by-anthropomorphic-objects/76252558 MatthiasWlfel@slideshare.net(MatthiasWlfel) Acceptance of dynamic feedback to poor sitting habits by anthropomorphic objects MatthiasWlfel The human body is designed for regular movement. Many humans, however, spend the bulk of their day sitting still instead. On average, for instance an adult spends approximately 10 hours each day sittingin Asia, Europe as well as US. While a brief period of sitting here and there is natural, long periods of sitting day-in and day-out can seriously impact health and are associated with a significantly higher risk of heart disease, diabetes, obesity, cancer, and depression, as well as muscle and joint problems. Even working out vigorously may not compensate for long sitting sessions. The key is to build frequent movement variety into the day and to change the sitting position from time to time. About every 20-30 minutes the body needs a posture break by moving for a couple of minutes or, at least, by changing the sitting position. Most humans, even knowing about bad behavior and willing to change it, are not able to do so for many different reasons. In order to support behavior changes we have developed a system which is able to track sitting behavior and reflect this by anthropomorphic objects. By doing so we can provide a constant feedback of the seating position and give a reminder to sit right, to change the seating position from time to time or to stand up. In our user study we demonstrate that such a system is accepted by the user and believed to lead to better posture awareness and sitting behavior. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/acceptanceofdynamicfeedbacktopoorsittinghabitsbyanthropomorphicobjects-170523124409-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> The human body is designed for regular movement. Many humans, however, spend the bulk of their day sitting still instead. On average, for instance an adult spends approximately 10 hours each day sittingin Asia, Europe as well as US. While a brief period of sitting here and there is natural, long periods of sitting day-in and day-out can seriously impact health and are associated with a significantly higher risk of heart disease, diabetes, obesity, cancer, and depression, as well as muscle and joint problems. Even working out vigorously may not compensate for long sitting sessions. The key is to build frequent movement variety into the day and to change the sitting position from time to time. About every 20-30 minutes the body needs a posture break by moving for a couple of minutes or, at least, by changing the sitting position. Most humans, even knowing about bad behavior and willing to change it, are not able to do so for many different reasons. In order to support behavior changes we have developed a system which is able to track sitting behavior and reflect this by anthropomorphic objects. By doing so we can provide a constant feedback of the seating position and give a reminder to sit right, to change the seating position from time to time or to stand up. In our user study we demonstrate that such a system is accepted by the user and believed to lead to better posture awareness and sitting behavior.
Acceptance of dynamic feedback to poor sitting habits by anthropomorphic objects from Matthias Wlfel
]]>
284 4 https://cdn.slidesharecdn.com/ss_thumbnails/acceptanceofdynamicfeedbacktopoorsittinghabitsbyanthropomorphicobjects-170523124409-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Voice Driven Type Design /slideshow/voice-driven-type-design/53942360 vdtd-151014193239-lva1-app6892
The richness of verbal communication is lost in text based communication. To overcome this we propose, instead of treating typographical character as unchangeable property, we propose to let the shape of each single character adjust according to particular acoustic features in the spoken reference. Publication: https://www.researchgate.net/publication/282945898_Voice_Driven_Type_Design ]]>

The richness of verbal communication is lost in text based communication. To overcome this we propose, instead of treating typographical character as unchangeable property, we propose to let the shape of each single character adjust according to particular acoustic features in the spoken reference. Publication: https://www.researchgate.net/publication/282945898_Voice_Driven_Type_Design ]]>
Wed, 14 Oct 2015 19:32:39 GMT /slideshow/voice-driven-type-design/53942360 MatthiasWlfel@slideshare.net(MatthiasWlfel) Voice Driven Type Design MatthiasWlfel The richness of verbal communication is lost in text based communication. To overcome this we propose, instead of treating typographical character as unchangeable property, we propose to let the shape of each single character adjust according to particular acoustic features in the spoken reference. Publication: https://www.researchgate.net/publication/282945898_Voice_Driven_Type_Design <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/vdtd-151014193239-lva1-app6892-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> The richness of verbal communication is lost in text based communication. To overcome this we propose, instead of treating typographical character as unchangeable property, we propose to let the shape of each single character adjust according to particular acoustic features in the spoken reference. Publication: https://www.researchgate.net/publication/282945898_Voice_Driven_Type_Design
Voice Driven Type Design from Matthias Wlfel
]]>
1054 8 https://cdn.slidesharecdn.com/ss_thumbnails/vdtd-151014193239-lva1-app6892-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
To be there or not to be there that is the question /slideshow/to-be-there-or-not-to-be-there-that-is-the-question/53727902 tobethereornottobetherethatisthequestion-151009081500-lva1-app6892
Virtual environments let us experience a person's sense of being there, a form of spatial immersion dubbed presence and of being there together known as connected presence. This visceral feeling of being there depends on several dimensions including the realism of the environment and the embodiment of one self and the others. We argue that presence and belonging is not only a question of technology but a question of the provided format, the symbolic spaces and of social interaction. A virtual world is always a managed space and a managed me with all its complications. Publication: https://www.researchgate.net/publication/282730208_To_Be_There_or_Not_to_Be_There_That_is_the_Question%21 ]]>

Virtual environments let us experience a person's sense of being there, a form of spatial immersion dubbed presence and of being there together known as connected presence. This visceral feeling of being there depends on several dimensions including the realism of the environment and the embodiment of one self and the others. We argue that presence and belonging is not only a question of technology but a question of the provided format, the symbolic spaces and of social interaction. A virtual world is always a managed space and a managed me with all its complications. Publication: https://www.researchgate.net/publication/282730208_To_Be_There_or_Not_to_Be_There_That_is_the_Question%21 ]]>
Fri, 09 Oct 2015 08:15:00 GMT /slideshow/to-be-there-or-not-to-be-there-that-is-the-question/53727902 MatthiasWlfel@slideshare.net(MatthiasWlfel) To be there or not to be there that is the question MatthiasWlfel Virtual environments let us experience a person's sense of being there, a form of spatial immersion dubbed presence and of being there together known as connected presence. This visceral feeling of being there depends on several dimensions including the realism of the environment and the embodiment of one self and the others. We argue that presence and belonging is not only a question of technology but a question of the provided format, the symbolic spaces and of social interaction. A virtual world is always a managed space and a managed me with all its complications. Publication: https://www.researchgate.net/publication/282730208_To_Be_There_or_Not_to_Be_There_That_is_the_Question%21 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/tobethereornottobetherethatisthequestion-151009081500-lva1-app6892-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Virtual environments let us experience a person&#39;s sense of being there, a form of spatial immersion dubbed presence and of being there together known as connected presence. This visceral feeling of being there depends on several dimensions including the realism of the environment and the embodiment of one self and the others. We argue that presence and belonging is not only a question of technology but a question of the provided format, the symbolic spaces and of social interaction. A virtual world is always a managed space and a managed me with all its complications. Publication: https://www.researchgate.net/publication/282730208_To_Be_There_or_Not_to_Be_There_That_is_the_Question%21
To be there or not to be there that is the question from Matthias Wlfel
]]>
461 5 https://cdn.slidesharecdn.com/ss_thumbnails/tobethereornottobetherethatisthequestion-151009081500-lva1-app6892-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Kinetic Space Ein intuitiver Zugang zur 3D Gestenerkennung https://de.slideshare.net/slideshow/kinetic-space-ein-intuitiver-zugang-zur-3d-gestenerkennung/43290948 kineticspaceeinintuitiverzugangzur3dgestenerkennung-keinevideos-150107120242-conversion-gate01
Ber端hrungslose Mensch-Maschine-Schnittstellen, die durch die Beobachtung und das Erkennen von Bewegungsmustern im dreidimensionalen Raum anhand von Tiefensensoren erm旦glicht werden, erfreuen sich einer immer gr旦eren Beliebtheit. Damit das volle Potential einer tiefensensorbasierten Mensch-Maschine-Schnittstelle vielf辰ltig genutzt werden kann, m端ssen traditionelle Interaktionsparadigmen aufgegeben werden. So kommen bei der Steuerung der Xbox 360 mit der Kinect weiterhin Interaktionsparadigmen zum Einsatz, die stark an der Maus-Cursor-Interaktion orientiert sind. Eine aufwendige Analyse der K旦rperbewegungen, wie sie zum Beispiel in Nike+ Kinect Training oder Dance Central implementiert ist, um die Bewegungen des Nutzers zu analysieren, kommt aber recht selten zum Einsatz zeigt aber bereits die potentiellen M旦glichkeiten auf. Wir stellen zahlreiche m旦gliche Anwendungen der dreidimensionalen Gestenerkennung vor die unter anderem mit der Software Kinetic Space realisiert wurden. Dabei gehen wir auf unsere Erfahrung bei der Entwicklung und dem Support von bereits heute realisierten oder noch in der Entwicklung befindlichen gestenbasierten Anwendungen ein und zeigen auf, was heute technisch bereits m旦glich ist und wo noch Entwicklungsarbeit geleistet werden muss.]]>

Ber端hrungslose Mensch-Maschine-Schnittstellen, die durch die Beobachtung und das Erkennen von Bewegungsmustern im dreidimensionalen Raum anhand von Tiefensensoren erm旦glicht werden, erfreuen sich einer immer gr旦eren Beliebtheit. Damit das volle Potential einer tiefensensorbasierten Mensch-Maschine-Schnittstelle vielf辰ltig genutzt werden kann, m端ssen traditionelle Interaktionsparadigmen aufgegeben werden. So kommen bei der Steuerung der Xbox 360 mit der Kinect weiterhin Interaktionsparadigmen zum Einsatz, die stark an der Maus-Cursor-Interaktion orientiert sind. Eine aufwendige Analyse der K旦rperbewegungen, wie sie zum Beispiel in Nike+ Kinect Training oder Dance Central implementiert ist, um die Bewegungen des Nutzers zu analysieren, kommt aber recht selten zum Einsatz zeigt aber bereits die potentiellen M旦glichkeiten auf. Wir stellen zahlreiche m旦gliche Anwendungen der dreidimensionalen Gestenerkennung vor die unter anderem mit der Software Kinetic Space realisiert wurden. Dabei gehen wir auf unsere Erfahrung bei der Entwicklung und dem Support von bereits heute realisierten oder noch in der Entwicklung befindlichen gestenbasierten Anwendungen ein und zeigen auf, was heute technisch bereits m旦glich ist und wo noch Entwicklungsarbeit geleistet werden muss.]]>
Wed, 07 Jan 2015 12:02:42 GMT https://de.slideshare.net/slideshow/kinetic-space-ein-intuitiver-zugang-zur-3d-gestenerkennung/43290948 MatthiasWlfel@slideshare.net(MatthiasWlfel) Kinetic Space Ein intuitiver Zugang zur 3D Gestenerkennung MatthiasWlfel Ber端hrungslose Mensch-Maschine-Schnittstellen, die durch die Beobachtung und das Erkennen von Bewegungsmustern im dreidimensionalen Raum anhand von Tiefensensoren erm旦glicht werden, erfreuen sich einer immer gr旦eren Beliebtheit. Damit das volle Potential einer tiefensensorbasierten Mensch-Maschine-Schnittstelle vielf辰ltig genutzt werden kann, m端ssen traditionelle Interaktionsparadigmen aufgegeben werden. So kommen bei der Steuerung der Xbox 360 mit der Kinect weiterhin Interaktionsparadigmen zum Einsatz, die stark an der Maus-Cursor-Interaktion orientiert sind. Eine aufwendige Analyse der K旦rperbewegungen, wie sie zum Beispiel in Nike+ Kinect Training oder Dance Central implementiert ist, um die Bewegungen des Nutzers zu analysieren, kommt aber recht selten zum Einsatz zeigt aber bereits die potentiellen M旦glichkeiten auf. Wir stellen zahlreiche m旦gliche Anwendungen der dreidimensionalen Gestenerkennung vor die unter anderem mit der Software Kinetic Space realisiert wurden. Dabei gehen wir auf unsere Erfahrung bei der Entwicklung und dem Support von bereits heute realisierten oder noch in der Entwicklung befindlichen gestenbasierten Anwendungen ein und zeigen auf, was heute technisch bereits m旦glich ist und wo noch Entwicklungsarbeit geleistet werden muss. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/kineticspaceeinintuitiverzugangzur3dgestenerkennung-keinevideos-150107120242-conversion-gate01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Ber端hrungslose Mensch-Maschine-Schnittstellen, die durch die Beobachtung und das Erkennen von Bewegungsmustern im dreidimensionalen Raum anhand von Tiefensensoren erm旦glicht werden, erfreuen sich einer immer gr旦eren Beliebtheit. Damit das volle Potential einer tiefensensorbasierten Mensch-Maschine-Schnittstelle vielf辰ltig genutzt werden kann, m端ssen traditionelle Interaktionsparadigmen aufgegeben werden. So kommen bei der Steuerung der Xbox 360 mit der Kinect weiterhin Interaktionsparadigmen zum Einsatz, die stark an der Maus-Cursor-Interaktion orientiert sind. Eine aufwendige Analyse der K旦rperbewegungen, wie sie zum Beispiel in Nike+ Kinect Training oder Dance Central implementiert ist, um die Bewegungen des Nutzers zu analysieren, kommt aber recht selten zum Einsatz zeigt aber bereits die potentiellen M旦glichkeiten auf. Wir stellen zahlreiche m旦gliche Anwendungen der dreidimensionalen Gestenerkennung vor die unter anderem mit der Software Kinetic Space realisiert wurden. Dabei gehen wir auf unsere Erfahrung bei der Entwicklung und dem Support von bereits heute realisierten oder noch in der Entwicklung befindlichen gestenbasierten Anwendungen ein und zeigen auf, was heute technisch bereits m旦glich ist und wo noch Entwicklungsarbeit geleistet werden muss.
from Matthias W旦lfel
]]>
1555 178 https://cdn.slidesharecdn.com/ss_thumbnails/kineticspaceeinintuitiverzugangzur3dgestenerkennung-keinevideos-150107120242-conversion-gate01-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Gestenbasiertes Lernen https://de.slideshare.net/slideshow/gestenbasiertes-lernen/43289492 gestenbasierteslernen-150107112054-conversion-gate01
Die 端ber Jahrzehnte etablierte Interaktion mit Computern 端ber Tastatur und Maus ist im Umbruch und die Bedienung 端ber Sprache und Gesten ist der neue Trend. W辰hrend sich diese nat端rlicheren Interaktionsformen (die daher auch als Natural User Interface bezeichnet werden) bereits im t辰glichen Gebrauch auf Smarphones und Tablets, in Infotainmentsystemen im Auto und beim Spielen (Xbox Kinect) bereits weite Verbreitung gefunden haben, wurde das Potential solcher Interaktionsformen f端r das eLearning bei weitem noch nicht ausgesch旦pft. So gibt es zahlreiche Arbeiten, bei denen untersucht wurde, wie sich die Bedienung 端ber Touch (also die zweidimensionale Gesteneingabe mit unmittelbarer Interaktion mit den virtuellen Inhalten) im Vergleich zur Bedienung durch Tastatur und Maus auf den Lernprozess auswirkt. Hierbei wurde aber nur die Interaktion mit der Lernumgebung ver辰ndert, aber nicht der Lernprozess selbst. Jedoch erm旦glichen sensorbasierte Natural User Interfaces genau dieses: sei es beim Erlernen von Tanzbewegungen oder krankengymnastischen bungen durch Beobachtung und Analyse der ausgef端hrten Bewegung; beim Spielenerlernen eines Musikinstrumentes durch die intelligente Auswertung des Audiosignals oder beim Erlernen der richtigen Schreibweise, indem der Schreibstift vibriert, wenn ein Wort falsch geschrieben wurde.]]>

Die 端ber Jahrzehnte etablierte Interaktion mit Computern 端ber Tastatur und Maus ist im Umbruch und die Bedienung 端ber Sprache und Gesten ist der neue Trend. W辰hrend sich diese nat端rlicheren Interaktionsformen (die daher auch als Natural User Interface bezeichnet werden) bereits im t辰glichen Gebrauch auf Smarphones und Tablets, in Infotainmentsystemen im Auto und beim Spielen (Xbox Kinect) bereits weite Verbreitung gefunden haben, wurde das Potential solcher Interaktionsformen f端r das eLearning bei weitem noch nicht ausgesch旦pft. So gibt es zahlreiche Arbeiten, bei denen untersucht wurde, wie sich die Bedienung 端ber Touch (also die zweidimensionale Gesteneingabe mit unmittelbarer Interaktion mit den virtuellen Inhalten) im Vergleich zur Bedienung durch Tastatur und Maus auf den Lernprozess auswirkt. Hierbei wurde aber nur die Interaktion mit der Lernumgebung ver辰ndert, aber nicht der Lernprozess selbst. Jedoch erm旦glichen sensorbasierte Natural User Interfaces genau dieses: sei es beim Erlernen von Tanzbewegungen oder krankengymnastischen bungen durch Beobachtung und Analyse der ausgef端hrten Bewegung; beim Spielenerlernen eines Musikinstrumentes durch die intelligente Auswertung des Audiosignals oder beim Erlernen der richtigen Schreibweise, indem der Schreibstift vibriert, wenn ein Wort falsch geschrieben wurde.]]>
Wed, 07 Jan 2015 11:20:53 GMT https://de.slideshare.net/slideshow/gestenbasiertes-lernen/43289492 MatthiasWlfel@slideshare.net(MatthiasWlfel) Gestenbasiertes Lernen MatthiasWlfel Die 端ber Jahrzehnte etablierte Interaktion mit Computern 端ber Tastatur und Maus ist im Umbruch und die Bedienung 端ber Sprache und Gesten ist der neue Trend. W辰hrend sich diese nat端rlicheren Interaktionsformen (die daher auch als Natural User Interface bezeichnet werden) bereits im t辰glichen Gebrauch auf Smarphones und Tablets, in Infotainmentsystemen im Auto und beim Spielen (Xbox Kinect) bereits weite Verbreitung gefunden haben, wurde das Potential solcher Interaktionsformen f端r das eLearning bei weitem noch nicht ausgesch旦pft. So gibt es zahlreiche Arbeiten, bei denen untersucht wurde, wie sich die Bedienung 端ber Touch (also die zweidimensionale Gesteneingabe mit unmittelbarer Interaktion mit den virtuellen Inhalten) im Vergleich zur Bedienung durch Tastatur und Maus auf den Lernprozess auswirkt. Hierbei wurde aber nur die Interaktion mit der Lernumgebung ver辰ndert, aber nicht der Lernprozess selbst. Jedoch erm旦glichen sensorbasierte Natural User Interfaces genau dieses: sei es beim Erlernen von Tanzbewegungen oder krankengymnastischen bungen durch Beobachtung und Analyse der ausgef端hrten Bewegung; beim Spielenerlernen eines Musikinstrumentes durch die intelligente Auswertung des Audiosignals oder beim Erlernen der richtigen Schreibweise, indem der Schreibstift vibriert, wenn ein Wort falsch geschrieben wurde. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/gestenbasierteslernen-150107112054-conversion-gate01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Die 端ber Jahrzehnte etablierte Interaktion mit Computern 端ber Tastatur und Maus ist im Umbruch und die Bedienung 端ber Sprache und Gesten ist der neue Trend. W辰hrend sich diese nat端rlicheren Interaktionsformen (die daher auch als Natural User Interface bezeichnet werden) bereits im t辰glichen Gebrauch auf Smarphones und Tablets, in Infotainmentsystemen im Auto und beim Spielen (Xbox Kinect) bereits weite Verbreitung gefunden haben, wurde das Potential solcher Interaktionsformen f端r das eLearning bei weitem noch nicht ausgesch旦pft. So gibt es zahlreiche Arbeiten, bei denen untersucht wurde, wie sich die Bedienung 端ber Touch (also die zweidimensionale Gesteneingabe mit unmittelbarer Interaktion mit den virtuellen Inhalten) im Vergleich zur Bedienung durch Tastatur und Maus auf den Lernprozess auswirkt. Hierbei wurde aber nur die Interaktion mit der Lernumgebung ver辰ndert, aber nicht der Lernprozess selbst. Jedoch erm旦glichen sensorbasierte Natural User Interfaces genau dieses: sei es beim Erlernen von Tanzbewegungen oder krankengymnastischen bungen durch Beobachtung und Analyse der ausgef端hrten Bewegung; beim Spielenerlernen eines Musikinstrumentes durch die intelligente Auswertung des Audiosignals oder beim Erlernen der richtigen Schreibweise, indem der Schreibstift vibriert, wenn ein Wort falsch geschrieben wurde.
from Matthias Wlfel
]]>
663 16 https://cdn.slidesharecdn.com/ss_thumbnails/gestenbasierteslernen-150107112054-conversion-gate01-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Advertisement in Hybrid Urban Space /slideshow/advertisement-in-hybrid-urban-space/43277786 advertisementinhybridurbanspace-150107052525-conversion-gate01
In stark contrast to online advertising campaigns, advertisement in the urban space has lost attention and seems to be stuck in the Gutenberg era. The emergence of hybrid urban spaces, however, allows for novel possibilities to bring back customers attention and interest. In this publication we review current interactive advertisement campaigns, investigate the use of implicit (age, gender, location) and explicit (2D and 3D gestures) interactions of the user to adjust the ad, and discuss novel questions and responsibilities that are driven by these new advertisement formats. To evaluate different aspects on the behavior and acceptably of such novel kind of advertisement we have built a prototypical system and put it into a shopping mall. The conducted user study includes 98 random visitors of the mall who have first tested the system and then filled out a questionnaire.]]>

In stark contrast to online advertising campaigns, advertisement in the urban space has lost attention and seems to be stuck in the Gutenberg era. The emergence of hybrid urban spaces, however, allows for novel possibilities to bring back customers attention and interest. In this publication we review current interactive advertisement campaigns, investigate the use of implicit (age, gender, location) and explicit (2D and 3D gestures) interactions of the user to adjust the ad, and discuss novel questions and responsibilities that are driven by these new advertisement formats. To evaluate different aspects on the behavior and acceptably of such novel kind of advertisement we have built a prototypical system and put it into a shopping mall. The conducted user study includes 98 random visitors of the mall who have first tested the system and then filled out a questionnaire.]]>
Wed, 07 Jan 2015 05:25:25 GMT /slideshow/advertisement-in-hybrid-urban-space/43277786 MatthiasWlfel@slideshare.net(MatthiasWlfel) Advertisement in Hybrid Urban Space MatthiasWlfel In stark contrast to online advertising campaigns, advertisement in the urban space has lost attention and seems to be stuck in the Gutenberg era. The emergence of hybrid urban spaces, however, allows for novel possibilities to bring back customers attention and interest. In this publication we review current interactive advertisement campaigns, investigate the use of implicit (age, gender, location) and explicit (2D and 3D gestures) interactions of the user to adjust the ad, and discuss novel questions and responsibilities that are driven by these new advertisement formats. To evaluate different aspects on the behavior and acceptably of such novel kind of advertisement we have built a prototypical system and put it into a shopping mall. The conducted user study includes 98 random visitors of the mall who have first tested the system and then filled out a questionnaire. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/advertisementinhybridurbanspace-150107052525-conversion-gate01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> In stark contrast to online advertising campaigns, advertisement in the urban space has lost attention and seems to be stuck in the Gutenberg era. The emergence of hybrid urban spaces, however, allows for novel possibilities to bring back customers attention and interest. In this publication we review current interactive advertisement campaigns, investigate the use of implicit (age, gender, location) and explicit (2D and 3D gestures) interactions of the user to adjust the ad, and discuss novel questions and responsibilities that are driven by these new advertisement formats. To evaluate different aspects on the behavior and acceptably of such novel kind of advertisement we have built a prototypical system and put it into a shopping mall. The conducted user study includes 98 random visitors of the mall who have first tested the system and then filled out a questionnaire.
Advertisement in Hybrid Urban Space from Matthias Wlfel
]]>
574 2 https://cdn.slidesharecdn.com/ss_thumbnails/advertisementinhybridurbanspace-150107052525-conversion-gate01-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-MatthiasWlfel-48x48.jpg?cb=1728638790 Dr. Matthias W旦lfel is a Professor for Intuitive and Perceptive Interfaces at Karlsruhe University of Applied Sciences, Germany. His research interests include interaction design, human-computer and human-computer-human interaction, artificial intelligence, augmented- and virtual reality, digital culture as well as art and media theory. Dr. W旦lfel has been recognized in 2017 as one of the best professors in Germany by UNICUM, a popular magazine for university graduates in Germany that features an annual Professor des Jahres (professor of the year) competition. Out of 2200 professors, nominated by students and graduates, he came in second place in the category Engineering/Computer Science. www.colorfulbit.com https://cdn.slidesharecdn.com/ss_thumbnails/socialinteractioninimmersiveenvironments-241016222916-f983b9d5-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/social-interaction-in-immersive-environments-pdf/272477325 Social Interaction in ... https://cdn.slidesharecdn.com/ss_thumbnails/f35-181011151033-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/color-preference-differences-between-hmd-and-pc/119125363 Color Preference Diffe... https://cdn.slidesharecdn.com/ss_thumbnails/s74-181011150555-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/effects-of-electrical-pain-stimuli-on-immersionin-virtual-reality/119125015 Effects of Electrical ...