ºÝºÝߣshows by User: jmiseikis / http://www.slideshare.net/images/logo.gif ºÝºÝߣshows by User: jmiseikis / Tue, 10 Jul 2018 05:41:52 GMT ºÝºÝߣShare feed for ºÝºÝߣshows by User: jmiseikis Robot Localisation and 3D Position Estimation Using a Free-Moving Camera and Cascaded Convolutional Neural Networks /slideshow/robot-localisation-and-3d-position-estimation-using-a-freemoving-camera-and-cascaded-convolutional-neural-networks/105092451 jmiseikisaim2018presentation-180710054152
Conference paper presentation at IEEE/ASME International Conference on Advanced Intelligent Mechatronics]]>

Conference paper presentation at IEEE/ASME International Conference on Advanced Intelligent Mechatronics]]>
Tue, 10 Jul 2018 05:41:52 GMT /slideshow/robot-localisation-and-3d-position-estimation-using-a-freemoving-camera-and-cascaded-convolutional-neural-networks/105092451 jmiseikis@slideshare.net(jmiseikis) Robot Localisation and 3D Position Estimation Using a Free-Moving Camera and Cascaded Convolutional Neural Networks jmiseikis Conference paper presentation at IEEE/ASME International Conference on Advanced Intelligent Mechatronics <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikisaim2018presentation-180710054152-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Conference paper presentation at IEEE/ASME International Conference on Advanced Intelligent Mechatronics
Robot Localisation and 3D Position Estimation Using a Free-Moving Camera and Cascaded Convolutional Neural Networks from Justas Miseikis
]]>
243 3 https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikisaim2018presentation-180710054152-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Multi-Objective Convolutional Neural Networks for Robot Localisation and 3D Position Estimation in 2D Camera Images /slideshow/multiobjective-convolutional-neural-networks-for-robot-localisation-and-3d-position-estimation-in-2d-camera-images/105091597 jmiseikisurmulti-objpresentation-180710053619
IEEE 15th International Conference on Ubiquitous Robots in Honolulu, Hawaii, 2018.]]>

IEEE 15th International Conference on Ubiquitous Robots in Honolulu, Hawaii, 2018.]]>
Tue, 10 Jul 2018 05:36:19 GMT /slideshow/multiobjective-convolutional-neural-networks-for-robot-localisation-and-3d-position-estimation-in-2d-camera-images/105091597 jmiseikis@slideshare.net(jmiseikis) Multi-Objective Convolutional Neural Networks for Robot Localisation and 3D Position Estimation in 2D Camera Images jmiseikis IEEE 15th International Conference on Ubiquitous Robots in Honolulu, Hawaii, 2018. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikisurmulti-objpresentation-180710053619-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> IEEE 15th International Conference on Ubiquitous Robots in Honolulu, Hawaii, 2018.
Multi-Objective Convolutional Neural Networks for Robot Localisation and 3D Position Estimation in 2D Camera Images from Justas Miseikis
]]>
93 2 https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikisurmulti-objpresentation-180710053619-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Automatic Calibration of a Robot Manipulator and Multi 3D Camera System /slideshow/automatic-calibration-of-a-robot-manipulator-and-multi-3d-camera-system/83646105 jmiseikissiicalibpresentation-171208124908
With 3D sensing becoming cheaper, environment- aware and visually-guided robot arms capable of safely working in collaboration with humans will become common. However, a reliable calibration is needed, both for camera internal calibration, as well as Eye-to-Hand calibration, to make sure the whole system functions correctly. We present a framework, using a novel combination of well proven methods, allowing a quick automatic calibration for the integration of systems consisting of the robot and a varying number of 3D cameras by using a standard checkerboard calibration grid. Our approach allows a quick camera-to-robot recalibration after any changes to the setup, for example when cameras or robot have been repositioned. Modular design of the system ensures flexibility regarding a number of sensors used as well as different hardware choices. The framework has been proven to work by practical experiments to analyze the quality of the calibration versus the number of positions of the checkerboard used for each of the calibration procedures.]]>

With 3D sensing becoming cheaper, environment- aware and visually-guided robot arms capable of safely working in collaboration with humans will become common. However, a reliable calibration is needed, both for camera internal calibration, as well as Eye-to-Hand calibration, to make sure the whole system functions correctly. We present a framework, using a novel combination of well proven methods, allowing a quick automatic calibration for the integration of systems consisting of the robot and a varying number of 3D cameras by using a standard checkerboard calibration grid. Our approach allows a quick camera-to-robot recalibration after any changes to the setup, for example when cameras or robot have been repositioned. Modular design of the system ensures flexibility regarding a number of sensors used as well as different hardware choices. The framework has been proven to work by practical experiments to analyze the quality of the calibration versus the number of positions of the checkerboard used for each of the calibration procedures.]]>
Fri, 08 Dec 2017 12:49:08 GMT /slideshow/automatic-calibration-of-a-robot-manipulator-and-multi-3d-camera-system/83646105 jmiseikis@slideshare.net(jmiseikis) Automatic Calibration of a Robot Manipulator and Multi 3D Camera System jmiseikis With 3D sensing becoming cheaper, environment- aware and visually-guided robot arms capable of safely working in collaboration with humans will become common. However, a reliable calibration is needed, both for camera internal calibration, as well as Eye-to-Hand calibration, to make sure the whole system functions correctly. We present a framework, using a novel combination of well proven methods, allowing a quick automatic calibration for the integration of systems consisting of the robot and a varying number of 3D cameras by using a standard checkerboard calibration grid. Our approach allows a quick camera-to-robot recalibration after any changes to the setup, for example when cameras or robot have been repositioned. Modular design of the system ensures flexibility regarding a number of sensors used as well as different hardware choices. The framework has been proven to work by practical experiments to analyze the quality of the calibration versus the number of positions of the checkerboard used for each of the calibration procedures. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikissiicalibpresentation-171208124908-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> With 3D sensing becoming cheaper, environment- aware and visually-guided robot arms capable of safely working in collaboration with humans will become common. However, a reliable calibration is needed, both for camera internal calibration, as well as Eye-to-Hand calibration, to make sure the whole system functions correctly. We present a framework, using a novel combination of well proven methods, allowing a quick automatic calibration for the integration of systems consisting of the robot and a varying number of 3D cameras by using a standard checkerboard calibration grid. Our approach allows a quick camera-to-robot recalibration after any changes to the setup, for example when cameras or robot have been repositioned. Modular design of the system ensures flexibility regarding a number of sensors used as well as different hardware choices. The framework has been proven to work by practical experiments to analyze the quality of the calibration versus the number of positions of the checkerboard used for each of the calibration procedures.
Automatic Calibration of a Robot Manipulator and Multi 3D Camera System from Justas Miseikis
]]>
384 4 https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikissiicalibpresentation-171208124908-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
3D Vision Guided Robotic Charging Station for Electric and Plug-In Hybrid Vehicles /slideshow/3d-vision-guided-robotic-charging-station-for-electric-and-plugin-hybrid-vehicles/83645966 3dvisionguidedroboticchargingstationforelectricandplug-inhybridvehicles-171208124641
Electric vehicles (EVs) and plug-in hybrid vehicles (PHEVs) are rapidly gaining popularity on our roads. Besides a comparatively high purchasing price, the main two problems limiting their use are the short driving range and inconvenient charging process. In this paper we address the following by presenting an automatic robot-based charging station with 3D vision guidance for plugging and unplugging the charger. First of all, the whole system concept consisting of a 3D vision system, an UR10 robot and a charging station is presented. Then we show the shape-based matching methods used to successfully identify and get the exact pose of the charging port. The same approach is used to calibrate the camera-robot system by using just known structure of the connector plug and no additional markers. Finally, a three-step robot motion planning procedure for plug-in is presented and functionality is demonstrated in a series of successful experiments.]]>

Electric vehicles (EVs) and plug-in hybrid vehicles (PHEVs) are rapidly gaining popularity on our roads. Besides a comparatively high purchasing price, the main two problems limiting their use are the short driving range and inconvenient charging process. In this paper we address the following by presenting an automatic robot-based charging station with 3D vision guidance for plugging and unplugging the charger. First of all, the whole system concept consisting of a 3D vision system, an UR10 robot and a charging station is presented. Then we show the shape-based matching methods used to successfully identify and get the exact pose of the charging port. The same approach is used to calibrate the camera-robot system by using just known structure of the connector plug and no additional markers. Finally, a three-step robot motion planning procedure for plug-in is presented and functionality is demonstrated in a series of successful experiments.]]>
Fri, 08 Dec 2017 12:46:41 GMT /slideshow/3d-vision-guided-robotic-charging-station-for-electric-and-plugin-hybrid-vehicles/83645966 jmiseikis@slideshare.net(jmiseikis) 3D Vision Guided Robotic Charging Station for Electric and Plug-In Hybrid Vehicles jmiseikis Electric vehicles (EVs) and plug-in hybrid vehicles (PHEVs) are rapidly gaining popularity on our roads. Besides a comparatively high purchasing price, the main two problems limiting their use are the short driving range and inconvenient charging process. In this paper we address the following by presenting an automatic robot-based charging station with 3D vision guidance for plugging and unplugging the charger. First of all, the whole system concept consisting of a 3D vision system, an UR10 robot and a charging station is presented. Then we show the shape-based matching methods used to successfully identify and get the exact pose of the charging port. The same approach is used to calibrate the camera-robot system by using just known structure of the connector plug and no additional markers. Finally, a three-step robot motion planning procedure for plug-in is presented and functionality is demonstrated in a series of successful experiments. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/3dvisionguidedroboticchargingstationforelectricandplug-inhybridvehicles-171208124641-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Electric vehicles (EVs) and plug-in hybrid vehicles (PHEVs) are rapidly gaining popularity on our roads. Besides a comparatively high purchasing price, the main two problems limiting their use are the short driving range and inconvenient charging process. In this paper we address the following by presenting an automatic robot-based charging station with 3D vision guidance for plugging and unplugging the charger. First of all, the whole system concept consisting of a 3D vision system, an UR10 robot and a charging station is presented. Then we show the shape-based matching methods used to successfully identify and get the exact pose of the charging port. The same approach is used to calibrate the camera-robot system by using just known structure of the connector plug and no additional markers. Finally, a three-step robot motion planning procedure for plug-in is presented and functionality is demonstrated in a series of successful experiments.
3D Vision Guided Robotic Charging Station for Electric and Plug-In Hybrid Vehicles from Justas Miseikis
]]>
246 3 https://cdn.slidesharecdn.com/ss_thumbnails/3dvisionguidedroboticchargingstationforelectricandplug-inhybridvehicles-171208124641-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Mažasis Universitetas, IT karjera: Robotika /slideshow/maasis-universitetas-it-karjera-robotika/17554678 roboticspresentation-130323104112-phpapp01
]]>

]]>
Sat, 23 Mar 2013 10:41:12 GMT /slideshow/maasis-universitetas-it-karjera-robotika/17554678 jmiseikis@slideshare.net(jmiseikis) Mažasis Universitetas, IT karjera: Robotika jmiseikis <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/roboticspresentation-130323104112-phpapp01-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br>
Ma転asis Universitetas, IT karjera: Robotika from Justas Miseikis
]]>
487 2 https://cdn.slidesharecdn.com/ss_thumbnails/roboticspresentation-130323104112-phpapp01-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
CUDA based Iris Detection based on Hough Transform /slideshow/cuda-based-iris-detection-based-on-hough-transform/14864867 presentation-121024062657-phpapp02
GP/GPU Class final project to implement a GPU based eye iris segmentation to achieve improvement over an optimised equivalent CPU implementation.]]>

GP/GPU Class final project to implement a GPU based eye iris segmentation to achieve improvement over an optimised equivalent CPU implementation.]]>
Wed, 24 Oct 2012 06:26:56 GMT /slideshow/cuda-based-iris-detection-based-on-hough-transform/14864867 jmiseikis@slideshare.net(jmiseikis) CUDA based Iris Detection based on Hough Transform jmiseikis GP/GPU Class final project to implement a GPU based eye iris segmentation to achieve improvement over an optimised equivalent CPU implementation. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/presentation-121024062657-phpapp02-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> GP/GPU Class final project to implement a GPU based eye iris segmentation to achieve improvement over an optimised equivalent CPU implementation.
CUDA based Iris Detection based on Hough Transform from Justas Miseikis
]]>
1235 3 https://cdn.slidesharecdn.com/ss_thumbnails/presentation-121024062657-phpapp02-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Joint Human Detection from On-Board and Off-Board Cameras /slideshow/joint-human-detection-from-onboard-and-offboard-cameras/14410845 presentationforpublic-120923054727-phpapp02
]]>

]]>
Sun, 23 Sep 2012 05:47:25 GMT /slideshow/joint-human-detection-from-onboard-and-offboard-cameras/14410845 jmiseikis@slideshare.net(jmiseikis) Joint Human Detection from On-Board and Off-Board Cameras jmiseikis <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/presentationforpublic-120923054727-phpapp02-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br>
Joint Human Detection from On-Board and Off-Board Cameras from Justas Miseikis
]]>
992 2 https://cdn.slidesharecdn.com/ss_thumbnails/presentationforpublic-120923054727-phpapp02-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
TESP 2012 Drums Haptic Interface /slideshow/tesp-2012-drums-haptic-interface/14331024 tesp-2012presentation-120918080711-phpapp02
Few days Arduino based project to develop sound (frequency) dependent haptic interface for simulated instruments on touch-screen devices like tablets or phones. Input is simply the audio signal. Could be expanded to real instruments for the enhancement of the feeling by using a microphone.]]>

Few days Arduino based project to develop sound (frequency) dependent haptic interface for simulated instruments on touch-screen devices like tablets or phones. Input is simply the audio signal. Could be expanded to real instruments for the enhancement of the feeling by using a microphone.]]>
Tue, 18 Sep 2012 08:07:09 GMT /slideshow/tesp-2012-drums-haptic-interface/14331024 jmiseikis@slideshare.net(jmiseikis) TESP 2012 Drums Haptic Interface jmiseikis Few days Arduino based project to develop sound (frequency) dependent haptic interface for simulated instruments on touch-screen devices like tablets or phones. Input is simply the audio signal. Could be expanded to real instruments for the enhancement of the feeling by using a microphone. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/tesp-2012presentation-120918080711-phpapp02-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Few days Arduino based project to develop sound (frequency) dependent haptic interface for simulated instruments on touch-screen devices like tablets or phones. Input is simply the audio signal. Could be expanded to real instruments for the enhancement of the feeling by using a microphone.
TESP 2012 Drums Haptic Interface from Justas Miseikis
]]>
616 6 https://cdn.slidesharecdn.com/ss_thumbnails/tesp-2012presentation-120918080711-phpapp02-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Finger Rehabilitation Robot - Justinas Miseikis /slideshow/finger-rehabilitation-robot-justinas-miseikis/13910232 fingerrobotjmiseikis-120808061326-phpapp02
Finger Rehabilitation robot designed, constructed and control added by Justinas Miseikis as a semester project at ETH Zurich. Maximum possible score, 6, was awarded for the project.]]>

Finger Rehabilitation robot designed, constructed and control added by Justinas Miseikis as a semester project at ETH Zurich. Maximum possible score, 6, was awarded for the project.]]>
Wed, 08 Aug 2012 06:13:25 GMT /slideshow/finger-rehabilitation-robot-justinas-miseikis/13910232 jmiseikis@slideshare.net(jmiseikis) Finger Rehabilitation Robot - Justinas Miseikis jmiseikis Finger Rehabilitation robot designed, constructed and control added by Justinas Miseikis as a semester project at ETH Zurich. Maximum possible score, 6, was awarded for the project. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/fingerrobotjmiseikis-120808061326-phpapp02-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Finger Rehabilitation robot designed, constructed and control added by Justinas Miseikis as a semester project at ETH Zurich. Maximum possible score, 6, was awarded for the project.
Finger Rehabilitation Robot - Justinas Miseikis from Justas Miseikis
]]>
1377 6 https://cdn.slidesharecdn.com/ss_thumbnails/fingerrobotjmiseikis-120808061326-phpapp02-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-jmiseikis-48x48.jpg?cb=1550396296 https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikisaim2018presentation-180710054152-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/robot-localisation-and-3d-position-estimation-using-a-freemoving-camera-and-cascaded-convolutional-neural-networks/105092451 Robot Localisation and... https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikisurmulti-objpresentation-180710053619-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/multiobjective-convolutional-neural-networks-for-robot-localisation-and-3d-position-estimation-in-2d-camera-images/105091597 Multi-Objective Convol... https://cdn.slidesharecdn.com/ss_thumbnails/jmiseikissiicalibpresentation-171208124908-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/automatic-calibration-of-a-robot-manipulator-and-multi-3d-camera-system/83646105 Automatic Calibration ...