Category Archives: Robot Sensors

Survey on visual attention in 3D for robotics

Ekaterina Potapova, Michael Zillich, and Markus Vincze, Survey of recent advances in 3D visual attention for robotics, The International Journal of Robotics Research, Vol 36, Issue 11, pp. 1159 – 1176, DOI: 10.1177/0278364917726587.

3D visual attention plays an important role in both human and robotics perception that yet has to be explored in full detail. However, the majority of computer vision and robotics methods are concerned only with 2D visual attention. This survey presents findings and approaches that cover 3D visual attention in both human and robot vision, summarizing the last 30 years of research and also looking beyond computational methods. First, we present work in such fields as biological vision and neurophysiology, studying 3D attention in human observers. This provides a view of the role attention plays at the system level for biological vision. Then, we cover computer and robot vision approaches that take 3D visual attention into account. We compare approaches with respect to different categories, such as feature-based, data-based, or depth-based visual attention, and draw conclusions on what advances will help robotics to cope better with complex real-world settings and tasks.

Improving orientation estimation in a mobile robot for doing better odometry

M.T. Sabet, H.R. Mohammadi Daniali, A.R. Fathi, E. Alizadeh, Experimental analysis of a low-cost dead reckoning navigation system for a land vehicle using a robust AHRS, Robotics and Autonomous Systems, Volume 95, 2017, Pages 37-51, DOI: 10.1016/j.robot.2017.05.010.

In navigation and motion control of an autonomous vehicle, estimation of attitude and heading is an important issue especially when the localization sensors such as GPS are not available and the vehicle is navigated by the dead reckoning (DR) strategies. In this paper, based on a new modeling framework an Extended Kalman Filter (EKF) is utilized for estimation of attitude, heading and gyroscope sensor bias using a low-cost MEMS inertial sensor. The algorithm is developed for accurate estimation of attitude and heading in the presence of external disturbances including external body accelerations and magnetic disturbances. In this study using the proposed attitude and heading reference system (AHRS) and an odometer sensor, a low-cost aided DR navigation system has been designed. The proposed algorithm application is evaluated by experimental tests in different acceleration bound and existence of external magnetic disturbances for a land vehicle. The results indicate that the roll, pitch and heading are estimated by mean value errors about 0.83%, 0.68% and 1.13%, respectively. Moreover, they indicate that a relative navigation error about 3% of the traveling distance can be achieved using the developed approach in during GPS outages.

Testbed for comparisons of different UWB sensors applied to localization

A. R. Jiménez Ruiz and F. Seco Granja, “Comparing Ubisense, BeSpoon, and DecaWave UWB Location Systems: Indoor Performance Analysis,” in IEEE Transactions on Instrumentation and Measurement, vol. 66, no. 8, pp. 2106-2117, Aug. 2017.DOI: 10.1109/TIM.2017.2681398.

Most ultrawideband (UWB) location systems already proposed for position estimation have only been individually evaluated for particular scenarios. For a fair performance comparison among different solutions, a common evaluation scenario would be desirable. In this paper, we compare three commercially available UWB systems (Ubisense, BeSpoon, and DecaWave) under the same experimental conditions, in order to do a critical performance analysis. We include the characterization of the quality of the estimated tag-to-sensor distances in an indoor industrial environment. This testing space includes areas under line-of-sight (LOS) and diverse non-LOS conditions caused by the reflection, propagation, and the diffraction of the UWB radio signals across different obstacles. The study also includes the analysis of the estimated azimuth and elevation angles for the Ubisense system, which is the only one that incorporates this feature using an array antenna at each sensor. Finally, we analyze the 3-D positioning estimation performance of the three UWB systems using a Bayesian filter implemented with a particle filter and a measurement model that takes into account bad range measurements and outliers. A final conclusion is drawn about which system performs better under these industrial conditions.

Efficient detection of glass obstacles when using a laser rangefinder

Xun Wang, JianGuo Wang, Detecting glass in Simultaneous Localisation and Mapping, Robotics and Autonomous Systems, Volume 88, February 2017, Pages 97-103, ISSN 0921-8890, DOI: 10.1016/j.robot.2016.11.003.

Simultaneous Localisation and Mapping (SLAM) has become one of key technologies used in advanced robot platform. The current state-of-art indoor SLAM with laser scanning rangefinders can provide accurate realtime localisation and mapping service to mobile robotic platforms such as PR2 robot. In recent years, many modern building designs feature large glass panels as one of the key interior fitting elements, e.g. large glass walls. Due to the transparent nature of glass panels, laser rangefinders are unable to produce accurate readings which causes SLAM functioning incorrectly in these environments. In this paper, we propose a simple and effective solution to identify glass panels based on the specular reflection of laser beams from the glass. Specifically, we use a simple technique to detect the reflected light intensity profile around the normal incident angle to the glass panel. Integrating this glass detection method with an existing SLAM algorithm, our SLAM system is able to detect and localise glass obstacles in realtime. Furthermore, the tests we conducted in two office buildings with a PR2 robot show the proposed method can detect ∼ 95% of all glass panels with no false positive detection. The source code of the modified SLAM with glass detection is released as a open source ROS package along with this paper.

Improving sensory information, diagnosis and fault tolerance by using multiple sensors and sensor fusion, with a good related work section (2.3) on fault tolerance on data fusion

Kaci Bader, Benjamin Lussier, Walter Schön, A fault tolerant architecture for data fusion: A real application of Kalman filters for mobile robot localization, Robotics and Autonomous Systems, Volume 88, February 2017, Pages 11-23, ISSN 0921-8890, DOI: 10.1016/j.robot.2016.11.015.

Multisensor perception has an important role in robotics and autonomous systems, providing inputs for critical functions including obstacle detection and localization. It is starting to appear in critical applications such as drones and ADASs (Advanced Driver Assistance Systems). However, this kind of complex system is difficult to validate comprehensively. In this paper we look at multisensor perception systems in relation to an alternative dependability method, namely fault tolerance. We propose an approach for tolerating faults in multisensor data fusion that is based on the more traditional method of duplication–comparison, and that offers detection and recovery services. We detail an example implementation using Kalman filter data fusion for mobile robot localization. We demonstrate its effectiveness in this case study using real data and fault injection.

The problem of monitoring events that can only be predicted stochastically, applied to mobile sensors for monitoring

Jingjin Yu; Karaman, S.; Rus, D., Persistent Monitoring of Events With Stochastic Arrivals at Multiple Stations, Robotics, IEEE Transactions on , vol.31, no.3, pp.521,535, June 2015, DOI: 10.1109/TRO.2015.2409453.

This paper introduces a new mobile sensor scheduling problem involving a single robot tasked to monitor several events of interest that are occurring at different locations (stations). Of particular interest is the monitoring of transient events of a stochastic nature, with applications ranging from natural phenomena (e.g., monitoring abnormal seismic activity around a volcano using a ground robot) to urban activities (e.g., monitoring early formations of traffic congestion using an aerial robot). Motivated by examples like these, this paper focuses on problems in which the precise occurrence times of the events are unknown apriori, but statistics for their interarrival times are available. In monitoring such events, the robot seeks to: (1) maximize the number of events observed and (2) minimize the delay between two consecutive observations of events occurring at the same location. This paper considers the case when a robot is tasked with optimizing the event observations in a balanced manner, following a cyclic patrolling route. To tackle this problem, first, assuming that the cyclic ordering of stations is known, we prove the existence and uniqueness of the optimal solution and show that the solution has desirable convergence rate and robustness. Our constructive proof also yields an efficient algorithm for computing the unique optimal solution with O(n) time complexity, in which n is the number of stations, with O(log n) time complexity for incrementally adding or removing stations. Except for the algorithm, our analysis remains valid when the cyclic order is unknown. We then provide a polynomial-time approximation scheme that computes for any ε > 0 a (1 + ε)-optimal solution for this more general, NP-hard problem.