|
Dennis G. Romero, A. F. Neto, T. F. Bastos, & Boris X. Vintimilla. (2012). RWE patterns extraction for on-line human action recognition through window-based analysis of invariant moments. In 5th Workshop in applied Robotics and Automation (RoboControl).
Abstract: This paper presents a method for on-line human action recognition on video sequences. An analysis based on Mahalanobis distance is performed to identify the “idle” state, which defines the beginning and end of the person movement, for posterior patterns extraction based on Relative Wavelet Energy from sequences of invariant moments.
|
|
|
Dennis G. Romero, A. F. Neto, T. F. Bastos, & Boris X. Vintimilla. (2012). An approach to automatic assistance in physiotherapy based on on-line movement identification. In VI Andean Region International Conference – ANDESCON 2012. Andean Region International Conference (ANDESCON), 2012 VI: IEEE.
Abstract: This paper describes a method for on-line movement identification, oriented to patient’s movement evaluation during physiotherapy. An analysis based on Mahalanobis distance between temporal windows is performed to identify the “idle/motion” state, which defines the beginning and end of the patient’s movement, for posterior patterns extraction based on Relative Wavelet Energy from sequences of invariant moments.
|
|
|
Angely Oyola, Dennis G. Romero, & Boris X. Vintimilla. (2017). A Dijkstra-based algorithm for selecting the Shortest-Safe Evacuation Routes in dynamic environments (SSER). In The 30th International Conference on Industrial, Engineering, Other Applications of Applied Intelligent Systems (IEA/AIE 2017) (pp. 131–135).
|
|
|
Angel D. Sappa, Cristhian A. Aguilera, Juan A. Carvajal Ayala, Miguel Oliveira, Dennis Romero, Boris X. Vintimilla, et al. (2016). Monocular visual odometry: a cross-spectral image fusion based approach. Robotics and Autonomous Systems Journal, Vol. 86, pp. 26–36.
Abstract: This manuscript evaluates the usage of fused cross-spectral images in a monocular visual odometry approach. Fused images are obtained through a Discrete Wavelet Transform (DWT) scheme, where the best setup is em- pirically obtained by means of a mutual information based evaluation met- ric. The objective is to have a exible scheme where fusion parameters are adapted according to the characteristics of the given images. Visual odom- etry is computed from the fused monocular images using an off the shelf approach. Experimental results using data sets obtained with two different platforms are presented. Additionally, comparison with a previous approach as well as with monocular-visible/infrared spectra are also provided showing the advantages of the proposed scheme.
|
|
|
Angel D. Sappa, Juan A. Carvajal, Cristhian A. Aguilera, Miguel Oliveira, Dennis G. Romero, & Boris X. Vintimilla. (2016). Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study. Sensors Journal, Vol. 16, pp. 1–15.
Abstract: This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and LongWave InfraRed (LWIR).
|
|