Miguel Realpe, Boris X. Vintimilla, & Ljubo Vlacic. (2016). Multi-sensor Fusion Module in a Fault Tolerant Perception System for Autonomous Vehicles. Journal of Automation and Control Engineering (JOACE), Vol. 4, pp. 430–436.
Abstract: Driverless vehicles are currently being tested on public roads in order to examine their ability to perform in a safe and reliable way in real world situations. However, the long-term reliable operation of a vehicle’s diverse sensors and the effects of potential sensor faults in the vehicle system have not been tested yet. This paper is proposing a sensor fusion architecture that minimizes the influence of a sensor fault. Experimental results are presented simulating faults by introducing displacements in the sensor information from the KITTI dataset.
|
Low S., I. N., Nina O., Sappa A. and Blasch E. (2022). Multi-modal Aerial View Object Classification Challenge Results-PBVS 2022. In Conference on Computer Vision and Pattern Recognition Workshops, (CVPRW 2022), junio 19-24. (Vol. 2022-June, pp. 417–425).
Abstract: This paper details the results and main findings of the
second iteration of the Multi-modal Aerial View Object
Classification (MAVOC) challenge. This year’s MAVOC
challenge is the second iteration. The primary goal of
both MAVOC challenges is to inspire research into methods for building recognition models that utilize both synthetic aperture radar (SAR) and electro-optical (EO) input
modalities. Teams are encouraged/challenged to develop
multi-modal approaches that incorporate complementary
information from both domains. While the 2021 challenge
showed a proof of concept that both modalities could be
used together, the 2022 challenge focuses on the detailed
multi-modal models. Using the same UNIfied COincident
Optical and Radar for recognitioN (UNICORN) dataset and
competition format that was used in 2021. Specifically, the
challenge focuses on two techniques, (1) SAR classification
and (2) SAR + EO classification. The bulk of this document is dedicated to discussing the top performing methods
and describing their performance on our blind test set. Notably, all of the top ten teams outperform our baseline. For
SAR classification, the top team showed a 129% improvement over our baseline and an 8% average improvement
from the 2021 winner. The top team for SAR + EO classification shows a 165% improvement with a 32% average
improvement over 2021.
|
Spencer Low, O. N., Angel D. Sappa, Erik Blasch, Nathan Inkawhich. (2023). Multi-modal Aerial View Object Classification Challenge Results – PBVS 2023. In 19th IEEE Workshop on Perception Beyond the Visible Spectrum de la Conferencia Computer Vision & Pattern Recognition CVPR 2023, junio 18-28 (Vol. 2023-June, pp. 412–421).
|
Spencer Low, O. N., Angel D. Sappa, Erik Blasch, Nathan Inkawhich. (2023). Multi-modal Aerial View Image Challenge: Translation from Synthetic Aperture Radar to Electro-Optical Domain Results – PBVS 2023. In 19th IEEE Workshop on Perception Beyond the Visible Spectrum de la Conferencia Computer Vision & Pattern Recognition CVPR 2023, junio 18-28 (Vol. 2023-June, pp. 515–523).
|
Angel D. Sappa, S. L., Oliver Nina, Erik Blasch, Dylan Bowald & Nathan Inkawhich. (2024). Multi-modal Aerial View Image Challenge: Sensor Domain Translation. In Accepted in 20th IEEE Workshop on Perception Beyond the Visible Spectrum of the 2024 Conference on Computer Vision and Pattern Recognition.
|
Angel D. Sappa, S. L., Oliver Nina, Erik Blasch, Dylan Bowald & Nathan Inkawhich. (2024). Multi-modal Aerial View Image Challenge: SAR Classification. In Accepted in 20th IEEE Workshop on Perception Beyond the Visible Spectrum of the 2024 Conference on Computer Vision and Pattern Recognition.
|
Rafael E. Rivadeneira, A. D. S. and B. X. V. (2022). Multi-Image Super-Resolution for Thermal Images. In Proceedings of the International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications VISIGRAPP 2022 (Vol. 4, pp. 635–642).
|
Mehri, A., Ardakani, P.B., Sappa, A.D. (2021). MPRNet: Multi-Path Residual Network for Lightweight Image Super Resolution. In In IEEE Winter Conference on Applications of Computer Vision WACV 2021, enero 5-9, 2021 (pp. 2703–2712).
|
Angel D. Sappa, Cristhian A. Aguilera, Juan A. Carvajal Ayala, Miguel Oliveira, Dennis Romero, Boris X. Vintimilla, et al. (2016). Monocular visual odometry: a cross-spectral image fusion based approach. Robotics and Autonomous Systems Journal, Vol. 86, pp. 26–36.
Abstract: This manuscript evaluates the usage of fused cross-spectral images in a monocular visual odometry approach. Fused images are obtained through a Discrete Wavelet Transform (DWT) scheme, where the best setup is em- pirically obtained by means of a mutual information based evaluation met- ric. The objective is to have a exible scheme where fusion parameters are adapted according to the characteristics of the given images. Visual odom- etry is computed from the fused monocular images using an off the shelf approach. Experimental results using data sets obtained with two different platforms are presented. Additionally, comparison with a previous approach as well as with monocular-visible/infrared spectra are also provided showing the advantages of the proposed scheme.
|
Byron Lima, Ricardo Cajo, Victor Huilcapi, & Wilton Agila. (2017). Modeling and comparative study of linear and nonlinear controllers for rotary inverted pendulum. In Journal of Physics: Conference Series (Vol. 783).
Abstract: The rotary inverted pendulum (RIP) is a problem difficult to control, several studies have been conducted where different control techniques have been applied. Literature reports that, although problem is nonlinear, classical PID controllers presents appropriate performances when applied to the system. In this paper, a comparative study of the performances of linear and nonlinear PID structures is carried out. The control algorithms are evaluated in the RIP system, using indices of performance and power consumption, which allow the categorization of control strategies according to their performance. This article also presents the modeling system, which has been estimated some of the parameters involved in the RIP system, using computer-aided design tools (CAD) and experimental methods or techniques proposed by several authors attended. The results indicate a better performance of the nonlinear controller with an increase in the robustness and faster response than the linear controller
|