This study presents an innovative Visual-Inertial Odometry (VIO) approach for Unmanned Aerial Vehicles (UAVs). The proposed system integrates the modern deep learning-based SuperGlue algorithm with an information based adaptive Extended Kalman Filter (EKF). The system establishes a dynamic confidence estimation mechanism using image entropy, intensity variation, and motion blur metrics, thereby providing robust pose estimation even in challenging environmental conditions. Thanks to the superior performance of advanced transformer-based feature matching methods like SuperGlue, the proposed loosely-coupled sensor fusion technique approaches the accuracy of traditional tightly-coupled approaches, potentially offering an alternative to these methods. Comprehensive experiments conducted on the EuRoC MAV dataset demonstrate that the proposed method provides significant improvements compared to conventional approaches. Particularly in challenging scenarios, an improvement of approximately 50% was observed in the estimation of quaternions and Euler angles. This study reveals that loosely-coupled sensor fusion, when combined with advanced feature matching techniques and adaptive filtering strategies, can offer a robust alternative to tightly-coupled approaches. The results highlight the potential applications of the proposed method in fields such as robotic navigation, autonomous vehicles, and augmented reality. Additionally, the Python code associated with our study has been shared as open-source on GitHub for use in other academic studies at "\href{https://github.com/ufukasia/Adaptive-VIO-Odometry}{https://github.com/ufukasia/Adaptive-VIO-Odometry}".