For proper navigation of Unmanned Aerial Vehicles (UAVs), it is necessary to know their position in real-time to ensure safe navigation. Determining position in outdoor spaces is quite well solved. On the other hand, in indoor spaces, existing solutions are either imprecise or excessively costly. In this paper, the 3D localization problem is addressed in the context of UAV navigation. The main purpose of this work is to develop and evaluate a robust real-time localization scheme using exclusively the information from an embedded Event Camera and an IMU (Inertial Measurement Unit). Deep learning techniques and robust computer vision algorithms are implemented together to accurately compute the UAV pose, leveraging the strengths of well-established visual-inertial odometry algorithms and the intrinsic advantages of Event Cameras, such as high dynamic range and absence of motion blur. Throughout this study, state-of-the-art techniques are selected, refined, implemented, and evaluated. The proposed system demonstrated good performance and acceptable precision specially in situation with abrupt lighting changes.