In this work, we explore the low-rank matrix completion (LRMC) problem, which is crucial for data science and machine learning applications. We develop and analyze a Gradient Descent-based solution, termed Alternating GD and Minimization (AltGDmin), aimed at efficiently recovering an $n \times q$ rank-$r$ matrix $\Xstar$ from a subset of its entries, particularly when $r \ll \min (n, q)$. Our theoretical guarantees, including iteration and sample complexity bounds, demonstrate that AltGDmin is not only the most communication-efficient solution in a federated setting but also one of the fastest methods available, achieving the second-best sample complexity among all iterative solutions to LRMC. Furthermore, we establish convergence results for noisy LRMC and highlight how our lemmas improve the sample complexity guarantee for AltMin, the fastest centralized solution. Overall, our findings emphasize the importance of leveraging the low-dimensional structure of data to recover missing entries and enhance predictive capabilities in matrix completion.