Convergence analysis of any random search algorithm helps to verify whether and how quickly the algorithm guarantees convergence to the point of interest. Butterfly optimization algorithm (BOA) is a popular population-based stochastic optimizer introduced by mimicking the foraging behaviors of butterflies in nature. In this paper, we have developed the Markov chain model of the BOA and analyzed the convergence bahavior of the algorithm. The Markov chain model of the BOA is constituted where the population sequence generated by the algorithm is found to be a finite homogenous Markov chain and the defined population state set is found to be reducible. Convergence analysis of the algorithm is performed mathematically using the Markov chain model of the algorithm with the help of global convergence theorem which is based on a random search algorithm satisfying two subtle conditions. The butterfly algorithm has been found to satisfy the conditions for the global convergence theorem to enact, whereof it guarantees the global convergence of the BOA. We have also tried to show experimentally that the convergence of the algorithm do not always have considerable impact on rate of convergence as it is influenced by various other factors.