360-degree video is providing users with an interactive experience to explore the scenes freely. But, because only a small portion of the entire video, called viewport, is watched at every point in time, transmitting the entire video is bandwidth-consuming. Since, the perceptual quality of such video mainly depends on the quality of the viewport, more bandwidth should be assigned to these important parts of the scene. Hence, understanding how people observe and explore 360-degree content is essential. In this paper, we propose a new Two-level rate control algorithm which tries to allocate more bits for encoding the viewport parts of a 360-degree video. The head and eye movements of the observers is used to investigate the visual attention of people to detect the viewports. Then, a Coding Tree Unit (CTU) level rate assignment approach is proposed to assign a proper number of bits to each CTU of the viewport and non-viewport parts. It is assumed that higher motion complexity results in higher bitrates of the encoded video. So, we propose to assign the proper number of bits to each CTU according to its motion complexity. Another novel part of our proposed approach is proposing a new metric to parameterize the motion complexity of each CTU using the high-order motion models in Versatile Video Coding (VVC) standard. Experimental results show that our proposed rate control, on average, achieves 58.27% reduction in bitrate in the Bjøntegaard-Bitrate scales, compared to the standard VCC standard. Furthermore, the proposed scheme provides a significantly better subjective viewing quality compared to the-state-of-the-art methods.