Fluorescence microscopy image restoration (FMIR) has received wide attention in the life science field and led to significant progress, benefiting from the deep learning (DL) technology. However, most of the current DL-based FMIR methods need to train a task-specific deep model from scratch on a specific dataset for each FMIR problem, such as super-resolution (SR), denoising, isotropic reconstruction, projection, volume reconstruction, etc. The performance and practicability of these FMIR models are limited due to the troublesome training, the difficulty in obtaining high-quality training images, and the limited generalization ability. Nowadays, the pre-trained foundation models have obtained significant breakthroughs in computer vision (CV) and natural language processing (NLP), demonstrating the powerful effect of the pre-training and fine-tuning paradigm. Here, inspired by the huge success of the pre-trained foundation models in the artificial intelligence (AI), we provide a universal solution for different FMIR problems by presenting a unified FMIR foundation model (UniFMIR), achieving higher image precision, better generalization performance, efficient and low-cost training of the task-specific model. The experimental results on five FMIR tasks and nine datasets, covering a wide range of fluorescence microscopy imaging modalities and biological samples, demonstrate the strong capability of the UniFMIR to handle various FMIR situations with a single model. The UniFMIR, pre-trained on the large-scale dataset we collected, can effectively transfer the knowledge learned during the pre-training to a specific FMIR situation by fine-tuning and can obtain a significant performance improvement, uncovering clear nanoscale cell structures and facilitating high-quality imaging in live samples. This work first explores the potential of applying the foundation model for FMIR. We hope to provide some inspiration for more researchers to further explore the DL-based FMIR and to trigger new research highlights of the FMIR model pre-training and development.