Currently, a large number of question retrieval solutions rely on supervised methods, which require expensive manual annotations. However, existing unsupervised question retrieval methods lack the reinforcement of keyword features in question representations. To address this issue, this paper proposes an unsupervised question retrieval method called QRTM (Question Retrieval based on Topic Filtering and Multi-Task Learning). Firstly, a topic filtering algorithm called GF is designed, which calculates the distribution of topics and keywords using topic models. It sequentially extracts the topic keywords from the questions to construct unsupervised learning corpus data. Secondly, a multi-task learning approach is adopted to build the question retrieval model. Three tasks are designed in this paper: a short question contrastive learning task; a question generation task with corresponding sequential topic keywords; and the SDN (Similarity Distribution Network) task, which combines autoencoders and attention mechanisms to measure the similarity between questions and sequential topic keywords. These three tasks are trained in parallel to obtain enhanced embeddings of topic keywords in question representations, capturing accurate semantic information of the questions. Comparative and ablation experiments on publicly available question datasets validate that the proposed QRTM method outperforms baseline methods. When using the BERT pre-trained model, the P@1, MAP, and MRR metrics are respectively improved by 8.5%, 5.4%, and 5.0% compared to the baseline model. When using the Roberta model, the corresponding metrics are improved by 4.8%, 4.3%, and 4.4%. QRTM can be applied to transfer learning with pre-trained models such as BERT and RoBERTa, effectively enhancing the accuracy of unsupervised similar question retrieval, with the retrieved similar questions appearing at a higher rank.