The proliferation of the ever-increasing number of highly heterogeneous smart devices and the emerging of a wide range of diverse applications in 5G mobile network ecosystems impose to tackle new set of raising challenges related to agile and automated service orchestration and management. Fully leveraging key enablers technologies such as Software Defined Network (SDN), Network Function Virtualization (NFV) and Machine Learning (ML) capabilities in such environment is of paramount importance to address Service Function Chaining (SFC) orchestration issues according to user requirements and network constraints. To meet these challenges, we propose in this paper a Deep Reinforcement Learning (DRL) approach to investigate online Quality of Experience (QoE)/Quality of Service (QoS) aware SFC orchestration problem. The objective of this work is to fulfill intelligent, elastic and automated Virtual Network Functions (VNF)s/Container Network Function (CNF)s deployment optimizing end-to-end user experience while respecting QoS constraints. We implement the DRL approach through using a variant of Deep-Q-Network (DQN) algorithm referred to as Double DQN. We show how DRL agent behaves along the learning process for different PSN scales. We highlight also the impact of a set of hyper-parameters such as batch size and learning rate on solving the sequential decision problem related to SFC orchestration. The evaluation of the learning process is achieved based on the quality of learning with respect to the number of runs. In this regard, we use QoE metric to define a score quantifying the quality of learning.