Abstract:The high-ratio of renewable energy sources such as wind and PV integrated with power grid has emerged as an essential initiative to mitigate global energy crisis. However, the intermittent and volatility of renewable energy sources present certain challenges to the reliability of the system. A real-time scheduling model for operation optimization based on deep reinforcement learning (DRL) algorithm is introduced to enhance the utilization of renewable energy while guaranteeing system security. Firstly, a load forecasting model is constructed to achieve load forecasting and Gaussian mixture model which is applied to fit forecasting errors. Secondly, considering the constraints of each node of the system, taking system operating cost and safe operation as optimization target, the appropriate optimization model is formulated. And then, the optimization issue is converted into a Markov decision process and addressed by the twin delayed deep deterministic policy gradient (TD3) algorithm. Finally, the optimal joint scheduling strategy is acquired by environment interactive mechanism and policy discretionary exploration of DRL algorithm. The experimental results demonstrate that the proposed method has excellent adaptability and allows for online real-time scheduling.