Evolutionary algorithms (EAs) are random optimization methods inspired by genetics and natural selection, resembling simulated annealing. We develop a method that can be used to find a meaningful tradeoff between the difficulty of the analysis and the algorithms' efficiency. Since the case of a discrete search space has been studied extensively, we develop a new stochastic model for the continuous n-dimensional case. Our model uses renewal processes to find global convergence conditions. A second goal of the paper is the analytical estimation of the computation time of EA with uniform mutation inside the (hyper)-sphere of volume 1, minimizing a quadratic function.