An efficient refinement algorithm is proposed for symmetric eigenvalue problems. The algorithm is simple, primarily comprising matrix multiplications. It constructs arbitrarily accurate eigenvalue decomposition by iteration, up to the limit of computational precision. We show that since the proposed algorithm is based on Newton's method, it quadratically converges for sufficiently distinct simple eigenvalues. Numerical results demonstrate excellent performance of the proposed algorithm in terms of convergence rate and overall computational cost,and show that it is considerably faster than a straightforward approach using multiple-precision arithmetic. Unfortunately, the algorithm is not applicable to a matrix having nearly multiple eigenvalues unless a sufficiently accurate initial guess is provided. However, it is worth noting that the proposed algorithm works for exactly multiple eigenvalues.