In this talk, we focus on solving a series of linear systems with an identical (or similar) coefficient matrix. The linear systems are sequentially processed due to the dependence of the right-hand side vector on the solution vector of the prior linear system. For such a problem, we investigate the subspace correction and deflation methods to accelerate the convergence of the Krylov subspace method. Practically, these acceleration methods work well when the range of the auxiliary matrix contains eigenvectors corresponding to small eigenvalues of the coefficient matrix. We have developed a new auxiliary matrix construction method to identify the approximation of the eigenvectors with small eigenvalues using error vector sampling in the prior solution step. Numerical tests confirm that both subspace correction and deflation methods with the generated auxiliary matrix accelerate the convergence of the iterative solver. We also briefly mention that our method can be used to estimate the condition number of the coefficient matrix.