We propose two approximate versions of the first-order primal-dual algorithm (PDA) for solving a class of convex-concave saddle point problems. The introduced approximate criteria are easy to implement in the sense that they only involve the subgradient of a certain function at the current iterate. The first approximate PDA solves both subproblems inexactly and adopts absolute error criteria, which are based on nonnegative summable sequences. The second approximate PDA, assuming that one of the PDA subproblems can be solved exactly, solves the other subproblem approximately and adopts a relative error criterion. The relative error criterion only involves a single parameter ranging in [0; 1), which makes the method more applicable. For both versions, we establish the global convergence and O(1=N) rate of convergence measured by the iteration complexity, where N counts the number of iteration. Under further assumptions that partial of the underlying functions and the whole underlying functions are strongly convex, we show the accelerated O(1=N 2) and linear rate of convergence, respectively, for the inexact PDA with absolute error criteria. We then prove that these inexact criteria can also be extended to solve a class of more general problems. Finally, we perform some numerical experiments on sparse recovery and image processing problems, and the results demonstrate the feasibility and superiority of the proposed methods.