Current methods for fitting cognitive diagnosis models (CDMs) to educational data typically rely on expectation maximization (EM) or Markov chain Monte Carlo (MCMC) for estimating the item parameters and examinees’ proficiency class memberships. However, for advanced, more complex CDMs like the reduced reparameterized unified model (Reduced RUM) and the (saturated) loglinear cognitive diagnosis model (LCDM), EM and Markov chain Monte Carlo (MCMC) have the reputation of often consuming excessive CPU times. Joint maximum likelihood estimation (JMLE) is proposed as an alternative to EM and MCMC. The maximization of the joint likelihood is typically accomplished in a few iterations, thereby drastically reducing the CPU times usually needed for fitting advanced CDMs like the Reduced RUM or the (saturated) LCDM. As another attractive feature, the JMLE algorithm presented here resolves the traditional issue of JMLE estimators—their lack of statistical consistency—by using an external, statistically consistent estimator to obtain initial estimates of examinees’ class memberships as starting values. It can be proven that under this condition the JMLE item parameter estimators are also statistically consistent. The computational performance of the proposed JMLE algorithm is evaluated in two comprehensive simulation studies.