Insights from 5 1/2 years of California MFT license exam data: Part II

In part one, we learned that there are huge differences between programs in how their graduates perform on California’s MFT licensing exams. Here, we’ll see how for-profit programs measure up. You may be surprised.

For-profit universities have come under scrutiny in the past few years for aggressive recruiting practices and high costs. While the overwhelming majority of marriage and family therapy graduate programs are non-profit (either public or private), here in California a few programs are in the business of education to make money.

Some of the scrutiny faced by for-profit universities revolves around whether they are so eager to bring in new students that they accept unqualified students who cannot succeed in their fields. Since MFT licensure requires an examination that every applicant takes, we have a handy, easily-measured research question:

How do graduates of for-profit MFT programs perform on state licensing exams, compared to graduates of non-profit programs?

From this list of for-profit colleges and universities, we can identify at least four for-profit MFT programs in California:

  1. University of Phoenix – San Diego
  2. University of Phoenix – Sacramento
  3. Argosy University
  4. California Southern University

These programs, as it turns out, are widely varied when it comes to their graduates’ exam performance.

Table 1: Pass rate, CA Standard Written Exam, Graduates of for-profit MFT programs

Considering the better-than-average performance of Argosy graduates and the worse-than-average (but by no means abysmal) performance of Phoenix graduates, it seems that little can be safely concluded about an MFT program simply on the basis of its for-profit/non-profit status. So here are three money-centered things I would ask any program, for-profit or not, about if I were a prospective student:

  • A true accounting of costs. For-profit programs may be expensive, but non-profit programs can be too. Unfortunately, it is not as simple as asking “How much is the tuition?” Availability of financial aid should be a factor, particularly the question of how much aid comes in the form of loans (which need to be paid back) versus scholarships or grants (which do not). It also may be wise to ask about additional costs separate from tuition (books, fees), and whether the program will make you eligible for various stipend and loan reimbursement programs offered at the county, state, and federal levels. Students at for-profit universities appear to have particular difficulty repaying their loans.
  • Graduation rates. If programs (for-profit or not) are, in fact, admitting students who cannot succeed, that may not show up on licensing exam data; the students simply would never get that far. A key criticism of for-profit programs has been that they suffer high dropout rates, leaving students with additional debt but no additional job qualifications to show for it. Ask how many students actually complete the program relative to those who start.
  • Where your money goes. You want the bulk of your tuition money to support your learning. How much does the program spend on faculty salaries, learning technology, and other support for student learning, as opposed to administration, investments, or other costs? Naturally, some other costs are needed for any program to function. But as a general rule, the bulk of your tuition money should be going toward those things that most directly impact your educational experience.

Elsewhere on this blog I’ve described some other factors that may help you choose the best MFT graduate program for you. The questions here are more financial in nature. They’re all worth asking about.

Ultimately, I would not dismiss any of these programs simply because of their for-profit status. Any of them may be the right fit for you. Ask questions, and make sure any decision you make on a graduate education is a well-informed one.

Insights from 5 1/2 years of California MFT license exam data: Part I

California’s 80-ish degree-granting MFT programs are hard to compare. While there are minimum curriculum standards every program must meet, each has its own personality, its own goals, and its own structure. For prospective students, it can be difficult to figure out which programs offer them the best chances of success in the field. There’s really only one common yardstick that every student, from every program, ultimately gets measured on: Licensing exams.

California’s Board of Behavioral Sciences routinely publishes licensing exam success rates for each program in the state, in six-month increments. The most recent data can be viewed on their web site for the Standard Written Exam and the Written Clinical Vignette Exam. However, the small sample sizes that result from using six-month intervals make meaningful comparison difficult; smaller programs are particularly prone to large swings in their graduates’ exam pass rates from one six-month period to the next.

I gathered the BBS data going all the way back to 2004 to see whether bigger sample sizes might allow for some more solid conclusions — or at least better-informed guesses — about how MFT programs around the state compare to one another.

Before we dive in, some pretty major caveats need to be put forward. (1) My student assistants and I did the best we could to cross-check and validate the data, but we cannot guarantee that we did a perfect job. You are cordially invited to check our work (details at the end of this post). (2) Lots of factors influence the exam pass rates of a particular school’s graduates, separate from the quality of education. There are the program’s choices about whom to admit; graduates’ experiences in supervision; decisions by students about whether to pursue MFT licensure; and on and on. So if a program’s graduates performed especially well or poorly, that does not necessarily mean that the program itself performs that way. (3) To whatever degree exam passing-or-failing does reflect on a program itself, it reflects on that program’s performance several years prior to the exam. When you’re looking at data going back to 2004, as we are here, we’re considering the impact of an education received as far back as around 2000, possibly even earlier. Programs change. (4) If you’re a prospective MFT student, exam pass rates are certainly not the only things to consider when choosing an MFT program. They can be useful to include in your decision-making, but please do not let them be a powerful factor.

Got all that? Great! With those cautionary notes in mind, let’s dive in. We’ll focus here on the first licensing exam, the Standard Written Exam.

I’ve had student assistants input and cross-check the data, and we’ve done some analysis using Excel and SPSS (now PASW, for statistical-software purists) programs. The data shows some clear trends.

  • There are big differences between programs. Don’t let anyone tell you that graduate programs are basically interchangeable. They may all be subject to the same MFT curriculum requirements, but some appear to be far more effective than others in preparing their students for the licensing exams. (Education is not the only influence on exam preparedness, of course, but this data does suggest that it is a meaningful one.)

    Note: Try as I might, I could not get the tables to display well inline here. So they’ve been shifted over to the server, where they display properly. -bc

    Table 1: Best and worst performing programs*, California MFT Standard Written Exam, 1/1/2004-6/30/2009 (Minimum 50 first-time examinees)

  • Accreditation matters. Graduates of COAMFTE-accredited programs were more successful on licensing exams than graduates of non-COAMFTE programs. While my own program at Alliant International University did better than the state average, much of the COAMFTE benefit seems to come from the strength of the University of San Diego. (I have a more detailed exploration of the link between program accreditation and licensing exam success in press at the Journal of Marital and Family Therapy.)

    Table 2: COAMFTE-accredited programs*, California MFT Standard Written Exam, 1/1/2004-6/30/2009
    * – Accredited as of 1/1/2004.

  • Size doesn’t matter. Graduates of smaller programs did no better or worse overall than graduates of bigger programs. And the biggest programs were not necessarily the best. Far from it, in fact. Graduates of National University, the state’s largest MFT program by number of examinees, performed well below state averages on the Standard Written Exam:

    Table 3: Most first-time examinees, California MFT Standard Written Exam, 1/1/2004-6/30/2009

These are only the beginning, of course. There is a lot to be gleaned from the available data, for programs and prospective students alike. I’ll be doing some additional posts with more comparisons here in the coming weeks to illustrate some more of the interesting (I hope!) things we found.

One big plus about working with BBS data is that it’s all public information. So I feel an obligation to make sure others can review it, call out any errors you find, and do additional research with it as you see fit. All of the information on which these tables were based is available now at There you will find the BBS source documents that we put together, as well as a searchable database so you can compare your program with others around the state. Your comments and suggestions are always welcome; I hope this is a useful resource!

Coming in Part II: Comparing for-profit programs with not-for-profit.