California’s 80-ish degree-granting MFT programs are hard to compare. While there are minimum curriculum standards every program must meet, each has its own personality, its own goals, and its own structure. For prospective students, it can be difficult to figure out which programs offer them the best chances of success in the field. There’s really only one common yardstick that every student, from every program, ultimately gets measured on: Licensing exams.
California’s Board of Behavioral Sciences routinely publishes licensing exam success rates for each program in the state, in six-month increments. The most recent data can be viewed on their web site for the Standard Written Exam and the Written Clinical Vignette Exam. However, the small sample sizes that result from using six-month intervals make meaningful comparison difficult; smaller programs are particularly prone to large swings in their graduates’ exam pass rates from one six-month period to the next.
I gathered the BBS data going all the way back to 2004 to see whether bigger sample sizes might allow for some more solid conclusions — or at least better-informed guesses — about how MFT programs around the state compare to one another.
Before we dive in, some pretty major caveats need to be put forward. (1) My student assistants and I did the best we could to cross-check and validate the data, but we cannot guarantee that we did a perfect job. You are cordially invited to check our work (details at the end of this post). (2) Lots of factors influence the exam pass rates of a particular school’s graduates, separate from the quality of education. There are the program’s choices about whom to admit; graduates’ experiences in supervision; decisions by students about whether to pursue MFT licensure; and on and on. So if a program’s graduates performed especially well or poorly, that does not necessarily mean that the program itself performs that way. (3) To whatever degree exam passing-or-failing does reflect on a program itself, it reflects on that program’s performance several years prior to the exam. When you’re looking at data going back to 2004, as we are here, we’re considering the impact of an education received as far back as around 2000, possibly even earlier. Programs change. (4) If you’re a prospective MFT student, exam pass rates are certainly not the only things to consider when choosing an MFT program. They can be useful to include in your decision-making, but please do not let them be a powerful factor.
Got all that? Great! With those cautionary notes in mind, let’s dive in. We’ll focus here on the first licensing exam, the Standard Written Exam.
I’ve had student assistants input and cross-check the data, and we’ve done some analysis using Excel and SPSS (now PASW, for statistical-software purists) programs. The data shows some clear trends.
- There are big differences between programs. Don’t let anyone tell you that graduate programs are basically interchangeable. They may all be subject to the same MFT curriculum requirements, but some appear to be far more effective than others in preparing their students for the licensing exams. (Education is not the only influence on exam preparedness, of course, but this data does suggest that it is a meaningful one.)
Note: Try as I might, I could not get the tables to display well inline here. So they’ve been shifted over to the MFTEducation.com server, where they display properly. -bc
Table 1: Best and worst performing programs*, California MFT Standard Written Exam, 1/1/2004-6/30/2009 (Minimum 50 first-time examinees)
- Accreditation matters. Graduates of COAMFTE-accredited programs were more successful on licensing exams than graduates of non-COAMFTE programs. While my own program at Alliant International University did better than the state average, much of the COAMFTE benefit seems to come from the strength of the University of San Diego. (I have a more detailed exploration of the link between program accreditation and licensing exam success in press at the Journal of Marital and Family Therapy.)
Table 2: COAMFTE-accredited programs*, California MFT Standard Written Exam, 1/1/2004-6/30/2009
* – Accredited as of 1/1/2004. - Size doesn’t matter. Graduates of smaller programs did no better or worse overall than graduates of bigger programs. And the biggest programs were not necessarily the best. Far from it, in fact. Graduates of National University, the state’s largest MFT program by number of examinees, performed well below state averages on the Standard Written Exam:
Table 3: Most first-time examinees, California MFT Standard Written Exam, 1/1/2004-6/30/2009
These are only the beginning, of course. There is a lot to be gleaned from the available data, for programs and prospective students alike. I’ll be doing some additional posts with more comparisons here in the coming weeks to illustrate some more of the interesting (I hope!) things we found.
One big plus about working with BBS data is that it’s all public information. So I feel an obligation to make sure others can review it, call out any errors you find, and do additional research with it as you see fit. All of the information on which these tables were based is available now at www.MFTeducation.com. There you will find the BBS source documents that we put together, as well as a searchable database so you can compare your program with others around the state. Your comments and suggestions are always welcome; I hope this is a useful resource!
Coming in Part II: Comparing for-profit programs with not-for-profit.