Insights from 5 1/2 years of California MFT license exam data: Part I

California’s 80-ish degree-granting MFT programs are hard to compare. While there are minimum curriculum standards every program must meet, each has its own personality, its own goals, and its own structure. For prospective students, it can be difficult to figure out which programs offer them the best chances of success in the field. There’s really only one common yardstick that every student, from every program, ultimately gets measured on: Licensing exams.

California’s Board of Behavioral Sciences routinely publishes licensing exam success rates for each program in the state, in six-month increments. The most recent data can be viewed on their web site for the Standard Written Exam and the Written Clinical Vignette Exam. However, the small sample sizes that result from using six-month intervals make meaningful comparison difficult; smaller programs are particularly prone to large swings in their graduates’ exam pass rates from one six-month period to the next.

I gathered the BBS data going all the way back to 2004 to see whether bigger sample sizes might allow for some more solid conclusions — or at least better-informed guesses — about how MFT programs around the state compare to one another.

Before we dive in, some pretty major caveats need to be put forward. (1) My student assistants and I did the best we could to cross-check and validate the data, but we cannot guarantee that we did a perfect job. You are cordially invited to check our work (details at the end of this post). (2) Lots of factors influence the exam pass rates of a particular school’s graduates, separate from the quality of education. There are the program’s choices about whom to admit; graduates’ experiences in supervision; decisions by students about whether to pursue MFT licensure; and on and on. So if a program’s graduates performed especially well or poorly, that does not necessarily mean that the program itself performs that way. (3) To whatever degree exam passing-or-failing does reflect on a program itself, it reflects on that program’s performance several years prior to the exam. When you’re looking at data going back to 2004, as we are here, we’re considering the impact of an education received as far back as around 2000, possibly even earlier. Programs change. (4) If you’re a prospective MFT student, exam pass rates are certainly not the only things to consider when choosing an MFT program. They can be useful to include in your decision-making, but please do not let them be a powerful factor.

Got all that? Great! With those cautionary notes in mind, let’s dive in. We’ll focus here on the first licensing exam, the Standard Written Exam.

I’ve had student assistants input and cross-check the data, and we’ve done some analysis using Excel and SPSS (now PASW, for statistical-software purists) programs. The data shows some clear trends.

  • There are big differences between programs. Don’t let anyone tell you that graduate programs are basically interchangeable. They may all be subject to the same MFT curriculum requirements, but some appear to be far more effective than others in preparing their students for the licensing exams. (Education is not the only influence on exam preparedness, of course, but this data does suggest that it is a meaningful one.)

    Note: Try as I might, I could not get the tables to display well inline here. So they’ve been shifted over to the MFTEducation.com server, where they display properly. -bc

    Table 1: Best and worst performing programs*, California MFT Standard Written Exam, 1/1/2004-6/30/2009 (Minimum 50 first-time examinees)

  • Accreditation matters. Graduates of COAMFTE-accredited programs were more successful on licensing exams than graduates of non-COAMFTE programs. While my own program at Alliant International University did better than the state average, much of the COAMFTE benefit seems to come from the strength of the University of San Diego. (I have a more detailed exploration of the link between program accreditation and licensing exam success in press at the Journal of Marital and Family Therapy.)

    Table 2: COAMFTE-accredited programs*, California MFT Standard Written Exam, 1/1/2004-6/30/2009
    * – Accredited as of 1/1/2004.

  • Size doesn’t matter. Graduates of smaller programs did no better or worse overall than graduates of bigger programs. And the biggest programs were not necessarily the best. Far from it, in fact. Graduates of National University, the state’s largest MFT program by number of examinees, performed well below state averages on the Standard Written Exam:

    Table 3: Most first-time examinees, California MFT Standard Written Exam, 1/1/2004-6/30/2009

These are only the beginning, of course. There is a lot to be gleaned from the available data, for programs and prospective students alike. I’ll be doing some additional posts with more comparisons here in the coming weeks to illustrate some more of the interesting (I hope!) things we found.

One big plus about working with BBS data is that it’s all public information. So I feel an obligation to make sure others can review it, call out any errors you find, and do additional research with it as you see fit. All of the information on which these tables were based is available now at www.MFTeducation.com. There you will find the BBS source documents that we put together, as well as a searchable database so you can compare your program with others around the state. Your comments and suggestions are always welcome; I hope this is a useful resource!

Coming in Part II: Comparing for-profit programs with not-for-profit.

Can MFT interns pay for supervision?

Aaron Feldman is frustrated. He’s spent several months telling the BBS, AAMFT-CA, CAMFT, and anyone else who will listen that he can’t run a therapy business legally and be successful in the state of California. No one has told him he’s wrong.

The problem, which Aaron is trying to conquer while others mostly ignore, is California’s Labor Law. Depending on whom you ask, it might prohibit marriage and family therapist interns from paying their employers for supervision; if it actually does (CAMFT argues otherwise), then a significant number of clinics around the state would be in violation.

But the supervision issue could be only the beginning. Mental health clinics are no different from any other business in the eyes of state labor law. But when was the last time a clinic required its therapists to take at least a 10-minute break every two hours, or mandated a lunch break for shifts lasting beyond six hours? Each of those is a labor law requirement.

At the October 2009 meeting of the Board of Behavioral Sciences (BBS), the licensing board’s counsel was clear: Those who employ MFT interns are subject to labor laws just like any other employer, and “there is no scenario under which it would be appropriate for a supervisee to pay an employer for supervision.” The BBS has since elaborated on that position here. CAMFT has since put forth its contrasting interpretation of the law, arguing that it is perfectly legal for an intern to pay their employer for supervision, as long as the terms of the arrangement are agreed to by both parties, in writing, in advance.

It is the Labor Board‘s opinion — or perhaps ultimately a judge’s — that matters. And so far, the Labor Board has chosen to stay away from the issue, politely rebuffing requests to attend BBS meetings or issue any written statement on the question. For the Labor Board to address this issue directly, it appears, they will have to face a complaint from a disgruntled supervisee.

And that has been Aaron’s frustration. He doesn’t want to run his clinic in constant fear of becoming the test case. CAMFT has hinted that they might try to clear up the issue through legislation, but the odds of getting a bill passed if it looks in any way like a weakening of, or exemption from, state labor laws is slim.

Without a clear set of guidelines, what is an employer or agency to do? The lowest-risk approach appears to be to abide by the BBS stated opinion: Assume MFT interns, when they are employed (as opposed to volunteering), are subject to all labor laws, and act accordingly. In practical terms, this would require an employer to go the extra mile to make sure that interns are taking adequate breaks. Do not have interns pay for supervision. Ensure that, however interns’ pay is computed, it amounts to at least minimum wage once all the intern’s time (including time spent on supervision, marketing, administrative tasks, etc.) is taken into account. (Of course, I’m not a lawyer, so please do not construe this as legal advice; any employer with questions about labor law or its application should consult an attorney.)

The problem for Aaron, and anyone else trying to run a therapy business honestly, is that even if he chooses to play by all of the labor law rules, plenty of other employers are not. If Aaron abides by the labor law, he will need to either pay his employees less, or charge his clients more, than similar clinics. The extra burden could be enough to make his clinic, as a business, unsustainable. Until at least one disgruntled intern makes a test case out of it, those who disregard the labor law appear to be at a competitive advantage.