Call for Change group offers non-rebuttal rebuttal

MFT Call for Change group responds to my previous post, where I highlighted several of their erroneous statements about California.                                                                                                                                                                                     

The group calling themselves “MFTs Call for Change” (CFC) has posted a lengthy rebuttal to my earlier post criticizing CFC misstatements about California, specifically in areas related to the LPCC license and its development here.

Notably, they rarely, if ever, challenge my statements of fact. They claim that my post included “misinformation,” but their arguments are more often of the moving-the-target (“yes, but”) variety than they are factual disagreement. And they add at least one to the list of factual errors of their own.

  • CFC criticizes my statement that “When CAMFT was negotiating changes to various versions of the LPCC bill, they sought to make MFTs and LPCCs as indistinguishable as possible.” They would prefer I label this as my own belief, or an opinion of AAMFT-CA. But there’s no need. I saw, firsthand, CAMFT’s opposition to language supporting distinctiveness of professions during the negotiation process. Remember, CAMFT wanted grandparenting to be automatic for licensed MFTs based just on coursework (this version of the bill allowed exactly that), and indeed, CAMFT has continued to argue there are no meaningful differences in practice between the MFT and LPCC professions (as CAMFT themselves said, they believe “LMFTs and LCSWs may do in practice everything LPCCs may do“) — which would make the licenses effectively indistinguishable. That’s not my belief, that’s an argument CAMFT itself is continuing to make and act upon.
  • Along similar lines, CFC calls my discussion of CAMFT’s lawsuit against the BBS “patently irresponsible” because… well, I can’t tell why, exactly. I’m not even sure which part they’re taking issue with. CAMFT sued the BBS to try to make the “gap exam” for MFT grandparenting go away, based on their belief that the practices of the professions are indistinguishable. They have very clearly said so. That the lawsuit attempted to use technical means (like the BBS’s failure to consult with a state agency on exams, the one point of three in the lawsuit on which CAMFT won) to reach their desired ends (no gap exam) does not change those desired ends or the publicly-stated rationale behind them.
  • AAMFT-CA and AAMFT have not been “against the LPCC bill since its inception,” as CFC newly and falsely claims. Primary sources here tell the tale. California counselor legislation was first introduced in February 2005. In November 2005, nine months later, I first spoke to a legislative committee about AAMFT-CA’s concerns with bill language. Even then, AAMFT-CA took no formal position, as we understood the bill would be further amended. AAMFT-CA only formally opposed LPCC legislation in 2007 (this legislative committee analysis is the first mention of AAMFT-CA opposition), after it became clear that the legislation was moving in a direction that would hurt the MFT profession. Furthermore, in 2009, once we worked out the compromise language that became the LPCC law, AAMFT-CA’s opposition was removed [page 2], helping the bill pass. The larger AAMFT never took any formal position at all on the bill.
  • In discussing Kim Madsen, the BBS Executive Director, the CFC rebuttal suggests that in my post, “The reader has been lead [sic] to believe Ms. Madsen would be less than forthcoming” when discussing licensure issues. Nonsense. Ms. Madsen has been, in my experience, extremely professional, highly ethical, and very forthcoming, even when we have disagreed on policy. In my earlier post, what I suggested was that CFC, not Ms. Madsen, was being less than forthcoming by leaving out important details. This should have been evident in my preface “I suspect what Ms. Madsen said was…” Given my experiences with each of them, I trust her to be complete and forthcoming much more than I presently trust CFC to do so.

As I said previously, the CFC group seems to be well-meaning. I just don’t understand their dogged pursuit of this line of criticism. It is not supported by facts, and makes CFC look more interested in finding fault with AAMFT than actually supporting or developing the profession.

Call for Change group erroneously calls California licensure threat “unsubstantiated rumor”

A group of well-meaning MFTs has been trying to make AAMFT more transparent and accountable. But their latest report about the LPCC license in California, which claims to have found deception by AAMFT, gets the important facts wrong.

Nuvola apps error* See updates below.

If you’re a member of AAMFT or have been connected in any way with the “Call for Change,” you may have received an email with an unsettling headline:

Unsubstantiated Rumor Influences AAMFT’s Strategic Plan

The email goes on to say that this “unsubstantiated rumor” was the notion that there was a threat to MFT licensure in California connected to the development of the state’s Professional Clinical Counselor (LPCC) license. You can read the CFC’s complete report online: CA LPC Deception Revealed.

Because the threat posed by LPCC licensure was all “unsubstantiated rumor,” the CFC logic goes, perhaps someone at AAMFT knowingly ginned it up for political reasons:

What could be the motives behind generating such rumors, fear, and speculation within our membership in relation to a document and process as significant as a long range Strategic Plan? Since the truth of this matter was already known by AAMFT leadership, could the reason be political?

For anyone unfamiliar, the CFC group is made up largely of well-meaning MFTs who want a more open, accountable professional association. That’s a laudable goal, and they have pushed AAMFT into making some much-needed changes, especially around organizational transparency. There is surely more work to be done on that front, and for that reason, a subgroup like CFC can sometimes ultimately be a positive thing for the larger association they criticize.

In this case, though, CFC gets the important facts wrong, and loses a great deal of its own credibility in the process. It starts by reporting that Kim Madsen, the California Board of Behavioral Sciences Executive Director, told them “there had been no current or past discussions regarding replacing the LMFT license in California with the LPC license” (I’m quoting the CFC’s paraphrase of Ms. Madsen). I of course was not in on this phone call, but I suspect what Ms. Madsen said was that no such discussion took place at the BBS. And, with that kind-of-important detail, that’s true — the BBS never had a public meeting or vote where replacing the MFT license was discussed or voted on.

For the CFC to extrapolate that to the notion that no one in California ever discussed eliminating the MFT license, however, is a rather significant error. CAMFT (which is not affiliated with AAMFT or its California Division) specifically discussed replacing the MFT license. You can see it in page 14 of this California legislative committee report, and the same text was in an email to members that CAMFT sent in May 2007 (emphasis mine):

CAMFT states, “At some time in the future, we project that there will only be one masters level profession in California, with individuals specializing within that license. Thus, those who wish to specialize in systems work will do so; those who wish to specialize in art therapy will do so, etc. The current system with a variety of acronyms is confusing for consumers who just want to be helped and do not perceive greater value from one professional compared to the next.”

When CAMFT was negotiating changes to various versions of the LPCC bill, they sought to make MFTs and LPCCs as indistinguishable as possible. And without distinctiveness between professions, it becomes much harder to argue that distinct licensure should be maintained. Sure enough, the following language was inserted into a version of the 2008 LPCC bill (which ultimately failed; again, emphasis mine):

The Department of Consumer Affairs and the Board of Behavioral Sciences shall collaboratively evaluate the licensing requirements and scope of practice for licensed professional
counselors, licensed clinical social workers, licensed educational psychologists, and marriage and family therapists. The evaluation shall include a recommendation on whether or not these licensed professions should become a general license category, and if such a recommendation is made, how it is proposed to occur.

CAMFT projected a one-license future, and that proposal found its way into proposed legislation. These are facts on the public record. They are indisputable. That legislation may have become law had AAMFT-CA and the California Psychological Association not worked diligently in opposition. For CFC to dismiss the threat to California MFT licensure as “unsubstantiated rumor” is clearly and demonstrably incorrect.

CFC goes on to question how CAMFT could possibly have any influence over license transitions, since it is the BBS and not any professional association that enforces licensure standards. This is shockingly naive. CAMFT does a tremendous amount of work (often very good and beneficial) on state legislation, and the licensure standards the BBS exists to enforce are written in state law; the BBS has no right or authority to refuse to enforce a law once it takes effect. Almost every major piece of state legislation that has impacted California MFTs in a generation has had CAMFT’s fingerprints on it. If CAMFT wanted MFTs to become LPCs, they wouldn’t politely ask the BBS to make it happen. They would seek to change the law, and they have the resources to do so. As you can see above, a change in the law is precisely what could have happened.

Even after the final version of the LPCC law passed in California, CAMFT has continued to work hard at making MFTs and LPCCs seem indistinguishable. When even the relatively minor barrier of a “gap exam” for MFTs seeking grandparenting into LPCC licensure arose, CAMFT not only fought it, they sued the BBS — a suit that could have, among other things, led to a court ruling that the professions are not distinct.

Thankfully, CAMFT largely lost that suit. There will be a gap exam, and the BBS is supporting distinctiveness between MFTs and LPCCs in other ways. But for CFC to suggest that there never was a threat to the MFT license in California — or indeed, to even suggest that threat has fully abated now — is simply out of line with reality.

===

Full disclosure: I have served as a consultant to AAMFT in the past (not currently), and still volunteer regularly for the California Division, though I hold no official role there. I was part of the AAMFT-CA negotiating team on California LPCC legislation. I’m a former CAMFT member; their lawsuit against the BBS led me out the door.

===

Updated 3-16-2011: The Call for Change folks have posted a rebuttal to my post here. See: CFC rebuttal // My response

# # #

The image above is the semaphore sign for “error.”

Insights from 5 1/2 years of California MFT license exam data: A defense of underperforming programs

Some MFT programs’ graduates perform poorly on California’s MFT licensing exams. Don’t assume that means the program is of poor quality; there may be good reasons.

Graduation hat1We’ve seen that there are huge differences in performance on the California marriage and family therapy licensing exams based on what graduate program the test-takers attended. We’ve also seen that for-profit MFT programs should not be dismissed simply because they aim to make money; Argosy graduates do particularly well on the exams, while University of Phoenix graduates do not. I’ve said before, though, that there are lots of things to consider when choosing a graduate program in MFT, and that graduates’ exam performance should only be one of many such considerations. Indeed, there are some major problems with putting too much stock in exam data.

If you are looking at graduate programs, and are concerned about your prospective MFT program’s exam pass rate, here are three reasons why you may want to ignore the exam data:

  • Programs can and do improve. Exam data reflects students who graduated years earlier. Remember, it takes the average MFT intern in California more than four years to move from graduation to the licensing exams. That number is a bit lower for graduates of COAMFTE-accredited programs, primarily because they do more practicum hours while still in school. Nonetheless, if you are looking at MFT licensing exam data from 2009 and earlier, you will find very little information on anyone who graduated much past 2005, and nothing to tell you which programs have gotten better or worse since then. Consider the recently-COAMFTE-accredited programs at Chapman University and Hope International University. Their national accreditation should arguably make them more appealing (and thus competitive) programs for prospective students and faculty alike. That’s important, and simply is not reflected in currently-available exam data.
  • Programs seek to give students opportunities. Consider for a moment the state’s worst-performing program, according to a table that appeared in Part I of this series: Pacific Oaks College. Based on the pass-rate statistic alone, one might presume that the Pacific Oaks program is not very good. But that conclusion can’t safely be made from that data. Pacific Oaks, over the past few years, has specifically sought to provide opportunities to historically underserved populations, creating cohorts specifically for African-American Family Therapy and Latino Family Therapy. (This outreach is vital: Lots of evidence suggests that the mental health workforce is not meeting the needs of minority populations, either in California or around the US.) Students in these cohorts may lack the family, economic, and social support, as well as the earlier educational opportunities, that other students often have. Pacific Oaks goes to great length to remediate these earlier deficits, and may be doing more, with less, than programs who start with more economically- and educationally-advantaged students. When financial and accreditation concerns threatened to close the Pacific Oaks in 2009, I was one of many who stood up in defense of keeping the program alive, and have no reservations about having done so.
  • Programs have no control over what students do after graduation. A program can really only control what happens from the time students are in the program to the time they graduate — and even then, programs have limited control over how well their students prepare themselves. A great supervisor can help an MFT Intern/Associate make up for deficiencies in their education, and help get them ready for licensing exams. Poor supervision may leave the Intern/Associate on their own to prepare, or even offer incorrect information that ultimately harms one’s chance of passing the exams. And of course, programs have no control over whether their graduates use MFT exam prep programs, although there is little evidence that these prep classes actually impact MFT exam pass rates.

Of course, before you do go dismissing a program’s exam pass rate, take some steps to get reassurance that you are making that decision wisely. If you are considering attending a program whose graduates have not performed well on recent licensing exams, ask the program (1) why, (2) what they’re doing about it, and (3) what evidence they have that they’re getting better. If the program can’t pass that test, then it’s time to wonder whether you would be able to pass yours.

That infidelity-and-income study? Don’t believe it.

A recent study, presented at an American Sociological Association conference to a fawning media reception (NPR / Salon), tells us that men who make less than their wives or live-in girlfriends are five times more likely to cheat. It’s bogus. Here’s why.

While commentators have been stumbling over themselves to determine what the study’s findings mean about gender, marriage, and society, no one seems to be bothering to notice that the study itself appears pretty useless. The major conclusion, linking income and infidelity, has a number of problems, not the least of which is that everyone — myself included — who wasn’t at the conference is relying on a press release and subsequent media reports about it. Such reports are notoriously unreliable, often drawing ideas from generous and/or speculative interpretations of the results rather than the study itself. That said, here are three of the reasons I’m particularly skeptical:

  1. Do the math. The National Longitudinal Survey of Youth, upon which the study is based, followed about 9,000 individuals — surely a healthy sample size. But the infidelity study examined only those who were married or with a live-in partner for more than a year, which is a much smaller subset. And of those, only seven percent of men and three percent of women actually fessed up to cheating during the study’s six-year period. So, let’s be generous and say that two-thirds of the NLSY group met the relationship-status criteria (n=6,000). And we’ll presume that roughly half are of each gender (3,000 men and 3,000 women). That leaves us with about 210 men who have fessed up to infidelity in this survey. Of those, it is not clear from the media reports how many were in situations where the male earned less than his partner; other recent research suggests about a third, or fewer than 80 of those reporting infidelity, were in such a relationship. And remember, we’re being generous because we do not have the actual numbers. To be sure, 210 male cheaters is still a decent sample, and it could be enough to draw meaningful conclusions about links between infidelity and income (among other factors). But it still is not a lot. In fact, it probably is a lot less than the number of participants in the survey who actually cheated. Remember…
     
    Updated 2010-08-20: LiveScience.com (which has more details on the methodology, and as an added bonus, commentary from Stephanie Coontz) is reporting that only 3.8 percent of men, and 1.4 percent of women, admitted to cheating in the study. That’s not exactly true; on average, 3.8 percent of men and 1.4 percent of women admitted to cheating in any given year of the six-year study, at least according to the press release.

  2. …People lie. A major income discrepancy in the relationship may be a good reason for men to simply be more honest about their cheating. Sure, you could argue, if the wife/girlfriend finds out then the gravy train ends. But if the man is in a relationship for the money, and not emotionally committed, why on earth would he lie to an anonymous survey about his cheating? There is little incentive to, and there is no cognitive dissonance to resolve over telling the truth. On the other hand, if he is emotionally engaged, and is in the relationship for reasons other than money, he may find it safer (and more palatable) to hide any previous infidelity. If all that sounds awfully speculative, well, that’s the point. People lie on studies like this, and we do not always know who will be most likely to lie or why. Yet commenters (and, too often, the researchers themselves, as seems to be the case here) treat the findings as truth in spite of their huge flaws, and then seek to divine an explanation.

  3. Account for other factors, like age, education, and religion, and the income-infidelity link vanishes. That inconvenient fact is actually in the press release, but of course, no one is paying attention to it. Does earning more than your man make him more likely to cheat? the chatterers are asking. In a word, no — the income issue appears to (at best, and even this has big holes) correlate with, but not be a cause for, cheating.

The trouble with any study of undesirable behavior that relies on self-reports is that it is impossible to know what we’re really studying — the behavior itself, or the act of reporting it. Only a more carefully (and expensively) constructed study could parse that out. In the meantime, move on. Nothing new to see here.

Blogging the AAMFT Conference: Sunday

The AAMFT Annual Conference wrapped up Sunday morning, as the last die-hards snuck in one last 3-hour workshop before drifting up to the Sacramento airport and back home. I was a presenter on Sunday morning, but only if you use the word “presenter” loosely; I led a panel discussion on licensing exams.

Before you go into the automatic response (which is, I believe, “zzzzzzz”), you should know there was one significant piece of news to come out of it: California’s Board of Behavioral Sciences and the Association of Marital and Family Therapy Regulatory Boards, who develop the California and National MFT Exams respectively, have agreed to collaborate on their next occupational analyses. It’s a small step but a significant one, as it will at least allow for an apples-to-apples comparison of the practice of MFT in California versus around the rest of the country. That kind of comparison would be necessary in any discussion of California eventually moving toward use of the National MFT Exam — something I believe in very strongly.

===

As is the case every year, I leave the conference energized, with many new research ideas, new potential collaborators, and new respect for the amazing work taking place in our field. I also leave Sacramento with renewed appreciation for just how clinically strong our Alliant students are and will be; it wasn’t at all unusual to see the best questions in any workshop — and often the best answers as well — coming from people with orange “Alliant” stickers on their badges. Great work, everyone. See you in Atlanta next September.