WC.com

Thursday, March 29, 2018

Annual Survey 2018

It is that time of year again. For the eleventh year, the Florida Office of Judges of Compensation Claims (OJCC) joins with the Workers' Compensation Section of The Florida Bar to collect your perceptions regarding the OJCC judges and mediators. We will deploy the 2018 survey on Monday, April 2, 2018. If you are an attorney and either a registered OJCC e-filing user or a Section member, you should receive an email invitation. The deployment was announced with an email to all of our registered users this week. 

The OJCC Survey is intended to fulfill two main functions, to provide subjective perceptions and constructive suggestions for change. It is acknowledged that the spectrum of statistical measures published each year in the OJCC Annual Report provides objective indicators of efficiency, effectiveness, and performance. There is an objective demonstration of petition volumes, litigation trends, and timing, But, the survey provides an opportunity to move beyond numbers, and express perceptions. 

It started in 2007 with perceptions and feelings expressed. Bench and bar relations were at the nadir. There was a discussion with the Section and a general consensus that practitioners have perceptions about judges, mediators, and how offices perform. Those discussions included thoughts about the objective and statistical measures. The consensus was that a measure of subjective perceptions of performance would augment and supplement those objective measures. And in 2008, the first joint survey was deployed. 

The survey is subjective. Opinions and conclusions, everyone has them. We all perceive the world around us, compare our perceptions to our personal worldview, and we make conclusions. And throughout our lives, we are all likely to be influenced by our personal biases, preconceptions, and perspectives. Bias is something we must each acknowledge, recognize, and strive to overcome. This overcoming is critical for the survey and would benefit us all generally in our lives. 

Bias may affect the process when we review performance, which is done periodically in most employment settings. When we are asked to evaluate performance, we should remain vigilant about the potential for personal bias to interfere with the process. If we acknowledge the potential for bias and recognize the potential, then we can strive to overcome bias. This is true in the evaluative process and can be similarly engaged in our lives generally.  


Learn That (a management website) cautions about five types of bias that can interfere with effective evaluation. "Recency Bias" can cause an evaluator to "over or undervalue short-term events to the detriment of the employee’s long-term performance." In other words, are you allowing your most recent experience with a judge or mediator to overpower or unduly influence perceptions? When you evaluate, are you remaining conscious and respectful of a variety of experiences and interactions?


The "Halo Effect" is similar, but instead of focusing the evaluator on a particular interaction, the halo effect focuses the evaluator on a particular skill. In light of perceptions of a particular strength or weakness, the evaluator may allow her/himself to be influenced regarding the evaluation of other skills, abilities, or performance ("she is so nice"). If someone is "stellar in one area," then we should say so, but that should not keep us from acknowledging room to grow in others. 

The "Central Tendency Bias" is one that most of us will acknowledge we have employed. This describes our tendency to group most people "into middle-level grades while having some extremes in the top and bottom."  This is an expediency, which may make completion of a survey easier and quicker (mark all the "average" boxes), but it may fail to address "all of the contributions and problems of" those evaluated. 

The "Spillover Effect Bias" describes our tendency to measure current performance through the filter of our previous evaluations. That is, we may use someone's results on a prior evaluation as a reference point. This can cause an evaluator to "unfairly weight" factors and perceptions of performance. Previous evaluations considered previous performance. In evaluating current perceptions, the evaluator should consider current performance and ignore previous evaluation outcomes. In the OJCC Survey, that means since the last survey, roughly the last twelve months.

Finally, there are "Personal Biases," and they may be the most challenging. It is not appropriate to evaluate performance based on whether someone is "like me," or even if they are "liked by me." A review of performance should be about effort exerted, focus, dedication, courtesy, professionalism, and outcome. That we dislike someone, are distracted or put off by some personality trait ("I hate the way he . . ."), or know we could personally perform a task better are not the standard. The standard should be whether the person professionally exerts appropriate effort and accomplishes successful outcomes. 

These are but a few of the potential bias effects. Others include "confirmation" bias (unconsciously favoring data that fits pre-existing views), "bandwagon effect" (forming opinions based on perceptions and statements of others), "anchoring" bias (basing all conclusions on some initial data point or conclusion), "selection" bias (results are predetermined by the selection of who will participate), and "reporting" bias (deciding whether to report based upon the outcome). These are more fully discussed here, and the article is a worthy read.

In the end, we will email close to 5,000 invitations to participate in this study. They will go only to attorneys. Some will argue that is "selection" bias, in excluding injured workers, adjusters, and more. There have been a great many discussions of that over the last ten years. While this is certainly "selection bias," the consensus has concluded this is the population most likely to be sufficiently familiar with process and performance. It is, after all, the attorneys who interact most frequently in mediations, motions, and hearings. 

And, it is imperative to stress the real strength of the joint survey. This is an opportunity to suggest how someone might do something better. It is easy to disagree with how someone performs a task or process. Criticism is easy. However, constructive suggestions are harder. If you do not like a process or procedure, say so. But, use the "comment" section and provide constructive suggestions of what you perceive would be better, and how you think that result would follow from your idea. 

Finally, remember that everyone can benefit from a compliment. If someone has impressed you with their intellect, patience, analysis, perseverance, commitment, energy, enthusiasm, optimism, courtesy, or otherwise, use the comment section and say so. If someone has disappointed, frustrated, or hurt you, say so. But, if your comment is critical there is no reason to make it insulting, rude or demeaning. The survey is anonymous, but that is no reason to make it mean or degrading. 

The results will be published (no "reporting bias") after it closes in late April. Time will be required to collate the collected data, but by summer, the 2018 results will join the rest, published on the "publication" section of the OJCC website, under the "survey" tab. It will not be "the" answer to anything, but it will provide insight as to how you perceive this Office. I thank you for taking your valuable time to participate.