2011 Client Satisfaction Survey Executive Summary

Prepared for:
Office of the Commissioner for Federal Judicial Affairs Canada

Prepared by:
The Strategic Counsel

The Strategic Counsel is pleased to present the highlights from a recent client satisfaction survey undertaken on behalf of the Office of the Commissioner for Federal Judicial Affairs ( FJA). This current study is a follow-up to the original, benchmark client satisfaction survey which was conducted in the fall of 2008. The surveys are a mechanism by which FJA can formally obtain feedback from its clients, federally appointed judges, regarding their overall satisfaction with the service provided by the organization as well as across specific service areas. While some new areas were explored in 2011 survey, many of the results track those from the 2008 study, highlighting any changes in satisfaction levels as well as service from FJA, and continuing to identify possible areas for improvement.

Research Objectives and Approach

The survey is just one of several tools FJA uses to actively listen to its clients and monitor the judicial environment. Feedback through these surveys permits FJA to better understand its clients' administrative needs, identifying if, and where, judges perceive service gaps exist. Conduct of such surveys on a regular basis assists the FJA in planning and delivering service improvements to its clients as mandated by the Treasury Board Secretariat's Management Accountability Framework (MAF).

The questionnaire that was sent to judges was based on the Common Measurement Tool (CMT), a standardized client satisfaction measurement tool developed specifically for public sector organizations, to permit evaluation of results against normative data. Some modifications were made to the CMT in the 2008 benchmark survey to reflect the particular requirements of FJA and the array of services offered. These modifications have been retained, for tracking purposes, in the 2011 questionnaire.

All judges received preliminary communications about the survey, and were given the option of completing the survey either online (via an email-based web link) or on paper. A total of 936 invitations were sent out on March 29, 2011. The online survey was active for a period of four weeks following this date, with the cut-off being April 22, 2011. Any mail responses received prior to that date were also included in the final count of completed surveys.

A total of 564 completed surveys (175 were submitted online, 389 were submitted in hard copy) were received prior to the deadline for completion, yielding an overall response rate of 60%, which represents a slight improvement over the 2008 response rate of 57%. This is a reasonable and expected response rate for such surveys given the time pressures, schedules and availability of the client population to complete such a survey. At an organization-wide level, the results are accurate within +/- 2.6 percentage points, 19 times out of 20.

Please note that the questions included on the survey provided an option for respondents to answer “don't know,” “not applicable” or simply to decline to respond (e.g. DK/NA/Ref). Where respondents were asked to rate their levels of agreement/disagreement or satisfaction/dissatisfaction, for purposes of reporting, these results have been re-calculated to exclude those who responded in this manner. This is common practice when reporting on such measures in client satisfaction surveys, as it is effectively the general tendency among those who had an opinion that offers the best overall picture of levels of satisfaction and the gap between stated importance and performance, as reflected by satisfaction, on a particular dimension of service.

Key Findings

Judges continue to report high levels of satisfaction with FJA and with all of its key functions or service areas.

It is rare that an organization receives such an overwhelmingly positive assessment on the services provided to staff and/or clients. However, judges continue to rate the FJA highly with nine-in-ten (91%) saying they are either “very” or “somewhat satisfied” with the services provided by FJA. This assessment is essentially unchanged from 2008 (93%). While four-in-five judges (83%) also rate FJA as doing a “good” or “excellent” job in fulfilling its mission to support and promote judicial independence by providing services to the Canadian judiciary, there has been a very slight drop on this measure since 2008 (88%). Nevertheless, both these ratings suggest that judges feel they are generally well served by FJA.

Satisfaction scores were also consistently high across specific service areas and, with the exception of two areas, relatively unchanged from 2008 scores. The vast majority of judges who took language training services offered by FJA say they are satisfied with this service (88%, versus 87% in 2008). Similar numbers express satisfaction with the services provided by the Finance and Administration section (86%), representing a slight improvement from 2008 (82%). FJA also receives reasonably good satisfaction scores for the quality of service provided by Compensation, Benefits and Pension Services (78%, versus 79% in 2008).

Finally, while about three-quarters of judges (76%) are satisfied with the JUDICOM system, this does represent a dip in satisfaction ratings from 2008 (84%). It should be noted that this decline appears to reflect a shift toward a more neutral assessment rather than from satisfaction to dissatisfaction, as very few (4%) declared themselves dissatisfied on this item. Consistent with this finding, there has been a drop in the number of judges agreeing that JUDICOM is an important tool (from 86% in 2008 to 75% in 2011) or that JUDICOM helps them to better communicate and collaborate with other federally appointed judges (from 81% in 2008 to 72% in 2011). At the same time, considerable numbers (45%) use JUDICOM daily, while 61% use it at least once a week.

Overall Satisfaction Chart
Text Description Of Image

On most dimensions of service, ratings of FJA are equally high. However, identifying points of contact continues to be an issue and represents an area of opportunity for FJA to address a service gap.

Judges rated FJA as being responsive to their needs (90% agreed with a statement to this effect). About the same number agreed that “the staff who provide services at FJA do an excellent job” (89%). FJA staff were also commended for their courteousness (94%), helpfulness (93%) and professionalism (93%). All of these scores represent strongly positive ratings and suggest that FJA's commitment to client service continues to be core organizational value. All of these ratings are virtually unchanged in the three-year period from 2008 to 2011.

However, one area remains somewhat problematic and we note a marked drop in the level of agreement with the statement “when I need service from FJA, I know where to get it.” In 2008, over four-in-five judges (85%) agreed with this statement. This percentage has now dropped 13 points to just under three-quarters (72%).

This trend is also apparent in responses on a related item, underscoring a possible service gap. While 88% of judges say it is important that it be clear to them who they should contact if they have a problem, the percentage of those expressing satisfaction on this item is just 62%, a 26-point gap (as represented by subtracting the satisfaction score from the importance score). The current ratings for this item, and the size of the importance-satisfaction gap (i.e. the service gap), are similar to those found in 2008, suggesting that identifying appropriate contact points continues to be a challenge for some judges.

Service Satisfaction Chart
Text Description Of Image

Although many judges don't encounter any issues with service from FJA, it is notable that, from a list of possible issues judges may have experienced when attempting to obtain services from FJA, one-quarter (24%) indicated that no one had taken the time to explain things to them (a considerable increase from the 4% who experienced this problem in 2008). Another 18% also confirmed they had encountered some problem with finding out who to contact.

When asked to choose the top three from a list of 11 specific areas in which FJA could improve its service, judges tended to focus on the following, as they did in 2008: “the time it takes to obtain service from FJA” (44%), “the steps I have to take if I have a problem” (42%), and “the number of people I need to deal with to get service at FJA” (30%).

Conclusion

The results of the 2011 survey show that judges remain generally satisfied with the service FJA provides. Indeed, FJA receives high satisfaction ratings in overall and in many aspects and areas of service.

The findings suggest the organization is doing an excellent job providing administrative services and support to its clients with many satisfaction scores in the range of 80 per cent or above. For the most part, ratings in 2011 are consistent with those found in 2008, suggesting that FJA has been able to maintain a high level and quality of service to clients.

Some areas remain challenging and continued efforts should focus on clarifying points of contact as well as turnaround times and the steps involved in resolving issues.