Thursday, February 4, 2010

Employee Satisfaction and Other Surveys

Thursday, February 04, 2010

Most employers, from time to time, conduct employee surveys to determine employees' attitudes, opinions, motivation, training needs, satisfaction, and engagement levels. However, what do these surveys tell us abount how valid and reliable are the surveys results? The answer to those questions can be very technical; I however do not intend to explore the technical aspects of survey design, statistical validation and reliability analysis. Rather, I plan to take an intuitive approach to survey design and administration.

Pre-prepared and standardized surveys that allow for external comparisons can be obtained from any number of HR consulting firms. Survey templates are available online from numerous websites; a Google search will turn up dozens. If an off-the-shelf survey does not meet your requirements, HR consulting firms will be more than happy to build customized surveys for your specific needs, and at a steep price. You certainly can build your own survey with just a small amount of effort. Websites like Survey Monkey, Zoomerang, and Survey Gizmo are but three that allow you to build and publish online surveys and analyze the results. So you have the tools, you know what you want to survey and who, so it seems like a simple process to just go do it?

The issue is survey participation, how do you get the survey target participants to complete the survey? Except for paid survey participation, surveys have a notoriously low completion rates. One survey provider recently bragged that it had increased its completion rates from 31.4% to 38.9% in just three years. Oh, by the way, the company had been in business for over 27 years when it achieved this level. If I get a 25% participation rate, I consider myself lucky.

Validity has everything to do with the statistical distribution of those completing the survey. The completed surveys should match the statistical distribution of the workforce being sampled. Consider a survey of a workforce that is equally female and male workers. If the returned surveys are more heavily weighted towards males, how can the organization formulate policy based on such biased results? If the completed surveys are disproportional weighted on one or two age groups, any actions taken by the organization may lack creditability with the targeted workforce. Any incentives designed to encourage participation would need a high degree of anonymity or participants would be concerned with a lack of privacy. Incentives could lead to biased survey results as participants attempt to respond to questions in a manner other than their own in order to please or displease management.

Not only must the returned survey statistically match the workforce being sampled, but it must also have a sufficient number of surveys returned in order to have a statistically significant response rate. How can an organization get both a statistically balanced survey and a significant response rate from a voluntary participation? There is no guarantee, but a well-crafted communications plan will go a long ways towards achieving that goal. This means crafting the survey and the all communications associated with it in a manner directed at the intended target audience. The level of the language, the format of the documents (printed vs. electronic), and the method and timing of distribution all must be considered. An online survey directed at a population of non-computer users is not going to have the desired response rate. A printed survey written at the 12-grade language level for employees, who barely speak English, may be perceived with skepticism, distrust or outright anger.

Professional survey developers should take the employees’ primary language, educational level, culture, demographics, and other aspects into consideration when designing a survey. Should you find yourself in the role of a survey designer, you will need to consider the same factors as you go about building the survey document and communications. If possible, test your survey with a small group of employees (focus group) representing a cross section of the larger employee population. This will give you some sense how the survey, its communications lead-in, and instructions are going to be perceived by the larger group. If need be, the survey and its communications materials can be updated based on feedback from the focus group.

If this survey is going to be repeated periodically in the future, you may want to include a “comments” section to allow for participant feedback on the survey itself. This feedback may prove helpful in increasing participation in future surveys.

1 comment:

  1. Sometimes, follow up is required to understand what is needed to make scores change. For example, if you score low on internal communications, you need to get to the "why" and the "how to improve." This data is hard to get to in a survey format so follow up is required. -
    Employee surveys

    ReplyDelete