Survey Says...
Finding out if you are successful is easy for some things: did your football team win? Check the score. Did people like your product? Check the sales or reviews. But sometimes finding out if your strategic communications efforts are successful is more nebulous, particularly if the campaign you are running is more top-of-mind advertising than targeted to a specific event or outcome.
The ways that we as strategic communications professionals measure our results is the topic for this week, and in my blog posting, I will be breaking down one of those measurement tools—the survey.
At my current employer, we use surveys to gauge the annual results of each department among the relevant constituents. This is my first time using any type of measurement device in the workplace—at my previous place of employment, our success or failure was literally measured in whether we succeeded or failed to reach our fundraising goal at the end of the campaign. But at a community college, the public relations and marketing efforts we put out are more difficult to define as successes or failures. While you can measure marketing registration and application periods by assessing the number of students who applied or registered, it is more difficult to measure brand awareness campaigns.
That’s where a survey can prove a useful tool.
Melissa Hurley with PR News wrote an article about using surveys to gauge the effectiveness of your public relations or marketing efforts. According to Hurley, “a thoughtful, well-engineered survey can be an excellent vehicle for generating media opportunities and gathering data to support your organization’s key messages” (Hurley, 2013.)
The key steps in a successful survey are planning, developing, and launching. Hurley breaks down each step in her article, begging with the first step in a successful survey – planning.
Phase One – Plan
Determine the Goals: Start by developing a theme or story you’d like to tell. Make sure every survey question is designed to help you tell that overall story, regardless of whether the results come back in your favor or not (Hurley, 2013).
Our surveys at my current employer are designed to gauge the effectiveness of the PR and Marketing Department. Each department on campus must have strategic goals set each year that can be measured, and the survey is the instrument. So the goal of our current surveys is to assess whether we met those goals.
However, if we were to conduct a survey to the general public to assess our marketing and public relations effectiveness, our goal would be to see how favorable the community was to our college and what our level of brand awareness was in the community.
Define Success: What are the results you hope to see? Discuss the data that you want to gather and draft anticipated proof points. Brainstorm ways you can package the information you gather to share with the media (Hurley, 2013).
For our current surveys, we will define success as 90% of those surveyed having a favorable opinion of our department’s public relations and marketing efforts. That is based on the results we have indicated in our strategic plan.
For a new community survey, success for the first year would be shown in the number of people we could get to participate. To my knowledge, we have not conducted a community-wide survey and so for me, it would be successful if we were able to reach a certain percentage of the community. According to Customer Thermometer, a survey response rate of 50% or higher is considered excellent, while a range of 5 – 30% is the most common (Willott, 2019).
Identify the Audience: Decide who will help you reach your survey goals. Customers or unbiased individuals? Consumers or professionals? This will ensure that your survey reaches the right people, as well as help clarify your results (Hurley, 2013).
Currently, our audience is internal. We conduct a survey among faculty and staff in the fall, and among students in the spring. They are consumers who are biased toward our school because they attend or work here.
A community survey target audience would be high school students and also people in the community ages 18 - 40. That range makes up 93% of our current students.
Select the Platform: The best thing about running a survey? It doesn’t have to be expensive. Many low-cost or free options exist, such as Survey Monkey (Hurley, 2013).
Internally, we use a platform that is typically used in institutional settings. (In fact, I took a Troy survey recently that was the same platform that we use.)
I would have to do research into that platform to see if it was one that could be used externally, but other organizations I have been involved with have used Survey Monkey with good results. That is an excellent, inexpensive option.
In 2021, it goes without saying (but here I am saying it anyway) that the platform would be digital and mobile device compatible.
The second phase after planning is developing.
Phase Two – Develop
Construct the Questions: Craft the survey to ensure that questions are phrased to yield interesting data without leaning towards bias (in terms of industry, profession, location or age). If you’re using a survey vendor, a representative can also take a look at your survey and provide advice on phrasing (Hurley, 2013).
One thing I was able to do in my new role was to revamp our current faculty survey. It had several questions on it that were just not relevant to achieving the 90% positive results we are looking for. They also solicited information that I did not think provided helpful information in our marketing efforts. Since it was an internal survey, I instead included more questions regarding the process that faculty and staff go through when working with our department. I have to say I am very excited to see the results.
If an external survey was conducted, we would have questions that could gauge the respondents' awareness of the College; awareness of our program offerings; likelihood of attending. We could also use the survey questions to ascertain where the respondents go to get information – that could be used later to target those specific areas or platforms for future marketing efforts.
Build the Answers: Instead of allowing respondents to select “all that apply” or the “top” priorities, ask them to rank items in order of importance. Ranking items will help you gain more interesting data and valuable insight into what your respondents care about the most (Hurley, 2013).
This is effective only if you want to gauge how the answers rank amongst themselves.
Consider the Structure: When developing the content, take into account the flow and order of the questions and ultimately how you’d like to open and close the survey. Be mindful of your audience’s time; you don’t want to receive incomplete surveys at the expense of the survey length (Hurley, 2013).
Our current surveys can be completed in about 5 minutes time. Customer Thermometer stresses brevity in surveys, with instructions to keep surveys short, asking only a few questions and providing only a few answer choices (Willott, 2019).
An external survey would likely be the same length to be respectful of the respondents’ time.
Cover Demographics: Everything from a respondent’s professional title to his location can help uncover additional demographic-based trends when you do your final analysis. Don’t neglect these questions when developing the survey (Hurley, 2013).
We do not use our internal surveys to gather demographic data, as we have that for our current faculty/staff and students already.
For an external survey, including location would be key so that we could use the results to target specific schools or districts in our marketing.
After you have done all your planning and development, it’s time to go live with your survey.
Phase Three – Launch and Publicize
Run a Beta Test: In order to weed out design flaws or tricky questions, have an internal team take the survey (Hurley, 2013).
For our internal survey, we only submit the questions/answers and another department is responsible for building the survey. Our department takes the survey first to find any issues with flow, incorrect selection options, or any other technical issues. We would do the same with a community-wide survey.
Leverage Analysis: Leveraging data analysis tools can yield a whole new perspective that may offer unique perspectives and trends (Hurley, 2013).
After looking at the results of last year’s survey, I have already been able to use the data collected in decision-making.
Package the Results: Depending on the data your survey yields, you may be able to choose from a wide variety of internal and external formats for the content. Options could include a company white paper, formal report, webinar, or infographic. Additionally, consider who will tell your survey’s story (Hurley, 2013).
As director of my department, I would be responsible for telling our data’s story. The format we currently use to distribute the data is a spreadsheet. I would like to convert that into a more easily readable infographic with next year’s results.
Deliver the Data: Consider the most impactful way to release the survey results. Should you provide them as an exclusive to a top-tier reporter, announce them via a press release or blog post, or time them with a particular event or news item? Work to evaluate the biggest PR opportunity (Hurley, 2013).
To my knowledge, we have not released our internal results college-wide. That would potentially be a helpful tool to let our peers know more about the inner workings of our department.
In conducting an external survey, if the results were indeed excellent, that would be something that could be released to our local print and television media.
As mentioned before, this is my first experience receiving feedback from a survey, and the results from last year have been very eye-opening and I am really looking forward to seeing the results and using them to create new opportunities based on hard data.
References:
Hurley, M. (2013, January 15). The PR Pro’s Guide to Successful Surveys—A Checklist. PR News. https://www.prnewsonline.com/the-pr-pros-guide-to-successful-surveys-a-checklist/
Willott, L. (2019, July 12). Average Survey Response Rate – What You Need to Know. Customer Thermometer. https://www.customerthermometer.com/customer-surveys/average-survey-response-rate/
The ways that we as strategic communications professionals measure our results is the topic for this week, and in my blog posting, I will be breaking down one of those measurement tools—the survey.
At my current employer, we use surveys to gauge the annual results of each department among the relevant constituents. This is my first time using any type of measurement device in the workplace—at my previous place of employment, our success or failure was literally measured in whether we succeeded or failed to reach our fundraising goal at the end of the campaign. But at a community college, the public relations and marketing efforts we put out are more difficult to define as successes or failures. While you can measure marketing registration and application periods by assessing the number of students who applied or registered, it is more difficult to measure brand awareness campaigns.
That’s where a survey can prove a useful tool.
Melissa Hurley with PR News wrote an article about using surveys to gauge the effectiveness of your public relations or marketing efforts. According to Hurley, “a thoughtful, well-engineered survey can be an excellent vehicle for generating media opportunities and gathering data to support your organization’s key messages” (Hurley, 2013.)
The key steps in a successful survey are planning, developing, and launching. Hurley breaks down each step in her article, begging with the first step in a successful survey – planning.
Phase One – Plan
Determine the Goals: Start by developing a theme or story you’d like to tell. Make sure every survey question is designed to help you tell that overall story, regardless of whether the results come back in your favor or not (Hurley, 2013).
Our surveys at my current employer are designed to gauge the effectiveness of the PR and Marketing Department. Each department on campus must have strategic goals set each year that can be measured, and the survey is the instrument. So the goal of our current surveys is to assess whether we met those goals.
However, if we were to conduct a survey to the general public to assess our marketing and public relations effectiveness, our goal would be to see how favorable the community was to our college and what our level of brand awareness was in the community.
Define Success: What are the results you hope to see? Discuss the data that you want to gather and draft anticipated proof points. Brainstorm ways you can package the information you gather to share with the media (Hurley, 2013).
For our current surveys, we will define success as 90% of those surveyed having a favorable opinion of our department’s public relations and marketing efforts. That is based on the results we have indicated in our strategic plan.
For a new community survey, success for the first year would be shown in the number of people we could get to participate. To my knowledge, we have not conducted a community-wide survey and so for me, it would be successful if we were able to reach a certain percentage of the community. According to Customer Thermometer, a survey response rate of 50% or higher is considered excellent, while a range of 5 – 30% is the most common (Willott, 2019).
Identify the Audience: Decide who will help you reach your survey goals. Customers or unbiased individuals? Consumers or professionals? This will ensure that your survey reaches the right people, as well as help clarify your results (Hurley, 2013).
Currently, our audience is internal. We conduct a survey among faculty and staff in the fall, and among students in the spring. They are consumers who are biased toward our school because they attend or work here.
A community survey target audience would be high school students and also people in the community ages 18 - 40. That range makes up 93% of our current students.
Select the Platform: The best thing about running a survey? It doesn’t have to be expensive. Many low-cost or free options exist, such as Survey Monkey (Hurley, 2013).
Internally, we use a platform that is typically used in institutional settings. (In fact, I took a Troy survey recently that was the same platform that we use.)
I would have to do research into that platform to see if it was one that could be used externally, but other organizations I have been involved with have used Survey Monkey with good results. That is an excellent, inexpensive option.
In 2021, it goes without saying (but here I am saying it anyway) that the platform would be digital and mobile device compatible.
The second phase after planning is developing.
Phase Two – Develop
Construct the Questions: Craft the survey to ensure that questions are phrased to yield interesting data without leaning towards bias (in terms of industry, profession, location or age). If you’re using a survey vendor, a representative can also take a look at your survey and provide advice on phrasing (Hurley, 2013).
One thing I was able to do in my new role was to revamp our current faculty survey. It had several questions on it that were just not relevant to achieving the 90% positive results we are looking for. They also solicited information that I did not think provided helpful information in our marketing efforts. Since it was an internal survey, I instead included more questions regarding the process that faculty and staff go through when working with our department. I have to say I am very excited to see the results.
If an external survey was conducted, we would have questions that could gauge the respondents' awareness of the College; awareness of our program offerings; likelihood of attending. We could also use the survey questions to ascertain where the respondents go to get information – that could be used later to target those specific areas or platforms for future marketing efforts.
Build the Answers: Instead of allowing respondents to select “all that apply” or the “top” priorities, ask them to rank items in order of importance. Ranking items will help you gain more interesting data and valuable insight into what your respondents care about the most (Hurley, 2013).
This is effective only if you want to gauge how the answers rank amongst themselves.
Consider the Structure: When developing the content, take into account the flow and order of the questions and ultimately how you’d like to open and close the survey. Be mindful of your audience’s time; you don’t want to receive incomplete surveys at the expense of the survey length (Hurley, 2013).
Our current surveys can be completed in about 5 minutes time. Customer Thermometer stresses brevity in surveys, with instructions to keep surveys short, asking only a few questions and providing only a few answer choices (Willott, 2019).
An external survey would likely be the same length to be respectful of the respondents’ time.
Cover Demographics: Everything from a respondent’s professional title to his location can help uncover additional demographic-based trends when you do your final analysis. Don’t neglect these questions when developing the survey (Hurley, 2013).
We do not use our internal surveys to gather demographic data, as we have that for our current faculty/staff and students already.
For an external survey, including location would be key so that we could use the results to target specific schools or districts in our marketing.
After you have done all your planning and development, it’s time to go live with your survey.
Phase Three – Launch and Publicize
Run a Beta Test: In order to weed out design flaws or tricky questions, have an internal team take the survey (Hurley, 2013).
For our internal survey, we only submit the questions/answers and another department is responsible for building the survey. Our department takes the survey first to find any issues with flow, incorrect selection options, or any other technical issues. We would do the same with a community-wide survey.
Leverage Analysis: Leveraging data analysis tools can yield a whole new perspective that may offer unique perspectives and trends (Hurley, 2013).
After looking at the results of last year’s survey, I have already been able to use the data collected in decision-making.
Package the Results: Depending on the data your survey yields, you may be able to choose from a wide variety of internal and external formats for the content. Options could include a company white paper, formal report, webinar, or infographic. Additionally, consider who will tell your survey’s story (Hurley, 2013).
As director of my department, I would be responsible for telling our data’s story. The format we currently use to distribute the data is a spreadsheet. I would like to convert that into a more easily readable infographic with next year’s results.
Deliver the Data: Consider the most impactful way to release the survey results. Should you provide them as an exclusive to a top-tier reporter, announce them via a press release or blog post, or time them with a particular event or news item? Work to evaluate the biggest PR opportunity (Hurley, 2013).
To my knowledge, we have not released our internal results college-wide. That would potentially be a helpful tool to let our peers know more about the inner workings of our department.
In conducting an external survey, if the results were indeed excellent, that would be something that could be released to our local print and television media.
As mentioned before, this is my first experience receiving feedback from a survey, and the results from last year have been very eye-opening and I am really looking forward to seeing the results and using them to create new opportunities based on hard data.
References:
Hurley, M. (2013, January 15). The PR Pro’s Guide to Successful Surveys—A Checklist. PR News. https://www.prnewsonline.com/the-pr-pros-guide-to-successful-surveys-a-checklist/
Willott, L. (2019, July 12). Average Survey Response Rate – What You Need to Know. Customer Thermometer. https://www.customerthermometer.com/customer-surveys/average-survey-response-rate/
Comments
Post a Comment