You might wonder why this post is here - well, it seems the most accessible for the audience I would like to share it with. So if this seems a little 'off topic' from my usual posts, please forgive me! This is simply my personal, practical experience and in no way reflects my employer's perspective on surveys or the results of said survey.
In April and May 2011 we sent a survey to our alumni from the University. We selected equal amounts of recipients from different age groups and for those we have email contacts from, with no bias as to country. In the UK, we sent a printed survey out to those who we only have postal addresses for.
We firstly tested the robustness of the survey by sending it out to alumni who are staff. They came back with many helpful suggestions on how to improve the survey as well as providing some feedback as well (from a different perspective of course).
The purpose of the survey was to assess our communications methods and messages, events and a little bit on fundraising. We want to make sure we are 'doing the right thing' and to improve where we can.
We used an integrated survey tool that is part of our web-system called NetCommunity. We use this because it links directly to our database, Raiser's Edge. This had some advantages, but many disadvantages too.
In collecting data by email, we did not want to ask the same questions on the survey that the alumni would have to fill in when registering on the site, so for the email survey we did not collect employment information. For the postal survey, we collected some of that information, but then putting it into the system required manual input.
We had a bit of criticism that the questionnaire didn't pick up lots of new data, but the purpose of the questionnaire was opinion based, not data collection. Make sure you don't ask too much of one survey.
Strengths: links direct into Raiser's Edge. Encouraged more individuals to register on line (from the email survey).
Weaknesses: postal survey meant lots of manual data input, and interrogation of the data is extremely basic. Though we could register who completed a survey, to find out what they actually answered we had to input manual attributes. To get anything really meaningful other than 'top line' (see picture) response, you need to export the data into a spreadsheet or some other tool and manipulate to find the real meat of your results (the way we did it, anyway). Very time consuming.
Opportunities: it would have been better to have all the data input in the same way, so mixing methods (email and post) could be improved. In addition, thinking about the level of data you want from a survey then look very carefully at the method you use. Those specialist survey companies charge a lot of money for good reason - they do all the work and can provide you with lots of fabulous data without you having to do all the work.
Threats: asking the wrong questions in the first place. If we haven't asked exactly the right questions, we won't actually learn anything. We may misinterpret data. Data can be skewed: for example we sent the email survey worldwide, but three quarters of responders were from overseas (so their perspective on events in London, for example, would not be fully representative).
The key is in knowing not just what you want to ask, but to what level you want to analyse your responses. Are there key differences in the ages of those you survey, of their location, and by subject perhaps? Set a very clear set of objectives beforehand and understand that different segments will respond differently. We selected equal amounts from each age-group, but unsurprisingly it was the older alumni who were more inclined to answer.
Use the right tool for the job, don't just go with what you have because it is easiest or what is cheapest.
And, finally, make sure you act on anything you have asked within the survey that requires a response; for example if they have offered an internship or a paper, follow up and thank them, engaging as relevant.
How to run a survey
Some interesting info
Alumni surveys, an overview
Some example surveys:
Survey from Cornell
Colorado - Dental Medicine
Robert Gordon University
If you have additional suggestions, please add them to the comments box below.