I'm busy working on my blog posts. Watch this space!
(This is a pre-publish version of an article I wrote for De Minimis, the unofficial Melbourne Law School student newspaper)
I have never met a person w...
What it means to be open-minded
May 13, 2018
(This is a piece I wrote for De Minimis, the unofficial Melbourne Law School student newspaper)
Something very troubling happened a few weeks ago.
Self-censorship, name-calling and political polarisation
May 13, 2018
(This is a modified version of a piece I wrote for De Minimis, the unofficial Melbourne Law School student newspaper)
I haven’t done much difficult stu...
Bracing myself for law school
May 13, 2018
Our Recent Posts
Survey design for idiots
October 6, 2017
I am writing this with all of the expertise of someone who has designed one survey. If you have designed a survey before, this is not for you. If you have never designed a survey before, hopefully this will help you avoid some simple but consequential errors.
I found designing that survey, and seeing the direct relationship between the quality of the survey design and the quality of the data, to be (painfully) informative. I have recorded a few lessons I learned in case they may be of use to someone reading and so that I don’t forget them.
The single largest take-home message: the little things matter. Something small that you might overlook will potentially have very large implications for the quality of the data you end up with. Yes, the slightly varied wording of that question will change the quality of your data. Yes, you need to account for people who are not sure how to answer the question in a way that won’t distort the data, e.g. by providing an ‘N/A’ option. Yes, responders will lose interest quickly- make is short and to the point.
Here are lessons I learned:
Don’t overlap brackets, such as age
How is a 35yo supposed to answer this question?
The brackets are now clearly defined and your data is more accurate.
Don’t expect the data to provide information that you didn’t design the survey to yield. I could rephrase this as know exactly what your survey is testing before you launch it. For example, the survey I designed was, in part, to test the adherence of a company to service-level standards, which vary by the category of ‘urban’, ‘rural’ and ‘remote’. The survey did have a geographic component by asking for postcode and federal division, but not for the categories mentioned above. Simply substituting a question or asking one extra question could have improved the accuracy and usefulness of the data and saved hours of manual coding.
Don’t launch a survey without double checking everything. Once a survey is live, you’re committed to what you have put up. Iron out the creases by doing a test run or two. This will save you a lot of time on the analysis end. (Thanks, Rohan)
I hope to never make these simple mistakes again, and I hope this post prevent others from making them, too.