Over the past few decades, the neuroscience community has seen a huge growth in new types of experiments, and methods for analysing data.
But there is no magic wand for data analysis: having a large, flexible toolbox of methods can accidentally lead to the equivalent of baking a cake using salt instead of sugar: the ingredients look sensible and might give great-looking results, but overall you didn’t get what you thought you were making. Similarly, we could use the correct ingredients, but pour them in the toaster rather than the oven. What’s wrong with that? It still cooks, doesn’t it?
Although you wouldn’t last long making these mistakes as a chef, it’s often less obvious when analogous mistakes are made in science. In my research, I am developing analysis methods for functional neuro-imaging. It’s always fun to facilitate new experiments which were previously out of reach, but equally as important – though less glamorous – is to generate proper diagnostic tests to validate the research method. This has often been ignored in the excitement of tackling the big questions in brain science.
Neuro-imaging is a particularly difficult field to work within, since it requires collaboration between statistics, physics, psychology, engineering, neuroscience and many other disciplines. No-one can be an expert in everything, but we can at least know what common mistakes to watch out for.
With that in mind, NeuRA hosted a workshop on Skeptical Neuro-imaging Analysis: a course to keep our researchers at the forefront of statistical methods. The workshop was presented by a range of local experts as well as two international speakers, DuBois Bowman (Emery University, USA) and Roland Henry (UCSF, USA).
Key ideas from all speakers were to appreciate the range of assumptions we make at every step in the scientific process, and what we can reasonably assume to gain from different imaging modalities. For example, we can gain more information from an experiment mapping the white-matter pathways in the brain by including information about the heartbeat – otherwise, the heartbeat distorts the measurement.
Overall, the workshop successfully put researchers into a more skeptical mindset when conducting their own research.