Sounds like a good idea to me:
Perhaps we need a new field of "cognitive forensics" for analyzing and investigating motivated scientific error, bias, and intellectual misconduct. The goal would be to develop a comprehensive toolkit of diagnostic indicators and statistical checks that could be used to detect acts of irrationality and to make it easier to apprehend the culprits. (Robin's recent post gives an example of one study that could be done.) Another goal would be to create a specialization, a community of scholars who had expertise in this subfield, who could apply it to various sciences, and who could train students taking advanced methodology classes.
Of what components would cognitive forensics be built? I'd think it would have a big chunk of applied statistics, but also contributions from cognitive and social psychology, epistemology, history and philosophy of science, sociology of science, maybe some economics, data mining, network analysis, etc.
It seems like this is an idea whose time may have come. The damage done by falsifying academic research can be tremendous. If it causes people to build ideas off of faulty research, then all further research built upon it becomes uncertain.
As we learned in my Law and Economics class, there are two approaches that could be taken with this type of watchdog group to make it effective:
- You could try to make the organization large enough to apprehend any and all offenders (a very expensive proposition) or
- You could randomly select various papers to review, but make the penalties for poor conduct sufficiently high to disuade people from commiting an offense (a much cheaper proposition).
If people believed there was a credible threat of discovery and harmful consequences for cheating, it should incentivize academics to be more careful and honest in their work. Dr. Poehlman's example may be a step in the right direction.
No comments:
Post a Comment