15 July 2012
Rosemary Redfield, from the University of British Columbia in Canada, has published an interesting paper on revamping the university introduction to genetics course.
She suggests that the course needs to be changed from the traditional historical perspective to one that begins with a foundational knowledge of modern genetics, and then transitions into analysis. Her view is to always keep in mind the current applications of genetics and how best to prepare students in light of these applications.
The historical perspective, it was thought, would walk students through the many steps and questions that scientists asked as they discovered genetics. However, as Dr. Redfield points out:
Unfortunately, this wasn’t working as planned; although students learned to solve our genetic analysis problems, their ability to think scientifically didn’t noticeably improve and they didn’t seem to understand much genetics.
And while making the students walk through the process of discovery is a good idea, in theory, it was not what was happening in practice:
Pioneer geneticists treated these processes as “black boxes” whose rules they deduced, but our students appeared to avoid this challenge by simply memorizing the rules and problem-solving rubrics that well-meaning instructors provided.
The students were able to regurgitate the information that they were supposed to study, but they did not show higher-level thinking on the topics. Take meiosis, for example:
And although they could reproduce the stages of meiosis, map genes in three-factor crosses, and diagram meiotic recombination in complex inversion-heterozygotes, most had no idea how or why homologous chromosomes pair and recombine.
This problem is endemic in the university setting and particularly in the sciences. While students may obtain a “doctorate of philosophy” in their given field, most programs do not include classes that address the philosophical underpinnings of a given field.
In general, many students are taught the vocabulary of their field, and how to be laboratory workhorses, but rarely do they engage in higher-level questions, such as: “What is the ‘end’ or purpose of my field?”; or “How does my research fit within the grand scheme of scientific advancement?”; or “Why does the body behave in this way?” This, of course, is a generalization, but many scientists have related this trend in their graduate work.
An advanced degree usually means that one is training to become a specialist. At one time, the specialist had a bit more clout than he or she does today. The idea was this specialist would have a repertoire of knowledge in a particular field, would hope to become the resident expert in his or her field at a particular university. However, the internet has posed a small problem for the traditional view of the academic expert.
Today, facts are cheap. I don’t need to seek out a geneticist to tell me the phases of mitosis or the latest research in genetics. I can just google it.
The expert is no longer needed for his or her mental database. Now, the expert is needed to discern and assess ideas. This is a far different thing from memorizing vocabulary, tracing process, and working math problems.
This is not to say that the specialist of yesterday did not do these things, and it’s not to say that people should not memorize vocabulary. But it does mean a shift in the roles and expectations of the specialist, and perhaps that means a shift in priorities in the classroom.
Redfield’s suggested curriculum seems to be an acknowledgment that students need to be more than an expensive database. Her curriculum begins with the fundamental concept of how genotype determines phenotype, allowing students to build from their prior knowledge of DNA. When students study mitosis and meiosis, rather than memorizing the specific steps in the processes, students are to consider “first the problems mitosis and meiosis must solve (how to get the right chromosomes into the daughter cells) and then their molecular solutions . . .” Finally, analysis incorporates prior learning of phenotypes and inheritance, thus making a more holistic and less fragmented curriculum.
Redfield believes that an introductory genetics class should cover issues that the students will actually encounter in real life. She offers examples from the news:
- Is genetic testing a wise thing to do?
- Is it a sound financial investment?
- Should I have full access to my genetic information?
- Should athletes be tested for genetic modifications?
- Did my genes make me gay?
- Are genetically modified foods safe?
- Are cloned animals ethical?
- How different are human races, and how different are we all from chimpanzees and gorillas?
Many of these questions are ones bioethicists have been contending with for years. Bioethics is a multi-disciplinary field that is usually occupied by philosophers and healthcare professionals, but in the last twenty years it has seen an influx of lawyers, scientists, and people from many other disciplines.
Redfield suggests that it is the role of the scientist to address ethical questions. However, something that Redfield does not state in her paper, is that scientists are rarely trained in anything that would be helpful in assessing ethical issues, such as moral philosophy or rhetoric, and most programs do not include history and philosophy of science.
Usually scientists avoid humanities courses (bioethics is usually taught by the humanities department), just as humanities students avoid chemistry. Yet, these questions require something more than technical knowledge in the field. Furthermore, to make ethical declarations, one must draw from a moral foundation, placing scientists in a position of providing a moral compass for society.
This is not to say scientists should not contend with bioethical issues. I believe they should. Sometimes the ethical nuances of a particular technique can be best unpacked by looking at the technical process. For example, many bioethicists will put adult stem cell research in a different ethical category from embryonic stem cell research because of the differences in how the cells are retrieved and the subsequent consequences of the technique.
Overall, training scientists to be critical thinkers—to analyze and assess ideas—is good. It promotes a greater understanding of context, and how a particular scientific topic relates to other fields of study. Furthermore, it is all too easy to compartmentalize specific areas of study, such as studying the steps of meiosis, without ever taking a critical view of a topic in the context of genetics (or any field) as a whole.
Taking a critical view of a subject matter promotes interdisciplinary interaction and assessment of how discoveries and research in other fields can inform the scientist’s particular field.