An interesting article on expertise, by Philip E. Ross, in the August issue of Scientific American, came to my attention via M. Deric Bownds gem of a blog, Deric Bownds’ Biology of the Mind. Dr. Bownds is Emeritus Professor of Molecular Biology and Zoology and Past Chair of the Department of Zoology at the University of Wisconsin in Madison. Ross’s discussion of expertise in Scientific American certainly resonates to my experience of expertise as it began to emerge during my education and training as a psychologist.
My earliest efforts as a graduate student led to a kind of micro-mastery of subject matter in a piecemeal way. As far as anyone else might have been concerned at that time, I was an excellent student who was progressing well in my training as a clinician. While I knew that some of my fellow students were not doing as well as I was doing academically, I was also aware that there were some who seemed to understand what we were studying far more deeply than I did. All of them, as I recall, had considerable experience as clinicians prior to entering our graduate program, while I had no prior clinical experience.
It was clear to me that these students had a facility with clinical data that I lacked. During discussions of case material, they frequently offered observations that I understood, but could not at arrive at on my own, whether by dint of reason, sheer will or both. Although these more capable students weren’t expert clinicians, one obvious difference between their ability and my own was that they seemed to know what to look at and what to discard without sifting through every word of the clinical material and testing every permutation of the data in a completely conscious, rational, deliberative manner. In contrast to my own efforts, their ability to recognize and organize salient elements in clinical data had much in common with the cognitive processes of experts as described in the SA article.
In that article, author Ross discussed chess players, noting that research suggests that increased expertise does not necessarily lead the expert chess player to examine more possibilities when looking at the board and deciding on a move. In some cases this may be true, but often, experts examine far fewer possibilities and deliberate less than less expert players before deciding on a move. Ross explained that experts do not rely “so much on an intrinsically stronger power of analysis as on a store of structured knowledge” that tells them where they are in a game, including what must have come before and what will probably happen next.
Like expert chess players, the most capable students I encountered at the beginning of my graduate training seemed to draw upon stores of structured knowledge that oriented them to clinical material, while I was attempting to find my way, relying on a laborious process of trial and error. I would generate hypotheses and test them with little sense of whether one hypothesis versus another was more or less likely to hold up with additional data.
At the time, I was troubled to find that no amount of effort or determination showed any appreciable immediate effect on my success as I struggled to see and hear clinical material in ways that I could not yet see or hear. The experience felt very much like a kind of blindness or deafness that is now familiar to me in the experience of blindness and deafness I frequently encounter in other clinicians.
Over the course of time, the process of going down many clinical dead ends and repeatedly walking through clinical material consciously and effortfully, eventually led me to a new kind of sight that can be startling even to me. I often find myself thinking with eerie precision about things that patients will say to me before they actually say them and I find that I often know things about patients prior to the patient sharing the information with me in any deliberate or obvious manner. This strikes me as similar to the chess player who can look at a snapshot of a chess board and see where the game has been, as well as where the game is going.