Wednesday, April 6, 2016

FWS Student Highlights: Cameron Hatcher

 In the past 20 years, technological advances have made music increasingly more available in America (Schwartz & Fouts, 2003). In 2001, it was reported that Americans spent more money on music than they did on prescription drugs (Huron, 2001). Recently, research has shown that music listening (both intentional or otherwise) on average, adds up to more than 5 hours a day for the American citizen (Levitin, 2006; McCormick, 2009).

 Given that music is more ubiquitous today than at any other point in history, it is not surprising that psychologists have examined a number of important questions concerning music in everyday life (e.g., identity formation, social behavior, developmental functions). In contrast, fewer studies have examined how listening to music affects creativity and no studies have examined how self-defined music preference (e.g., “I love this song”) affects creativity. Additionally, no studies have utilized modern day technology (e.g., mobile applications like Spotify, noise-cancelling headphones) to simulate realistic music-listening experiences for participants. Therefore, the goal for this research is to determine whether listening to music individuals self-define as music they “love” or “hate” affects creativity. To assess creativity, participants will complete the Alternate Uses Task (AUT), a commonly used creative task, while listening to music they love, hate, or in silence (control). The AUT measures ideational fluency—the number of non-redundant ideas one can generate about a given topic.  Findings may suggest that listening to self-defined preferred music is beneficial in situations that demand idea formation, that listening to self-defined non-preferred music is disruptive, or that listening to music—regardless of preference—is disruptive to creative cognition. Depending on the results of this study, future studies may then begin to assess the effects of music listening in applied domains (e.g., across different creative tasks, worker or student productivity, assessing the quality of written work in work/school-related settings).
I have been working as a research assistant in the Applied Performance Research Lab for nearly three years. Conceptualization for this project began about a year ago when I stumbled upon music psychology as a sub-discipline within applied cognitive research. I fervently combed the literature for about a month before deciding to design and conduct my own study from the point of data collection through analysis and dissemination of the results (e.g., honors thesis, conference posters, potential journal publication). Last Spring I focused solely on conducting a thorough review of the literature, designing my study, obtaining IRB approval, applying for OSCAR’s URSP summer grant, and preparing for data collection. Although the study is fairly straightforward (conceptually), I was surprised at the amount of time I spent designing and piloting the study over the summer. For instance, I created online questionnaires in Qualtrics, a data-tracking protocol in Excel, a data-scoring program in Excel, purchased the measure of creativity (the Alternate Uses Task), constructed a third form of that measure (and piloted it to assess reliability and validity of this form), created a fake account in Spotify, planned recruitment of participants, created a counterbalanced design for a within-subjects design, created databases in SPSS using syntax, and developed a data analysis plan—all while writing drafts of my honors thesis (and this was all before the data collection process). After data collection, entry, and auditing (Summer/Fall), I then trained two research assistants to independently score the tests of creativity for originality/creativeness to remove experimenter/expectancy bias (a measure typically taken when scoring or ranking qualitative data). In order to effectively train the research assistants, I wrote a coding/scoring manual. This week, I have received the scored results and I am finally in the process of cleaning and analyzing my data. I hope to have analyzed the results by the end of this week. Stay tuned!
One thing I have discovered this week (I’m about to get a little reflective) is how quickly time moves and how important it is to plan ahead in the world of research. I once heard from someone that the “re” in “research” means “re-do.” Yes. Research rarely moves at the pace one needs it to (for jobs, grant or conference deadlines, defense deadlines, graduation, etc.), so it is important to balance planning with patience. On the subject of planning, I will be applying to Experimental and Quantitative Psychology doctoral programs this fall (2016). Because these programs place an emphasis on experimental design and statistics, the advanced training will provide me with the flexibility to stay in academia and consult with different researchers on a variety of projects. Independently designing my own study and writing an honors thesis has certainly prepared me for a future in academia.