Just last month, Moore and Abetz (2019) published a study using Reddit to investigate parental communication of regret around having had children. As with other studies which use Reddit as a data source, the authors selected threads which met certain criteria. In this case, the threads needed to contain at least thirty comments (i.e. to have robust conversation) and be archived (i.e. closed to changes, whether editing/deleting existing comments or creating new ones). For some research projects, this last criterion might be problematic, as Reddit threads are archived three to six months after they are first posted. Given that the nature of this research was not time-sensitive, choosing only conversations that were closed ensured the data would remain consistent after collection. There are three methodological choices that I find particularly interesting about this study. First, Moore requested and received institutional review board (IRB) approval to collect and analyze Reddit data. From what I have seen, most institutions (including Abetz’s) punt when asked to approve research and relies on public data, claiming they have no jurisdiction over, and therefore do not review proposals that rely upon, data posted publicly. Moore’s institution, the University of Utah, not only decided this request was within its jurisdiction, but affirmed that data posted publicly is acceptable to use in research without having to gain the informed consent of the authors. Secondly, the authors chose to hand-code the selected comments with regard to their emotional content instead of relying on automated tools. I wonder whether this is a sign that they did not trust the ability of these tools to properly detect regret, whether they simply weren’t well-versed in using automated coding, or both. Last, the authors took care not to presume the gender of a commenter based on the gender of their partner, but did make inferences about the gender of a commenter based on biological references, i.e. whether a person referred to having been pregnant or having gotten a vasectomy, statuses available to all genders.
Meanwhile, Petre et al (2018) focused on the visual language of Redditors, investigating memes related to the ongoing European refugee crisis, and the tenor of the conversations surrounding those memes, in order to better understand how and what participants thought about the increasing presence of these refugees in their towns, cities, and nations. The authors selected four memes, two which presented positive viewpoints about refugees (above) and two which presented negative viewpoints. The comments on these memes were coded to identify whether they positive or negative towards refugees, then thematically identified, and analyzed in the context of the greater discussion. Here, the authors do not indicate whether any of these steps were performed manually by the researchers, or automatically using sentiment analysis programs. Further detail in this area would make the findings more understandable, and also more replicable. This is a particular concern given the data were collected in November 2016; this time frame seems like it may have been a unique moment to be performing research regarding narratives concerning refugees. I wonder whether this same study performed earlier or later would have resulted in very different findings. I also wonder whether a better approach would have been to do an analysis of how many memes were positive and how many were negative, and then to select a sample that more closely represented the overall viewpoint, rather than going for an artificial “balance.”
Moore, Julia and Jenna S. Abetz. 2019. “What Do Parents Regret About Having Children? Communicating Regrets Online.” Journal of Family Issues; Thousand Oaks 40(3):390.
Petre, Glăveanu Vlad, de Saint-Laurent Constance, and Literat Ioana. 2018. “Making Sense of Refugees Online: Perspective Taking, Political Imagination, and Internet Memes.” The American Behavioral Scientist; Thousand Oaks 62(4):440–57.