If you email it, they will comment
Today I made an appearance on everyONE, the PLoS ONE blog. I relayed my experience using email to solicit comments on my lab’s recent PLoS ONE paper about the antidepressant Zoloft’s accumulation in yeast cell membranes. Here I’ll present supplemental materials that include the minutiae (and extended discussion) that didn’t make it into the guest post due to space limits.
To recap, here are the take-home messages of, and response rate data for, my sociological experiment:
- 8 out of 166 (~5%) university professors agreed to move post publication review from the email world to my article’s comment thread
- Professors with whom I had prior contact replied at nearly thrice the rate of strangers
- The 8 solicited professor comments placed my article in the top 0.1% of PLoS ONE articles
- Commenting is not a favorable process, to borrow a phrase from thermodynamics; one must supply energy to make the reaction go forward
Now let’s dig into the numbers. There were 122 non-responders. 12% (15/122) of “no replies” were actually autoreplies, or my email bounced outright because the recipient’s email address is defunct. In the food processing industry, they call unavoidable losses “shrink.”
What about the 14 professors who engaged in multiple rounds of email colloquy but never commented? Why didn’t these professors surmount the seemingly tiny hurdle separating emailing from commenting? It wasn’t for lack of interest, because two of the 14 non-commenters volunteered to chat over the phone, which is certainly worth more than a comment. Some professors may prefer private over public correspondence, period.
In addition to the eight successful cases of email-to-comment conversion, there are six other comments on my PLoS ONE paper for a total of 14 comments. Two of those comments don’t really count: one is a press coverage digest generated by PLoS ONE, and the other one is a digest of relevant tweets about the paper that I posted. So that leaves 12 bona fide comments.
Only one of the 12 was unsoliticited and anonymous, and, not surprisingly, a bit odd: “I followed this discussion from In the Pipeline. I’m not sure this is related, as the information seems self-reported, and therefore anecdotal. The sample size is quite small, as well. The reference is from a study published in the Canadian Journal of Psychology about the rare side effect of yawning as a cause of orgasm in female study patients taking clomipramine, which can be found at the link below.”
“In the Pipeline” is a blog maintained by noted science blogger Derek Lowe, which brings me to the most interesting comment of the 12. Lowe reviewed my paper, and his post garnered a whopping 25 comments! And it isn’t just quantity. The “In the Pipeline” comment thread is more complex than my paper’s comment thread, with later comments responding to or amplifying earlier comments, and multiple posts by the same commenter, etc. Interestingly, most of the commenters posted pseudonymously, a custom in many online comment forums, and yet the content didn’t suffer one iota.
However, the “In the Pipeline” comments are invisible in the official PLoS count, and, regrettably, isolated from the discussion already in progress on the PLoS site, even though after the fact Lowe was kind enough to copy and paste them into a comment for me. Alas, I don’t have a good answer to comment-thread balkanization. I wish it were as easy as highlighting text in an email, and then clicking a “send to comment thread” button that would instantly cross-publish at the PLoS ONE comment thread, or any science comment thread of one’s choosing.
With more online comment forums adopting a common infrastructure, e.g., DISQUS, the fragmentation should lessen over time (I hope). Yet even with a stable commenting identity that is portable across a wide swath of blogs, there’s no way to recreate overnight the informed, loyal readership of “In the Pipeline,” which took years to cultivate I’m sure. But I’m not asking for miracles, just more efficient coordination, and where that fails more technology that facilitates cross-blog pollination.
So how many comments are enough? 25? 100? 200? Put another way, what is the natural comment ceiling for a given research article? Obviously the answer is complex, and depends on the popularity of the research area, the reputation of the authors, the reputation of the journal, among others. However, as I argued in my everyONE post, the 90-9-1 engagement rule is a good rule of thumb. 90-9-1 also jibes with my paper’s article level metrics.
Let’s assume that the 440 PDFs downloads of my paper are a proxy for university- and industry-based readers, as I argued in a previous post, and as others have recently argued. For that many PDF downloads, I would expect several dozen comments. If you add the 12 bona fide comments residing at PLoS ONE to the 25 comments residing at “In the Pipeline,” that comes out to 8.4% (37/440), which is almost exactly equal to the aforementioned email-to-comment conversion rate.