IMC 2014 Decision on Public Reviews
Recent IMC conferences have provided with each published paper an almost-verbatim but anonymized
review written by the reviewers, and a short response by the authors to the review.
For examples, see the programs from
IMC 2011,
IMC 2012,
and
IMC 2013.
(To access the reviews for IMC 2012 and IMC 2013 papers, click on the "review" link
next to each paper. Reviews for IMC 2011 papers are appended at the end of each paper's PDF.)
The stated goal of this exercise was to "make the review process more transparent
by making public the PC's rationale for accepting the paper, the main concerns
of the reviewers, and the authors' response to those concerns".
For IMC 2014, we decided not to continue this practice of publishing
anonymized public reviews and author responses. This note is a brief synthesis
of the reasons for our decision. Our decision was informed by many discussions
we had (in person, via social media, and via e-mail) with nearly 40 members of
the broad IMC community, spanning senior researchers and students, academics,
and industry researchers.
In our discussions, we asked people if they felt the stated goals were being
achieved, and if they perceive clear benefits. Below we summarize the comments
that we received.
-
"...public reviews provide a view into what the PC looked for in a paper" -- one of the stated goals in the CFP.
In an ideal world, this could be a strong point in favor of keeping the process
of publishing reviews alive. Unfortunately, this is only true if reviews are
well-structured and well-thought-out; most conference paper reviews, including
those at IMC, are detailed but not necessarily well-structured. More importantly,
anonymized reviews reflect the submitted version of the paper; so the reader
often has to reverse-engineer what the submission may have looked like and what
a reviewer could have been complaining about. Much of what is in the review may
not make sense with respect to the camera-ready version of the paper because the
most glaring issues that reviewers identify will likely have been fixed.
It seems that the majority of our community members do not read the public reviews;
this too may be due to the unstructured nature of the content.
-
"...public reviews hold authors accountable" -- an oft-stated benefit.
Unfortunately, most author responses are run-of-the-mill and don't go into
sufficient detail about what was or was not addressed and why.
-
"...this practice improves review quality" -- another oft-stated benefit.
There is no strong evidence of this. Generally, reviewers do not seem
to have this in mind when doing reviews.
-
"...public reviews could inform people on how to write good IMC papers" -- this is often echoed as a benefit for students.
Given that we already have over 100 papers with public reviews from the
past three years (IMC 2011, IMC 2012, IMC 2013), there is a solid body
of examples to aid newcomers and students. Thus, it doesn't seem necessary
to repeat this exercise each and every year.
-
"Do you read them?"
We asked everyone this question but unfortunately an overwhelming majority
answered "no". Anonymized public reviews take a significant amount of work to
generate (especially when it comes to coordinating with the ACM about printing),
and hence it seems a shame for volunteers to spend time on this.
In conclusion, it seems that the benefits are really not clear.
What we ideally need is a public summary, where one reviewer writes a one-pager
about the paper, carefully synthesizing reviewer comments and balancing against
the end product. However this places a significant additional load on an already
strained PC! Also, SIGCOMM tried this a few years ago (in 2006).
It did not seem to succeed in its goals, and was quickly discontinued.
In balance, there appeared to be no strong reason to keep the experiment going.
Thus, we decided to discontinue the process, at least for this year.
Aditya Akella and Nina Taft
IMC 2014 Program Co-Chairs