Leland Teschler, Executive Editor
On Twitter @ DW_LeeTeschler
There’s a controversy these days about the number of scientific papers that end up being retracted. The science journal Nature estimates that the early 2000s saw only 30 retraction notices annually, but by 2011 that number had climbed to over 400 per year and continued to rise. More troubling was the wording of the retraction notices. Many of them have become vague to the point where it’s tough to distinguish a retraction because of an error to one arising from deliberate scientific misconduct.
Though most of the retractions seem to arise in papers covering psychology and biology, engineering topics haven’t been immune. Retraction Watch, a website website devoted to highlighting scientific retractions, reports that about 40% of the retractions in its database come from one source, the Institute of Electrical and Electronics Engineers (IEEE). The vast majority of the retractions are abstracts from IEEE conferences taking place between 2009 and 2011. In all, says Retraction Watch, IEEE retracted more than 7,300 abstracts, and most of the authors are in China.
We asked the IEEE about all this, and they responded with a prepared statement that said, in part, “Several years ago, we identified a number of conference papers published in the IEEE Xplore digital library that did not meet our guidelines. We took immediate action to ensure continued research integrity and subject matter alignment, and, after review, subsequently cleared or retracted papers as appropriate. Since that time, we have reviewed and increased the rigor of our processes to better ensure we detect articles that do not meet our standards…… We have formed a dedicated committee of IEEE volunteers and staff serving as subject matter experts and ‘gatekeepers’ for incoming conference content. This committee now provides an additional level of review ….”
More illumination of the issue comes from a presentation by the chair of the IEEE conference quality management committee, Lance Fung, an emeritus professor at Murdoch University in Australia. Fung’s description of the problem makes it clear the IEEE was being victimized by a sizeable group of scammers. He says the issues included “widespread and rapidly growing misconduct, mostly with a ‘pay to publish’ type of model” where conference paper authors would pay a conference registration fee just to have their paper published, and then never show up at the conference.
It’s also clear from Fung’s presentation that it wasn’t just authors who were problematic. Some of the conferences themselves organized back around 2009 and 2010 were scams. He notes quality issues that included acceptance of papers that were obviously generated by machines, that treated subjects wildly outside the scope of the conference, and paper stuffing behavior where a conference with 100 attendees would publish 1,000 papers in its proceedings.
The process of reviewing questionable papers from that era must have had humorous moments. Fung notes that some papers had technical content resembling undergraduate lab reports while others were only a couple pages long, and many of them suspiciously referenced only publications by the authors themselves.
Fortunately, it looks as though IEEE’s policing has had an impact. The organization rejected 140 conferences in 2011 but only a few dozen last year.
However, the lesson is clear: It’s probably wise to treat newly published technical papers as internet hoaxes until proven otherwise.