This is to report on a couple of recent publications relevant to IFIP’s focus on “information processing”. The first is a set of recommendations by the World Association of Medical Editors (WAME) on “Chatbots and Generative Artificial Intelligence in Relation to Scholarly Publications” (https://wame.org/page3.php?id=106).
This short list of recommendations (starting with “Chatbots cannot be authors”) is an early attempt to rein in the understandable enthusiasm chatbots have caused in the world of academic publishing, since the non-intentional language of chatbots, or “botfo”, needs to be handled with care. As a recent paper put it, chatbots, “are unaccountable, and can’t think, judge or be jailed.”
The second is an editorial on nuclear risk published in over 130 journals in the week of 19 May. The nuclear age is continuing apace, increasingly with the help of digital technology and now artificial intelligence, and in the health sector, we have been looking to the World Health Organization to review and report on the health aspects of nuclear weapons. It is often forgotten that the health consequences of nuclear bombs are not only the obvious ones that occur when a nuclear weapon is actually exploded, but there are also significant health consequences arising in the development, storage, testing and upgrading of weapons.
The editorial, a product of the Nobel-Prizewinning International Physicians for the Prevention of Nuclear War and leading journal editors, calls on WHO to renew its mandate to report on nuclear weapons and health. WHO’s previous mandate lapsed in 2020 and it hasn’t produced a report on the subject since 1987.
An Open Access version of the editorial is freely available at the BMJ: https://www.bmj.com/content/389/bmj.r881.
Chris Zielinski
WG 9.2 on Social Accountability