Thorny Roses: Investigating the Dual Use Dilemma in Natural Language Processing

04/17/2023
by   Lucie-Aimée Kaffee, et al.
0

Dual use, the intentional, harmful reuse of technology and scientific artefacts, is a problem yet to be well-defined within the context of Natural Language Processing (NLP). However, as NLP technologies continue to advance and become increasingly widespread in society, their inner workings have become increasingly opaque. Therefore, understanding dual use concerns and potential ways of limiting them is critical to minimising the potential harms of research and development. In this paper, we conduct a survey of NLP researchers and practitioners to understand the depth and their perspective of the problem as well as to assess existing available support. Based on the results of our survey, we offer a definition of dual use that is tailored to the needs of the NLP community. The survey revealed that a majority of researchers are concerned about the potential dual use of their research but only take limited action toward it. In light of the survey results, we discuss the current state and potential means for mitigating dual use in NLP and propose a checklist that can be integrated into existing conference ethics-frameworks, e.g., the ACL ethics checklist.

READ FULL TEXT

page 5

page 7

research
08/31/2022

Efficient Methods for Natural Language Processing: A Survey

Getting the most out of limited resources allows advances in natural lan...
research
11/10/2019

Not All Claims are Created Equal: Choosing the Right Approach to Assess Your Hypotheses

Empirical research in Natural Language Processing (NLP) has adopted a na...
research
05/03/2022

Meta Learning for Natural Language Processing: A Survey

Deep learning has been the mainstream technique in natural language proc...
research
08/26/2022

What Do NLP Researchers Believe? Results of the NLP Community Metasurvey

We present the results of the NLP Community Metasurvey. Run from May to ...
research
08/27/2023

Examining User-Friendly and Open-Sourced Large GPT Models: A Survey on Language, Multimodal, and Scientific GPT Models

Generative pre-trained transformer (GPT) models have revolutionized the ...
research
08/28/2016

What to do about non-standard (or non-canonical) language in NLP

Real world data differs radically from the benchmark corpora we use in n...
research
06/22/2022

Enhancing Networking Cipher Algorithms with Natural Language

This work provides a survey of several networking cipher algorithms and ...

Please sign up or login with your details

Forgot password? Click here to reset