The Murky World of ‘Extremism’ Research

 ‘Extremism experts’ are playing with a heady concoction of colonial thought and unethical practice that endanger the subjects, the researcher and the field.

 

As an academic exploring ‘extremism’ and counter-extremism, research ethics are a constant feature of my work. Having spent years conducting interviews and ethnographies with groups classified as ‘extremist’, such ethical discussions represent a good way of approaching sometimes daunting fieldwork.

 

As such, I was interested to read Julia Ebner’s recent accounts of her research into far-right groups. Ebner, as an ‘extremism expert’, detailed her time ‘infiltrating far-right networks’ by juggling five different identities, as well as describing the ‘adrenaline rush’ she felt when conducting undercover work.

 

Ebner cast an archetypal image of a researcher in this field – of a hero against the bad guys, conducting daring subterfuge to bring back Promethean knowledge for the good of our security. The interviewers painted her as not only noble in intent, but lucky enough to experience a ‘thrill in pulling the wool over people’s eyes and potentially putting herself in danger’.

 

But, whilst such daredevil, devil-may-care framings may titillate those wanting to understand ‘extremism’, they sit uneasily with those familiar with academic practice. These interviews highlight some of the serious problems that have plagued research into ‘extremism’: the sensationalism, the bypassing of research ethics, the drawing of dubious conclusions, and the use of securitised narratives. Whilst Ebner and other ‘extremism experts’ may largely operate outside of academia, their work has significant implications for scholars operating in similar fields. If research into such groups is to have any kind of credibility, it must be repositioned to be grounded in practice that adopts both academic ethics at its heart.

 

Academic Ethics

Academic ethics bind researchers in a set of practices designed to ensure safety, good practice and the potential for replicability and the validating of findings. They are often messy, confusing and irritating to navigate for the researcher. But, for better or for worse, they offer a guide which comes into its own in discussions on ‘extremism’ and ‘terrorism’.

 

The field of studies into extremism and terrorism has been consistently dogged by accusations of intellectual poverty, a lack of adequate research and significant methodological shortcomings. Researchers have often used closed or classified sources that are impossible to replicate, place within time or location, in a practice which ‘casts a dark cloud over the entire terrorism studies community’. In a field which ‘attracts phoneys and amateurs as a candle attracts moths’, ethics procedures offer an interesting, though sometimes flawed, riposte (Ranstorp 2009: 26, 28-30).

 Trust

Ebner uses deception as a means of gathering information and ethnographic information. When considering research into difficult groups, it may seem like a sensible option to use such covert techniques. How else, surely, would you get potentially difficult actors to engage? Yet, as academic researchers, such work should be almost entirely off-limits.

 

All academic research is subject to a practice referred to as the Participant Protection Model (PPM), in which respect for those being studied is sacrosanct. Any research, no matter its subject, should give the participant the knowledge that they are part of a study. This comes from the Nuremberg Code and subsequent Helsinki Declaration , which sought to prevent the abuses of previous projects –Tuskegee, Milgram, Zombardo and Laud Humphrey – being repeated. In doing so, it put the wellbeing of the participant above all else, and enshrined the maxim of, first and foremost, do no harm.

 

By withholding the knowledge that they are being studied from a person, we undermine the delicate researcher-researched notion of trust. This might not seem like an issue for working with groups that have unpleasant or dangerous views. But, if we ignore certain safeguards for some groups, then we allow the removal of safeguards for others and become the arbiter of who deserves ethics and who does not. Ethics must apply to us all equally or it ceases to be meaningful.

 

But trust also allows for the verification of research and the continuation of future study. By engaging with so-called ‘extremist’ groups openly as a researcher, I have been able to build a working relationship whereby I can build further networks, follow up on issues missed and verify data from past interviews.

 

As an example, I was faced in the past with two separate interviewees detailing a contradictory timeline of events. Armed with conflicting data and being able to go back and check with both individuals, I worked with interviewees to draw together events and correct former misinformation. This not only led to a far more accurate output, it also revealed several relevant causes as to why such information had been misremembered.

 

This practice, of course, this carries its own risks – interviewees might decide to withdraw their consent after reading my interpretation – but the reward of checking the accuracy of work with the subject far outweighs this.

 

We also have a duty of care to future researchers. The research process does not stop at the end of our own projects: just as we stand on the shoulders of giants by drawing on their work, so later scholars will build on ours. ‘Burning’ participants through deception not only makes your own work largely unverifiable, it also makes the field more difficult – why would a group engage with a researcher if they had previously been deceived by another? Rigorous academic work is stifled, and the whole field suffers.

 

Often, one of the strongest hands you can play in research is openness and a willingness to learn. By approaching clandestine groups with transparency, I found not only was I often given a positive response, but it would sometimes yield engagement sometimes well beyond my initial request. Honesty is appreciated everywhere, and the vulnerability this involves is often recognised and rewarded.

Thrills

Another problem raised by Ebner’s interviews is the perceived ‘thrill’ of undercover work. Whilst this might sell newspapers and books, it causes problems for the field. Discussions on ‘extremism’ can carry the stench of neo-colonialism, replicating Islamophobic tropes and supporting oppressive security structures. In a field which is already guilty of drawing heavily on racist language, the researcher becomes complicit, their own thrills equivalent to ‘the satisfaction of the Great White Hunter who has bravely risked the perils of the [urban] jungle to bring back an exotic specimen’ (Lee 1995: 76).

 

In contrast, most academics will openly admit that research is largely incredibly dull – even in more trendy subjects such as ‘extremism’. As part of my doctoral research, I spent many months away from my home and my loved ones, in cities I barely knew, where the only sustained human contact with my research subjects. Many days were spent trudging from one event to another, in the sometimes-futile hope that I might gain a contact or interview.

 

Even when I gained success in agreeing an interview, the result might turn out to be a dud, the discussion disappointing or useless in substance[BS4] . For instance, some individuals who seemed important to the project would refuse to give much away or engage fully during an interview. Whilst this was initially dispiriting, the dullest of interviews could sometimes be more informative due to what they didn’t discuss, or else create new leads and connections later on. Encouraging the expectation of excitement risks missing vital information and nuances that can later be critical to accurate findings.

Power

In choosing to trick the research subject and flippantly bypassing research ethics, we expose ourselves and the research subject to danger. We open the engagement up to risks from counter-terror legislation, racism or violence – potentially enabling their (or our) arrest.

 

‘Extremism’ is an incredibly politically charged and value-laden term. Who or what is extreme is defined entirely by those doing the labelling (McNeil-Willson et al. 2019). By treating ‘extremists’ as subject to different ethics than others, we become unwitting components of a governmental security apparatus that has problematised all manner of groups and activists under its security lens. An academic’s aims are therefore not the same as that of the state – the very notion of social science is the critical probing of power. To unquestionably support state authorities in social science research is almost, by definition, a contradiction.

 

Furthermore, to conduct security work for the state under the guise of academic research exposes other researchers to danger. The death of Giulio Regeni and the current imprisonment of Patrick George Zaky in Egypt; the arrest of Matthew Hedges in the UEA; the imprisonment of Fariba Adelkhah, Roland Marchal, Kameel Ahmady and Kylie Moore-Gilbert in Iran – all were justified because the line between academic and state was seen by authorities as blurred. It is for the safety of ourselves and others that we must resist becoming agents of the state or complicit in its agenda.

 

This is even more urgent in the current political context. Moves by the UK Government to play towards a hard-right populist base – from the deportation of members of the Windrush Generation to the stripping of citizenship – have created a ‘hostile environment’ towards human rights and cemented a hierarchy of rights based on skin colour. As such, researchers from minority backgrounds or from outside the UK are placed in precarious and dangerous positions whereby conducting research into contentious issues – or even simply conducting research outside the UK – may result in their loss of rights. In an international environment where governments are attacking both academia and minorities, it is imperative that all researchers are aware of the implications of their work on those with less privilege, and their positionality within it.

 Using Ethics

This is not to say that ethics procedures should limit or control the topics of research. Ethics processes are there to find answers for how to safely conduct research, not to police our activities. Ethics boards are imperfect – a messy process of negotiation, of give and take, creating organic consideration of what could go wrong for the research subject and the researcher themselves.

 

Nor is this an attack Ebner. Her work here is emblematic of wider problems in the field of study. While ‘extremism experts’ operate within security-focussed think tanks, their work sits uneasily on the edge of academic research and impacts directly on our ability to engage with difficult actors and ask necessary questions.

 

Nor is all undercover necessarily wrong. Important undercover work is and has been conducted by journalists and activist groups such as Hope Not Hate. But their work is grown from anti-racist work and is credible because of it is not implicated in the securitised ‘counter-extremism industry’ that is so often guilty of replicating structural racism.

 

The freedom of academics is undoubtably degraded by the image of a duplicitous researcher. Such actions sustain criticism of the field as methodologically suspect, securitised, implicated in government agendas and lacking adequate oversight. Good research practices and ethics procedures therefore aren’t here to stop research – they are here to prevent the dangerous excesses of the security state, keeps us safe and to enable important research to continue.

 

We can and must change the current state of play. The tricking of research subjects into giving information – no matter their views – must be ended, along with the uncritical amplification of problematic government counterterror narratives. Current ‘extremism experts’ are playing with a heady concoction of colonial thought and unethical practice. But by participating in the building of research ethics and by drawing on good ethnographic practice and critical understandings of security, we can support all of academia to ask some of the most crucial questions facing society.

 

~

 

Issues in this article are raised in the following chapter: McNeil-Willson, R. (due 2020). ‘Here Be Dragons: Navigating the Problems of Researching “Terrorism” and Critical Terrorism Studies’, in Charles, L., Pappe, I. & Ronchi, M., Researching the Middle East: Cultural, Conceptual, Theoretical and Practical Issues, Edinburgh: Edinburgh University Press. 

Dr Richard McNeil-Willson

Dr Richard McNeil-Willson is a Research Associate at the Robert Schuman Centre for Advanced Studies, European University Institute, Florence. He specialises in discussions over ‘extremism’ and counter-extremism in Europe and is the main researcher for the European Commission BRaVE project. @mcneilwillson

Previous
Previous

Ethnography: Rethinking from the Interstice

Next
Next

Fieldwork in the context of Covid-19