Youssef with Curly Hair: Navigating the Ethics of Digital Trauma Archives and AI Imagery
Introduction
From mere tools of communication, social media platforms have become vast repositories of collective memories, especially in times of conflict. Archiving traumatic events on these platforms weaves a complex web of emotional responses, shared experiences, and collective trauma. In that respect, digital content could be used to encapsulate widespread grief and symbolically represent collective trauma, as this exposition will show.
The Context of Collective Trauma in Digital Archives
Generally, collective trauma is defined as large-scale, psychological effects on large groups of people after major events like wars, natural disasters, or any other catastrophic event. The impact is not at the individual level only; instead, it also disturbs the identity and collective memory of the community, and at the same time reinforces internal ties. Putting these experiences down on paper, so to speak, into a social media platform allows an outlet for people to process their grief collectively and document events.
In cases of conflict, social media turns into a very strong tool for visualization and humanization of the victims of such atrocities. According to Kaplan (2018), digital archives in the case of conflict help a lot in the retention of historical discourses and in creating a commiserating society. Yet, despite Kaplan’s observations, as much as these archives conserve frightening experiences, they also perpetuate trauma in the sense that the population keeps memoirs of them. After all, the mere publishing of a post on social media automatically archives it in less obvious ways. Anything that has been shared can be accessed at any moment in the future simply by typing in a few keywords, and as such, every post becomes an archival record by default when it is published.
The Case of "Youssef with Curly Hair"
The video (Alaa Shaath | علاء شعث,2023) that went viral during the most recent Palestinian-Israeli conflict, featuring "Youssef with curly hair," is one of the most stunning examples of collective suffering preserved on social media. Viewers were extremely moved by his mother's comment that "Youssef has curly hair, white, and beautiful" as she desperately looked for him. The tragedy culminates with Youssef's mother discovering at the hospital, where her husband worked, that Youssef had died in the latest Israeli bombing.
In the video, Youssef's mother is seen desperately searching for her son while repeatedly mentioning his curly hair. This sentence, which represents the purity and beauty lost in the struggle, now changes people's feelings into grief. The phrase and video were widely shared on social media, which further highlighted the shock that viewers felt as a group.
The repeated phrase “Youssef has curly hair, white and beautiful” became a symbol of shared sorrow and collective mourning. This phenomenon illustrates how digital content can transcend individual experiences and become part of a larger, shared narrative of suffering.
Yet, here is the real question: Was it the fact that this moment captured and circulated digitally allowed for such a collective mass mourning, or was it this deeply humanizing of the mother’s words that reached a universal cord? The tension between the digital content going viral and the resonance of a tangible human detail reaches a reflectiveness of what truly allows certain stories to transcend individual experiences and become shared narratives of collective suffering.
Psychological and Sociological Perspectives
Such digital content has a significant psychological impact. Regardless of whether they are directly involved in the conflict, viewers suffer from secondary trauma. According to Morozov (2019), witnessing traumatic events through media can lead to feelings of helplessness, anxiety, and sadness, similar to those experienced by direct witnesses of the events. This mode of sharing – where content is amplified without enough context or space for reflection – amplifies the psychological toll. While there is, of course, some risk of re-traumatization, this way of coping together is enabled by the fact that trauma can be collectively processed through shared digital content.
From a sociological perspective, the Youssef case illustrates how digital archiving can produce a cohesive story. Halbwachs (1992) argues that collective memory is socially constructed and is often maintained through shared narratives and symbols. Here, Youssef's curly hair and the corresponding phrase took on symbolic meaning, strengthening the collective memory of the conflict and its human cost.
Archival Challenges and Ethical Considerations
Though digital archives help in recording and preserving collective memories, they present many challenges. The danger of continuous retraumatization from exposure to trauma materials on one side, and desensitization on the other, are major hazards. Another ethical matter concerns the way people are represented in situations of trauma.
The consent and dignity of persons whose documents are contained in these archives have to be taken as a priority. The American Psychological Association says the following about cases where images and stories of individuals are being shared widely: "Consider the potential long-term psychological effects on the person whose image or story is being disseminated." It was, therefore, incumbent that in the case of Youssef, his story had to be narrated tactfully and without exploitation or further harm.
While digital archiving provides a very strong moment of collective memory and a sense of solidarity with regard to traumatic events, it also brings about a major concern over "compassion fatigue", Moeller (1999). This phenomenon is concerned with the emotional exhaustion viewers experience when constantly being exposed to images of human suffering. In this relentless flow, what is seen is how such images can overwhelm viewers, leading to desensitization – that is, making the pain and suffering of others banal – stripping the image of emotional impact and undermining any impulse for action.
The function of the images of horror, very common and often unfiltered in social media, is to bombard audiences with visuals of human suffering in its most raw and unmediated forms. Tester (1997) holds that exposure of this nature places one in a state of "over-familiarity" with the content of horror, by which it becomes routine and the audience's empathetic response is dulled. Through the use of repetition, the images become inevitable, deflated of shock value and urgency that they might have held.
The phenomenon presents severe difficulties for digital archiving, especially when the aim is to construct an involved and empathetic audience. Being consecutively exposed to distressing content may trigger emotional defense mechanisms in viewers, causing a numbing effect that eventually makes them less responsive to the suffering of others. On one hand, such materials can be retraumatizing for victims and people close to the victims, putting them through their pain all over again. On the other hand, with continuous exposure to such disturbing content, viewers far from the scenes fire off their emotional defense mechanisms, numbing themselves. In due course, this may lead to hardening against others' suffering in time to come, making them not sensitive to the gravity of an event. This can be especially serious in the context of conflicts like the Israeli-Palestinian one, with its huge human tragedy and the need for global awareness and action.
Furthermore, the normalization of such images will have far-reaching implications for how societies process and respond to collective trauma. As images of violence and loss begin to sear themselves into the public consciousness, these events risk being treated with a sense of inevitability, as opposed to crises that require urgent and sustained intervention. Indeed, the symbolic power of digital content is a force for solidarity and collective action; however, the same could be worn down by too much habituation to scenes of horror through audiences.
This desensitization also further obfuscates the responsibilities that archive and share such content. This desensitization complicates the work of those who would archive and make such content available. While the intention may well be to bear witness and in some sense monumentalize these moments for the historical record, there is a hair-thin line separating awareness-raising from inducing emotional exhaustion by virtue of sustained exposure.
The potential for compassion fatigue does warrant a more reflective approach towards digital archiving, one that focuses on the need to document and share but also reflects on the psychological well-being of the audience.
Compassion fatigue is a phenomenon that plays a role in the complexity of digital archiving and sharing content in this age of social media. The "Youssef with Curly Hair" moment clearly shows that the force of digital content is undeniable for the implication of collective mourning and solidarity. However, at the same time, the risks of desensitization underline the need for a much more careful way to not only share but also curate and archive traumatic imagery. It is a delicate act of balancing the documentation of these historical acts and the amount of psychological impact that will leave the audience shaken. That way, these archives would be working for the purpose of their establishment without taking away from the humanity of the people in whose hands the characters they feature were held.
AI-Generated Imagery and Ethical Concerns:
Generative AI will open up new ways for content creation to become a much more active tool for archiving and memorialization, from the visual to the textual: from the video tributes for specific moments of mass violence (Makhortykh, 2019) to the digitally created images and drawings that give public presence and visibility to on-going and past suffering (Lundrigan, 2020). For instance, an AI-made picture captioned "All eyes on Rafah" has over 41 million shares on Instagram to advocate for support for the displaced people besieged and under attack in the area (BBC News Arabic, 2024). Drawing from such a level of engagement, one can deduce that AI has a potential increase in public participation and extends access to knowledge about mass atrocities that helps researchers and the general public meaningfully connects with historical events.
Yet the very technology that has the potential to amplify authentic memory practices also raises some major ethical risks. Generative AI can be used to distort or deny historical events, including mass atrocities. That can be made both deliberately and accidentally. The intentional usages of AI entail the creation of huge amounts of content representing a distortion of facts or denial of atrocities through the creation of fake visual evidence or the production of fabricated documents that mislead the public. These might be manipulated pictures that look the same as real ones but may not have some crucial information or may totally alter the story by face-changing or key features of the picture to confuse the mind of the viewer. In such a situation, it may be important to revert to oral testimonies from eyewitnesses whom one trusts, since such first-hand accounts provide truth that grounds digital manipulations and cannot be so easily shaken.
Additionally, AI systems are subject to 'hallucinations': information gaps that the system fills in with generated content where there is no real fact (Beutel, Geerits, & Kielstein, 2023). In terms of archiving violence cases, this could be the fabrication of facts on victims or perpetrators, which may then become the knowledge base of how certain events took place in history. If not put under control appropriately, such approaches will continue reverberating; thus, AI-generated content risks continued distortion of historical facts through the mistrust of authentic sources of information.
That AI can create content nearly indistinguishable from human material raises further questions about the credibility of information related to mass atrocities. AI-generated, misleading content could be mistakenly accepted as a fact, furthering fake claims and fueling the already tremendous problems of denial and distortion in historical narratives. Even if such users could recognize and remove extremist or misleading claims, it would be the sheer volume of subtly inauthentic content created by AI that will wear away trust in historical sources over time. In atrocity memorialization, where accuracy and informational integrity are paramount, this erosion of trust is particularly dangerous (Vaccari & Chadwick, 2020).
Besides such ethical concerns, one can hardly be blind to the psychological consequences of imagery created by AI. Such wide application of AI in the production of painful or distressing images could lead to audiences becoming desensitized and to the previously described phenomenon of "compassion fatigue." In addition, the normalization of AI-generated trauma imagery further risks making it less likely to stir meaningful action or empathy. There is increasing distrust on the part of the viewers, who could become incredibly hard-pressed to tell the difference between what is real and what is fabricated, thus further muddling public discourse around key issues.
Conclusion
The "Youssef with Curly Hair" case exemplifies how digital archives on social media can turn a single story into shared experiences of collective trauma. In allowing for collective grief and helping to maintain collective historical memory, such platforms are essential in showcasing times of tragedy. However, they pose a potential danger of desensitizing the audiences by way of repeated exposure to suffering, which in itself can cause "compassion fatigue."
AI-generated imagery opens a new set of ethical questions. While AI can enhance memorialization, it does so at the risk of more distorted historical facts and less trust in authentic sources. The likelihood that AI will continue to churn out misleading content and the psychological impacts of normalizing traumatized imagery beg questions about the way these tools are impacting public empathy and action.
The balancing act that digital archives have to master in the preservation of collective trauma, is how to both document atrocities and violence, as well as protect the psychological well-being of the subject and the audience. Only then will archives serve their intended purpose without exploiting or diminishing the human experiences they document.
These concerns might further lead some to argue that the return to orality – the trusted word of the witness – paves the way for a much stronger guarantee of authenticity in memories. The need for documenting collective trauma needs to be weighed against protecting the psychological well-being of the subjects themselves and the audience who is viewing it. Only then can these platforms serve their purpose without any feelings of exploitation or demeaning of the very human experience they try to immortalize.
Ultimately, the hand-drawn picture accompanying this article of Youssef with blond, curly hair and wings is one manifestation of how communities push back against the overwhelming nature and sometimes impersonal character of digital content (Figure 1 | الرسم 1). Simple, humanized depictions like this contrast with the sea of digital imagery as poignant reminders of lost innocence and the human lives behind cold statistics. It provides a softer, more intimate way to process grief and remember victims.
Figure 1. Youssef with Curly Hair (sally_samir_, 2023)
الرسم 1. يوسف شعره كيرلي (سالي سمير 2023)
References
Alaa Shath | علاء شعث [@3laashaath]. (2023, October 20). يوسف، ٧ سنين، شعره كيرلي أبيضاني وحلو| Youssef, sab’a saneen, sha’ro curly abyadani w helou. [Video attached] [Post]. X. https://x.com/3laashaath/status/1715348796534649079?s=48
American Psychological Association. (2017). Ethical principles of psychologists and code of conduct. Retrieved from https://www.apa.org/ethics/code
BBC News عربي. (2024c, May 29). كل العيون على رفح: حملة تضامنية تحظى بتفاعل أكثر من 40 مليون شخص. https://www.bbc.com/arabic/articles/c511609vx5go
Beutel, G., Geerits, E., & Kielstein, J. T. (2023). Artificial hallucination: GPT on LSD? Critical Care, 27, 1–3. https://doi.org/10.1186/s13054-023-04425-6
Halbwachs, M. (1992). On Collective Memory. University of Chicago Press.
Kaplan, A. (2018). Digital archives and collective memory. New Media & Society, 20(9), 3241-3258.
Lundrigan, M. (2020). #Holocaust #Auschwitz: Performing Holocaust memory on social media. In H. Earl & S. Gigliotti (Eds.), A Companion to the Holocaust (pp. 639–655). Chichester: Wiley.
Makhortykh, M. (2019). Nurturing the pain: Audiovisual tributes to the Holocaust on YouTube. Holocaust Studies, 25, 441–466. https://doi.org/10.1080/17504902.2018.1468667
Moeller, S. D. (1999). Compassion Fatigue: How the Media Sell Disease, Famine, War and Death. New York and London: Routledge.
Morozov, E. (2019). Trauma and media: The psychological impact of viewing traumatic events. Psychology Today, 31(3), 45-56.
sally_samir_ (2023, October 26). اسمه يوسف..شعره كيرلي،وأبيضاني وحلو..! #ليسوا_أرقامًا | Esmo Youssef.. sha'ro curly, w abyadani w helou..! #laysou_arqaman . [Image attached] [Post]. Instagram. https://www.instagram.com/p/Cy3c6PWNYz6/
Tester, K. (1997). Moral Culture. London, Thousand Oaks, California: Sage.
Vaccari, C., & Chadwick, A. (2020). Deepfakes and disinformation: Exploring the impact of synthetic political video on deception, uncertainty, and trust in news. Social Media + Society, 6(1), 1–13. https://doi.org/10.1177/2056305120903408