AI, Ethics and CSR: Rebuilding Human Connection in the Age of Automation
As AI reshapes marketing and call centers, CSR must evolve toward relational sustainability—protecting empathy, trust, and meaning in automated workplaces. Ethics?
VEILLE SOCIALEMANAGEMENT
LYDIE GOYENETCHE
11/3/20258 min read


Across industries, a silent transformation is reshaping the modern workplace.
From Silicon Valley to Paris and Madrid, customer service departments and call centers are replacing human voices with conversational AI systems designed to process thousands of interactions simultaneously. According to Goldman Sachs, nearly 300 million jobs worldwide could be partially or fully automated by 2030, with the service sector—particularly sales and customer support—at the forefront of this shift. In France, an OECD report (2024) estimates that 27% of jobs are highly exposed to automation, while another 36% will see their core tasks deeply transformed by AI. In call centers, productivity gains can reach up to 60%, but the human cost remains largely invisible.
These numbers reveal a deeper logic—the systematic optimization of relational labor.
A telemarketing company that replaces fifty employees with a Mistral-powered chatbot may cut operational costs by 70%, yet in doing so, it also dismantles part of what once made conversation meaningful: reciprocity, tone, hesitation, and empathy. What was once a human encounter becomes a data flow. Unsurprisingly, indicators of client satisfaction often decline, employee burnout increases, and long-term loyalty erodes. A 2023 Gallup study found that only 21% of employees worldwide feel engaged in their work—a historic low in an era of hyper-automation.
This shift is not only economic; it questions the very nature of intelligence within organizations.
When algorithms anticipate needs, write prospecting emails, or handle customer interactions, the line between technological assistance and cognitive substitution begins to blur. Yet, as developmental psychologists such as Donald Winnicott and Sylviane Giampino remind us, human intelligence does not emerge from performance but from emotional security. It thrives in environments where trust and recognition precede efficiency. To replace these conditions with automated protocols is to risk undermining the symbolic structure of work itself.
In this context, corporate social responsibility (CSR) can no longer be limited to offsetting job losses or publishing carbon-neutral reports. It must expand to include what we might call psychological and relational sustainability—the responsibility to maintain emotional and cognitive safety within workplaces increasingly mediated by machines. An organization that automates its interactions without reflecting on their human implications may end up destroying its most valuable asset: the collective capacity to connect, imagine, and care.
The challenge, therefore, is not merely economic but civilizational.
Between the offices of San Francisco, where models like Mistral and ChatGPT are refined, and the open spaces of Lyon or Madrid, where their impact is already tangible, one urgent question emerges: what place will business grant to human fragility in an algorithmic world obsessed with performance? The CSR of tomorrow must be written within this tension—between technology and care, optimization and attention, algorithmic intelligence and relational wisdom. For only by reconciling these dimensions can companies ensure that intelligence—whether human or artificial—remains at the service of connection, not its disappearance.
IA ethic and business
Thinking as Containment: Winnicott and the Need for Emotional Security
To think is first to be held. Before it becomes a cognitive process, intelligence is an act of trust — a movement toward the other and toward the world. Donald Winnicott, the British pediatrician and psychoanalyst, described this dynamic through the concept of holding: the psychological function of containment provided by the caregiver that allows the infant to feel secure enough to explore. In organizational life, this “holding environment” becomes the emotional and symbolic framework that allows employees to think, take risks, and innovate. A team that feels supported — psychically “held” — develops adaptive and creative intelligence. Conversely, an anxious or fragmented environment generates rigidity, fear of error, and disengagement.
This insight has become urgent in a world where the automation of communication is accelerating.
Call centers that once relied on human dialogue now depend on conversational bots that respond in milliseconds. In digital marketing, automated email workflows target prospects with behavioral precision, but without emotional resonance. When communication becomes a mechanical sequence rather than a space of encounter, the psychological holding environment collapses. Employees and customers alike begin to feel reduced to data points in a predictive system rather than participants in a shared meaning.
The Cognitive Ecology of Safety: Giampino and the Conditions for Thought
French psychologist Sylviane Giampino extends Winnicott’s theory into the modern social and educational landscape. Her research demonstrates that intellectual development thrives in emotional security — when the need for safety is acknowledged and met. Transposed to organizations, this means that true innovation depends not on constant optimization but on the quality of psychological climate.
A company obsessed with metrics, automation, and efficiency may increase output, yet it risks eroding the invisible ecosystem that sustains human intelligence: belonging, trust, and symbolic recognition. In sales and marketing, this erosion manifests subtly — in the disappearance of personalized outreach, the cold precision of automated campaigns, and the loss of genuine listening that once defined client relations. Giampino’s findings remind us that without relational safety, language itself becomes impoverished. Words lose their weight, and communication turns transactional rather than transformative.The Skin of Thought: Anzieu and the Symbolic Envelope of Work
Didier Anzieu’s concept of the Skin-Ego deepens this understanding. He proposed that the human psyche is organized like a skin — an envelope that both protects and connects. Physical touch, emotional recognition, and early sensory experiences shape our ability to contain emotions and construct meaning. In the workplace, this psychic “skin” translates into a sense of symbolic cohesion — rituals, shared narratives, and moments of collective presence that give texture to organizational life.
When a company replaces its relational interfaces with automation — when emails are generated by templates and customer responses are parsed by algorithms — it risks thinning this symbolic skin. The collective loses its sense of warmth and continuity. The organization becomes porous, overexposed, and emotionally uncontained.
From Optimization to Care: A New Ethic of Responsibility
Through Winnicott’s holding, Giampino’s emotional ecology, and Anzieu’s Skin-Ego, a common truth emerges: human intelligence is relational before it is computational. It does not arise from data but from connection. If companies wish to integrate AI ethically into their operations, they must first rebuild this invisible framework of trust — a culture that supports thought before measuring it, that protects before it performs.
Without such a framework, the automation of communication becomes the automation of disconnection. And in that silence — between an unanswered call and an algorithmic reply — something essential is lost: the shared humanity that once made both work and marketing acts of presence rather than systems of prediction.
IA or illusion of optimization ?
The Illusion of Performance: When Rationalization Becomes Dehumanization
Across industries, automation has quietly redrawn the map of customer relations.
In call centers from Bangalore to Chicago, AI-driven chatbots now handle up to 80% of customer inquiries, according to Deloitte (2024). A McKinsey report estimates that 45% of customer service tasks can be fully automated, allowing some companies to cut operational costs by 60–70% while maintaining round-the-clock responsiveness.
But beneath the surface of these performance metrics lies a silent erosion: the disappearance of human reciprocity. What once carried hesitation, warmth, or empathy is now replaced by precision, speed, and predictability. Employees who once listened now monitor dashboards; conversations have become metrics. Efficiency increases—but the symbolic fabric of dialogue unravels.
In the U.S. alone, the contact center workforce has declined by nearly 25% between 2019 and 2024, as firms reallocate budgets toward AI infrastructure. Yet, surveys from Gartner reveal that customer satisfaction drops by 18% when users interact primarily with automated systems, despite faster response times. This paradox defines the new economy: streamlined communication, but an impoverished relationship.
The Hollowing of Language: Emotional Standardization in Digital Marketing
The same pattern unfolds in web marketing.
AI-powered systems like HubSpot, Brevo, or Salesforce Einstein now automate over 70% of outbound email campaigns, generating “personalized” content through behavior-based triggers. Conversion rates rise modestly, but depth and nuance decline. The language of marketing has become syntactically rich but semantically poor—efficient, optimized, emotionally neutral.
This shift transforms empathy into simulation. What appears as personalization is often just algorithmic mimicry: variables standing in for genuine understanding. In B2B and B2C alike, clients sense this artificiality. A 2023 Accenture survey found that 62% of consumers prefer slower, human-led communication when making complex purchasing decisions, even if automated options are available. In the quest for frictionless efficiency, companies risk losing the emotional friction that once generated trust.
The Collapse of the Organizational “Holding Environment”
This technological rationalization doesn’t just affect clients—it reshapes the inner life of organizations.
Winnicott’s concept of holding—the safe psychological space that allows thinking to emerge—has no equivalent in the algorithmic enterprise. In workplaces dominated by KPIs, automation dashboards, and predictive analytics, employees report rising anxiety and emotional fatigue. A 2024 Gallup global study revealed that 59% of workers feel disconnected from their company’s mission, a figure directly correlated with the expansion of automation in administrative and communication tasks.
When dialogue is replaced by scripts, and empathy by data, the organization ceases to contain. Employees no longer feel seen or supported. Cognitive engagement drops, creativity stagnates, and collaboration gives way to silent compliance.
Giampino’s insight becomes prophetic here: intelligence does not thrive under pressure but within affective security. Without it, organizations may become operationally efficient yet psychically hollow—machines that think without feeling.
Toward a CSR of Relationship: Restoring the Symbolic Dimension of Work
In this landscape, Corporate Social Responsibility (CSR) must evolve.
Beyond carbon footprints and ethical sourcing, CSR must now embrace what we might call relational sustainability: the preservation of human presence within automated systems. Companies that invest solely in optimization risk eroding their social and symbolic capital—the trust that binds teams, clients, and society together.
Reintroducing slowness into customer relations, valuing genuine conversation, and protecting employees from algorithmic exhaustion are no longer luxuries; they are strategic imperatives. Studies by MIT Sloan (2024) show that companies integrating human-led communication alongside automation enjoy 23% higher long-term customer retention than those relying solely on AI channels.
True innovation, then, lies not in the machine’s ability to simulate speech, but in an organization’s courage to preserve meaning, presence, and empathy amid the digital noise.
AI can assist thought—but if it replaces it, the human capacity to symbolize, connect, and care begins to fade. And with it, the possibility of building workplaces that are not only intelligent, but truly alive.
CONCLUSION
In an age defined by automation, the challenge facing organizations is no longer simply technological — it is existential. The promise of artificial intelligence lies in its capacity to extend human potential, not erase it. Yet the current trajectory of automation in customer relations, marketing, and management reveals a drift toward disembodied efficiency, where connection is sacrificed for speed and symbolic depth gives way to statistical accuracy.
The figures speak for themselves. Global AI adoption in enterprises surpassed 55% in 2024 (PwC), and the generative AI market is projected to exceed $1.3 trillion by 2032. But alongside this rapid expansion, employee disengagement has reached its highest level in a decade, and the proportion of consumers who perceive brands as “authentic” has fallen below 30% (Edelman Trust Barometer, 2024). These trends suggest that what is being optimized operationally may be impoverished relationally.
Reclaiming balance requires a paradigm shift — from a model of automation for productivity to one of augmentation for presence. This is not nostalgia for pre-digital communication; it is an ethical and strategic necessity.
A truly responsible company will not seek to simulate empathy through algorithms, but to anchor technology in the living experience of care. It will use AI to lighten cognitive overload, not to erase human dialogue; to enhance accessibility, not to suppress diversity of voice; to facilitate learning, not to replace the slow maturation of thought.
This reorientation invites a new frontier for Corporate Social Responsibility — one that includes psychological sustainability. Just as environmental CSR seeks to protect ecosystems from overexploitation, relational CSR must protect human ecosystems from cognitive exhaustion and emotional fragmentation. The sustainability of tomorrow depends as much on empathy as on energy efficiency.
If Mistral and ChatGPT have taught us anything, it is that intelligence, stripped of its relational roots, becomes sterile. True thought — like true innovation — begins not in data centers, but in trust, recognition, and the fragile dance of human presence. The ethical question is no longer whether machines can think, but whether humans will continue to feel.
To preserve that capacity is not to resist progress, but to guide it — toward a future where technology amplifies what makes us most human: our ability to connect, to imagine, and to care.


EUSKAL CONSEIL
9 rue Iguzki alde
64310 ST PEE SUR NIVELLE
07 82 50 57 66
euskalconseil@gmail.com
Mentions légales: Métiers du Conseil Hiscox HSXIN320063010
Ce site utilise uniquement Plausible Analytics, un outil de mesure d’audience respectueux de la vie privée. Aucune donnée personnelle n’est collectée, aucun cookie n’est utilisé.