Darknet
We are going to help to deal wich your data leaks emerging in the Darknet.

Between research and practice - how we shape digital change in a human way through dialogue.
Digital transformation. Artificial intelligence. Cybersecurity. Industry 4.0. Hardly a day goes by without new buzzwords, new tools, new risks – and new promises. In the daily confrontation with new technologies, whether in the public debate, in companies or in everyday private life, one thing becomes particularly clear: There is rarely a lack of innovation, but often a lack of orientation. While artificial intelligence, automation and digital systems are developing rapidly, many people remain unclear about what these developments mean for them in concrete terms – professionally, socially, but also emotionally.
We have set ourselves precisely this task: Research with substance, but without an ivory tower. For over 20 years, we our Corporate Research Team has been driving forward future-oriented topics, far beyond purely technical perspectives. Our specialty: Thinking outside the box and talking at eye level. Because technology affects us all. So everyone should be able to have their say.
This is where I come into play: In my work as a psychologist and senior researcher in the Research team, which has been at the forefront of cybersecurity research for over two decades, it is precisely this interface between man and machine that concerns me. More precisely: The relationship between the two. Because as much as we talk about technical potential, risks, advances and new areas of application, the central question remains unanswered: How is technology changing us – as individuals, as organizations, as a society?
For more than eight years, my research has focused on the psychological dimensions of artificial intelligence, focusing on communicative AI, in particular the question of how people build trust in machines, how this trust influences how AI is used – and what consequences arise from this. Recently, especially in the field of mental health. An area in which physical, psychological and technological safety are extremely critical.

Trust is not a simple state, but a complex interplay of expectation, experience, interaction and context. It becomes particularly exciting when people begin to see artificial systems not just as tools, but as communicative, sometimes even social actors. In projects such as the Artificial Intelligence Quotient (AIQ), in which we have investigated the cognitive performance of chatbots, or in current studies on so-called synthetic relationships, we are addressing precisely these questions: How does an artificial character act in a conversation? What expectations does it raise? And what happens when people start to enter into relationships with it – whether in the workplace, in care, in education or in the private sphere? How does this change our movements in cyberspace?…and in the interpersonal space?

The theory that technologies are becoming increasingly emotionally connoted – for example, when voice assistants are given a name, a face or respond in a way that suggests empathy – is not new. However, it is only in recent years that the psychological and ethical questions associated with this have become clear. What does it mean, for example, when a single person finds a conversation partner in an AI character who is available day and night, never contradicts them and seems to pay perfect attention? What responsibility do developers bear in such scenarios – and how do we deal socially with this shift in traditional forms of relationships?
Our work is not about presenting simple solutions. Rather, we try to open up spaces for dialog and reflection. Technology is not a law of nature, but can be shaped. And design begins with language, with the metaphors we choose, with the images we use, with the values we convey. That’s why we also work with creative formats: In workshops, for example, we let participants build a simple companion bot or simulate relationship dialogs with AI avatars in order to make central concepts such as autonomy, attachment or control not only theoretical but also emotionally tangible.
In recent years, the demand for our knowledge has grown steadily, whether as a source of inspiration for strategic decision-making, as a critical voice in public debates or as a sparring partner in innovation processes. There is a particular demand for private learning formats that not only inform, but also inspire further thought, open up perspectives and enable emotional and intellectual engagement with technology in a protected space.
Keynotes and presentations form a central part of our work. Whether at major business forums, industry-specific events, internal management conferences or inspiring diversity events, our contributions are designed to convey scientifically sound content in an understandable and relevant way. It is not only about presenting current research, but also about activating new thought processes in organizations that are looking for orientation.

A particular milestone was our appearance at TEDx, which not only provided us with an international stage, but also gave us lasting visibility online. This presence shows that our topics are in tune with the times and that our perspectives have an impact beyond academia and shape social discussions.
Editorial teams also regularly draw on our expertise in documentaries and media reports. Swiss television in particular has repeatedly incorporated our research, for example in programs on the darknet, artificial intelligence or digital twins. This collaboration makes it possible to bring complex psychological and technological topics to a broader public – in an accessible and differentiated way.
Podcasts have established themselves as another important medium for addressing specific target groups directly. Whether in an international context or locally anchored – we regularly take part in discussion formats in order to classify current trends, discuss controversial issues and convey new perspectives. In doing so, we always focus on the respective audience: A business podcast requires different examples than an educational channel – but the standard remains the same.
Workshops that go beyond the mere transfer of knowledge are particularly effective and popular. While keynotes inspire reflection, workshops enable real experience. In a typical setting, participants build a simple companion bot, for example, and experience directly how it feels to engage with an artificial actor. This experience often changes their own attitude: What previously seemed abstract suddenly becomes tangible. This creates fertile ground for further discussions about the design, limits and potential of such systems.

Finally, guest lectures at universities are an important part of our work. We value the opportunity to engage in dialog with the next generation of decision-makers, designers and researchers. The exchange with students not only opens up fresh perspectives, but also keeps our research alive where it should have a long-term impact – in the minds that shape our future.

Back to our core business, this means that a psychological perspective is crucial, especially in the world of cybersecurity, which is often thought of in purely technical terms. Security does not start with the system, but with people – with their expectations, their behavior, their perception of risk and trust. If you understand how people make decisions and deal with uncertainty, you can develop systems that are not only secure, but also ethically responsible and user-centered.
We believe that scientific findings only have an impact if they are communicated in an understandable, relevant and practical way. Our values are reflected in everything we do: In research, in consulting, in lectures, workshops and the daily exchange with our customers. We are not interested in buzzwords or technological overpowering, but in genuine discussion at eye level – the actual meaning of dialogue.
What we stand for:

This attitude characterizes our daily work and is the reason why many companies, organizations, media and educational institutions approach us time and again.
We at scip AG, from the Red Team to the Blue Team to the Titanium Research Team: We do not see our role as an expert who provides ready-made answers, but as a sparring partner for companies, organizations and interested parties who want to embark on a journey to better understand technology and make it more meaningful. If you would like to find out more about our philosophy, we invite you to read our articles in the media, podcasts or specialist publications or to talk to us directly. Because trust, whether between people or between people and AI, is not created by technology alone. It is created through dialogue. And that’s exactly what we want to have with you.
Our experts will get in contact with you!

We are going to help to deal wich your data leaks emerging in the Darknet.

Marisa Tschopp

Marisa Tschopp

Marisa Tschopp
Our experts will get in contact with you!