House for Journalism and the Public Sphere

“We talk about AI as if it were a divine gift”

Everyone should “speak fluent tech” in order to understand what companies do with our data. Tactical Tech teaches this digital fluency worldwide. The founding duo Stephanie Hankey and Marek Tuszynski explain how this works. Maria Exner, director of Publix, gave the interview.

Tactical Tech develops successful workshops and exhibitions to educate people on digital media and technology and are Publix residents. You can experience their work at the Open House on June 29.

Not a day goes by without headlines about AI companies, about Elon Musk, the owner of short-form messaging platform X, or about Meta-CEO Mark Zuckerberg. Does that mean that the public is well-informed about what unfolds in the digital sphere?

Stephanie Hankey: Everyone knows a bit. Everybody has a different piece of the puzzle. Some people are very knowledgeable about the apps they use every day, but know little about anything else. We try to help people fill in the rest of the picture by themselves; by providing spaces for them to pose questions they have not had the chance to explore before.

Tactical Tech has been doing this work, worldwide, for over 20 years. Since you started your work, the amount of time we spend on devices, with software and digital media has significantly increased. Given this context, are the educational programs that are available sufficient?

Hankey: Perhaps let’s take one step back: Why do we need digital fluency programs to begin with? You might say, we need other things to increase the understanding of technology in politics: we need to work on policy, or we need to invest in innovative solutions that pose fewer risks for users. However, the technology is changing at a much faster pace than the policy. We cannot wait for people in decision-making positions to change the system. We need to make sure that people aren’t just confronted with these problems but have the tools to navigate and engage in the conversation. People cannot comment on something they don’t understand.

If that is the goal, how close are we to it? How good is the digital literacy of the populace?

Marek Tuszynski: The concept of digital literacy rests on the assumption that certain demographic groups need to become ‘literate,’ as if they were illiterate before. We believe that everyone who uses technology, whether professionally or privately, is familiar with certain aspects and less so with others. How ‘native’ does someone feel in digital spaces and in dealing with digital media? How well versed are they in technological, political and economic aspects of the technology they use? That is what we are interested in.

What does a higher level of understanding and knowledge protect against? What potential dangers threaten those who do not yet ‘speak tech’ fluently?

Tuszynski: A potential risk is the dwindling protection of the private sphere and therefore one’s own security. For some people, there are other risks involved, such as how much they want to disclose, who they trust. You have to question: what happens to my data? Who makes money afterwards? And what is the balance between the risks I take and the benefits I derive from it – compared to the benefits others derive from my risks? In practical terms, it’s important to know what you’re using and why.

How can this be achieved?

Hankey: The problem is the sheer volume of people who need critical media literacy. Addressing a few thousand people is not enough. We need to reach millions. We will only make real systemic progress when we integrate these topics into school curricula and find ways to reach older people at a structural level. We can look to the existing social infrastructure in Europe and Germany, which is hugely important, for example libraries, museums, schools. One way to reach people en masse would be to use social media to educate people about social media. But you'd have to enforce this, or it would require a lot of investment if it was going to be done independently.

Individual Tactical Tech exhibitions have reached thousands of people in the past. You now work a lot with partners such as libraries. How does that work?

Tuszynski: Our team at Tactical Tech consists of around 20 people here in Berlin. That is not a lot. So, we work with existing institutions that are having to reinvent themselves anyway. They don’t know how the digital sphere works – but their audience expects them to know. These are, for example, libraries, schools, museums, community centers, associations, clubs. They may have few resources, but they exist and will continue to exist. We help them to become agents of change, adapting our existing materials to their needs, and developing new things in collaboration with them. All our materials and tools are creative commons and open source.

Hankey: To name one example, we equipped one library in Sweden and trained their staff. This library passed on that knowledge to 60 other libraries. Another partner in the Netherlands has practically made our work known throughout the whole country. We therefore have an impact far beyond our own reach.

So you are rarely present when people engage with the work of Tactical Tech. How do you know whether the workshops or exhibitions are having an impact?

Hankey: We test every new idea ourselves extensively with the intended target group, for example with school classes, whose feedback is then incorporated into further development. For the ‘Glassroom’ exhibition, which we showed in Europe and the US from 2016 to 2019, visitors were surveyed immediately after the visit and then again six months later. What we found is that they had not only installed other apps but had also measurably changed their attitude towards technology. We are currently seeing another effect in the development of our new ‘AI and You’ program: people are simply grateful and relieved that they can talk to someone about the topic and ask their questions.

In other words, people are using AI tools, but have a lot of questions in their heads that they don’t have an outlet for?

Tuszynski: The discourse and narrative around technology is dictated by tech corporations at every level. For example, we talk about social media as if it were a neutral term, a concept that has been eternally embedded in society. But that is not the case. The same goes for AI. We talk about AI as if it were a miraculous entity that was discovered in a cave somewhere and then given to the public, and now we have to understand it because it is some sort of divine gift. If you replace terms like ‘social media’ or ‘AI’ with what they are – products of international corporations backed by large language models and immense computing power – it sounds very different.

From this perspective, is the teaching of media skills no longer neutral, rather it means teaching people how to use the products of certain corporations more safely?

Tuszynski: It is an attempt to repair broken tools, if you will. Corporations participate in such measures as long as it is good for them – and when the political winds shift, they evade all accountability, as Facebook recently did. We as NGOs, and many educational institutions as well, absorb the damage that these companies are causing with their products. We are essentially the post-market research departments, drawing attention to risks long after they have manifested. In this respect, the behavior of the tech companies is similar to that of the oil or plastics industry – including the lobbying, the same attempts to influence scientific studies, the same attempts to dilute critical narratives. I’m afraid it will take 40 years to figure this out too.

At the Publix Open House on 29 June, you will be offering a workshop that takes a playful look at the polarizing effects of social media and a ‘Data Detox Minibar.’ What exactly can visitors expect?

Hankey: The minibar will offer practical tips in the style of recipes. Because yes, theories and concepts are important, but people also want to know specifically what they can do. That’s what our step-by-step guides to digital detox are for. Our workshops are always an invitation to participants to experiment and try things out. We want to create ‘aha!’ moments.

What is the most important insight you have gained while working on this topic?

Tuszynski: That we spend too much time reacting to what Big Tech dictates. Can tools from companies that don’t give anyone access to the algorithms and data ever be democratic tools? My answer is no. We end up fighting over which black box is better than the other. We are wasting time – and wasting time is great for their business. Lots of democratic institutions that would have the power and resources to define technologies are struggling with black boxes that will never open. They are never going to be good. Instead of discussing ‘How good or bad is AI? How good or bad is social media?’ We should ask ourselves, what kind of technology a democratic society really needs.

More to read

Subscribe to our newsletter!