What does it mean to use technology ethically?
In today’s world, digital tools and AI influence how we think, connect, and share. That’s why this big question is where our online workshop started. Together, the participants explored real-life dilemmas and tested interactive activities.
Using a live word-cloud activity, participants associated ethical technology with ideas like privacy, moderation, truthfulness, careful sharing, and “do no harm”. The exercise surfaced an important tension: people often refer to AI as unethical, even though the issues usually arise from how tools are used, not from the tools themselves.
The discussion highlighted four main challenges: misinformation and deepfakes, the balance between data privacy and surveillance, growing dependence on digital tools, and the environmental impact of AI, which is often overlooked but increasingly relevant.
Participants suggested everyday steps such as protecting personal data with strong passwords and two-factor authentication, updating devices regularly, and being mindful about oversharing. They also emphasised the importance of verifying online content through source-checking, reverse-image searches, and spotting tell-tale signs of manipulated media.
These practices were underlined by key values that participants highlighted as essential for online life: truth, privacy, responsibility, moderation, and fairness. These values serve as guides before posting, sharing, or creating content with AI.
CASES DISCUSSION
After these discussions, the participants were given different scenarios to discuss together:
One of them focused on AI-generated influencers: fictional personas that can attract real audiences. While some saw creative potential, most agreed that transparency is essential. If people believe these personas are real, trust and consent are not granted.
In another case, participants examined the use of AI in writing drafts. Many agreed that AI can be a useful tool for brainstorming and structure but warned that relying too much on it risks replacing critical thinking and raises questions of academic integrity. A common ground emerged around responsible, transparent use: disclose assistance when appropriate, verify facts, and ensure the final work reflects the author’s own reasoning and voice.
This workshop was part of our European Solidarity Corps project Connecting Minds for Digital Tomorrow! And it was not only a space to exchange ideas but also a step toward shaping resources that empower young people to act responsibly, confidently, and ethically online. All these insights will directly be implemented into the Digital Safety Handbook for Youth that we are working on right now.