top of page
Search
  • velislava777

Protect, empower, inspire: is the tech industry changing old habits?

By V. Hillman, P. Bello, D. Siu, G, M. Eloholma, G. Pereira, K. Struyf, S. Nys, M. Ngounou


The season of tech conferences has been in full swing since the start of the year and the good sign is that, in comparison to previous ones, more academic and civic society representatives are able to mix with industry and policymakers. Rising to the opportunity, eight organisations with one common goal gathered in June at the MyData 2023 conference to assess and equally inspire the industry on how to design digital products with children’s wellbeing in mind.


Is the tech industry changing its unethical practices towards children yet?


The quick answer is mostly no. The recent Federal Trade Commission’s fine against Edmodo, an education technology provider, for unlawfully using children’s personal information for advertising and commercial purposes, and the shocking revelations about Meta’s (former Facebook) Instagram that it connects paedophile networks to child-sex content, continue to corrode public trust and question the effectiveness of the current efforts to scrutinise and hold digital technology companies accountable and protect children. These recent incidents reveal the persistent gaps in ensuring the safety and security of online platforms and child-appropriate designs of technologies.


Striking a delicate balance between empowering communities online and protecting vulnerable users remains a formidable challenge. Yet since the start of the year, tech conferences have been in full swing, and much verbal promise comes flying from the Big Tech (typically the convenings’ sponsors themselves). Yet, by the time one exits from these venues, more news comes about their unethical practices than concrete changes in the right direction. The most recent story again was that Microsoft illegally collected personal information from children through the popular Xbox video console without their parents’ consent. The question therefore arises: do we just keep naming the problems, or will appropriate scrutiny and tighter oversight finally set in to keep tech targeting children in check?


During the MyData 2023 conference, eight diverse organisations* focusing on children’s rights and well-being in the digital world joined forces to give a ‘crash course’ to the industry on how to design equitable, inclusive, and empowering digital products and ensure that children’s special needs, rights, and freedoms in the digital world are industry’s lex omnium.


Action for Changing Minds

The objective of our collective ‘crash course’ was to provide a mind-changing initiative – assess knowledge and attitudes, educate and initiate a call to action on culture change among the business community targeting children with their digital offerings. Certainly, this one-off effort is not enough to foster a comprehensive impact. Yet, it is a way to build bridges between stakeholders across sectors and fields of expertise for a common cause. The course lasted approximately 1:30 hrs and consisted of [i] a timed survey for rapid responses (see Figures 1 and 2), [ii] ‘speed-dating’, and [iii] a workshop for design assessment to enable discussions around risks and opportunities relating to the digital realm and children. Around 30 attendees took part in this dense-with-activities session.


Figure 1: Rapid responses from workshop participants on risks related to data collection from children


There is greater awareness of the risks from data collection behind the interface – the only thing the users see.


Figure 2: Rapid responses from workshop participants on the most important aspects of governing children’s data


Easier said than done. Understanding children’s fundamental rights, however, requires detailed discussion so that the industry can fully know what this means.


Underlying principles for the future


1. Protect: Ensuring Safety in Digital Spaces

It takes a (digital) village to raise a child. Informing and educating children about the dangers online should be done in a balanced manner and with a positive tone. It is crucial to identify all ‘actors’ or ‘stakeholders’ in a child’s life concerning their digital experiences and opportunities, including their responsibilities and roles. The table below summarises initial inputs from the workshop participants.



Figure 3: Actors and responsibilities in the digital world where children can be active participants

2. Empower: Fostering Independence and Resilience

Empowerment lies at the core of supporting children’s growth and development. The workshop participants highlighted several avenues for empowering children in the digital world:

  • Moving from protection to empowerment is not easy. When new technology is introduced to children or existing software is updated with new functionalities (say, adding Chat-GPT to existing applications), updates along the socio-ethical, legal and other requirements must become a priority. Meredith Broussard contends that “[algorithmic] models decay…every computational system needs to be updated, staffed, and tended” and so must the principles and guardrails used for user protection and empowerment..

  • Empower the child’s circle of trust, especially their family and teachers. We must acknowledge that besides empowering the child, it can be hard for children to understand data processing and manipulation mechanisms (such as nudging and hyper-nudging techniques). Children, families, and teachers need guidance on these topics and therefore become more resilient.

  • Digital platforms can incorporate educational components that promote critical thinking, media literacy, and digital citizenship, enabling children to navigate the online landscape confidently. Education has to start with the basic concepts: what does personal data mean in a child’s everyday life? But this is not only education in the classroom; it also means education in the home.

  • Enabling children to have a say in their online experiences, such as through customisable or sandbox-like environments, empowers them to make decisions and explore within defined boundaries.

  • Acknowledge children’s resilience but equally encourage them to both ask for help and ensure that adults provide such easy-to-reach help, for example, when a child comes across an online advertisement or cookie notification and doesn’t know what to do with it.

  • It is equally important for adults to acknowledge when they are wrong and seek guidance, setting an example for children to follow.

  • Creating digital spaces that are mindful of children’s cognitive limits and not overwhelming them with excessive information helps promote focused learning, creativity, and well-being.

  • Industry, developers and designers must ask who is at the centre of these designs. The child or the system? In developing new technologies, industry players must demonstrate critical consciousness through the balance between convenience and privacy and avoid forcing people to exchange personal data and privacy with a convenient user experience. Service providers need to empower children and their caretakers to understand the terms and conditions so that they can make conscious decisions about the implications of sharing their personal data.

3. Inspire: Cultivating Curiosity and Imagination

Workshop participants agreed on the importance of honouring and promoting children’s open and creative view of the digital world. The message was clear: don’t take away the inspiration children hold by guiding them in narrow pathways through data manipulation and imposing decisions. Industry actors should provide dedicated spaces for children to experiment, cooperate with others, and be free to make mistakes. Children and their parents should not fear that personal data about themselves could be misused for behavioural manipulation.

Key elements highlighted as a result of the “crash course” include:

  • Industry actors must enable multi-modal, child-appropriate designs, and inclusive functionalities that address all abilities and needs when creating digital spaces for children.

  • Understanding children’s perspectives, interests and language and giving them tools to elaborate on their own views and ways of self-expression is key to developing respectful services.

  • Children can handle diversity well; setting up communities helps them build inclusive collaboration skills.

  • Staying present and raising awareness of the personal beliefs/biases that may influence digital products. The industry should be self-reflective and self-critical towards their designs and subjective viewpoints.

  • Stimulate a culture that encourages children to have a voice in the digital world. Ask them – what consequences, risks and opportunities they foresee? What solutions do they want to see?

* The organisations involved in the session were: MyData: MyData4Children and MyData Literacy thematic groups, Education Data Digital Safeguards, Fujitsu, Young Justice Leaders, Heder, mIKs-it, Afroleadership. TIEKE Finnish Information Society Development Centre.

This blog post was originally published on MyData.org. .






14 views0 comments
bottom of page