top of page
Search
  • velislava777

Standards are coming. Are education technology companies prepared?

Updated: Mar 28, 2022

Ahead of Safer Internet Day, EDDS founder and LSE Visiting Fellow Dr Velislava Hillman asserts that the newly proposed IEEE standards for an Age-Appropriate Digital Services Framework should be seen as a stepping stone towards the development of meaningful frameworks and standards for governing and regulating the edtech sector.



Credit photo: Media on Six

When regulation is not in place, what’s at stake?

When Mithridates VI, the ruler of northern Anatolia between 120-63BC created the ‘mithridatium’, a medicine made from more than 50 ingredients (!), it was considered a cure from all illnesses for centuries. It wasn’t until 1540 in England that mithridatium and other medicine were subjected to evaluation under the Apothecaries Wares, Drugs and Stuffs Act. And it took several centuries, and numerous catastrophic events, including the 1950s thalidomide disaster, for rigorous global standardisation to be introduced.


When many in the education technology (edtech) industry boast about improving learning through personalised programs based on artificial intelligence (AI), algorithmic analytics and more data collection from students (while some researchers even prod on personalising education along genetic lines!), one cannot help but worry about the consequences of this still largely unregulated market.


What are the risks of AI in education?

Many edtech products today have gone past the offer of mere access to content and connectivity. Data collection and algorithmic modeling propel user profiling and control in ways students and even their teachers may not be aware of and understand. Edtech applications for learning, instruction and assessment use nudging and hyper-nudging technologies to influence and manipulate users. Google’s search engine results have long used this manipulative technique – steering users to click through the first page and not go beyond it even though all choices are available and no obvious harm is in the way. Google’s algorithmic configuration is set to prime the users, and that priming is engineered by Google’s architects without users’ knowledge.

Similarly, the AI-based edtech platform Century Tech uses hyper-nudging techniques, steering students towards what to learn and how. The platform steers students’ behaviour in a predictable way; they merely react to the nudges and stimuli. Thus, the platform presents a ‘soft’ form of behavioural control precisely because one cannot become consciously aware of being manipulated. Meanwhile, the technology builds a substantial data reservoir of detailed knowledge about students (more than they are aware). How this data reservoir is deployed to regulate, by design, what students learn is anyone’s guess precisely because of the lack of clear standards and regulations of such mechanisms.


Many edtech products make direct links with neuroscience, looking at wiring students with technologies that can monitor their moods and stress levels, collect their personal thoughts, track their heart rates and gaze, and collect biological, neural and behavioural data. Symanto is a psychographic text analytics tool whose AI is used in universities to monitor student behaviour and support their achievement and mental well-being. Other software claiming to help students with personal and emotional struggles include Solutionpath, Stream and Gaggle, to mention a few. These companies gather metrics on attendance, library use, grades, online learning activities, students’ search history and more. The collected data is used to make inferences about students and decide whether they are ‘well’ or not.


No one knows who is behind these inferential decisions; what happens once a student is marked as ‘unwell’; who makes use of these evaluations; and how they may impact them later in their lives. Gaggle, for instance, hires part-time workers to monitor students’ content. Indeed.com provides reviews from Gaggle’s ‘safety representatives’. While these merit special attention and further research, the reviews suggest that to work as a safety representative at Gaggle is a low-wage job; considered a ‘side gig’. One reviewer says: ‘The basic lowest level representative (which you will start at) only has access to review documents. It’s a[s] simple as reading a sentence and determining if it is urgent or harmful content. If you put in the hours and prove a near 100% accuracy rate then you will be given additional responsibilities.’ Not only are such technologies as Gaggle, Symanto or Century unregulated; they do not show substantial evidence of benefiting students. The lack of regulation leaves a blank cheque for businesses that have capacities for human surveillance and manipulation.


Lack of standards favours businesses, not individuals

Access to edtech is often equated to enhancing learning. But time and again, evidence shows that although some marginal improvement exists, the technologies favour the already well-off: less so the most disadvantaged and displaced populations. Acknowledging that there are numerous opportunities technologies can afford for learning, clear standards and frameworks are needed to help unpack not only which edtech products contribute to the educational processes and how, but also who benefits from them and in what way. Lack of regulation around the transparency of edtech products, their objectives and effects, leaves the customer in the dark with a ‘medicine’ that promises to heal all ills. So far, no one knows the ingredients, except for the edtech companies’ own engineers. Moreover, concerning the lack of regulation and standards, the industry maintains a laissez-faire attitude.


Regulation begins with meaningful quality standards: this must begin now

We mustn’t wait for disasters to happen in education to draw out meaningful standardisation and benchmarking. Today, a third of the Internet’s users are children, but the digital technologies market continues to boast designs that do not anticipate child users. Simply forbidding them from using technologies is like forbidding them to ride a bicycle on the street. Appropriate standards must guide engineers, data scientists, user experience designers, marketers and developers in their product development and optimisation.


One step in this direction is Institute of Electrical and Electronics Engineers’(IEEE) 2089-2021 standard developed upon the 5Rights principles for children’s rights. The proposed framework sets out rules, terms and conditions that ensure companies provide age-appropriate digital services in situations where their users are children. The standard fills up “a gap between global efforts to ensure that young people are catered for by design in the digital world, and a lack of practical guidance for how to achieve this”. However, such efforts must be envisaged for the edtech industry, too. Moreover, further work is necessary towards creating a safe learning space with a dedicated independent governing and regulatory body that ensures that the edtech industry adheres to such standards, prioritises learners’ best interests and indeed contributes to pedagogy and curriculum.



This post first appeared on the Media@LSE blog of the London School of Economics and Political Science. It solely gives the views of the author.

52 views0 comments
bottom of page