top of page
Search
velislava777

ECIAIR 4th European Conference on the Impact of Artificial Intelligence and Robotics - keynote

AI (in)justice in education – challenges for policy, governance, and society


[The below is transcription and recording of EDDS founder Veli Hillman's keynote presentation at the 4th European Conference on the Impact of Artificial Intelligence and Robotics supported by École de management de Normandie]



If our future generations, children in schools today, are to benefit from AI in any shape or form be that in school, work or in life, there are at least three concerns we should be thinking about from now. We can see these as stemming from something very basic and human, the need for security – a need that also manifests in every aspect of life – from politics and education to the state of our planet.


Before I get to my three points I want to lay out the objectives of this presentation.


First, I am going to depict a rather pessimistic picture that emerges from my research in education as educational institutions continue to grow their dependence on data-driven and algorithmic systems. Around 2011, I first started researching with children from the ages between 6 and young people under the age of 18, I looked with the eyes of an idealist. I loved the stories of Seymour Papert, Mitch Resnick, and even Sir Ken Robinson about how school kills creativity and assimilates computers to fit the archaic mould of schooling, instead of breaking away from it and adapting to the innovations. School eschewed technologies and made them dull and boring. There was the promise for personal devices coming into classrooms on the horizon around 2014 – 15 and I researched opportunities for children for self-navigated learning and agency just as Lev Vygotsky saw it - instead of broomsticks, with computers.


This creativity, curiosity, and agency is still happening – children are the most wonderful thing on earth! Papert warned that schools neutralize the “threat to the status quo by assimilating a foreign body, in this case the computer, to something that fits better with the(educational) system’s structural form” or that the computer was “schoolified”. However, something entirely different is happening. Bad different.


It is the computers or rather the advancing systems and their private owners. who are assimilating education, converting all its ecosystem into data nodes and means of production which provide the capacity of the owners or beholders of these systems to re-define education altogether.


Much of the critical debate surrounding algorithmic systems and EdTech in education reach to the point of blaming Silicon Valley businesses and culture, and the few powerful Big Tech who own the foundations of all things digital. But these digital systems, as a product and a means to political capitalist economy, provide something else. Something that has motivated my research more recently. Before I get to that gloomier part of my presentation, there’s a silver lining.


Second, I would like to propose a direction of thinking for a bigger picture with more positive overtones and hopefully not deflate the spirit in which we are all gathered here today.


First, the bad news.


Recently I was inspired by a couple of manifestos Public Service Media and Public Service Internet and the Media Manifesto (Fuchs and Unterberger, 2021; Fenton et al., 2022) to think about what we want for and out of education technologies and advancing algorithmic systems more broadly for education and for our children and young people.


Such declarations as Fuchs’ and Fenton and authors’ manifestos call for setting things right with media in a time when the internet and the untamed (untameable?) power of algorithmic platforms and data-driven systems have enabled unprecedented societal polarisation and continue to cause mis- and disinformation harms. To safeguard democracy and universal human rights (UN, 1948), such manifestos call for Public Service Media and Internet that aim for quality and fairness.


But how can any quality and fair media exist without quality education to begin with?

How can democracy be upheld in a society whose educational institutions are at the risk of being gripped by the very same powerful algorithmic systems that grip and undermine media institutions?


How can humanity and what is uniquely human be defended if we are losing human autonomy, that is the space of the self by making submission to tracking a requirement of daily life, retrofitting the self’s domain of action onto a grid of data extraction”?


Democracy cannot be upheld without enlightened individuals and independent and critical thinkers. Democracy cannot exist without societies that are unable to understand it and even want it if their education and upbringing is gripped by opaque and manipulative algorithmic systems

Systems that are capable to influence even (re)produce, in Bordieuan terms, who will be the engineer and who the ‘mere technician’.

So, what happens exactly in our educational institutions?

Who decides what (digital or other) systems and instruments operate there?

What do these institutions (are meant to) produce?


Digital technologies, owned by private businesses, with the very few Big Tech providers at the bottom layers of the stack (to use engineering jargon) are becoming the main infrastructures – the arteries of every educational process, from the cradle to the workforce. They may have not necessary envisioned this infrastructure capture but they certainly are hungry for markets and customer loyalty and what better way than grabbing that as early as possible – right from school?


Those who own the infrastructures then can and have the power to set the rules. They have the capacity to become not only the dominant pedagogic power but also one with regulatory and normative authority.


Here is what I mean.


Industrial societies brought about the factory-like schooling that many tech evangelists condemn today. Learning happens anywhere at any time. Industry changed that by introducing compulsory education. Education is different from learning. Education is the product of the industrial society -


Knowledge is packaged into marketable skills, into a commodity that is necessary later in one’s life to exchange for other commodities such as food, shelter, clean air and traveling (actually quite natural things that today are luxuries that demand substantial financial wealth).


This industrial organisation manifesting in educational institutions is imposed on most societies today – be that Bulgarian, British, or Bolivian. Digital technologies not only re-enforce and sustain this organisation; they claim possession. To use Papert’s word – they are now assimilating not only media, but education, and even political infrastructures.


They design, dictate, and promise to produce outputs (workforce) or outcomes (grades, degrees), which they simultaneously also begin to decide which will qualify as necessary and valid. However, becoming dependent on such structures can disable societies and impoverish individuals. People can become

… helpless to recognise evidence unless it has been certified by a professional – be he a television weather commentator or an educator; organic discomfort becomes intolerably threatening unless it has been medicalised into dependence on a therapist; neighbours and friends are lost unless vehicles bridge the separating distance (created by the vehicles in the first place). In short, most of the time we find ourselves out of touch with our world, out of sight with those for whom we work, out of tune with what we feel. (Illich, 2009/1978: 11)

In mercantile societies, Illich and Verne write (1976), “the idea of education underwent a first transformation, as it came to mean the manipulation of one individual of another…Industrial societies transformed the idea of education a second time; by education they meant the manipulation of children by adults using a programmed instrument called the school”. In compulsory education, “we are no doubt witnessing a further reduction of the idea of education, this time for the exclusive benefit of the capitalists of knowledge and the professionals licensed to distribute it”.


Today, the capitalists of knowledge are the owners of algorithmic platforms and data-driven systems. They are not licensed. Yet, they monopolise the purpose of educational institutions. Their systems can be programmed to shape the kind of workforce or social tiers societies should have.


To guarantee that the skills taught in schools today will convert as secure jobs tomorrow is a difficult task. In our digitalized world, the guarantors become the algorithmic systems that not only enable to track and predict labor supply and demand, but they also claim future labor markets by providing their own version of skilling and reskilling. We now see algorithmic systems and the constant generation of data providing new capabilities:

From data about children to inform educational processes, to credentialing and the development of worker pipelines

We see examples with Amazon, Google, Meta even Roblox the popular platform for games – all flocking to guarantee that what skills our children obtain in their warehouses today will guarantee them employability tomorrow. Mainstream schools in the meantime rush to show progressiveness and adopt all sorts of technologies, becoming entirely dependent on digital systems to tell them how they’re doing. Universities too are rushing to expand into the fields of AI, while some are dropping arts and humanities altogether.


Algorithmic systems and datafication of education are opening doors for the orchestration of something entirely different. By harvesting skills intelligence through the development of a ‘permanent online tool for real-time information’ for ‘all interested stakeholders’ to tailor careers and inform education policy based on industry demands is envisioned by the European Commission.


The algorithmic predictions and promises for solution stacks across educational ecosystems are now influencing policies. Perhaps they have always done so in one way or another. But the opportunity of the data economy is something entirely new.


Policies support further data collection on students, collection at all times, anywhere, and for just about anything, because, of at least one concrete reason: the promise algorithmic systems make about providing prediction with precision.


Politics often depend on and are influenced by the fallacy of the economic promise of algorithmic big data/big tech systems: Anything that increases the chances of output is a sure investment magnet, counting politicians in that equation. Whatever guarantees votes is an agenda worth fighting for. Regardless of the label, much of the decisions reflect a politician’s desire and needs for votes that will keep him for at least another mandate.


In another research I conducted last year in the US, this logic was prevalent. I looked at how policy works in tandem with corporations and envisions the set-up of data pipelines from across education and children’s ecologies to align with industry’s labour (or skills) demands. In short, data from one end (the educational) and data from the other end (corporate), the promise goes, will meet in the middle while algorithmic systems can identify what kind of skills are needed for education to produce.


Focus on productivity, however, is often at the expense of individuality and personal creativity. In the current arrangement, it makes no sense for a student to focus on drawing comics in middle school if the economic demands are for web service administrators. And so, the subject of arts drops out of education; STEM is pushed instead. This doesn’t happen only in Virginia. The same is evident here in the UK. Universities are dropping out the arts and humanities; the European Union, too, envisions the set-up of systems that can create data pipelines for workforce demand and supply alignment.


Politics are to a great extent expressed in the values politicians adopt. In the case with Virginia in the United States for instance, the values manifest in economic growth. And so, education is pushed to focus on productivity – a workforce that can quickly come out the school gates and fill up corporate demands, which in this case were demands of Amazon’s newly set-up headquarters in VA. In fact, the former Virginia Economic Development Partnership President and CEO Stephen Moret famously proclaimed “how did Virginia win Amazon new HQ2? (to be set up in the state). He pitched his state’s educated workforce. And so, Amazon future engineer spills over to courses, middle and high school, while little is said about whether children want this, or indeed whether there is any guarantee that Amazon will still exist by the time kids come out of their schooling. It’s actually already looking shaky.


But there is a third condition we’re now starting to investigate collectively – unfortunately some more than others. The planetary security is also needed for the other two systems – politics and economy to work.


So an even bigger fallacy or disappointment emerges in that none of these political and business promises underpinned by the promise for AI or Artificial General Intelligence rely entirely on themselves but rather on predictable variables, which lead to the development of ever more unpredictable technological outcomes. We cannot prepare and ensure security against something that we don’t understand, and we cannot know how it will impact us now or in the future. So, there is no way to guarantee any of the previous two promises – job security and political stability.


Instead, we should consider focusing on three other factors and conditions that wherever politics and industry lead societies to with their marriage, it isn’t to the end of a cliff.


And so, on a more positive note, here are three propositions on how we can steer the conversation and our thoughts.


Before policymakers continue to encourage AI adoption in education and schools become entirely dependent on data-driven systems, we need to consider

Fist, what kinds of people should we try and be to ensure we put AI to good use? We must ask what can or should we as humans do to improve our own attitude, increase kindness, reduce ego, and break away the greed before we get our hands on advancing tech that we hardly understand what they can do in the worst-case scenario?

We built fire and stopped some deadly bacteria from spreading; yet we also use fire to kill. We are designing self-driving cars, but also have combat military drones. We created the world wide web to connect with the world and meet cultures, yet the most successful online businesses are the most harmful and degrading (as in pornography and social media)


To break this into actionable steps, for a start we need to return to the question of what education is meant to be about and do. We hear more talks about education technologies and data collection in education than we hear about the content of information that children learn or what they even think about it.

Second, what is the planetary and societal cost of AI systems? Is economic growth a logical objective and even a possible one and until when?

What scalable alternatives have we come up to prevent deforestation, repleting natural resources, or even digitalizing and converting into data every single human action?


The same questions should govern policy development with regards to identifying what it is that technologies are trying to solve education, if it’s mere efficiency that we’re making national assessments, or getting students digital passes every time, they need go to the toilet while in school, what improvement do we achieve and whom does this benefit? What is the alternative to these new forms of doing things and what

And three, what real alternatives do we leave for progress that doesn’t depend solely on AI?

With this presentation my goal therefore becomes a call for collective conscious estimation of the trade-offs in education between algorithmic systems’ abilities to undermine educators, exploit children’s vulnerabilities and damage the environment, and the learning improvements and economic guarantees these systems make which questions so far attract more confusion and critique than provide substantial evidence.


The societal, economic, ecological costs are likely to surpass their worth. But to prevent that big price to pay, we need two other challenges to overcome:

Lead with consciousness and honesty

Our policies should address questions such as: What is the problem for which an algorithmic system is a solution to? If this question can be answered honestly by educational administrators, EdTech and AI providers, educators, and policymakers, then there is a way forward for having AI systems mediating, facilitating, and influencing education.


If the problem is to save teachers time in correcting essays, an algorithmic system cannot claim to ‘personalise’ learning. It merely offers efficiency. Advancing algorithmic systems in education should display honest results about their contribution and role in educational ecosystems. AI systems will not advance in the classroom through manipulation. They will only exacerbate the deteriorating trust as surveillance and datafication have become normative practices in educational institutions.


Progress doesn’t have to be entirely dependent on the digital

Such systems must be questioned; their owners and their motives – scrutinised. The good intentions behind educational institutions must be defended through meaningful governance and oversight in the face of advancing systems becoming their main means of production and distribution.


If digital technologies become the main source of education, then they have the capacity to define and impose their own grammatic rules – decide upon what progress should look like. Vernacular values become displaced by the commodity-driven values designed by algorithmic platforms and data-driven systems. We have seen their disastrous impact when they mediate societies’ media and communications - why Fuchs and Unterberger, and Fenton and authors have written their Public Service Media manifestos.


Children on whom algorithmic systems are imposed, unwittingly risk becoming the oppressed co-operators who will only continue to maintain these systems’ very power. Learning maths with a digital platform in school counts that one is learning maths. When learning maths with a teacher, through music or by cooking also counts, it means that children have open channels for real alternatives.


Alternatives create friction. Alternatives are the product of creativity. Without friction and creativity there is no progress, only subsistence and oppression.


If algorithmic platforms and data-driven systems get a grip on one’s individual sovereignty and begin to determine human lives from the cradle to the grave, the existence of democracy is in real jeopardy.


And so, “if democracy is to be replenished in the coming decades,” as Shoshana Zuboff declares, “it is up to us to rekindle the sense of outrage and loss over what is being taken from us” (521). It is not just “personal information” that is taken from us, and now taken from our children:


What is at stake here is the human expectation of sovereignty over one’s own life and authorship of one’s own experience. What is at stake is the inward experience from which we form the will to will and the public spaces to act on that will. (521-522)


Thank you very much.


References

Hillman, V & Bryant, J. (2022). Families’ perceptions of corporate influence in career and technical education through data extraction. Learning, Media and Technology, DOI: 10.1080/17439884.2022.2059765

Illich, I. (2009/1978). Disabling professions. London: Marion Boyars.

Fuchs, C. (2022). The Public Service Media and Public Service Internet Manifesto. http://doi.org/10.16997/book60

Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Profile Books Ltd.













































53 views0 comments

Comments


Commenting has been turned off.
bottom of page