The state of cybersecurity in education: the responsibilities of the EdTech sector towards children – LSE




0 comments | 4 shares
Estimated reading time: 5 minutes
0 comments | 4 shares
Estimated reading time: 5 minutes
The growing dependence of primary and secondary education on digital technologies has led to increased cybercrime in UK schools. Unlike other information and communication technology sectors, the EdTech industry tends to escape critical research enquiry with regards to their state of cybersecurity. EdTech businesses work in a fast-paced, relatively unregulated environment and their cybersecurity measures remain largely unknown. In a new [email protected] Working Paper on which this post is based, Velislava Hillman focuses on the state of cybersecurity in education by addressing EdTech businesses, to map the challenges and identify the needs for safety and security in education.
As discourse around digitalised education continues to gloss over issues of data collection, those demanding data on one end (policymakers), and those collecting it on the other (EdTech providers), tend to remain unchallenged. Talking about ‘education data’ or ‘data privacy loss’ as a thing of its own without addressing the businesses that create the problems in the first place, only leads to more absent individual and collective responsibility for children’s privacy in a digitalised education. Solely reiterating the problems risks desensitising audiences to the exploitative practices of businesses driving the data capitalism and legitimating their dominance and power further still. As we witness the progressive dependence of education on EdTech, there has been a drastic shift away from meaningful discourse around: the purpose of education and the growing cybercrime in schools. This has also gone hand in hand with an explosion of unquestioned, industry-led designs for children’s education without any clear code of good practice, benchmarks or, indeed, any mandates and external validation about who meets these.
This research was an attempt to address directly the issues emanating from digitalising education by meeting its providers: the EdTech businesses. Addressing them by talking about cybersecurity – rather than say, pedagogy – allowed them to concretely point to present and tangible issues emerging as a result of their proliferation in schools, and even discuss actionable solutions, much of which was proposed by the businesses themselves. And so, this research was not about the pedagogic value of EdTech or their contribution to learning outcomes, curriculum, or children’s wellbeing. Instead, it aimed to start an honest conversation with EdTech companies about what they do and how they protect children’s privacy and basic rights. Simultaneously it also aimed to exchange knowledge and align their priorities with children’s benefits and fundamental rights.
Cybercrime in education continues to grow in frequency and magnitude. Between 2021 and 2022 around 41% of primary schools and 70% of secondary schools in the UK experienced cyber breaches. Given schools’ reliance on digital technology, cybercrime disrupts education and deprives children of their fundamental right to it, as well as increasing other risks such as that of loss of privacy. Cybercrime leads to loss of sensitive information about children and teachers and leave systems unusable. The cost of cybercrime in primary and secondary education is estimated to exceed the cost of cyber breaches in any other sector globally. Cybercrime leads to loss of sensitive information about children and teachers and leaves systems unusable.
To identify the roles, responsibilities, and gaps that lead to cyber insecurities in education, this research used two methodologies. First, in-depth interviews were conducted with industry representatives – EdTech founders, security and software experts, chief executives and/or technology officers. Interviews were also conducted with representatives of the National Cyber Security Centre (NCSC) and the IASME Consortium which provides the CyberEssentials cybersecurity framework, the National Institute of Standards and Technology (NIST) which provides the NIST cybersecurity framework (NIST CSF) and other frameworks (NIST 800-171/800-53), and the National Initiative for Cybersecurity in Education (NICE). And second, publicly available literature (written in English) was analysed to identify the dominant discourse surrounding cybersecurity in digitalised education.
The report findings and a blueprint proposal for cybersecurity underpinning data privacy protection in primary and secondary education globally will be covered during a special event on February 8 2023, from 10:00-11:00am GMT, via Zoom. For more information, contact Velislava Hillman
This article represents the views of the author and not the position of the [email protected] blog, nor of the London School of Economics and Political Science.
Featured image: Photo by Annie Spratt on Unsplash
For the past ten years Dr Hillman has researched at the intersection of learning, digital technologies, children and young people with focus on their personal perspectives, uses and experiences. She has researched in the area of learning and creativity through the use of digital tools in formal and informal learning environments. During her fellowship in Berkman Klein Centre for Internet and Society at Harvard University between 2018 and 2019, Dr Hillman investigated the implications of data collection in K12 education on children’s basic rights and freedoms to self-expression. More recently, Dr Hillman’s interests are in the integration of AI systems in schools, data-driven decision-making and the role and participation of children and young people in increasingly digitised learning environments. As a Visiting Fellow she is focusing on identifying what kind of AI systems are used in compulsory education; learners’ experiences, perceptions and attitudes towards AI systems; the role of such systems in the instruction, learning and assessment processes; and the rights and freedoms of children and young people in relation to AI systems’ diagnostics and learner profiling.
Your email address will not be published. Required fields are marked *







© LSE 2022

source


CyberTelugu

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top

Adblock Detected

Please consider supporting us by disabling your ad blocker

Refresh Page