The Edtech Leviathan

In June 2021 Google and BYJU’S announced a partnership to provide education services in India. A press release from BYJU’S said, “the simplicity, flexibility, and security of Google ‘Workspace for Education’ and BYJU’S unmatched content pedagogy come together on the ‘Vidyartha’ platform to aid effective learning at school…this integration of Google ‘Workspace for Education’ with BYJU’S will offer a collaborative and personalized digital platform for classroom organization, available for free to participating educational institutions” (Byjus 2021).

By offering education content gratis and supporting ‘personalized learning’, Google and BYJU’S, see themselves as facilitating the transition of Indian education from the traditional brick and mortar classroom to a virtual learning space, potentially benefiting millions of Indian students during and beyond the pandemic. This paper examines the implications of this tie-up, in the context of commercialization of education and the increasing concentration of power with monopolistic corporations. We argue that private platforms in the unregulated ed-tech sector are incentivized to prioritize growth above all else and their programs are sharply opposed to the socially transformative aims of education. We conclude that there is a need to regulate the ed-tech sector and provide some domain-specific considerations.

Platform Power

The term ‘personalized learning’, also seen as student-centered learning or context-adaptive learning, is sold as a technologically-inflected alternative to the flawed ‘one-size-fits-all’ approach of traditional school education. As a counterpoint to this idea, we will critically assess the ways in which the implementation of AI-driven personalized learning models could adversely affect learning and the education system. Additionally, we will look into how the platformization of education and its venture capital funding, could have serious implications on keeping education as a not-for-profit public service.

Platforms are infrastructures of value creation, capture and distribution. They facilitate interactions among various actors (including consumers, producers, advertisers, service providers, and suppliers), harvest data from such interactions, and generate data-based intelligence for optimizing value (ILO 2021). Platforms have established themselves as critical socio-economic infrastructures, covering areas such as information search, social networking, transport, and e-commerce. The building block of these vast platform empires is data; in digital markets, market power is synonymous with data power. As digital markets expand, platforms have grown to become immensely large and powerful entities rivaling the power of states.

Google is a preeminent platform that acts as an information organizer for most people through its search engine (Statcounter 2021). Its cloud offerings provide a range of functionalities such as web browsing, language translation, data storage, e-mailing, location and navigation services, etc. Its Android operating system controls access to most mobile phones. In each of these products, it has the largest, or among the largest, market share. Together, this combination of products and services offer users, potent incentives to join and participate in the Google ecosystem. Consequently, Google is a leader in collecting and harvesting user data, building algorithms to process and provide value added services to users and securing a digital advertising windfall for itself. Over 80% of Alphabet’s (Google’s parent company) revenue comes from its highly sophisticated online advertisement business (Graham and Elias 2021). This business depends on the targeted delivery of customized ads to users, enabled by its collection and processing of user data. Google’s cloud services, data power, and digital computing infrastructure makes it a world leader in developing Artificial Intelligence (AI).

In the past decade, Google has also established itself as a powerhouse in the American public education system (Singer 2017a). Google’s proprietary Chromebook gives teachers and students access to its web-based applications like Google Docs, Google Calendar, and G-mail. In 2016, Chrome books accounted for 58% of the mobile devices shipped to schools in the US (Bouchrika 2021). By partnering with BYJU’S, Google is seeking to extend its dominance to the Indian ed-tech market, and cash in on “the next billion users”.

BYJU’S, the world’s largest ed-tech company by valuation, has had a meteoric rise. It has already acquired nine companies in its short, 10-year history (Banerji 2021a). Perhaps better known for its high-profile celebrity endorsements and kit sponsorship of the Indian men’s cricket team, the company has extended its ed-tech footprint from the test-prep coaching space to offer content for students across the K-12 range. It offers interactive video lessons through its app. It claims that the app aids learning by simplifying and making concepts easier to understand. Google’s extensive user base and popular cloud applications will allow much quicker expansion of BYJU’S products across India.

The Google BYJU’S collaboration portends a domination in the ed-tech space, through an identical model of offering gratis products/services, grabbing market share and harvesting user data. The data security and privacy concerns explode in the education context, where the data subjects are children, incapable of offering consent. Electronic Frontier Foundation’s research has revealed the troubling extent to which Google used its Chromebook devices to spy on children in the US, collecting far more information than was necessary (EFF 2015). Further, Google is facing several lawsuits globally for abusing its dominant market power to privilege its own products over competitors on its platforms. Likewise, egregious business practices dog BYJU’S sales efforts; its associates appear to tailor sales pitches to potential clients based on their socio-economic background, and ensnare unsuspecting parents into unwanted long-term loans (Singh 2021).The concerns about data privacy and the monopolistic practices employed by platforms companies are intimately linked. By offering the content on the Vidyartha platform for free, Google and BYJU’S are relying on a tried and tested predatory pricing strategy in the platformization processes, to corner market share.

The collaboration will also strengthen the competitive advantage of Google’s products, as children will get used to Google Docs, G-Mail,and other Google apps. This is similar to Microsoft’s model of ‘free’ teacher training for its MS Office, in collaboration with schools and education systems, which had helped cement its dominance of the office suite business (Di Cosmo and Nora 1998). As the anti-trust scholar Lina Khan has elucidated, focus on short-term price effects fails to capture modern forms of anti-competitive conduct in the platform economy (Khan 2017). With the coming together of the infrastructure platform (Google Workspace for Education) and educational content (BYJU’S course material) businesses, the likely result will be reduced consumer choice and vendor lock-in. Any regulatory intervention must, therefore, take a long-term structural view of the data economy, being attentive to the risks of predatory pricing and the manner in which integration across distinct business lines may prove anti-competitive.

It is also important to appreciate the circumstances of this collaboration. The pandemic is forcing education institutions and school systems towards digital and online education, amid the hype about the potential of digital technologies to ‘break-through’ the physical school model (KPMG 2017). Considering that India has the second largest education system in the world, the data harvesting potential is high and Indian students will become guinea pigs – training data at best and collateral damage at worst; providing their data to fine-tune algorithms for personalized learning.

Venture capital is an important fuel for platform capitalism. Venture capital funded companies such as BYJU’S are expected to provide very high returns on investment to repay the substantial capital investment. While even in the traditional funding models, the company’s primary duty is to its shareholders, under venture capital funding, companies are disproportionately incentivized to prioritize growth and profits above all else, including the well-being and holistic development of the learner. Such conditions violate the spirit of the Unni Krishnan judgment where the Supreme Court held that entities providing education cannot be for-profit. These misaligned incentives, which position sales and marketing as a core competency of ed-tech firms, are at least partially responsible for the emergence of flashy trends such as edu-tainment, personalized learning, gamifying education or BYJU’S ‘fun learning’ model, which are of questionable pedagogical value.

Algorithmic intermediation in education

Personalized learning solutions claim to offer radically new and context-adaptive ways to improve students’ academic performance and grasp of concepts. Claims to revolutionize, re-imagine, or hack education through technology are not new (Toyoma 2011) and have invariably fallen short of their promises. Advertisements for ed-tech platforms – like BYJU’S – often draw on the familiar cultural trope of ‘Sharma ji ka beta’, (Dutta 2015) capturing the imagination of parents keen on securing their children’s future prospects early on (John 2020). However, these can seriously hurt the mental and physical health of young children, who are pressured by parents and relatives to become ‘super achievers’. The periodic news of student suicides in Kota (and elsewhere) are visible and grave reminders of the academic pressures faced by students in the country. By widely targeting students, Vidyartha could make the Kota ‘coaching factory’ model (Johri 2015) seem trivial in comparison. The Kota model caters mostly to parents who want their children to do well in competitive examinations to select professional courses or for specific careers. BYJU’S content, available for all grades and subjects, will cater to every parent who has a child in school or college. Coaching classes of yore pose physical and monetary constraints. However, the negligible cost of ed-tech apps, predatory pricing, huge discounts, free services, and the lure of ‘competitive advantage for my child’, are likely to expand the coaching classes market.

Such marketing has contributed to the meteoric growth of Whitehat Jr (recently acquired by BYJU’S). Coding seekho, duniya badlo (learn coding, change the world) is Whitehat Jr’s clarion call, deceiving parents into believing that knowledge of coding enables children as young as six years of age to develop apps that will have investors lining up (TDH 2020). These kinds of ads tap into the aspirational Indian middle-class ambitions of ‘the wealthy life’ with the promise of lucrative Silicon Valley jobs, or the chance to become the next Sundar Pichai or Elon Musk. However, preoccupation (Banerji 2021b) with coding and computer science education distorts the wide exposure that children need at a young age (Singer 2017b). While an understanding of computer science is important, education must equally encourage critical reflection about the role of technology in society, both at an age-appropriate stage.

The lack of transparency in algorithmically-mediated learning brings another host of issues. Algorithms tend to be a black box and their operation is neither neutral nor unproblematic. Rather, they play an active and generative role in educational processes and reflect their designers’ and developers’ biases, in the rules they frame to guide the algorithms, as well as in the data sets they process. Reliance on algorithmic decision-making has been the basis of discrimination by educational institutions. Recently, UK students were graded by an algorithm (Katwala 2020); this caused an uproar when students from disadvantaged backgrounds received lower scores than white students, reflecting the implicit bias in the process. Similar biases have been repeatedly identified in AI models dealing with criminal justice (Heaven 2020), credit scoring, and facial recognition.

Algorithmic biases and errors can have serious long-term consequences for children's welfare and well-being. In the Indian context, where caste, gender, class, and religion are axes of marginalization, the reliance on the past to predict future possibilities (through a blinkered reliance on data-driven models) can create a situation where students from traditionally marginalized castes are driven toward vocational training, as the data will suggest that they are better off here, while their upper caste/class peers are afforded the privilege to continue with their mainstream education, which can offer better-paying and secure employment opportunities (Kasinathan 2020). Many teachers and administrators hold implicit beliefs about the ‘non-educability’ of marginalized groups (Namrata 2011) and there is a caste/class divide between teachers and students in government schools. AI will fortify these biases, its discriminatory approach will be pushed as ‘scientific’, formalizing gender/caste/class discrimination into the education system. Thus, NITI Aayog’s vision of using AI to preemptively identify students who are expected to drop out and recommend vocational education for them, could end up reinforcing the precarity of historically-marginalized and low-income communities (Niti Aayog 2018), and convert structural disadvantages into formal criteria for discrimination and exclusion.

Algorithmic intermediation in education is especially dangerous because of the foundational role that education plays in a child’s life. Drawing from the educator Paulo Freire’s definition of the process of conscientization, education is meant to instill a disposition for free thinking and critical inquiry, and aid in the development of a moral compass in learners. It must facilitate individual development and empowerment, including enabling the individual to transcend their own limitations/biases and be able to work for a new individual and collective future. In contrast, the bounded operation of algorithms reduces complex social phenomena to rule-based, definable components. For instance, predictive engines are deemed to be working well if they can influence or predict a user’s next move with some degree of accuracy. The use of recommendation engines or other types of predictive analytics in educational contexts, therefore, carries the risk of reinforcing regressive belief systems among students rather than challenging them.

We have seen these dynamics play out at a grand scale in the context of social media platforms that algorithmically mediate content to prioritize sensational and polarizing content based on its profitability (Nguyen 2020). Such a fate must not be allowed to befall educational platforms used by children (Kannan 2021). Such platforms need to provide diverse and even divergent/contrarian exposure for learning and development. Increasingly, scholars have warned against implementing AI and untested technologies in certain fields due to the disproportionate risks compared to the potential gains, which are not easily realizable, as in the case of facial recognition technology (Clark 2021) or DNA profiling (Ramanathan 2015). Similarly, using AI for personalized learning can undermine the foundational goals of education.

Even if online education or personalized learning does help improve academic performance according to the narrow metric of a test score, it overlooks the learner’s overall development as a socially-mediated process. Beyond classroom instruction, the school environment serves a variety of developmental functions in the life of a young individual. It is a space where learners pick up important life skills such as the ability to collaborate, play, deliberate, and disagree with their peers.

Progressive and far-reaching perspectives on education have been articulated excellently in the National Curricular Framework 2005 ‘Position Paper on the Aims of Education’ (NCERT 2006). The position paper makes it clear that education needs to be a rich and diverse experience for every learner, and that education must create the vital links between children’s experiences at home, in their community, and what the school offers them. It also emphasizes on the need to promote and nourish a wide range of capacities and skills, such as literary and artistic creativity; and the need to expose students to ways of life other than their own as worthy of respect. These aspects of a holistic education are deemed irrelevant in the personalized learning paradigm, which delivers knowledge in narrow and isolated learning capsules. Van Dijck et. al., have dubbed this as the learnification process wherein “the social activity of learning is broken into quantifiable cognitive and pedagogical units” (Djick and Poell 2015). By emphasizing solely on the syllabus material, this model reinforces the test-prep mentality of rote memorization to crack an exam. Further, by negating the creative and collaborative contribution of students, this paradigm also reduces the role of the teachers to passive participants who have little control or ownership over the educational processes. The implementation of personalized learning, by diluting the teacher’s role and responsibilities, has been shown to similarly undermine the role and agency of teachers in the American public education system (Kim 2019).

Philosophers of education have highlighted the socially transformative role of education in building a thriving democracy and sustaining culture and community values. Dewey elaborated on the critical importance of experiential learning (Dewey 1938). He underscored the importance of hands-on learning by drawing connections with the learner’s own lived experience, rather than passively absorbing concepts which are alien to the student’s life world. Such a learning experience would equip the collective to apply the principles learned in school to the real-world situations, and organize society around these mutually agreed upon principles. Freire (1970) wrote about the importance of dialogue and communication in the pedagogical process. He notes that one-sided narrative instruction by teachers who treat students as empty containers or receptacles to be filled up, produces docile, obedient and unquestioning subjects. Deshpande (2020) explains the social and political roles of university campuses, “as exemplary sites of social inclusion and relative equality, is arguably even more important than the scholastic role.” Thus, the value of education lies in cultivating a politically conscious citizenry, and safeguarding democratic values. These values are undermined by personalized learning models which isolate and insulate students from their surroundings and peers.

Platform and AI regulation

The emancipatory ideals of education can only be achieved by seeing it as a public good that needs to be universally and equitably provisioned, as envisaged in the Indian Right to Education Act, 2009 (RTE 2009). The ed-tech sector must be considered as sui generis within the platform economy, given the socio-political importance of education in a democracy. The RTE, enacted under Article 21 of the Indian Constitution, affirms as a fundamental right the free and compulsory education to every child from the age of 6 to 14 years. Education as a public good must be distributed on the basis of democratic principles of equity, and ought not to be left to the logic of the market. Ed-tech services also need to conform to the accepted curricular aim and frameworks of the country. AI must be cautiously implemented, perhaps more as a pedagogical support tool for the teacher (France 2020) rather than being used for direct student learning. The algorithms used to process data must be made available for public scrutiny (auditable AI) for the assumptions they make (explainable AI), the educational aims they serve, and the biases they hide. Closed source algorithm are black boxes which hide the curriculum and pedagogical assumptions they make. These cannot be validated for alignment to curricular frameworks or accepted aims of education. Hence algorithms used in education must be open source. Even open source AI must be selectively and sparingly used, and only after thoroughly considering the potential risks involved (Zimmermann 2021).

Recently, China has mandated its large and burgeoning private coaching/tutoring industry to operate on a non-profit basis (Koenig 2021). Such a step would eliminate venture capital funding and consequential pressures to form monopolies, hoard and harvest data. This move will also ease the intense academic pressures on children and financial burdens on parents. China’s action is based on the idea that private tutoring is essentially an educational service and hence, cannot be a commercial activity. China’s new policy stance is an important point of reference for India, as there are resemblances between the two economies in terms of size and the positions of significance that the private tutoring/coaching industry occupy within them. Notably, China also recognizes that the ed-tech sector must be regulated to serve the political goals of education.‘Not for profit’ need not mean that an activity should not generate a revenue surplus, but that there can be no economic return on investment for the enterprise owners. Unfortunately, in a move in the opposite direction, the Ministry of Education has proposed the ‘National Educational Alliance for Technology (NEAT)’ scheme, inviting private ed-tech vendors to advertise their proprietary ‘adaptive learning’ solutions, on its site.

The digital technology sector is predisposed towards the formation of mono/oligopolies and anti-competitive practices like predatory pricing and integration across lines of business, including through mergers and take-overs. To avoid this, some regulatory mechanisms need to be put in place. Firstly, the structural separation of infrastructure, data/content, and AI layers is crucial, to prevent monopolistic practices. Secondly, data collection must be stringently regulated to ensure the safety and well-being of children. The provisions in the Personal Data Protection Bill pertaining to children’s data (Section 16) are necessary, but instead of relying on the individualistic consent-based models, that would de-facto vest ownership rights of students’ data with platforms, community data ownership models should be implemented (MEITY 2020). Data about students and teachers and their learning transactions must belong to the school and the parent community although these may be hosted by data platforms. Access to this data for AI processing and providing services to schools must be regulated, including making it a not-for-profit enterprise.

It is important to question the dominant techno-utopian and mythologizing discourse that has been built around ed-tech. Buzzwords such as “student-centered tech” or “context-based learning” tend to conceal the underlying vested commercial and political interests. Audrey Watters writes, “"Re-imagining" is a verb that education reformers are quite fond of. And "re-imagining" seems too often to mean simply defunding, privatizing, union-busting, dismantling, outsourcing.” (Watters 2020). In the absence of any regulatory oversight, ed-tech will become another sector to evolve into a digital kleptocracy, with disproportionate power and influence concentrated in the hands of select large private actors unaccountable to the public, possessing the power to influence and steer education policy and curriculum. This would hollow out the public education system, and take us further away from achieving the educational promise of social transformation.

Author information - Gurumurthy Kasinathan, Director - IT for Change.  and Amshuman Dasarathy, Research Assistant - IT for Change.

The article was carried  by Economic and Political Weekly, Jan 1, 2022.


  1. Unni Krishnan, J.P. vs State Of Andhra Pradesh. 1993 AIR 2178

  2. The Right of Children to Free and Compulsory Education Act, 2009

  3. Ministry of Education, “Ministry of HRD announces National Educational Alliance for Technology (NEAT) Scheme for better learning outcomes in Higher Education,” 2019, Ministry of Education.

  4. IT for Change, “IT for Change’s Feedback to the Draft Digital Markets Act, 2020,” (2021).


  1. Banerji, Olina (2021a): “How world’s largest edtech Byju’s makes rivals an offer they can’t refuse,” 14 June, The Ken, (viewed on 30 September).

  2. Banerji, Olina (2021b): “WhiteHat Jr turns white knight as Byju’s – Disney fairytale stumbles,” 2 September, The Ken, (viewed on 30 September).

  3. Bouchrika, Imed (2021): “How Google conquered the classroom: The Googlification of schools worldwide,” 1 March, Research, (viewed on 30 September).

  4. Byjus (2021): “BYJU’s and Google Partner to offer ‘Learning Solution’ for Schools,” 2 June, Byjus, (viewed on 30 September).

  5. Clark, Mitchell (2021): “Students of color are getting flagged to their teachers because testing software can’t see them,” 8 April, The Verge, (viewed on 30 September).

  6. Deshpande, Satish (2020): “Online education must supplement, not replace, physical sites of learning,” 27 May, The Indian Express, (viewed on 30 September).

  7. Dewey, John (1938): Experience and Education, Kappa Delta Pi.

  8. Di Cosmo, Roberto and Dominique Nora (1998): “Hijacking the World: The Dark Side of Microsoft,” Di Cosmo, (viewed on 30 September).

  9. Djick, J. Van and T. Poell (2015), “Higher Education in a networked world: European Responses to U.S. MOOCs,” Consumer Social Responsibility eJournal, Vol 9, pp 2674 – 2692.

  10. Dutta, Nilanjana (2015): “13 traits ‘Sharmaji ka Beta’ has that you don’t,” 20 May, Storypick, (viewed on 30 September).

  11. EFF (2015): “Google deceptively tracks students’ internet browsing, EFF says in FTC Complaint,” 1 December, EFF, (viewed on 30 September).

  12. France, Paul Emerich (2020): “Three tips for personalizing in a pandemic,” 6 July, Paul Emerich France, (viewed on 30 September).

  13. Freire, Paulo (1970): Pedagogy of the Oppressed, Brazil.

  14. Graham, Megan and Jennifer Elias (2021): “How Google’s $150 billion advertising business works,” 18 May, CNBC, (viewed on 30 September).

  15. Heaven, Will Douglas (2020): “Predictve policing algorithms are racist. They need to be dismantled,” 17 July, MIT Technology Review, (viewed on 30 September).

  16. ILO (2021): “Platform labour in search of value: A study of workers’ organizing practices and business models in the digital economy – Executive Summary,” 7 September, ILO, (viewed on 30 September).

  17. John, Rachel (2020): “Pushy Indian parents now have a new goal for six-year olds – coding,” 11 October, The Print, (viewed on 30 September).

  18. Johri, Ankita Dwivedi (2015): “Inside Kota coaching factory – Why an 18-yr-old signs off her note: ‘Mein aaj apni life khatam karti hoon,’” 26 November, The Indian Express, (viewed on 30 September).

  19. Kannan, Ramya (2021): “Explained | How will the UK’s Children’s Code impact digital space norms?,” 12 September, The Hindu, (viewed on 30 September).

  20. Kasinathan, Gurumurthy (2020): “Making AI Work in Indian Education,” IT for Change, (viewed on 30 September).

  21. Katwala, Amit (2020): “An algorithm determined UK Students’ grades. Chaos ensued,” 15 August, Wired, (viewed on 30 September).

  22. Khan Lina (2017): “Amazon’s Antitrust Paradox,” The Yale Law Journal, Vol 126, pp 710 – 805.

  23. Kim, E. Tammy (2019): “The messy reality of personalized learning,” 10 July, The New Yorker, (viewed on 30 September).

  24. Koenig, Rebecca (2021): “Online Tutoring in China was booming. Then came a dramatic shift in regulations,” 26 July, Ed Surge, (viewed on 30 September).

  25. KPMG (2017): “Online Education in India: 2021,” KPMG, (viewed on 30 September).

  26. MEITY (2020): “Report by the Committee of Experts on Non-Personal Data Governance Framework,” 2020, Ministry of Electronics and Information Technology.

  27. Namrata (2011): “Teachers’ beliefs and expectations towards marginalized children in classroom setting: A qualitative analysis,” Procedia – Social and Behavioral Sciences, Vol 15, pp 850 – 853.

  28. NCERT (2006): “National Focus Group on Aims of Education,” 2006, NCERT.

  29. Nguyen, C. Thi (2020): “Echo Chambers and Epistemic Bubbles,” Episteme, Vol 17, Issue 1, pp 141 – 161.

  30. Niti Aayog (2018): “National Strategy for Artificial Intelligence #AIforAll,” 2018, Niti Aayog.

  31. Ramanathan, Usha (2015): “How the Committee that drafted the DNA Bill ignored a note of dissent,” 26 July, The Wire, (viewed on 30 September).

  32. Singer, Natasha (2017a): “How Google took over the classroom,” 13 May, The New York Times, (viewed on 30 September).

  33. Singer, Natasha (2017b): “How Silicon Valley pushed coding into American classrooms,” 27 June, The New York Times, (viewed on 30 September).

  34. Singh, Akanksha (2021): “Hard sells and ‘toxic’ targets: How Indian edtech giant Byju’s fuels its meteoric rise,” 20 August, Rest of World, (viewed on 30 September).

  35. Statcounter (2021): “Search Engine Market Share Worldwide,” (viewed on 30 September).

  36. TDH (2020): “WhiteHatJr. Says Wolf Gupta Ad with 20 Cr. Job Package at Google Was Fake to High Court Delhi, India,” 25 November, TDH, (viewed on 30 September).

  37. Toyama, Kentaro (2011): “There are no technology shortcuts to good education,” Educational Technology Debate, (viewed on 30 September).

  38. Watters, Audrey (2020): “The Ed-Tech Imaginary,” 21 June, Hack Education, (viewed on 30 September).

  39. Zimmermann, Annette (2021): “Stop building bad AI,” 21 July, Boston Review, (viewed on 30 September).


Focus Areas
What We Do
Resource Type
Edtech Leviathan