More

    Governments Recommended Unsafe EdTech Products During Pandemic

    ChildrenChild RightsGovernments Recommended Unsafe EdTech Products During Pandemic
    - Advertisment -

    Governments Recommended Unsafe EdTech Products During Pandemic

    Online learning or EdTech products endorsed by many governments across the world enabled surveillance of children. The products monitored or had the capacity to monitor children, in most cases secretly and without consent, in many cases harvesting data, says Human Rights Watch.

    A surveillance sweep gathered data from children – and on children – using digital tools during the COVID-19 pandemic, exposing their digital footprints, an analysis by the rights group, Human Rights Watch reveals.

    HRW’s report says that the overwhelming majority of education technology (EdTech) products endorsed by 49 governments of the world’s most populous countries and analyzed by Human Rights Watch appear to have surveilled or had the capacity to surveil children in ways that risked or infringed on their rights.

    HRW released technical evidence and easy-to-view privacy profiles for 163 EdTech products recommended for children’s learning during the pandemic on Tuesday.

    - Advertisement -

    HRW says that the evidence underpins an earlier report by the organisation titled, “‘How Dare They Peep into My Private Life?’: Children’s Rights Violations by Governments that Endorsed Online Learning during the Covid-19 Pandemic.”

    The analysis showed that of the 163 products reviewed, 145 (89 per cent) surveilled or had the capacity to surveil children, outside school hours, and deep into their private lives.

    Harvesting data

    These EdTech products “appeared to engage in data practices that put children’s rights at risk, contributed to undermining them, or actively infringed on these rights,” the report says.

    Many products were found to harvest information about children such as who they are, where they are, what they do in the classroom, who their family and friends are, and what kind of device their families could afford for them to use for online learning.

    “These products monitored or had the capacity to monitor children, in most cases secretly and without the consent of children or their parents, in many cases harvesting data on who they are, where they are, what they do in the classroom, who their family and friends are, and what kind of device their families could afford for them to use.”

    Few governments checked whether the EdTech products they rapidly endorsed during the COVID-19 pandemic were safe for children to use. Many governments put at risk or violated children’s rights directly. Of the 42 governments that provided online education to children by building and offering their own EdTech products for use, 39 governments made products that handled children’s personal data in ways that risked or infringed on their rights.

    Kept in the dark

    Human Rights Watch found that the data surveillance took place in educational settings where children could not reasonably object to such surveillance. Most companies did not allow students to decline to be tracked, and most of this monitoring happened secretly, without the child or their family’s knowledge or consent. In most instances, it was impossible for children to opt out of such surveillance without giving up on their formal education during the pandemic.

    “Children, parents, and teachers were largely kept in the dark about the data surveillance practices we uncovered in children’s online classrooms,” said Hye Jung Han, children’s rights and technology researcher and advocate at HRW. “By understanding how these online learning tools handled their child’s privacy, people can more effectively demand protection for children online.”

    Algorithmic bias

    One explanation for such conduct is the lack of knowledge on children’s rights, accruing from lack of sensitisation or training, says an analytical piece appearing in The Conversation. The article highlights Responsible Data for Children initiative (RD4C), an initiative by The GovLab and UNICEF to help avoid unintended negative consequences on data subjects and beneficiaries.

    Developers do not design for or implement the additional protections required for children and their data. “The danger posed by algorithmic bias is especially pronounced for children and other vulnerable populations,” the report in The Conversation says. “These groups often lack the awareness or resources necessary to respond to instances of bias or to rectify any misconceptions or inaccuracies in their data.”

     

    Image: UNICEF

    - Advertisement -

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here

    Latest news

    Cabinet Approves Flood Management and Border Areas Programme 

    This is particularly relevant as the increased incidence of extreme events have been witnessed during last few years in...

    Women’s Safety Scheme to Continue till 2025-26

    Out of the total project outlay of Rs.1179.72 crore, a total of Rs.885.49 crore will be provided by MHA...

    First Person: What I learnt from the Gabhales

    Through the provision of livestock and associated training, the family not only regained their economic stability but also underwent...

    New Anti-Rape Crisis Centre Brings Hope for Sexual Abuse Survivors in Pakistan

    The centre will offer free legal assistance to ensure that the forensic examination and tests are done correctly and...
    - Advertisement -

    If Trump Wins, Africa Will Spiral into Climate Hell

    A Donald Trump win in the US presidential elections would spell disaster for climate action in Africa and worldwide....

    Maldives Launches Mental Health Helpline

    The service will provide counselling and support tailored to individual mental health challenges. Red flags, such as callers expressing...

    Must read

    Cabinet Approves Flood Management and Border Areas Programme 

    This is particularly relevant as the increased incidence of...

    Women’s Safety Scheme to Continue till 2025-26

    Out of the total project outlay of Rs.1179.72 crore,...
    - Advertisement -

    More from the sectionRELATED
    Recommended to you