OTTAWA — One of Canada’s leading experts on online child protection has said she is “very optimistic” that a panel advising cabinet ministers on how to tackle online harm can chart a way forward to protect minors from online sexual exploitation to protect.
Lianna McDonald, executive director of the Canada Center for Child Protection, is one of about 10 people named last week to join the panel of experts tasked with helping the government draft new child protection laws.
Heritage Minister Pablo Rodriguez and Justice Minister David Lametti are preparing to reintroduce a bill to tackle online harm, including racist and anti-Semitic abuse. The first version was introduced in the closing days of the last Parliament and when the election was called it died on the ballot without having been debated in the House of Commons.
The addition of Ms McDonald’s to the new advisory committee signals that tackling online child abuse will be an important part of the upcoming bill.
She said there was an urgent need to find a way to force tech companies to quickly remove indecent images of children as reports of victimization continue to mount.
The center’s tipster hotline, Cybertip.ca, saw a 37 percent increase in reports of child online victimization last year. The median age of victims reporting online victimization and non-consensual sharing of intimate images is 14, with some as young as 12.
The line also saw a 79 percent increase in reports of “sextortion” — extorting money or sexual favors from minors with threats of revealing evidence of their sexual behavior.
Ms McDonald advised UK officials on the new UK Online Harms Bill, which she says is a “game changer” in this edition.
Canadian officials have been investigating UK bills that would impose a “duty of care” on tech companies, requiring them to quickly remove child abuse images from platforms or face significant fines.
Ms McDonald said a voluntary approach that asks companies to remove lewd and exploitative images isn’t working and millions of explicit photos and videos are still circulating.
“It can fester for decades,” she said.
The Canadian Center for Child Protection has its own program called Project Arachnid that searches the internet for images of child abuse. It found millions of exploitative images, prompting takedown notices to tech companies, and resulted in six million images and videos being removed since 2017.
Sean Litton, executive director of the Tech Coalition, which was formed to fight child exploitation online, said the Arachnid program plays a valuable role, but the vast majority of exploitative content is discovered and taken down by the industry itself.
“The tech industry has a very special responsibility to ensure their platforms are safe for children,” he said. “Members of the Coalition take this very seriously and are proactive in identifying, removing and reporting it.”
Members of the coalition are Facebook, Google, Twitter, Yahoo, Discord and TikTok.
YouTube says it spends significant time and resources removing non-compliant content as quickly as possible, removing more than 1.8 million videos that violated its policies between April and June last year. This includes videos with sexual or obscene themes aimed at young minors and families.
But Ms McDonald said vendors sometimes resist requests to remove exploitative images, including images of known victims of sexual exploitation dating back decades.
She said the center needed to assess whether certain graphic and suggestive images of young children met its harmful material removal guidelines.
“The videos of children being beaten do not correspond to any clear internal code. We had to go back and forth with companies to get the videos taken down,” she said.
Ms McDonald noted that tech platforms have been slow to remove not only images but also online discussions about child sexual exploitation.
She said that although many pedophiles communicate on the “hidden web,” much of the material resides on the “visible web” and can be quickly discovered and removed if the tech companies agree.
The House of Commons Ethics Committee released a report last year that said Canadians whose image is posted to Pornhub or other online streaming platforms without their consent should have the right to have it removed immediately.
The report also recommends keeping online platforms that do not ensure material is promptly removed and taking steps to ensure that those depicted in pornographic content are at least 18 years of age and have consented to its publication.
The committee conducted its study on sites like Pornhub, owned by Montreal-based MindGeek, after a New York Times article alleged that the site hosted videos of child sexual abuse and exploitation.
Pornhub said it has “strict” policies to “tackle and take down unwanted content” that far exceed those of other major platforms. He said he is at the forefront of fighting and eliminating illegal content and banning content for unverified people.
A spokesperson said anyone who appears on posted material without consent should have the right to have it removed immediately, as is Pornhub’s policy.
The Liberals have promised to reintroduce the Online Mischief Act within 100 days of the autumn elections, a deadline that has already passed.
A consultation launched last summer suggested its passage could be fraught with pitfalls, with some saying the law could undermine freedom of expression online and undermine the right to privacy.
Some of the members of the new panel of experts are among the critics of the first draft law.
The body will hold workshops and conduct additional consultations, including with online platforms, over the next two months, and Minister Rodriguez said the bill would be presented as soon as possible thereafter.