World News

‘Ground is shifting’ for social media giants, says federal justice minister pushing Online Harms Act | CBC News

[ad_1]

Warning: This story deals with suicide. If you or someone you know has been the victim of sexual extortion, or is struggling with mental health, you can find resources for help at the bottom of this story.


Too many young people are dying because of crimes mediated by online platforms, says federal Justice Minister Arif Virani, and he believes his Online Harms Act — introduced in Parliament in February — can help put it to a stop.

One of those people was Prince Edward Islander Harry Burke, who died by suicide a year ago this month during a sextortion attempt. The 17-year-old’s story was brought up in the P.E.I. legislature last month as MLAs voted to have the day of his death marked as Cyberbullying Awareness Day.

“Keeping people safe online, kids like Harry, is long overdue,” said Virani.

“That’s a needless death. It never needed to happen. The same goes for Amanda Todd, Rehtaeh Parsons, and the list is long, and unfortunately it’s growing.”

The Canadian Centre for Child Protection says every day it receives about 10 reports concerning sextortion, where young people are extorted for money after sharing intimate images. There is no way of knowing how many cases there actually are in Canada, or how many of them end tragically.

In a majority of these cases, as in Burke’s, teens are identified as potential targets and contacted through social media by people pretending to be someone they are not.

Their son was a victim of sextortion. They’re speaking out to protect others

The family of P.E.I. teen Harry Burke is speaking about their son’s death by suicide, which happened less than 12 hours after he shared intimate pictures with a new contact on Snapchat and became a victim of sextortion. Carl Burke and Barbie Lavers hope to raise awareness and protect other teens from sexual predators.

“What the platforms need to realize is that people will no longer tolerate it. I certainly will no longer tolerate it as minister of justice and father of two youngsters. Neither will Canadian parents,” said Virani.

“I think social media giants are seeing that the ground is shifting, particularly among western democratic nations.”

Who is responsible for content?

The act, currently awaiting second reading, would make social media companies explicitly responsible for designing products so that they are safe for young people to use.

How companies will do this is not spelled out, but it is a big change in culture for an industry that has for decades been guided by the American Communications Decency Act of 1996. That defined interactive computer services as carriers not responsible for the things that people publish on them.

Head shot of Suzie Dunn.
It will take years to understand how the Online Harms Act is going to work, says Dalhousie law Prof. Suzie Dunn. (Submitted by Suzie Dunn)

“There’s a significant culture around not making social media companies liable for the content that’s on their website,” said Suzie Dunn, a law professor at Dalhousie University in Halifax.

“But more and more lawyers and governments are trying to find ways to encourage social media to act more responsibly — and in some cases be financially responsible for some of the harms that occur on their sites.”

Australia and some European countries have also drafted laws to make social media companies more liable for content. This is part of the ground shifting Virani is talking about.

When he started working on this bill in 2020, he said, he had some concerns about social media companies refusing to acknowledge being subject to Canadian-made laws, but with so many countries now moving in this direction, he is finding social media giants are now willing to work with authorities in Canada.

An international problem

Sexual extortion of youth is just part of a much larger social media problem that Canada is facing. Crime on the street is down, but crime mediated through online platforms, often perpetrated by people outside the country, is growing quickly.

Between 2014 and 2022, reports to police of extortion, in all of its forms, are up 300 per cent. Fraud is up almost 70 per cent and indecent and harassing communications are up even more. Uttering threats is up 27 per cent.

Breaking and entering, on the other hand, is down more than 20 per cent. Canadians are now more likely to be victims of fraud than in-person crimes.

That issue recently drew a couple of dozen seniors to an RCMP workshop in the small community of Crapaud, 40 kilometres west of Charlottetown.

An RCMP officer is seen from behind talking to a group of seniors seated in front of her.
Over the last decade, RCMP on Prince Edward Island have presented dozens of workshops on how to avoid fraud and scams online. (Jane Robertson/CBC)

“It’s important to stay on top of all the latest scams, and hear about the scams and how they work,” said Paul Stevens, one of those who attended.

Just outlining all the various ways fraudsters will try to pry money from Canadians took the two RCMP officers who were presenting 90 minutes.

Cpl. Gavin Moore, an RCMP communications officer, said it’s hard for police to do anything about a fraud once it has occurred.

“It does bring about a whole different set of challenges in that they’re operating outside of the Canadian jurisdiction,” he said. “To address this, it’s key that we inform Islanders as best we can.”

Gavin Moore sits and looks at a computer screen.
Crimes mediated through online platforms come with special challenges for solving them, says RCMP Cpl. Gavin Moore. (Tony Davis/CBC)

RCMP started presenting workshops like the one in Crapaud a decade ago, and they have done more than 70 presentations to both community groups and businesses.

The need is obvious: The force issued a news release Tuesday saying Islanders reported $193,100 in crypto currency scam losses to the RCMP last year, with 10 of the cases involving thefts of over $4,000. 

Resolving cases is more difficult

Prevention is key, because as the number of these crimes grows, international clearance rates have collapsed.

In 2014, 40 per cent of extortion reports were cleared. In 2022 the rate was 11 per cent. Police cleared 30 per cent of fraud reports in 2014. That rate also fell to 11 per cent in 2022.

The perpetrators are well aware of their odds of getting caught, said Simon Fraser University criminology Prof. Richard Frank.

“If they’re abroad, then they’re safe,” said Frank. “There’s no real risks to them committing these frauds or attempts at fraud.”

Online but local

The rise in online-mediated crime is not always international in nature. Threats and harassing communications can also come from people who are known to the victims. These too are on the rise; they too can be mediated through online tools like the ones laid out in Virani’s new act.

Head shot of Jane Ledwell.
It is time to rethink what constitutes violence, says Jane Ledwell. (Submitted by Jane Ledwell)

“There are way more ways of accessing people’s privacy, accessing people’s information, reaching people through multiple ways, social media and phones, and all of the ways that we’re exposed to the world,” said Jane Ledwell, executive director of the P.E.I. Advisory Council on the Status of Women.

And women are often the target of this harassment, sometimes from former partners who still feel entitled to their attention, said Ledwell.

Racialized people are a target too, said Sobia Ali-Faisal, executive director of the P.E.I. advocacy group BIPOC-USHR. The acronym stands for Black, Indigenous, and People of Colour United for Strength, Home, Relationship.

“People seem a lot more comfortable being racist online,” said Ali-Faisal. “That people are more comfortable expressing racist views online creates this sense that ‘Everybody else is feeling this way.’ Someone who has these views, and they share them, they get a lot of people sharing similar views that might result in them being more comfortable expressing those views in real life as well.”

Person with long hair in white shirt
Developing comfort in saying racist things online can spill over into racist actions on the street, says Sobia Ali-Faisal. (Tony Davis/CBC)

Expressions that actually reach a criminal level are not nearly as common as racist slurs yelled on the street or posted in a Facebook comment, but they all play a part in making racialized people feel unsafe, said Ali-Faisal.

Ledwell believes it is time to revisit where the line between inappropriate language and criminal action is drawn.

“There is a real concern in the community, when it comes to gender-related violence, about those things that don’t quite meet the threshold of criminal behaviour,” she said.

“What constitutes inappropriate behaviour, what constitutes abuse, what constitutes violence? It’s really important to rethink some of those things.”

‘Threading the needle’

It’s a lot for one act of Parliament to fix, while at the same time considering broader issues.

“We’re trying to find this balance in terms of protecting free expression and ensuring we’re protecting children,” said Virani. “I feel we’re threading the needle quite well with this bill.”

A slide showing information about online scams and frauds.
With international crimes difficult to prosecute, prevention through education is currently a favoured tool for the RCMP. (Jane Robertson/CBC)

While protecting children is a major focus of the Online Harms Act, hate crimes will also be more defined. Platforms will also be required to do more to block unwanted communications, and improve how objectionable content is reported and handled.

Having material removed from the web would also be made easier. Under current conditions, pictures and videos can keep circulating on the internet long after their source has been identified and prosecuted.

“Even after Amanda Todd’s perpetrator was prosecuted, her victimization continues 10 years after the fact because those images continue to circulate,” said Virani.

Amanda Todd holding a cell phone.
More than a decade after her death, the pictures used to extort Amanda Todd can still be found on the internet. (CBC/The Fifth Estate)

The act also tries to ensure that people will know whether a “person” they are communicating with is not real.

“We’ve got a responsibility on Facebook and other online platforms to identify an inauthentic communication that’s generated by a robot,” said Virani.

Private versus public

In its efforts to protect free expression between individuals, the act does leave a large loophole.

“The Online Harms Act doesn’t address issues that occur on private channels, like within a DM or a text message,” Dunn noted.

“There are aspects of harmful behaviour that also need to be addressed by the government through criminal law or civil law that are more internal, so certain things like extortion might be happening on private channels.”

In a typical sextortion incident, or with online threats or harassment, the initial contact may be through public channels but the real harm happens when the conversation moves to private channels.

A man in a black suit gestures as he speaks into a microphone.
Social media companies have been willing to work with Canada on the Online Harms Act, says Justice Minister Arif Virani. (Sean Kilpatrick/The Canadian Press)

Age-appropriate features required in the bill may indirectly address this problem, said Virani. For example, platforms could make it harder for an adult to interact with a minor’s account, or learn who the young person’s connections are.

“You can eliminate a situation where it becomes knowable to the world at large who is within your network,” said Virani.

The idea is, if you’ve sent intimate images to an extortionist who doesn’t have your contacts, that person can’t threaten to send them to your friends and relatives. And threats to post them publicly would be covered by the act.

Many details to be worked out

How quickly change can come is another question.

The act lays out broad principles of responsibility, but the details will be in the regulations and will evolve under the guidance of three new bodies created by the act: the Digital Safety Commission, the Digital Safety Ombudsperson, and the Digital Safety Office of Canada.

These bodies will regulate specifically what the responsibility laid out in the act looks like. That is likely to take some time, said Dunn.

“It will take a few years once this bill has passed to see what the safety plans are,” she said. “The companies are going to have this expectation to assess their platforms, to identify the types of harm that exist on their platforms and then report back about what kind of mitigation strategies they’re doing and whether those… have been effective.”

WATCH: Video from CBC’s Power & Politics in February, when Virani’s bill was introduced in Parliament:

New online harms bill proposes changes to Criminal Code | Power & Politics

The Liberal government introduced its long-promised online harms bill Monday, proposing new regulatory bodies and changes to a number of laws in new legislation to tackle online abuse. Justice Minister Arif Virani discusses the new bill. Plus, Emily Laidlaw, Canada research chair in cybersecurity law weighs in.

While some harms that must be protected against are specifically outlined in the act, such as content that sexually victimizes a child, some other issues — such as impersonation — are not. Whether a person will be allowed to present themselves as someone they are not on social media has yet to be determined.

That those details are not written into the act is partly by design. The regulatory bodies will have to keep up with a constantly changing landscape, said Virani.

He said that’s how it needs to be, because any effort to define and fix all the problems today will soon be left behind by new technology and changing internet culture.


If you or someone you know is struggling, here’s where to get help:

[ad_2]

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button