It’s a much-needed step to protect the integrity of our electoral process, but whether it is capable of accomplishing that important objective before the next federal election is an open question. Read this article for free: Already have an account? As we navigate through unprecedented times, our journalists are working harder than ever to bring you the latest local updates to keep you safe and informed. Now, more than ever, we need your support.
Starting at $14.99 plus taxes every four weeks you can access your Brandon Sun online and full access to all content as it appears on our website. or call circulation directly at (204) 727-0527.
Your pledge helps to ensure we provide the news that matters most to your community! It’s a much-needed step to protect the integrity of our electoral process, but whether it is capable of accomplishing that important objective before the next federal election is an open question. Read unlimited articles for free today: Already have an account? Opinion It’s a much-needed step to protect the integrity of our electoral process, but whether it is capable of accomplishing that important objective before the next federal election is an open question. On Nov.
4, Canada’s chief electoral officer, Stéphane Perrault, released a report entitled “Protecting Against Threats to the Electoral Process.” In that document, he makes four key recommendations aimed at “addressing emerging threats arising from artificial intelligence and deepfakes.” He suggests that the “impersonation provision” of the Canada Elections Act be expanded to apply to the misrepresentation of candidates and other key participants in the electoral process through the manipulation of their voice or image without their consent.
He also suggests that the “misleading publications” provisions of the CEA be amended to also apply outside an election period both inside and outside Canada, and to explicitly protect party leaders, leadership contestants and nomination contestants. More importantly, Perrault’s report recommends that “all paid and unpaid electoral communications (image, audio, video or text) distributed during a regulated pre-election and election period, or a contest, that have been generated or manipulated by AI should include a clear transparency marker.” The phrase “electoral communications” would include all communications to the public made by or on behalf of a political entity, including a registered third party.
In addition, it would include communications by any other entity whose purpose is to influence electors to vote or not to vote, or to vote for or against a candidate or party. The requirement would also apply to nomination and leadership contests during the contest period. Finally, Perrault recommends that “In order to ensure that accurate information is being distributed about when, where and how to register and vote .
.. platforms that have AI-generated chatbots or search functions should be required to indicate in their responses where users can find official or authoritative information.
” Any person who has spent any time on social media lately has seen firsthand how innovations in the technology have made it possible to easily manipulate a person’s voice, and to replace someone’s face in a video with the face of a different person. The software enables ill-intentioned people to spread false information and sow confusion among voters, and that is one of the key threats that Perrault says needs to be confronted. Indeed, his report says that “AI images of people doing things they never did, audio of them saying things they never said or created videos can threaten democracy and make it difficult for a voter to know what is real and what is a deepfake.
” He also cautions that “foreign state actors could leverage the power of technology to create deepfakes in order to influence or undermine the electoral process.” Last week, Perrault told the media that he is “hoping to convince” the Trudeau government to expand an electoral reform bill — Bill C65 — that is currently being considered by the House of Commons to include provisions that reflect his recommendations. Given the seriousness of the threat posed by deepfakes and AI to the integrity of our elections, and the fact that neither the current version of the Canada Elections Act nor Bill C65 address the issue, our Liberal government should not require convincing that stronger, more detailed provisions are necessary.
In fact, such laws are overdue. While Canada has been largely asleep at the wheel on the issue, jurisdictions around the world have been amending their respective election laws to include anti-deepfake and anti-AI provisions. Passage and implementation of Perrault’s recommendations by Canada’s Parliament would simply allow Canada to catch up with those jurisdictions.
We would be more in line with other democracies, but only if the provisions are enacted before the next federal election. Given the slow pace of our legislative process and the reality that an election could be triggered at any time — it must occur by no later than next October — it is doubtful that new anti-deepfake and AI laws can be passed into law and implemented before Canadians vote to elect their next national government. That troubling reality means it is unlikely that Canada will have any laws in place to protect voters from deepfakes and other AI-based manipulation before and during the next federal election.
And that means the integrity and validity of our next election is at risk. Advertisement Advertisement.
Politics