MONTRÉAL, Nov. 12, 2024 /CNW/ - The Canadian Institute for Advanced Research (CIFAR) is proud to announce that it will play an essential role supporting the Canadian Artificial Intelligence Safety Institute (CAISI), a highly anticipated initiative unveiled today by the Government of Canada. This new institute is a pivotal step in ensuring the safe and responsible development and deployment of artificial intelligence (AI) technologies, reinforcing Canada's position as a global leader in AI.
Recent studies have shown that a lack of confidence in AI safety is currently blocking the adoption of AI technologies by many Canadian businesses, which risks impacting their competitive advantage and productivity. By addressing societal, technical and ethical challenges around AI, CAISI will play a crucial role in building trust in AI technologies and promoting their responsible adoption among Canadian business and institutions.
Building on its leadership role since 2017, when the Government of Canada identified CIFAR as a partner for the Pan-Canadian AI Strategy—making Canada the first country to implement such a strategy—CIFAR will now implement the applied and investigator-led research stream of the Canadian AI Safety Institute. Funded by a $27 million contribution to CIFAR from the Government of Canada, the applied and investigator-led research stream of CAISI will be steered by CIFAR, in collaboration with the National AI Institutes, and will take a multidisciplinary approach, engaging experts from diverse fields to mitigate and address both the short and long-term risk of AI systems. By partnering with research institutions across the country, it will further the research Canada's experts have already started in areas of AI safety, such as detecting AI-generated content, contributing to the evaluation of advanced AI models, ensuring the safe adoption of AI in high-risk applications and ensuring privacy in AI systems.
"The Canadian Artificial Intelligence Safety Institute will propel Canada to the forefront of global efforts to use AI responsibly, and will be a key player in building public trust in these technologies. In a world that's evolving quickly and full of unknowns, Canadians can be confident that we will always take the necessary steps to ensure the AI they use is safe, responsible, and trustworthy," says the Honourable François-Philippe Champagne, Canada's Minister of Innovation, Science and Industry.
The CAISI applied and investigator-led research stream will support priorities identified by CAISI, while considering key international reports such as the International Scientific Report on the Safety of Advanced AI, developed as part of a global initiative following the inaugural international AI Safety Summit in 2023. Yoshua Bengio, a Canada CIFAR AI Chair and Founder and Scientific Director at Mila, was appointed to lead the creation of this report.
To ensure a comprehensive and collaborative effort to understand and mitigate AI risks, CIFAR CAISI research activities will be guided by a Scientific Committee, which will include researchers and representatives from CIFAR, the National Research Council of Canada, and the country's three National AI Institutes: Mila in Montréal, the Vector Institute in Toronto and Amii in Edmonton. The program will leverage the extensive expertise of these renowned institutes.
Stephen Toope, President and CEO of CIFAR, expressed his enthusiasm for the initiative, saying: "As the world grapples with the tremendous potential and the risks of AI, CIFAR is pleased to play a leading role in Canada's response. The applied and investigator-led research stream at CIFAR will draw upon the strengths of Canada's robust AI scientific community in order to advance world-leading research on AI safety, for the benefit of all."
The launch of the Canadian AI Safety Institute marks a significant milestone as Canada joins a select group of countries with their own AI Safety Institutes, such as the United Kingdom and the United States. Just days before the inaugural meeting of the International Network of AI Safety Institutes in San Francisco and the upcoming AI Action Summit in Paris this February, Canada has now solidified its commitment to the safe development and adoption of AI technology.
"Canada has, over the course of decades, built a strong network of Canadian AI researchers, thanks in part to the Pan-Canadian AI Strategy," comments Elissa Strome, Executive Director, Pan-Canadian AI Strategy at CIFAR. "Our world-leading scientists are highly engaged in the scientific questions that inform the development of safe and responsible AI. This initiative will ensure that Canada will continue to be a leader in the ongoing global scientific response and deployment of this new and powerful technology."
CIFAR is currently seeking to fill three open positions for Members at Large on its Scientific Committee. Applications are accepted until December 16, 2024, and can be submitted here.
About CIFAR
The Canadian Institute for Advanced Research (CIFAR) is a globally influential research organization proudly based in Canada. We mobilize the world's most brilliant people across disciplines and at all career stages to advance transformative knowledge and solve humanity's biggest problems, together. Through our 15 research programs and the Pan-Canadian AI Strategy, CIFAR convenes close to 400 of the world's top researchers from 141 institutions in 20 countries. This community has included 23 Nobel Prize laureates, 3 Turing Award winners, 37 recipients of the Gerhard Herzberg Canada Gold Medal for Science and Engineering and 2 recipients of the Social Sciences and Humanities Research Council (Canada) Gold Medal. Our early-career programs attract and support some of the most exciting young researchers in the world today. We are supported by the governments of Canada, Alberta and Québec, as well as Canadian and international foundations, individuals, corporations and partner organizations.
SOURCE Canadian Institute for Advanced Research
Media Contact: Eric Aach, [email protected], 514-569-3594; Anna Woodmass, [email protected], 416-571-2147
Share this article