On December 1, 2020, I had the pleasure of attending IP Osgoode’s Bracing for Impact conference series. Due to the Covid-19 pandemic, this episode of the series was held online in the form of a webinar. This year’s theme was cyber challenges to human rights, which is increasingly becoming one of the critical issues the global legal community has been attempting to tackle. The conference started with Professor D’Agostino’s opening remarks, which included the introduction of IP Osgoode, followed by the introduction of the distinguished panel of expert speakers.
Professor D’Agostino established the purpose of this webinar was to shine the spotlight on leading global leaders in cyberspace in the areas of national security, law enforcement, and the war on terror as effectively tackling crime and terror requires advanced technological tools. As we address these challenges, complex issues relating to public policy and human rights protection arise.
York University’s Head of Research & Innovation Professor Amir Asif gave his opening remarks on York University’s social and environmental impact and commitment. Professor Amir Asif mentioned that York University had been recognized by the Times Higher Education (THE) in its Impact Rankings placing 33rd out of 767 universities for how it tackled its serious economic, social and health challenges at home and across the world. Professor Amir Asif declared cyber challenges to human rights as one of the key novel global concerns. He mentioned that some of the key goals for York University’s future work include global digital connectivity and discovering all possibilities for collaboration and problem solving to enhance the way we think, learn and create. Professor Amir Asif then introduced Professor Irwin Cotler.
Professor Irwin Cotler is the Founder and Chair of the Raoul Wallenberg Centre for Human Rights and the former Minister of Justice and Attorney General of Canada. Professor Irwin Cotler talked about the global pandemic, global digital authoritarianism, global assault on media freedom, and dystopian use and abuse of cyber technologies, which threaten our democracy’s values and fundamental freedoms. Professor Cotler mentioned the Right to Privacy article written by Samuel Warren and Louis Brandeis, which anticipated the violations of privacy rights that may arise due to the use of mechanical devices. This issue is especially concerning because the law struggles to catch up to the racing emerging technologies.
Furthermore, Professor Cotler elaborated on his concept of dystopian use of cyber technologies by explaining the proliferation of state-sponsored cyber warfare such as weaponization of cyber technologies to attack elections, power grids, healthcare institutions, and oilfields and increasing ransomware attacks targeting different democratic institutions. Besides, malicious actors have been weaponizing social media to repress and silence dissents at home and spread disinformation to destabilize democracies and democratic institutions abroad. This is a significant issue because the disinformation spread by foreign agents is likely to destabilize the Canadian democracy and polarize Canadian society, which calls for an immediate and comprehensive response. Hence, complex questions regarding the regulation and liability of emerging technologies must be addressed. Professor Cotler also implied the importance of having non-ambiguous laws for effective compliance of laws and human rights protections.
After Professor Cotler’s conclusions, Dr. Aviv Gaon introduced Ms. Laila Paszti from Norton Rose Fulbright. Ms. Paszti talked about the ethical issues created by AI and whether the law can alleviate this tension. As the complexity of AI solutions create public mistrust, especially because AI systems are increasingly making more critical decisions that impact people’s lives, such as identifying them for employment, credit or mortgage or medical diagnoses. Moral questions regarding the input data’s integrity or consent, where and how these technological solutions should be deployed, and what type of oversight we should have over these uses must be adequately addressed. Moreover, she has talked about the role of voluntary codes of conduct and best practices in addition to formal regulations. For instance, privacy by design requires companies to consider privacy impacts early in the design stage and could be used to design AI solutions.
After Ms. Paszti’s insights, Mr. Omri Timianker, the President and Co-founder of Cobwebs Technologies, talked about the human rights implications and AI-Powered web intelligence for national security and law enforcement purposes. Mr. Timianker emphasized the need for guidance on law and ethics in a world where a technology company might have to deal with clients in multiple jurisdictions with different regulations. These set of rules, if appropriately established, will allow tech companies to make ethical decisions. Ethics seal might be the right way to incentivize companies to self-regulate.
Lastly, Ms. Cornelia Kutterer, the Senior Director of Rule of Law & Responsible Tech and European Government Affairs at Microsoft, shared her insights about how Microsoft approaches the issue of ethical and responsible AI. Prioritizing people at the centre of these technologies’ development at an early stage is a crucial factor in establishing a responsible approach to AI. Attending the Bracing for Impact webinar indeed enabled me to think about how emerging technologies impact rights and freedoms. The protection of human rights should be the international legal responsibility of all governmental and private parties.
Written by Elif Babaoglu. Elif is a contributing IPilogue editor and an avid privacy and tech-law enthusiast with a particular focus on artificial intelligence.