AI and Human Rights: Navigating Technological Responsibility

Authors

  • Bakhtawar Ashraf Lawyer, Managing Partner at Khan & Shah Attorneys, Visiting Faculty University of Azad Jammu & Kashmir Pakistan

Keywords:

Artificial Intelligence (AI), Human Rights, Ethical Concerns, Legal Personhood, Free Speech, Intellectual Property, Prejudice, Policy Measures, Human Rights Violations

Abstract

The initial concerns raised by social scientists regarding Artificial Intelligence (AI) Technologies have now expanded to include potential effects on human rights. AI has created tensions affecting human rights, making it crucial to acknowledge these issues and seek solutions. Unfortunately, there are no global statutory rules or conventions regulating AI technologies. This legal uncertainty leaves individuals whose rights are violated by AI without any recourse. AI has had more negative than positive effects on liberties, notably impacting information privacy, equality, freedom of speech and expression, the right to assemble, and employment freedom. AI also affects intellectual property rights and can perpetuate bias and discrimination. A pertinent question is whether AI systems should be granted legal personhood, allowing them to be held accountable for their actions; the answer remains unclear.   The paper will follow an analytical research methodology to emphasise that as AI's impact on individual rights grows, governments must develop a regulatory framework. This paper focuses on human rights and the threats posed by AI technologies, particularly examining the legal personhood and responsibility of AI tools. The paper concludes that addressing the relationship between AI and human rights is complex, requiring intentional collaboration among governments, AI system users, and developers.

Downloads

Published

2024-08-15

How to Cite

Ashraf, B. (2024). AI and Human Rights: Navigating Technological Responsibility . UCP Journal of Law &Amp; Legal Education, 2(2), 54–72. Retrieved from http://58.27.199.232/index.php/ucpjlle/article/view/299