Título: | LAW AND TECHNOLOGY IN AMEFRICAN PERSPECTIVE: AUTONOMY, ALGORITHMIC BIAS AND RACIALITY | ||||||||||||
Autor: |
BIANCA KREMER NOGUEIRA CORREA |
||||||||||||
Colaborador(es): |
MARIA CELINA BODIN DE MORAES - Orientador CAITLIN SAMPAIO MULHOLLAND - Coorientador |
||||||||||||
Catalogação: | 13/MAI/2022 | Língua(s): | PORTUGUESE - BRAZIL |
||||||||||
Tipo: | TEXT | Subtipo: | THESIS | ||||||||||
Notas: |
[pt] Todos os dados constantes dos documentos são de inteira responsabilidade de seus autores. Os dados utilizados nas descrições dos documentos estão em conformidade com os sistemas da administração da PUC-Rio. [en] All data contained in the documents are the sole responsibility of the authors. The data used in the descriptions of the documents are in conformity with the systems of the administration of PUC-Rio. |
||||||||||||
Referência(s): |
[pt] https://www.maxwell.vrac.puc-rio.br/projetosEspeciais/ETDs/consultas/conteudo.php?strSecao=resultado&nrSeq=58993&idi=1 [en] https://www.maxwell.vrac.puc-rio.br/projetosEspeciais/ETDs/consultas/conteudo.php?strSecao=resultado&nrSeq=58993&idi=2 |
||||||||||||
DOI: | https://doi.org/10.17771/PUCRio.acad.58993 | ||||||||||||
Resumo: | |||||||||||||
This paper consists of analysis about the effects of what may be called new
technologies on not-white bodies and experiences when exercising their autonomy,
more precisely algorithmic biases derived from Artificial Intelligence (AI) systems
applied to digital products and services. In a global scenario of intense connectivity,
associated with sophisticated AI techniques and predatory use of personal data,
racial discrimination dynamics is being reproduced, reinforced, and hidden in
search platforms and engines, monitoring politics, and products and services
access. There is an efficient belief in law and technology neutrality. In the Brazilian
scenario, this belief still shows allied to sharing the myth of racial democracy,
narcissistic pacts, and racism denial, in a way that confronting racial inequalities by
techno-regulation, algorithmic governance, or even to the light of ethical-legal
challenges is still devoided. To explore the algorithmic racial bias phenomenon, it
is proposed a reflection about coloniality effects in the intersection between law and
technology from the Amefricanity politician-cultural category, developed by Lélia
Gonzalez. Starting from the premise that law and new technologies keep being read
and built on whiteness sign behind supposed neutrality and formal equality: a place
of privilege related to not-identified raciality. Under a formal inequality mantle kept
by law, the supposed indifference of algorithms and automatons in face of racial
identity of individuals reproduces a devilish use of ethnic-racial characteristics as
an exclusion mechanism. Right normative construction and ethical values that
surface the construction of technological governance, in turn, are produced from
the experience included in the being zone. From the Amefrican perspective
ingrained in Brazilian experience, it is intended to offer a narrative that re-establish
the role law performs and ethical-legal challenges on violence processes found on
the not-being zone in digital environment.
|
|||||||||||||
|