Draft:Fundamental Rights Impact Assessment
![]() | Review waiting, please be patient.
This may take 2 months or more, since drafts are reviewed in no specific order. There are 3,100 pending submissions waiting for review.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
Reviewer tools
|
Submission declined on 11 September 2025 by WeirdNAnnoyed (talk). Article is duplicative of Human Rights Impact Assessment and could just be a short additional section of that article. Sources do not sufficiently establish that this is a uniquely distinct topic, and most of them focus narrowly on AI (if this is really a more general topic then more general applications should also be discussed). And speaking of which, I suspect the article was made by (or at least partly by) an LLM.
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
This draft has been resubmitted and is currently awaiting re-review. | ![]() |
Fundamental Rights Impact Assessment (FRIA) is a process used to identify the potential impact of a specific technology, product, solution or technological environment on fundamental rights[1]. It focuses on fundamental rights, which differ from human rights[2][3][4] in that they are a category of rights recognised at a national level by legal systems, typically in constitutions. By contrast, human rights are recognised at an international and horizontal level, but do not impose direct obligations on citizens and companies. This distinction is crucial to correctly frame the difference between FRIA and Human Rights Impact Assessment (HRIA) [5][6][7], and avoid considering the former as a subcategory of the latter.
The main characteristics of a FRIA are: (i) a prior assessment approach; (ii) a focus on a specific technology application and the individual rights it may affect; (iii) a risk-based approach centred on the context of use; and (iv) an iterative, circular structure that follows the product/service throughout its lifecycle.
A FRIA has some similarities with a Human Rights Impact Assessment (HRIA) [5][6][7], but differs in terms of its objectives and, consequently, its methodology. In particular, HRIA has mainly been used in response to critical situations when prejudice to human rights has already occurred, as well as forming part of corporate due diligence[8]. In contrast, FRIA, as in the case of its implementation in the EU Artificial Intelligence Act [9], is a prior assessment to be carried out before any innovative solution is implemented in the real world, adopting a by-design[10][11][12] perspective. Furthermore, the HRIA is primarily a policy tool that provides companies with an assessment of potential impacts and a list of possible solutions to prevent or mitigate them. It leaves companies free to decide which solutions to adopt and to what extent to reduce those impacts. In contrast, FRIA is used by legislators as a mandatory assessment, the results of which must be used to prevent or mitigate technology-related risks.
Current studies and practices in the field of FRIA are still in their infancy, with the main developments taking place in the field of digital technology. This is in line with, and a consequence of, the focus of the EU legislator on the impact on fundamental rights in recently adopted regulations, such as the DSA and the AI Act.[13] In the context of AI regulation[14][15], the importance of FRIA in counteracting the potential negative impact of AI on society has been recognised.[16] In this regard, some methodological approaches adopt a broader scope, referring to algorithms in general[17] and considering a variety of impacts, not limited to fundamental rights.[18]
References
[edit]- ^ The European Union Agency for Fundamental Rights. 2020. Getting the future right – Artificial intelligence and fundamental rights
- ^ "Human Rights and Fundamental Rights (ChFR and ECHR) - Max-EuP 2012". max-eup2012.mpipriv.de. Retrieved 2025-09-12.
- ^ Palombella, Gianluigi (2007). "From Human Rights to Fundamental Rights: Consequences of a conceptual distinction". Archiv für Rechts- und Sozialphilosophie. 93 (3): 396–426. doi:10.25162/arsp-2007-0027. ISSN 0001-2343. JSTOR 23680856.
- ^ "Human rights and fundamental rights". A treatise on environmental law. Vol. 3. Lisbon Public Law Editions. 8 March 2024. pp. 22–76. ISBN 978-989-9179-03-5.
- ^ a b Danish Institute for Human Rights. Human rights and impact assessment.
- ^ a b CPDP Conference. 2022. Assessing the impact on fundamental rights in AI applications.
- ^ a b https://documents1.worldbank.org/curated/en/834611524474505865/pdf/125557-WP-PUBLIC-HRIA-Web.pdf [bare URL PDF]
- ^ https://www.ohchr.org/sites/default/files/documents/publications/guidingprinciplesbusinesshr_en.pdf [bare URL PDF]
- ^ Mantelero, Alessandro (2024). "The Fundamental Rights Impact Assessment (FRIA) in the AI Act: Roots, legal obligations and key elements for a model template". Computer Law & Security Review. 54 106020. doi:10.1016/j.clsr.2024.106020.
- ^ Ira, Rubinstein (2012). "Regulating Privacy by Design". Berkeley Technology Law Journal. 26: 1409. SSRN 1837862.
- ^ Prifti, Kostina; Morley, Jessica; Novelli, Claudio; Floridi, Luciano (2024). "Regulation by Design: Features, Practices, Limitations, and Governance Implications". Minds and Machines. 34 (2) 13. doi:10.1007/s11023-024-09675-z.
- ^ Almada, Marco (2023). "Regulation by Design and the Governance of Technological Futures". European Journal of Risk Regulation. 14 (4): 697–709. doi:10.1017/err.2023.37.
- ^ Malgieri, Gianclaudio; Santos, Cristiana (2025-04-01). "Assessing the (severity of) impacts on fundamental rights". Computer Law & Security Review. 56 106113. doi:10.1016/j.clsr.2025.106113. ISSN 2212-473X.
- ^ Senado Federal-Coordenação de Comissões Especiais, Temporárias e Parlamentares de Inquérito. 2022. Relatório final comissão de juristas responsável por subsidiar elaboração de substitutivo sobre Inteligência Artificial no Brasil. Brasília.
- ^ https://www.dpdenxarxa.cat/pluginfile.php/2468/mod_folder/content/0/FRIA_en_def.pdf [bare URL PDF]
- ^ Ortalda, Alessandro (2023-09-12). "More than 150 university professors from all over Europe and beyond are calling on the European institutions to include a fundamental rights impact assessment in the future regulation on artificial intelligence". Brussels Privacy Hub. Retrieved 2025-09-12.
- ^ https://jolt.law.harvard.edu/assets/articlePDFs/v35/Selbst-An-Institutional-View-of-Algorithmic-Impact-Assessments.pdf [bare URL PDF]
- ^ "Fundamental Rights and Algorithms Impact Assessment (FRAIA) - Report - Government.nl". 31 July 2021.