Abstract
This paper explores the ethical implications of predictive justice, shifting the focus from technical functionality and regulation to the normative stance judges should adopt when interacting with algorithmic tools. Predictive systems based on data analytics and statistical modeling offer the promise of efficiency and consistency in judicial decision-making. However, this approach risks marginalizing the moral weight of judgment by abstracting it from the unique and irreducible dimension of each individual case. The concern is that algorithmic justice, in bypassing the relational and experiential core of legal responsibility, reduces judgment to a technical operation devoid of ethical depth. By emphasizing the need for a renewed ethical awareness within judicial practice, the paper argues for a conception of justice that resists the standardization of legal responses and preserves the singularity of human experience. Without this dimension, the judiciary may become estranged from the moral obligations that give legitimacy to its authority.
