Because people think differently from machines. An algorithm writes off people's feelings if there is a net gain. A correlation is all that's needed ie: this neighborhood has net losses so avoid it entirely to save time.
A human is more sensitised to the feelings produced by such actions, particularly if they resonate with memories or live connections.
If you were redlined for reasons that shouldn't apply to you, you immediately see something the algorithm can't and that's frustrating. Once enough people are outraged a group stance is formed and if this resonates with existing moral code it's added to what is considered as ethics.