I agree with you that most technology of any kind can be used for good and evil. Human intent is good at appropriation; we can find ways of using most things as a means to our ends.

However, I disagree with your argument that technology has no morality embedded in it. Software viruses designed to hold personal data hostage are designed with the explicit intend to extort. That is a human moral value embedded in code, and a software behavior that is difficult to appropriate for any other purpose than harm. Yes, the human intent was necessary for that harm, but once released, the software acts alone.

Guns don’t act alone. If they were to — imagine a closed-loop drone with the ability to target civilians — I would view that drone as embodying and enacting evil. That it was a human that originally designed it to be evil human intent doesn’t absolve the drone of evil.

Some software technology is more neutral. Programming languages, for example, are perhaps the most neutral: they are designed for creation, regardless of intent. Some of my social scientist friends argue, however, that even programming languages embody questionable human intent: the desire to strip context and meaning from information.

All that said, my thesis is not that code is evil. That was a thesis I was rebutting. My thesis is that code is power, and that in a democracy, power must be democratized.

Professor of programming + learning + design + justice at the University of Washington Information School. Trans; she/her. #BlackLivesMatter.