9. The Critical Engineer notes that written code expands into social and psychological realms, regulating behaviour between people and the machines they interact with. By understanding this, the Critical Engineer seeks to reconstruct user-constraints and social action through means of digital excavation.
Every piece of technology we design has (usually our own) ideologies embedded within them. So often, people are so caught up with the *point* of the system they're trying to make that they don't even consider the unintended social implications that are going to result from their design choices. In the context of educational technologies, we see this constantly - sometimes the design choices support ideologies we may want to give to children (e.g., giving them a box to "justify their explanations" as a way of telling them that we value the idea of justifying our ideas) but also so many that we don't want to give to children. For example, if we let a child choose whether they want their avatar to represent them as a boy or a girl and we notice that girls only choose boy avatars whenever they're doing math games, do we have a social responsibility to stop their own internalized stereotyping? Are we enabling dangerous stereotypes that can be harming to students' self-image while they're learning? More concretely, for the most part, our educational technologies are telling students that we encourage competition, only reward the correct answer (regardless of process), don't formally support or care about collaboration, and that learning happens in a vacuum.
While I completely agree that the Critical Engineer should be seeking to reconstruct social action, there's also a more available and equally important first step - understanding how the seemingly innocent design choices we make *do* influence powerful things about users' sense of identity, even for students who don't fit into the Straight White Upper Middle Class Western Male box.