Monday, August 7, 2017

Beyond Coding and Computer Science in Schools: The Need for Computer Ethics

MIT's Moral Machine
For the past three years in Minnetonka, we have been teaching computer coding in our schools starting in kindergarten. Last year we changed the curriculum from simply coding instruction to include computer science principles in our lessons for students. This year we are adding and integrating maker spaces with our coding (more on this in a future post). 

Recent rapid advances in technology and stories in the news have got me thinking about what will be next in our coding program and needed in the future of computer science in schools. I believe that computer ethics will need to be added to and integrated in our teaching. Computer ethics is defined as a part of practical philosophy concerned with how computing professionals should make decisions regarding professional and social conduct. We need to start presenting our students with the complex issues and dilemmas that they will face in their future (if not already) to get them thinking about these problems and the bigger picture beyond lines of code. From advances in medical technology to robotics, today's students will be faced will all sorts of new problems that will require them to think about and figure out innovative solutions in entirely new ways we haven't dealt with previously.

A great example to illustrate this instructional need for is the self-driving car programming dilemma pictured above: in an unavoidable accident, who should the car be programmed to allow to die? A morbid yet necessary decision. The nuances of this question and all the possible scenarios would make for great classroom discussions and debate. (Does the age and number of people involved change the program? What about the social status of the individuals involved? What if the pedestrian involved is jay walking? etc...) MIT has a great "Moral Machine" scenario website for this. Having students do their own research on this will yield more resources, such as this recent article, Here's How Tesla Solves A Self-Driving Crash Dilemma.
"It is *because* some ethical choices are difficult, or difficult to understand as ethical choices, that they need to be taught to students." (Source)
We need to have these discussions with students and get them thinking about these complex issues. They need to be become aware of these ethical dilemmas so that they can face (and solve) even more complex issues that they will come across in the future. One nice resource for this is the University of Notre Dame’s John J. Reilly Center for Science, Technology and Values. For the past four years, they have published an annual "list of emerging ethical dilemmas and policy issues in science and technology." Besides autonomous cars, issues include "robotics, neuroscience, education and medical management." (Source) Again, having students research and find these issues will result in countless options, such as The Ten Commandments of Computer Ethics and this discussion board on Ethical dilemmas faced by software engineers. As they leave our schools to head out to the next stages in their lives and careers, we need our students--our future leaders--to understand and consider the ethics involved in their actions, decisions, and inventions. 

Related posts:

No comments:

Post a Comment