The idea for a social impact statement originated with Ben Shneiderman's 1990 address to the Computers and Society special interest group of the Association for Computing Machinery. He proposed a model based on the environmental impact statement that would enable software designers to find out the social impact of the systems they design in time to incorporate changes in those systems as they are built.
Since then the SIS has caught on as a fine teaching tool at several institutions across the country. Faculty at the Unversity of Maryland, George Washington University, and the University of Texas at Austin have all used variations on this approach to help students understand the impact of the software systems they design. In 1996, I presented a paper at the Computers and Society conference that described my use of the SIS in a class. This paper was published in the proceedings of that conference.
The social impact statement was first suggested by Shneiderman  at a SIGCAS conference. Huff and Jawer  provided some rudimentary suggestions for beginning an SIS. The basic idea is modeled on the environmental impact statement, and is an organized and coherent look at the social and ethical implications of a computing system.
The idea for including an SIS in a class on social and ethical issues in computing came from a desire to have students see ethical and social issues as real problems that they might have to face in their careers. By identifying these problems in real systems--with some guidance from the instructor--they will have a chance to locate them within the complexity of the technical issues of the system. This exercise provides students with the skills of locating these issues and thinking carefully about them, and the luxury of having the time to do so.
The social impact statement I describe here is designed as a class project, to be done by teams of students, looking at real computing systems on or near campus. I suspect these skills could be taken out of the classroom and profitably applied to systems design, and in fact this class project is designed precisely because many authors believe these skills will be useful in system design. But this description limits itself to the use of the SIS as a teaching tool.
As the major project paper for the Technology and Society class, I asked students to do a social impact statement on a real system on campus. With a little telephone work, we found 12 systems on campus with faculty or staff who were willing to help students understand the systems they had in their charge. We called the faculty or staff in charge of these systems the students' "clients" and students were encouraged to deal with them in the professional manner that such a relationship suggested. Some examples of the systems include: CAI programs in the medical school, the card access system to the dorms, cafeteria, and library, the database maintained by the health office, the modem pool, the campus public information system, and an emergency room patient intake system.
The list of relevant issues for any client's system was generated from the topic space suggested by the ImpactCS report on teaching social and ethical issues in computing . This list provides a good starting place for matching social and ethical issues with the concrete aspects of the system under study. The list is obviously not comprehensive, but does serve a useful function in beginning an inquiry. In addition to using this list to identify issues, students were also acquainted with Collins and Miller's Paramedic method  that begins with a comprehensive listing of the stakeholders involved in any ethical decision. Armed with a list of stakeholders for a system and a list of possible social and ethical issues, students can usually generate a quite plausible and comprehensive set of concerns with which to begin their inquiry.
Some of the systems were quite complex, and suggested more social and ethical issues than could be dealt with in a single class project. The campus public information system, for instance, brought up issues of privacy, reliability, property rights, the use of power, honesty, and a host of other issues associated with networks. Students needed quickly to focus on one or two of these issues in order to make any headway in such a large topic space. One of the criteria used to help narrow the scope of an issue was a quick look at which issues had been given the most attention by the clients. Students were then encouraged to concentrate on those issues their clients had not considered. Other methods also exist for paring the list of issues to consider. For instance one can investigate only those issues that seem the most dangerous, or those in which the client is most interested, or those that it would be the easiest to investigate. Of course the criteria one might use in a class project (e.g. can it be finished by novices in a few months?) will differ at times from those one would use in industry.
There are several goals of an SIS, and they flow from the practical nature of the undertaking. An SIS is not an attempt at publishable social science research, or a theoretical deconstruction of the meanings inherent in a computer system (though it may borrow from both of these approaches). It is a tool to allow students (and hopefully systems designers) to locate and deal practically with the social and ethical implications of the technology with which they are working.
A primary goal is to provide surprises about how the system works and the consequences of its operation. By surprises, I mean new information that is not included in the standard story of how the system works. Safety engineers call these latent errors , because they lie unnoticed in a system until an accident or critical incident arises. But the idea of latent errors need not apply to only safety issues. It can apply as well to issues of equal access, privacy, property rights, quality of life, or any area where hidden or unnoticed aspects of the system can pose ethical challenges. Thus, the methods used to produce an SIS must be apt for uncovering these surprises--simply reading the specification sheets will not do. In addition, the practitioner doing an SIS must be aware of the political issues involved in pointing out surprises to designers, managers, and operators of a system. Even if it is your job to do so, one must be politic in pointing out oversights to people you will need to trust to implement fixes.
A secondary, but still important, goal is to give the clients some practice in thinking about the ethical and social aspects of their own system. In a way, the process of thinking about the issues is as important as the product . Even an extended inquiry into a system cannot be sure of turning up all the issues (or even the most important ones). But sensitizing the designers, managers, and operators of a system to the social and ethical issues, and giving concrete examples of potential difficulties in the system, may make them more aware of the issues and more likely to detect other issues when they arise. One way of doing this sensitizing is to provide clients with a document that will be useful in future modifications of the system, and that will point them to the literature describing the particular problems they face.
The SIS class project was based on library and empirical research the students did, presented in class, and revised in their final documents. In their library research, students were asked to locate discussions of the likely ethical and social issues associated with their client's system. As an entry to the literature, they used whatever references their clients could suggest, selections from the set of readings for the course , and other references that the instructor or other faculty members could provide. They then expanded on this reference base using standard bibliographic tools. One purpose of the literature search was to help them compile a set of readings to recommend to their clients, but it also helped to prepare them for the kinds of issues they would be dealing with in the empirical stage of their project.
Students were asked to use 3 different empirical techniques to locate and analyze the ethical and social issues: interviews with principle informants, field observation of the system in use, and the construction of day-in-the-life scenarios. The purpose of using different techniques was both to triangulate on issues and to attempt to uncover hidden inconsistencies or oversights that might be unquestioned if only one technique was used.
Interviews. We lectured on and gave students practice in interview techniques derived from expert systems interviews . This allowed students to do an interview that gave them a view of the system as their principle informants (the experts) saw it. Students were given practice in taking an experts verbal description of a system and unpacking it to determine the criteria the expert was using to describe the system. They were also given practice in constructing an interview protocol that led logically from basic issues to critical functions of the system. In addition, we talked about the ethical issues inherent in doing interviews (e.g. confidentiality, respect, informed consent).
Groups were asked to do at least 3 and no more than 7 interviews. Some groups did follow-up interviews. They were asked to interview the designers and managers of the system, but also to include, where possible, lower level operators and clients of the system (and other important stakeholders in the system). These multiple views of the system can help piece together a more complex picture of the social and ethical concerns than a view from only one perspective.
Field Observation. We lectured and gave students practice in designing coding systems and taking field notes for careful, real time observation of the system in use [see 8 & 11 for suggestions]. The purpose of these observations was to allow students to get a better feel for the chaos and complexity of actual system use--as opposed to the description elicited in the interviews. Thus, these field observations were of necessity exploratory and not highly structured (e.g. no group used prepared coding sheets to count specific behaviors). Groups were asked to do at least 3-5 hours of real-time observation. For some groups (e.g. the modem pool, the card access system) on-line activity reports were already collected, and these were used as a part of the field observation. We asked student to be respectful of peoples privacy, but to also attempt to get as wide and varied a set of observations as time would allow.
Day-In-The-Life Scenarios. This technique, taken from human-computer interaction methods , involves taking the data from observations and interviews and using it to construct a "story-line" of a unit of the system over a unit of time. This could consist in tracking the actions of a person over a day (or over a reboot cycle), or tracking of a piece of personal data from its collection to its purge from the system. If the units tracked, and the time over which they are tracked, are chosen carefully, they can point out critical information that may be overlooked in interviews or observations. The day-in-the-life scenario is a less structured method of what is called "task analysis" in HCI circles . Other task analysis methods (e.g. construction of task frequency tables) would be useful in some projects, but the primary purpose here is to acquaint students with a flexible method to illuminate gaps in their knowledge. Careful construction of these scenarios allows students to see gaps in the story provided by their current data (e.g. what happens during shift changes?).
The combination of these three methods allowed for cross-checking among the various stories offered by each method, and helped to promote a more comprehensive view of the use and operation of the system, and of the procedures associated with system use and operation. If one were to do an SIS as part of a system design, it might be either more or less comprehensive than these, depending on the time, resources available, and the importance of the social and ethical issues involved. Part of class discussion centered on how to make these difficult choices while still producing a product on time and within budget. These discussions are more lively after students have some experience with their analysis and thus some idea of the importance of the issues they uncovered and the difficulty involved in the analysis.
The final report consisted of 6 sections:
Students were encouraged to focus on the client as the report's audience, and to produce a report they felt would be helpful to the client.
Executive Summary. This should be a page or two summary of the report. It should include a description of the report and of the system, a discussion of the significant issues discovered, and a list of the top recommendations highlighted on the page. Each of these should be keyed to page numbers in the longer report. The idea is to provide a summary that an executive can read in 5 to 10 minutes to get the basic information about the report. Summarizing information in this way is in itself a useful skill for computer science students to learn.
Description of the System. This description should include the physical, logical, procedural, and social elements of the system. The physical structure includes the machines and other hardware involved, the networks, and the physical facilities in which the system is housed (e.g. the offices). The logical structure includes the data structures and software structures involved in the system. The procedural elements of the system include the ways in which data is gathered, collated, stored, backed up, and reported. They also include procedures for maintenance, repair, and replacement of the system, and any other relevant organizational procedures (e.g. those related to privacy protection or safety). The social elements of the system include a description of personnel and their relationships to each other and to other relevant stakeholders.
Analysis of the Results. This section includes a discussion of those concrete aspects of the system that lead to specific concerns. These many include any single aspect of the system or interactions between aspects of the system (e.g. procedures that assume technical maintenance even though personnel are not trained). Patterns of use, patterns of oversight or error checking, specific hardware or software concerns, or specific organizational procedures are all candidates for inclusion in this section. The analysis of these specific concerns should highlight the specific risks associated with them, the probability of those risks occurring, and the likely harm or ethical concerns associated with those risks. Finally, the concrete advantages of resolving the specific concerns should be described.
Recommendations. This section should contain a set of recommendations that address each specific concern mentioned in the previous section. There should be at least two action options for each specific concern, and those options should be evaluated in terms of the client's goals and the ethical or social concerns involved. In most cases the client's goals will be multiple. For example, maintaining accurate records, guarding privacy, and minimizing cost. The effect of each option on this suite of goals should be noted. The options recommended should be carefully constructed to avoid simple black-and-white choices (e.g. safeguard privacy vs disregard privacy) and to emphasize the best available options for dealing with the issue. Technical fixes (e.g. use a different backup method) should be included as options, but should not be the only options listed. For instance, procedural changes or personnel training could also be recommended.
Reader's Guide. This should be a prose introduction to the most balanced and readable discussions of the issues that confront the client. It should include at least one item (e.g. a reader or an advanced article) that will serve as a window to further literature. This section can be organized as an annotated bibliography or as a prose review with references.
Methodological Appendix. This section should contain a rationale for the particular methods chosen, and a detailed and concrete description of those methods. The individuals interviews should be noted (those privacy issues here are important) as should the specific questions asked in the interviews and any changes made in the interviews to meet needs. The description of the field observation should include a description of, and a rationale for, the observation sites and times. It should also include a description of the significant events looked for, a description of the significant events discovered, and a list of any changes made in the observation protocol made to meet needs. The description of the day-in-the-life scenarios should include a rationale for the choice of those particular perspectives and time frames, a description of the information from which they were compiled (e.g. interviews, manuals, etc.), and finally, the detailed scenarios themselves.
Some students have difficulty getting started on these large projects, so I recommend that you include interim deadline dates. Useful deadlines might be initial contacts, initial literature searches, protocols for interviews and observations, and a first draft of the report. The interview and observation protocol deadlines are useful because the instructor can use them to make sure the data gathering is well organized and ethically thoughtful.
It takes time for students to get used to the difficulty of these issues, and they will need the entire term available to them to gather the information needed, and to backtrack to collect data needed to clarify questions that arise late in the process. I suggest that students try to have their data collection done by at least 3/4 of the way through the term. During the write-up phase, they will almost certainly want to go back for some follow-up interviews or observations, and it is good to have some time in which to do these.
Since these are often large projects, students could work in groups to complete them. However, it is reasonable to ask that every student be involved in every phase of the project (e.g. literature search, data collection, analysis, and write-up). One way to deal with the perennial issue of unequal group participation is to ask students to rate each other (and themselves) on scales for effort, reliability, and quality of contribution. These ratings can then be averaged into each students grade. Students who loaf can be pointed out and have their grade altered in this way. It is also useful for the instructor to reserve a "vote" in the rating, in case the entire group colludes in giving each other high (or low) ratings.
There is both a pedagogical and ethical issue involved in determining whether to share with clients the reports students have generated. This is most intense with those sub-par reports that might convince clients that they have no real difficulties to face in their systems. On the one hand a good report can be quite helpful to a client, and it is motivating for students to know that their reports will actually be used by their clients. On the other hand, a poor report might do more damage than good. One way to resolve this issue is for the instructor to have a meeting with the client after they have read the report, and for the instructor to point out both the strengths and weakness of the report. If time permits, the client's reaction to the report could even be used as part of the students' grade.
When I used this project in my Technology and Society class, students all appreciated the way the SIS connected them with real issues in real systems. Most said it was the most important part of the class, the part they learned the most from, and the part they would most remember.
1) Booth, P. An Introduction to Human-Computer Interaction. Hillsdale, NJ: Lawrence Erlbaum, 1989.
2) Collins, R. & Miller, K. Paramedic ethics for computer professionals. Journal of Systems and Software, January, 1 - 20, 1992.
3) Hart, A. Knowledge Acquisition for Expert Systems, 2nd Ed. New York: McGraw-Hill. 1992.
4) Huff, C. W. and T. Finholt (Eds.) Social Issues in Computing. New York: McGraw-Hill. 1994.
5) Huff, C.W., and Jawer, B. Toward a Design Ethic for Computing Professionals. In C.W. Huff and T. Finholt (Eds.) Social Issues in Computing, (691-713) New York, NY: McGraw-Hill. 1994.
6) Huff, C.W., & Martin, C.D. Computing Consequences : A Framework for Teaching Ethical Computing. Communications of the Association for Computing Machinery, 38(12), 75-84.
7) Reason, J. Human Error. Cambridge: Cambridge University Press. 1990.
8) Rosenthal, R. & Rosnow, R. Essentials of Behavioral Research: Methods and Data Analysis. New York: McGraw-Hill, 1991.
9) Shneiderman, B. Human Values and the Future of Technology: A Declaration of Empowerment. Computers & Society, 20(3):1-6. October, 1990.
10) Shneiderman, B. Designing the User Interface. Reading, MA: Addison-Wesley. 1992.
11) Spradley, J. P. Participant Observation. New York: Holt, 1980.
12) Westrum, R. Technologies & Society. Belmont, CA: Wadsworth Publishing. 1991.
The class in which I used this approach was taught at The George Washington University in the Spring of 1995 while I was on sabbatical there. I wish to thank Dianne Martin for her support and cooperation in pulling off the projects, particularly her willingness to contact personnel at GWU and to convince them to participate in the projects. Part of this work was supported by NSF grant DUE-9354626 to Dianne Martin. In addition, I am grateful to The George Washington University and to St. Olaf College for their generous support during my sabbatical.