The International System Safety Society (ISSS) is still suffering from the impacts of the 2011 Budget Control Act that imposed "sequestration" measures within many members of the Society's government contractor customer base. The Act resulted in immediate, and significant, decreases in membership and attendance at the annual Conference. Over a period of several years our member base was reduced by almost two thirds, going from approximately 1,200 paying members to about 500. Many system safety programs were "defunded," forcing people to move to new fields where a Society membership no longer seemed necessary. In addition, funding cuts and an apparent distaste for "conferences" by government agencies slashed budgets supporting engineers' Conference attendance.
I have been asked to provide a summary/status report on our involvement with the Arizona State University (ASU) initiative to introduce the topic of "design for safety" into engineering courses. I will attempt to do that here, but I wish to point out that this effort has the potential to change some of the fundamental aspects of our understanding of the goals and operation of the International System Safety Society (ISSS). It is my opinion that this is the correct time to re-think the vision of the ISSS to reflect an expanded global role. The ASU initiative is just one piece of a multi-part effort to reposition the ISSS as the "go to" organization in the field of system safety engineering and management. The effects of sequestration have made it clear that, for the ISSS, depending upon government projects is unreasonable, risky and does not meet the much broader needs of global industry — or mankind. I believe it is time for the Society to step up and admit that we are the leading organization in the field of system safety (by whatever name that field is referred to by various organizations).
The discussion concerning possible future directions for the ISSS have continued since the 2015 Conference in San Diego, California. This discussion centers around the question of whether, in this time of dwindling membership and financial resources, the ISSS should continue to operate as it has during the past three or four decades, or try something new and bold for the future. It is my opinion that now is a great time to strive for something new and bold. Since I joined the Society in the 1980s, it has always been my contention that this organization promotes the most important approach to ensuring the safety of systems and products, large and small. The founders and members of the Society have crafted a spectacularly successful approach for the identification and mitigation of potential hazards and risks early during the design and development process where appropriate and cost-effective solutions can be integrated into the overall design under consideration. As I have mentioned on numerous occasions, it is my observation that there is a tendency for industries and organizations to begin to implement system safety principles in standards, requirements and processes — only to eventually change direction back toward their old ways of depending upon compliance to detailed design-based standards. This is done, rather than trusting the analysis-based approach that we have proven effective on literally trillions of dollars of programs spanning from small and relatively simple systems, to the most complex and innovative projects in the world.
AbstractPast research has long documented that voters dislike parties and leaders who reverse their policy positions. But would they tolerate (principled) policy U‐turns if they are motivated by external events, such as a large‐scale crisis or scientific evidence? In this study, we explore whether the motivation behind positional shifts affects voter evaluations of political parties. To do so, we seek to connect the causes and consequences of policy shifts, a synergy still unexplored in the literature. We suggest that, while U‐turns, in general, can be damaging to a party's reputation, principled changes brought about by new scientific evidence or major crises should not necessarily have negative implications, because these changes can be necessary for the public good. We conducted a nationally representative survey experiment in Germany (n = 3127) featuring two classes of policy reversals: strategic and principled. Surprisingly, however, we find that voters by and large hold negative views of different types of policy U‐turns, thus including when external circumstances suggest change may be necessary. Interestingly, our empirical analysis reveals intriguing patterns. First, voters are willing to tolerate all sorts of policy reversals if the party ends up adopting their positions, suggesting that proximity matters even in the event of exogenous events. Second, voters with high levels of political trust tolerate different types of policy reversals, even when the party changes for mere strategic office‐seeking motivations. Coming from the premise that political and societal change is imperative, these findings have direct implications for democracies.
Lately, I have been wondering about the apparent decrease in interest in the International System Safety Society (ISSS), judging by the sharp decrease in membership and conference attendance during the past decade or so. Fairly obviously, the government's sequester policy had a lot to do with the decrease, starting around 2008. However, while the overall economy has slowly recovered and government budgets are largely restored, our membership has remained flat or has slowly declined from year to year.
AbstractHow does a terrorist attack affect party preferences? Based on existing theories, we would either expect incumbent parties to benefit because of a rally-effect, or populist radical right parties (PRRPs) to gain due to a radicalization of voters' preferences. These competing theories are tested with a unique dataset of a large sample of voters' responses on a Voting Advice Application. We do so using a novel way to leverage exogenous events using big public opinion data. We show that a terrorist attack has a positive effect for the main incumbent party, even when voters' positions on the issues owned by the PRRPs become more radicalized. This means that during crises, voters rally around the flag and prefer prominence over policy proximity.
Software Safety vs Software Reliability While looking back through Vol. 56, No. 1 (Summer 2020) of Journal of System Safety, I finally took the time to read Nathaniel Ozarin's article "Lessons Learned in a Complex Software Safety Program." The article is quite interesting and thought provoking, comparing what actually occurs while implementing a system safety program to the idealized descriptions found in documents such as MIL-STD-882, JSSSEH and AOP-52. While I found the article interesting and informative, I noted that the author consistently characterizes the "software safety problem" as a "reliability" problem, focused on finding and preventing "failures" and ensuring high "reliability." Some Thoughts on the Probabilistic Criteria for Ensuring Safe Airplane-System Designs We have been employed in the risk sciences for a total of 86 years, including 62 years in reliability engineering and safety engineering positions at The Boeing Company. For many of those years, Yellman was the designated "Risk-Analysis Focal" (person) for Boeing's 707, 727, 737 and 757 airplane models. For several decades, the United States government has published the same criteria, created by the U.S. Federal Aviation Administration (FAA), intended to ensure that the systems on large (transport-category) aircraft have been designed to be safe [Refs. 1 and 2]. But we believe that the criteria have failed to prevent certain aircraft accidents, and we think that the reasons for that should be better understood. We hope that this discussion will contribute to a better understanding by examining the part potentially played in those accidents by the FAA's criteria that are defined probabilistically.