Psychological Safety in Technology and Software Development
Ever since Google’s Project Aristotle in 2013, we’ve been able to recognise that psychological safety is the foundation for high performing teams. This is an ever-evolving collection of articles, papers and resources on psychological safety in relation to technology, software development teams, engineering, and the technology industry.
Psychological Safety and the Only Pyramid Scheme That Works – a great article by Evan Smith.
This erupted this week: changes announced at Basecamp and the way Jason and David lead and run the company have definitely polarised opinion. A lot of that debate has been focussed on how psychologically safe people, including people from diverse or disadvantaged backgrounds, may feel about the changes. There’s a great discussion going on at the psych safety community about this, head over and let us know what you think.
And in almost a mirror image of Basecamp, Jen Rubio of Away is taking on the challenge of transforming a “toxic” culture by encouraging feedback, communication, and a focus on values.
In episode 36 of o11ycast, Charity and Liz speak with Jacob Scott of Stripe about the need for SRE teams, customer happiness, and highlight “Safety I” and “Safety II” – the approach of trying to stop things going wrong, vs the approach of trying to make more things go right:
If you’re in tech, you might also want to sign up to the Blameless SRE newsletter:
This, from Timid Robot: Psychological Safety and the Unix Room. I particularly like the thought of positive feedback loops of excitement:
Resilience Engineering is in some ways the macro-application of concepts including psychological safety to the scale of large and complex sociotechnical systems. Find out what I mean at this talk at Continuous LifeCycle London in May:
The Easy Agile podcast is great. Catch Ep.7 with Sarah Hajipour, Agile Coach, talking about Agile spreading across organisations, and how psychological safety is fundamental to an Agile Transformation.
Here’s a great slide deck on Blameless Retrospectives, by Red Hat’s Jen Krieger:
And here is a super article on Opensource.com about the psychology behind blameless retrospectives:
Daniel Truemper, an engineering manager in Berlin, wrote this great piece summarising psychological safety in engineering teams.
Resilience Engineering is a multi-disciplinary field of applied research that spans systems thinking, safety, ergonomics, behavioural psychology and more. In this talk at Continuous Lifecycle London in May, I’ll be talking about resilience engineering and how psychological safety is a fundamental component of organisational resilience:
Here’s the Youtube recording of a panel discussion I attended last week with Godel Technologies and FootAsylum to talk about high performance technology teams and psychological safety. Some awesome Q&A towards the end:
If you’re in technology and/or a leadership role, you might be interested in this talk and panel session I’m doing on the 25th March: Psychological Safety & the pillars of a tech team.
The importance of Psychological Safety to the Scrum Master role.
The Supermanagers podcast – in this episode: Simon Stanlake, SVP of Engineering at Procurify, describing the power of the retrospective in building psychological safety.
Related to retrospectives, here’s PagerDuty’s awesome guide to Blameless post-mortems, which includes a thorough background to safety culture, cognitive biases, and “human error”.
And further still, because retrospectives and post-mortems are so crucial in building psychological safety, here’s a great interview by Mandi Walls with John Allspaw about the practice of dealing with technical incidents (spoiler: psychological safety is crucial)
On Thursday 25th March, I’ll be speaking about psychological safety in technology teams at this live panel facilitated by Godel, and joined by Andy Norton and Tristan McCarthy of Footasylum:
A fantastic piece from Brian Proffitt at Red Hat’s Open Source Program Office: Culture and engagement in community onboarding:
This is an excellent article by Jonathan Smart (of “Sooner, Safer, Happier”) about the Boeing 737 Max issues and how they stemmed from a lack of psychological safety at all levels of the organisation: https://itrevolution.com/lack-of-psychological-safety-at-boeing/
This is a great piece by Marty de Jonge about psychological safety and dynamics in Agile teams, nicely pulling in Google’s Project Aristotle team performance factors and a whole load of Lean principles.
In this episode of the Virtual Coffee Podcast, Tom Cudd talks about his experiences with DevOps and how it ties in with psychological safety, and about the dangers of “hero culture” in the workplace.
Great article about psychological safety in technology (and particularly testing) teams: “If you don’t have that safety, you’re going to stick to what you know. So you’re not going to try that new automation language, for instance, or, you’re not going to dive into that code.” Bring psychological safety into your test team.
Better CyberSecurity starts with openness and honesty. Super video by cybersecurity expert Nadya Bartol
This is a fascinating debate over at Hacker News about psychological safety in tech teams. Highly recommend a read: https://news.ycombinator.com/item?id=26860743
By Lisa Bradburn: The Necessity Of Building Psychological Safety In An Agile Environment.
Excellent work by Ibrahim Cesar about psychological safety in technology teams, in the context of the pandemic. “It’s okay, not to be okay. Sometimes just affirming ourselves and reaffirming it, gives us a breath to keep working, making us a little safer.” [Translated from Portuguese]
AI, ethics, and diversity: “Before you even think about the products that you’re building or the research that you’re doing, you need to start imagining: how can you work with people at the margins to shape this technology?” A Powerful piece about Timnit Gebru, who is known for foundational work in revealing AI discrimination, developing methods for documenting and auditing AI models, and advocating for greater diversity in research.
Here’s a nice article from Techrepublic about psychological safety in cybersecurity and information security: Cybersecurity: Don’t blame employees—make them feel like part of the solution
And here’s the piece referenced in the above article: Psychological Safety and Information Security.
‘‘Those found responsible have been sacked’’: some observations on the usefulness of error. This is a really great paper and I love this point: “Error is useful not in spite of its misapplication, but because of it. We need to take error seriously not because it is an accurate assessment but because it is inaccurate; inaccurate in particular sorts of ways that serve individual and organizational needs.“
Strong link vs weak link: I find this concept fascinating. Originally from a book called “The Numbers Game”, about football (soccer) – what matters more: How good your best player is, or how good is your worst player?Soccer is a weak-link game, whilst basketball is a strong link game. But what about other teams? Software Engineering teams, for example.
Here’s a great chapter in The Spring 2021 issue of The DevOps Enterprise Journal, about psychological safety at Boeing and the 737 MAX. Apologies – you have to provide personal details to read it, but it is free at least.
Recently, Jabe Bloom and I recorded a “Transformation Live” conversation about psychological safety and organisational resilience – please let us know what you think!
This was amazing – an outpouring across the internet in a show of solidarity for the “intern” who accidentally sent a test email to all HBO Max customers. Whilst blaming the intern is not exactly a classy move, what was brilliant was to see how it inspired thousands of people to show up and admit their big SNAFUs, almost like the internet got a little bit more psychologically safe, for a day or so!
I love this piece by Nora Jones about the power of story telling in incident reviews – why it’s so critical to consider people and their stories fundamental to this practice. Don’t just look for the technical “root cause” (there isn’t one), but celebrate the insight generation, dissemination and training by people in our organisation.
This is a nice writeup from the Container Solutions folks about psychological safety as a “Cloud Native” pattern, and emphasises the point that psychological safety is imperative for both coming up with new ideas, and learning from experiments, both of which are essential for cloud native transformations in technology organisations.
This is a fab Twitter thread by Hal Pomeranz about the importance of psychological safety in information security teams dealing with incidents. “We are the good people. The ones who are trying to figure out what happened and make things better. This is a team effort that is going to require everybody’s help. Nobody is to blame, we are all just trying to fix this mess we find ourselves in.”
This is excellent! Heuristics for Effective Software Development: A continuously evolving list. And the most important factor, at number 1? Without psychological safety, respect, and trust, none of the following is possible. I love every point though, and I feel like this should be read by every manager, everywhere, not just those involved in software development; for example, see point 20: Give people the environment and support they need, then get out of the way.
I love this: How Complex Systems Fail: a Short Treatise on the Nature of Failure in Complex Systems. There are so many points in here that resonate deeply with me, such as “All practitioner actions are gambles.” – none of us truly know what the result of our actions will be, and sometimes things don’t go to plan. Judging these “failed” gambles post-hoc as “poor” judgements is not valuable, the the converse is also true – success is rarely assigned to a gamble, but to some “expert” judgement.
It seems that Gerald Weinberg was aware of psychological safety back in 1971 when he wrote “The Psychology of Computer Programming. Check out this fab thread by Vicki Boykis on Twitter.
This was an amazing experience! I was super pleased to be invited onto PagerDuty’s “Page it to the limit” podcast with the excellent Mandi Walls, published this week. We talked all about psychological safety including the origins and history of the concept, Tim Clarke’s 4 Stages model, Westrum’s cultural typologies, organisational power structures: ‘old power’ (hierarchy & control) vs ‘new power’ (egalitarianism & diffusion) and more. Give it a listen and let me know what you think!
This is a great article from Keisha Morant at POCIT (People Of Colour in Tech) with software engineers Ademusoyo Awosika-Olumo and Taylor Poindexter about their career journey and the transition from junior to senior developer. This point from Taylor is spot on: “Focus on fostering true psychological safety in your workplace and recruit outside of your usual recruiting channels to find amazing Black talent. Usually, when I hear of Black people leaving tech, it’s because, in some way, shape, or form, they did not feel protected or valued.“
Virpi Oinonen is a business cartoonist, storyteller & communications pro, and here’s a good set of illustrations about Why Organisations Must Change. I particularly like this quote from Nassim Nicholas Taleb, author of Anti Fragile and agitator of Resilience Engineering scholars: “Avoiding small mistakes makes the big ones more severe.” I have a feeling that Erik Hollnagel, Sydney Dekker or Steven Spear may have made this point before, but it’s true nonetheless.
Speaking of Steven Spear, in the most recent episode of the Idealcast, Gene Kim and Dr Spear explore COVID-19 and Just-In-Time Supply Chains, Chaos Engineering, and the Soviet Centrally Planned Economy. This is in-depth, and worth a listen (maybe twice), particularly if you’re into complexity theory and similar, and it also repeatedly highlights the importance of psychologically safe environments in order to raise concerns and generate innovate ideas.
Sebastian Straube wrote this excellent comprehensive guide discussing the influence of psychological safety on product management. I particularly like that he draws in conflict styles into the conversation.
This looks pretty awesome if you’re into design or UX: Magnify, the inclusive design and research conference. Working to magnify and prioritise those left out or who face the most barriers in UX. Subjects include:
- How might we ensure psychological safety is created for participants in research sessions?
- How might we shift the power dynamics in design to empower citizens who are marginalised by society?
- How might we advocate for and receive stakeholder investment for inclusive projects?
Chris Weston at IDC shared this interesting writeup of the IDC Digital Leadership Community Roundup – Creating and Leading High Performing Teams. Some interesting takeaways, but this caught my eye in particular: “The concern in this post-COVID world, however, is that every single creative and development action must be laser focused on a business objective and revenue. As a result, there is a tendency to focus on safe creative endeavours. While this is okay from a budgeting and short-term perspective, it can be stifling. Rather, a creative culture should be encouraged, consisting of smaller projects that result in quick successes or rapid and understandable failures.”
This is a really interesting and thought provoking piece by Howard Seidel about transparency in organisations (on this occasion, Netflix) and psychological safety. Does “radical” transparency enhance or damage psychological safety, or do we need a degree of privacy in order to “vent” sometimes?
From Netflix to Twitter: in this piece by Kate Conger at The New York Times she describes what happened when Dantley Davis joined Twitter as VP of Design in 2019. According to the article, Davis stated that “Twitter was too nice”, and that the overly “kind” atmosphere in the organisation stifled honest feedback. Opinion appears to be split as to whether this approach ultimately will benefit the culture and performance of the organisation or damage it along with broader psychological safety of its people.
This is an excellent article by John Looney (@john_p_looney) about Psychological Safety in (Tech) Operation Teams. He uses the fictional case of a new engineer on a team to show how even small interactions can build or destroy psychological safety, and the resultant effects of that. Really worth a a read.
Really interesting research here by Kristen Toohill at William James College, Northeastern University, “Gamers Don’t Die, They Respawn” – about psychological safety and gamers: The gamers “magic circle” is a psychologically safe space that is entered into by the players in which failure and learning is expected. The theory is that this increase in general psychological safety hypothetically leads to greater specific epistemic curiosity behaviours at work, which in turn leads to increased creativity or innovation.
I like this Tweet from Bernd Schiffer, Agile coach, about making explicit the difference between estimates and commitments. We should make it easier and safer for people to suggest estimates and provide forecasts without worrying that they will be held to them as “deadlines”.
Relevant to the toxic culture pieces above, this tweet from John Cutler is an excellent summary of the traits and impacts of toxic leaders, and how the behaviour perpetuates and even amplifies itself in organisations.
The State of DevOps Report 2021 came out recently, and yet again emphasised the importance of culture, and specifically psychological safety, to digital and organisational transformation. I’ve summarised the report here, or you can read the entire report here (registration required).
Whilst this paper is not open access, the abstract tells you everything you need to know 🙂 “Maintaining Resilience in Times of Crisis: Insights From High-Reliability Organizations” – Organisations that conduct work in high-risk contexts may be able to model the success of HROs by keeping learning foremost, investing time and resources into team training, supporting a climate of psychological safety, coaching employees to keep performance objectives in focus, and practicing systems thinking and accounting for complexity in resource allocation.
I was talking to community member Subash Rajcoomar about his upcoming talk on psychological safety for Agile Mauritius, on the 14th Sept and it looks great. Worth checking out if you’re free.
This episode of “The Rabbit Hole: The Definitive Developer’s Podcast” talks about Psychological Safety and Google’s Project Aristotle. Worth a listen if you work in or around software and technology.
And here is a piece from the Scrum Master Toolbox exploring a possible retrospective format around creating the understanding of and the actions that lead to psychological safety in the team.
This is a good, concise and actionable article on The Enterprisers Project (aimed at CIOs and IT Leaders) – 5 ways leaders can boost psychological safety on teams by Eeva Raita, Head of Strategy & Culture at the International tech company Futurice.
If you’re in technology, this is a very interesting piece by Dr Nicole Forsgren (of Accelerate and DORA), proposing a framework for examining developer productivity via five dimensions of Satisfaction and well-being; Performance; Activity; Communication and collaboration; and Efficiency and flow (SPACE). Forsgren describes a refreshingly progressive approach that highlights the importance of the social and individual aspects of productivity, and points out that measuring productivity should not be a management tool, but a tool for teams and individuals to use themselves.
I wrote an opinion piece recently about Digital Transformation and Psychological Safety, which turned a little bit into speculation about the various organisational dysfunctions that “digital transformations” are often intended to mitigate. Fundamentally, any organisational transformation will fail unless the need for psychological safety is recognised and addressed. I expanded on these organisational dysfunctions on my blog.
This is excellent – The engineering principles of the Artsy.net tech team explicitly include psychological safety: “At it’s core, engineering is the practice of learning. To learn effectively and to be productive, engineers must feel safe asking questions and discussing mistakes.” I also really like their statement “Being Nice is Nice – There’s always a nice way to handle a situation, and we strive for that.” It’s often stated that psychological safety is not about being nice, and whilst that’s true, that doesn’t mean we shouldn’t strive to be nice.
This is an awesome piece with Nora Jones, founder and CEO of Jeli.io. I have a great deal of respect for Nora – she’s an incredible leader, a great technologist, and a lovely human being. Amongst a lot more, this piece specifically talks about technological incidents and how psychological safety is fundamental to learning from them: “My goal, as the person asking him questions after the incident, is not to interrogate him. It’s to make him feel like an expert. And so by making Jona feel like an expert, he’s gonna feel a lot more psychologically safe. He has a safe space with me to talk about this incident.“
Last week (or maybe the week before), we mentioned SLAM teams. Worth diving into a little deeper – SLAM stands for Small, Lean, Autonomous, and Multidisciplinary. Here’s a great piece that highlights the power of SLAM teams in fostering agility, alignment, collaboration, and speed.
Episode 18 of the Made Tech “Making Tech Better” podcast covers, alongside much more, psychological safety in software testing. Worth a listen if you’re in tech, and one of the first times I’ve heard The Goal referenced in the same podcast as The fearless Organisation 🙂
Staying in tech for the moment, here’s a comprehensive piece on Incident Review and Postmortem Best Practices by Gergely Orosz. I love this point: “Encourage participants to recount their experiences as they happened, not as they think they should have responded in hindsight.“
Kimberly Johnson and Christopher Porter spoke to Gene Kim on Leadership at DevOps Enterprise Summit 2021 (registration required), and Kimberly made some great points highlighting the importance, indeed, the foundational necessity, of psychological safety for high performance organisations, learning from mistakes, generating ideas and making better decisions.
In this InfoQ article, Jessica DeVita (Netflix) and Nick Stenning (Microsoft) share some of what they’ve learned from the research community in learning from production incidents, and offer some advice on the practical application of this work. Notably:
- More and more software systems are becoming “safety-critical”
- Software teams generally analyze failure in ways that are simplistic or blameful
- There are many intuitive traps teams may fall into, particularly the idea that human error is to blame for incidents
- The language used by investigators and facilitators is crucial to learning
- Teams can protect learning by keeping repair planning discussions separate from investigation
I’ve written about psychological safety and information security previously, but here’s the first study I’ve found that explicitly shows that psychological safety contributes to discussions on security concerns in development teams. Software Security Culture in Development Teams: An Empirical Study
I know Joe Fay, and he’s a lovely guy. He’s also just written this excellent piece about the future of work and organisations, that aligns with my belief that organisational change is only ever successful through experimentation and learning: “Scenario planning and constant experimentation works for delivering software and services. Applying it on a societal scale might be a daunting prospect, but is probably our best option.”
I also know Andrew Doran, who is also a great guy, and he’s written this excellent piece (actually a follow up piece) about his experience of how work has changed, continues to change, and may look in the future. Really worth a read, and I want to highlight his prediction on how high performing organisations will:
- Hire the best people;
- Set some ground rules about what is expected, e.g. be where the client wants them to be, and to participate in-person on team-wide/company-wide event days;
- Have an understanding that being in an open-plan office is more about relationship building than getting work done;
- Provide a variety of activity-based workspaces in their offices;
- Let staff decide where the best place is to be productive on any given day.
The Lean Startup framework was conceptualised by Eric Ries, and is defined as “an institution of people organized to make a new product or service in incredibly uncertain circumstances.” The practice of applying small and rapid experiments in order to quickly understand customer needs is central to the Lean Startup methodology, and this great article by Adil Addiya articulately describes why a culture of psychological safety is fundamental to the approach. Lean startup and Psychological safety. How to accelerate innovation by eliminating fear?
Here’s a fab piece from Charity Majors, about the pressures of being on-call in a tech team. I’ve personally witnessed the long-term damage to psychological safety on a team that can be wrought by ill-applied on-call processes and procedures. This article highlights the need to identify and resolve systemic, technological and organisational dysfunctions before attempting to “solve” them through having people on-call. Being on-call is psychologically expensive and can be physically hard too, particularly for anyone with caring responsibilities, neurodiversities, or simply going through a hard time.
This paper: The Values Encoded in Machine Learning Research, is excellent. Abeba and their colleagues point out, articulately and with strong evidence, that “societal needs are typically very loosely connected to the choice of project, if mentioned at all, and that consideration of negative consequences is extremely rare.” This paper reflects the (increasingly?) polarised spectrum of sociological and “hard science” approaches: ML and AI are becoming increasingly popular as tools in organisational management and change, but we’re in for a rough time if we ignore the sociological aspects and unintended consequences of such approaches.
This is an excellent article by Jennifer Riggins on the problem of diversity (or the lack of it) in open source communities. I love this point: “…when you do something to welcome one group, you enrich the community experience for everyone. Helping one person be successful, helps everyone be successful.“
Things to do and try: I did a little skateboarding as a kid, and never got very good, but I could do an ollie. This is a great talk by the godfather of modern street skating and all-round nice guy Rodney Mullen (who invented the flatground ollie) about iteration and innovation. This is such an amazing and inspiring talk about respect of your peers, failure and coming back from it, about learning from others, non-attachment to goals, and creation as a source of joy through sharing it with a community. I don’t know what this video will inspire you to do, but I’m sure it will inspire something.
This is a fab podcast: Greater Than Code #220: Safety Science and Failure As An Opportunity For Growth with Josh Thompson. Josh talks safety science and the art of rock climbing, transferring knowledge from experts to non-experts, and seeing failure as an opportunity for teams to learn.