Something I appreciated about working at AWS is that consultants were not thrown to the wolves without training, the most useful of which was a series of mock customer meetings that modeled likely real-world interactions with both business and IT stakeholders.
Proverbial soft skills are difficult to teach, but they’re criticallyimportant, and this kind of simulated experience does a decent job prepping new hires for the challenges they’ll face. And framing it around an actual project might just result in useful artifacts or marketable solutions. Win win!
I’m hoping to eventually replicate this practice at my new gig, and thus wanted to capture a rough online of how I’d structure such a training. Perhaps it makes a case for my prior post?
Participants
Project Team (Builders)
Customer Team (Customers) – Proxy for a real customer
Coach (cannot be on either Project or Customer teams)
Sponsor (can be same as coach, but usually will be manager / executive)
Planning Work
(drafted by Project Team, reviewed by Coach, and approved by Sponsor)
Define relevance to company objectives
Identify participants
Capture project title and description
Identify objectives
Skills to be learned
Artifacts to be built
Identify dependencies / deadlines
Scope guardrails for level of effort
Template
Participants: (identify every role)
Title: (one liner)
Description: (should clearly connect to company objectives)
Expected Outcomes: (final artifacts should be listed here, but ends, not means)
Skills To Be Learned: (be as specific as practical, acknowledging some flexibility may be required)
Customer Meeting Guidelines
All are scheduled and led by the Builders
All are attended by the Builders & Customers
Customer team should “play the part” (within reason)
Coach and Sponsor are optional but active participation should be minimized
Meetings are held synchronously and are recorded
Three Phases
1. Discover
Builders review pre-work document
Builders develop agenda and questions for discovery session
Coach reviews agenda and questions (can be async) and gives go-ahead for meeting
Builders schedule and lead discovery meeting
Builders capture notes and document requirements
Builders follow up with Customers post-meeting to gather more info as needed
Builders meet with coach to discuss meeting, review artifacts, and plan next steps (can be async)
2. Design
Builders design architecture/approach based on discovered requirements
Builders capture design and a build plan in a scope document
Coach reviews scope document (can be async) and gives go-ahead for meeting
Builders schedule and lead design review meeting
Customers give go-ahead to proceed (or iterate with Builders until ready)
Builders meet with coach to discuss design meeting, review artifacts, and plan next steps (can be async)
3. Demonstrate
Builders execute on the build plan
Builders engage Customers and Coach as needed to address blockers (can by async)
Coach reviews completed artifacts (can be async) and gives go-ahead for meeting
Builders schedule and lead demonstration meeting
Customers give go-ahead that build is complete (or iterate with Builders until ready)
Builders address feedback and deliver final artifacts to Customers
Builders meet with coach to discuss demo meeting, review artifacts, and capture final thoughts / next steps
A couple of months ago, on a whim, I applied to a training fellows program. I ended up not seriously pursuing it because of an already full “day job” workload, but to be considered I had to write an argument for why I’d make a good trainer. Thought I’d post it here for posterity:
I’ve been communicating technical topics to varied audiences my entire career across a number of dimensions. As both an individual contributor and manager I have:
Sat alongside Air Force crew members in flight to teach them how to operate a map display I implemented
Designed and led classes for technical writers to teach them computing fundamentals (including complex cryptography topics) and how they relate to electronic voting systems
Run workshops within Amazon Web Services on my approach to passing cloud certifications (I’ve also written about that here)
Mentored numerous junior and mid-level developers on how to make the transition to technical leadership (see thoughts I wrote on it here)
Embraced “learning in public” by writing on technical topics as I explored them, for example, Know Thyself (and my blog in general)
Further, the last 5 years of my career have been in a direct consulting capacity, where I routinely interact with both business and IT leaders at the C-suite equivalent level in state and local government to show them how cloud technology can transform and improve the services they offer their residents. I’ve done this both as a leader within a large organization (AWS) and as CTO of a small organization. I have extensive recommendations on LinkedIn that back this up.
None of this experience would matter, though, if I didn’t want to see others learn and grow. I’ve worked to cultivate a joy that comes from seeing others succeed vs just enjoying what my own two hands can build, which I wrote about here.
For many organizations (mine included), November is performance review season. I know it’s not everyone’s cup of tea, but I quite enjoy both getting and giving of constructive feedback. So much so that apparently I like writing about how I like performance reviews every November as well.
Today’s post is about specific questions for collecting feedback from stakeholders about a person you’re reviewing. Giving such feedback is uncomfortable for many people; it can feel like tattling or speaking ill of someone behind their back. But as a manager, my perspective is inherently limited (and usually biased). I require input from those who are working with my team day-to-day in order to fairly evaluate them and give them the feedback they need (both growth opportunities and especially praise they may not otherwise hear).
I’ve used a variety of approaches in the past, with mixed success. But I recently formulated a pair of questions in conversation with a career coach that I think can coax out the desired information without making stakeholders have too icky a feeling about giving it. They’re delightfully simple, and can be answered quickly and discreetly via BCC-ed email or even DM:
For reasons not worth getting into here, I’ve been waxing nostalgic (a phenomenon to which I’m apparentlyrathervulnerable; I mean, seriously). In particular, I took a brief mental stroll down memory lane to think of key leaders who influenced my career trajectory in a positive way. People who took a chance on giving me more responsibility than I’d had previously.
This is far from an exhaustive list, but thank you to the following folks:
Greg, who hired me to my first full-time coding gig despite me not even having yet graduated from my undergraduate computer science program.
Rick, who got me internal funding to implement an idea I had, giving me my first taste of leading a project completely my own.
Cathy, who brought me to San Diego without having seen me in action, only my reputation with a particular set of skills, which her project desperately needed.
Lori, who first promoted me to a management position when my prior boss left the company, and trusted me to scale an engineering organization to meet our goals.
James, who gave me my first singleton leadership position, and helped me think beyond my team and begin operating at a broad organizational level.
Taj, who twice helped me step up into broader responsibilities, and who first challenged me to consider business implications of technical decisions.
Abby, who recruited me into my current role, and has taught me much about how to be a true partner at the executive level.
In all these cases, the individuals didn’t just give me advice. They made opportunities happen and put me in places that caused growth. That’s what makes a mentor a mentor.
If you’re on the other side of a mentoring relationship now, don’t just pontificate. Open doors. Delegate. Trust. Support. Praise. Endeavor to be on someone else’s list.
Add this post to the cacophony, I guess. Today I want to cover a number of common interview mistakes, and approaches both interviewees and interviewers can take to mitigate them. That’s right, it’s a both/and responsibility. I know there have been times I’ve come to an interview unprepared to give the candidate my best, and the result was possibly missing out on a good hire. Don’t do that.
Mistake 1: Regurgitating resume content
As an interviewee, if you’re asked to introduce yourself, give an overview of your background, or describe the projects you’ve worked on, be sure to bring something unique to your answer beyond what’s on your resume. Otherwise you’re just wasting time that’s better spent on other topics. One idea: have an elevator pitch for yourself.
And interviewers? Avoid this by 1) reviewing the candidate’s resume ahead of time, so you don’t feel the need to ask for an overview, 2) gently interrupting if an answer veers into rote regurgitation, or 3) avoid these kinds of questions altogether. Better to just jump into the meat of the conversation, because interview time is precious. Speaking of…
Mistake 2: Wasting too much time giving introductions
While there’s value in easing into a conversation that’s high-stakes, it does no one any favors if a big chunk of time is spent on introductions that are not data-rich. That goes both for the interviewee and the interviewer. When I led interviews at Amazon I had a script so that I could get through my boilerplate in a crisp 60 seconds. Say hello, set some ground rules, describe the objective of the interview, and then jump in.
Mistake 3: Too much “we” and not enough “I”
Interviews are not the time for false humility. We know doing anything of reasonable complexity requires a team, but what was your role specifically? It’s easy to fall into this trap; if you’re giving the interview, don’t let your candidate go too long before redirecting them to talk more about themselves. If they struggle to give details about their own actions, that’s a concern! Closely related to this next mistake:
Mistake 4: Talking in generalities
This is a fatal mistake of which I’ve written before. Twice. Seriously, tell real stories that actually happened. As an interviewee, this is where preparation is key. Stories are much easier to tell when you’ve captured the details ahead of time. The sorts of behaviors employers are going to ask about aren’t rocket science (unless you’re interviewing for NASA, I suppose). Do a bit of research on the company’s values and prep stories that align to them.
As an interviewer, this is another case where’s it’s totally okay to interrupt and redirect if you think a candidate is drifting into theory. But help them along by phrasing questions effectively (i.e. “tell me about a time when…”) and perhaps even setting expectations in your intro that you want real examples. If you actually do care about a theoretical answer, say so explicitly.
Mistake 5: Telling the same story twice
Good interview teams are going to talk about the stories they heard during a debrief, so there’s no reason to tell the same story to two different interviewers. This gets easier if you’ve done your prep work. At absolute worst, if a story is rich enough to answer multiple questions, make sure you tell it from different angles.
And interviewers, coordinate ahead of time on the questions you’re going to ask. It’s a waste of time for there to be duplication between interview sessions. It’s a completely avoidable but all too common error that doesn’t speak well of your company when it happens.
Mistake 6: Telling stories not aligned to the level of the role
Candidates should understand the requirements of the role, especially as it relates to the degree of seniority, and be able to give examples that illustrate the required complexity and significance. If you’re asked about a time you failed, and you’re interviewing for a leadership role, you better be able to share a situation that involved real consequences. And if you’re aiming to be a senior engineer, I need tohear about more than just the cool code you wrote… how did you lead?
Interviewers, you need to be familiar with expectations for the role as well, so you can identify these “weaksauce stories” when you hear them. Sometimes follow-up questions can save them, but often I find that it’s better to step in and ask if the candidate has another example.
My favorite way to suss out the significance of a story is to ask a candidate about the people they were talking to at the time: only other engineers at their level? Related roles like product managers or salespeople? Managers? Directors? What about people outside their company, were they interacting with customers directly? Vendors? Other stakeholders? Answers to these questions will tell you a lot about where a person fit in their prior organizations and if they’ll be suited for the role you’re offering.
Mistake 7: Focusing too much on technology
Obviously having the technical skills required of the role is important, but just as critical is the leadership you show in applying those skills. So be sure to talk about those parts of the job too. Keep your stories balanced.
Interviewers also have responsibility here, to keep the questions they ask balanced between technical and behavioral. Amazon did a great job of this; in my experience they put more weight on behaviors aligned to their Leadership Principles during their interviews than any other tech company. Having done over 150 interviews during my tenure there (about half as a Bar Raiser), it’s a lesson I won’t soon forget.
Mistake 8: Failing to discuss results
I love the STAR method for structuring stories: Situation, Task, Action, Result. But each of these parts of the story are not equally important: they grow in significance from left to right. Results are the most critical parts of a story to tell, so make sure you don’t spend too much time describing the situation alone. What was the outcome? If you can share specific numbers, even better (you did prepare your stories, right?)
Interviewers can easily help their candidates here by always asking about results if they’re not shared proactively. Don’t commit this next common error:
Mistake 9: Rushing too quickly to the next question
Stories have layers; take the time to dig into them (10 minutes per story is my rule of thumb) with good follow-ups before moving on to your next prepared question. Sometimes the perfect follow-up emerges organically, but if it doesn’t, keep these classics at the ready:
Also, don’t be afraid of a bit of silence to be sure the candidate is completely done with their story. In graduate school I was trained to pause for 8 full seconds, a practice I still use today. Especially don’t fill silence by doing the following:
Mistake 10: Leading the witness
Interviewers, keep your questions open-ended, and resist the urge to prompt candidates on the “correct” (from your perspective) answer. Let them give the answer they want to give, because that’s what you’re there to evaluate. I recognize this can be difficult, especially if a candidate is struggling; human nature is to want to help. Gentle nudges are okay, but it’s easy to give too much.
Full disclosure: I’m bad at this one. It goes against what are normally healthy collaborative impulses. At least be aware of the tendency. Speaking of fighting against otherwise healthy tendencies:
Mistake 11: Providing too much real-time feedback on answer quality
This last one might be controversial, but I stand by it. Candidates sometimes want to get feedback on how they’re doing, but doing so is tricky territory. For one, it’s not a good use of time. Second, off-the-cuff judgments on candidates are rarely as helpful as thoughtful consideration, so an answer given in real-time might not be a good one. Finally, you don’t want to give a false impression that you might have to go back on later if you decide not to proceed with them.
I’m not saying being completely stoic, nor am I saying not to give some gentle redirections if you’re not getting what you need (as I’ve said throughout this post), but if asked directly, keep judgments on answer quality to yourself. Depending on the situation, I either use some version of “I heard what I needed to” and move on, or ask a specific follow-up if I think there’s more to discover.
I did tons of hiring during my time at Amazon. I haven’t done as much in my new role, but that’s starting to change (any open roles I’ll put on my LinkedIn profile if you’re interested in checking them out).
There’s a lot I could say about the work of both identifying good candidates and presenting yourself well when you’re looking for a job. But I’ll keep it brief today.
And second, during interviews, my number one thing: tell me what you did, not what you would do. Unless I specifically ask for a theoretical answer, I want to hear actual stories of actual work and actual outcomes. Experience is evidence.
Today’s cautionary reminder to know your audience is something of a sequel to Left Hand, Meet Right Hand. It involves a cold email from a recruiter I got two days ago. Which isn’t a rare occurrence by any means, but what was out of the ordinary was that 1) it was from my former employer, despite there being absolutely no indication the sender realized I was a recent ex-Amazonian, and 2) the jobs being offered were at or below the level I’d been hired at back in 2019, a full five years ago. Needless to say, I’m not interested (and I’m not just saying that because my current boss sometimes reads this blog).
Look, I recognize that this email was probably auto-generated from a LinkedIn search, but it’s a recruiter’s entire job to not only find, but adequately entice, qualified candidates. The poor person was hoist on their own petard with the boilerplate about “raising the bar” and “becoming an industry leader.” Failing to do even a modicum of homework is not frugal nor customer obsessed.
It’s not like it would be that hard. Even if the automation was solely LinkedIn based, my entire work history is right there and it’s pretty obvious I haven’t been a mid-level software engineer in ten years. But an Amazon employee could easily do even better, given that there’s robust internal tooling for querying data on current and past employees. I should know, because I wrote some of it. In fact, from memory I bet I could write a Python script that could cross check a list of potential job candidates against Amazon’s employee lists.
Thanks for the chuckle, my recruiter friend. But do better. Open up your browser, go to https://<redacted_wiki_domain>.com/view/Jud_Neer and you’ll find all the resources and documentation you need to avoid this error in the future.
Open rebuke is better than secret love. Faithful are the wounds of a friend.
We’re in the midst of performance review season, a process I enjoy. Really! Of course performance conversations can happen throughout the year, but there’s something especially valuable about a concentrated time of reflection and intense discussion. Describing strengths, celebrating successes, identifying growth opportunities, rooting out behaviors that are holding someone back; these are all reasons I became a manager in the first place.
This responsibility can feel like a chore, if not a terrible annoyance. Pointing out shortcomings or negative behaviors in colleagues is uncomfortable at best, and if not done with grace and from a foundation of trust, can be damaging and career limiting. But when the feedback is honest, timely, actionable, and includes both positives and critiques, it is a great gift to the receiver. An act of love, even. We don’t use that word often in the workplace, and that’s unfortunate. Genuine human connection is the foundation of anything worth doing. It’s good for you, it’s good for your colleagues, and it’s also good business.
If you don’t know where to begin giving feedback, I recommend the SBI framework for guidance. And no matter if you discuss in person or provide in writing, you should give your feedback some thought ahead of time (and maybe take a few notes). I promise that it gets easier the more experiences you have giving it, though it will always be an emotional process, and that’s a good thing.
Amazon promotions require “best reasons not to promote” to be documented, both from a manager and from any colleague who provides formal feedback. Arguably it was the most important part of the process, because it demonstrated that input came from individuals who could see the candidate clearly enough to speak honestly about both their strengths and their shortcomings.
When coaching candidates for promotion, I recommended they write their own version of that section, and then we’d review theirs alongside my own. Why? Because if you shy away from your deficiencies, you have no counterargument to them. They’re going to be found out anyways by any competent promotion evaluators, so why not get ahead of the curve.
I don’t pretend this is easy, especially for people who have battled insecurities or have lacked encouraging support throughout their careers. But it’s essential for making meaningful career progress. When advocating for yourself, look your shortcomings straight in the eye.
There isn’t a day that goes by that I don’t miss my Dad, but this week I’ve been particularly reminded of him being the reason I rarely feel insecure in naming my professional weaknesses. What a tremendous gift from a parent, the words “I love you” and “I’m proud of you” spoken aloud, simply and frequently. I must have heard those words hundreds of times. Thousands. So often that their truth got into my bones. I believed him then, and I still do now, even though he’s gone. Thanks Dad.
I don’t have a ton of tech writers that I read regularly, but one that I do is Gergely Orosz. His newsletter, The Pragmatic Engineer, is incredible, full of insights and advice for folks at any point in their technical career journey.
A recent two-part installment discussed in detail the plusses and pitfalls of trying to measure developer productivity, a notoriously difficult problem in software engineering. It’s one I’ve been thinking quite a bit about recently, in an attempt to balance the business need to understand how much value we can deliver per dollar spent, without devolving into a joyless culture of mediocrity that treats its team like coding robots (which, it must be said, they’re not).
If you’re in the same position as me, I’d encourage you to subscribe to the newsletter and give the articles a read-through, but if you’re short on time, I absolutely love this simply-summarized single objective measure:
Weekly delivery of customer-appreciated value is the best accountability, the most aligned, the least distorting.
Yup, that sums it up. Other measures matter, but nothing beats screamingly happy stakeholders.