1 00:00:07,560 --> 00:00:09,480 Kurt Nelson: Hey, Groovers, welcome back to Behavioral 2 00:00:09,480 --> 00:00:11,160 Grooves. I'm Kurt Nelson and I'm 3 00:00:11,160 --> 00:00:13,800 Tim Houlihan: Tim Houlihan. In this episode, we get curious 4 00:00:13,800 --> 00:00:17,640 about what it means to really find your groove at work, and 5 00:00:17,640 --> 00:00:21,020 how one UX researcher is applying behavioral science in 6 00:00:21,020 --> 00:00:25,400 surprising ways inside a tech giant, surprising 7 00:00:25,400 --> 00:00:28,460 Kurt Nelson: ways, that's right. Tim, our guest is Alexis Mook, 8 00:00:29,120 --> 00:00:33,560 user, researcher, manager at IBM with a PhD in experimental 9 00:00:33,560 --> 00:00:37,940 psychology, she seems to be built for breaking products that 10 00:00:37,940 --> 00:00:41,260 is breaking them for the better. And Alexis helps design teams 11 00:00:41,260 --> 00:00:45,640 build tools that are not just beautiful, but usable. And bias 12 00:00:45,700 --> 00:00:46,420 aware, 13 00:00:46,720 --> 00:00:49,660 Tim Houlihan: we covered a lot of ground with Alexis, but if we 14 00:00:49,660 --> 00:00:52,360 had to narrow it all down, I'd say that there are three big 15 00:00:52,480 --> 00:00:56,980 sites that stuck with us right. First, I'd say the confirmation 16 00:00:56,980 --> 00:01:02,340 bias in design is real and dangerous, right? When product 17 00:01:02,340 --> 00:01:06,120 owners test their own designs, it skews the results. So Alexis 18 00:01:06,120 --> 00:01:09,840 advocates fiercely for third party testing to reduce bias and 19 00:01:09,840 --> 00:01:12,180 protect the integrity of the insights. 20 00:01:12,360 --> 00:01:15,660 Kurt Nelson: Yeah. And secondly, she reminded us that influence 21 00:01:15,840 --> 00:01:20,240 isn't just for end users. Her real magic is in how she 22 00:01:20,240 --> 00:01:23,840 navigates internal stakeholders, teaching product managers and 23 00:01:23,840 --> 00:01:27,920 developers the value of behavioral research and standing 24 00:01:27,920 --> 00:01:30,980 up for science in an environment. Tim, I think you 25 00:01:30,980 --> 00:01:33,920 might have experienced this once or twice in an environment that 26 00:01:33,920 --> 00:01:36,860 doesn't always understand it. Oh, yes, basically, she's 27 00:01:36,860 --> 00:01:40,960 figured out a way to be a voice of truth with powerful people in 28 00:01:40,960 --> 00:01:44,020 our organization who aren't used to hearing this one word, Tim, 29 00:01:44,020 --> 00:01:47,380 what is that one word? So that way, they're not used to hearing 30 00:01:47,380 --> 00:01:51,820 this word no, no, they don't know the word no, or at least 31 00:01:51,820 --> 00:01:55,000 they don't hear it told to them when they're trying to have 32 00:01:55,000 --> 00:01:57,880 fries done. Right? Yeah. Third, 33 00:01:57,940 --> 00:02:01,320 Tim Houlihan: third. We dug into the purpose paradox. Now Alexis 34 00:02:01,320 --> 00:02:05,220 told us about some high impact ideas from her academic research 35 00:02:05,220 --> 00:02:09,420 on wrongful convictions and how they could apply to the tech 36 00:02:09,420 --> 00:02:12,540 world. It's kind of cool stuff, and in doing so, she discovered 37 00:02:12,600 --> 00:02:16,440 a more immediate and personal sense of purpose mentoring young 38 00:02:16,440 --> 00:02:20,540 researchers and shaping user experience that touches millions 39 00:02:20,540 --> 00:02:21,680 of people. Yeah, 40 00:02:21,680 --> 00:02:23,840 Kurt Nelson: so we think you're gonna love her sharp insights 41 00:02:23,840 --> 00:02:27,020 and her joyful presence, and we hope that it inspires you to 42 00:02:27,020 --> 00:02:30,920 reflect on your own groove. And we'd also like to express our 43 00:02:30,920 --> 00:02:34,220 gratitude to our producer, Caroline Schaefer, for 44 00:02:34,220 --> 00:02:38,120 introducing us to Alexis, who I think is her twin sister 45 00:02:38,120 --> 00:02:40,120 actually, and not just a friend or, 46 00:02:40,120 --> 00:02:44,500 Tim Houlihan: like, some kind of a even closer than Yeah, clone, 47 00:02:44,500 --> 00:02:44,920 clone, 48 00:02:44,919 --> 00:02:47,439 Kurt Nelson: yeah. I mean, maybe, maybe Caroline cloned 49 00:02:47,439 --> 00:02:49,839 herself, or Alexis cloned herself. I don't know one of the 50 00:02:49,839 --> 00:02:54,099 two, but yeah, they they sound, they sound so similar. It's 51 00:02:54,099 --> 00:02:58,479 crazy. It is crazy, but thank you. And it was like we're 52 00:02:58,479 --> 00:03:03,599 talking with Caroline on the show, yeah, yeah, all right. So 53 00:03:03,599 --> 00:03:08,219 sit back, grab a cup of black coffee, just like our guest, 54 00:03:08,219 --> 00:03:11,999 Alexis likes, and enjoy our conversation with the bias, 55 00:03:11,999 --> 00:03:14,279 busting joy, bringing Alexis muck. 56 00:03:22,080 --> 00:03:24,560 Tim Houlihan: Alexis muck, welcome to Behavioral Grooves. 57 00:03:24,920 --> 00:03:28,040 Thanks for having me. It's a pleasure to have you here. We 58 00:03:28,160 --> 00:03:31,400 have to start by saying thank you to our producer, Caroline 59 00:03:31,400 --> 00:03:34,820 Schaefer, for introducing us and and we're really looking forward 60 00:03:34,820 --> 00:03:37,340 to this conversation. We're gonna start with a speed round. 61 00:03:37,340 --> 00:03:40,600 So first and foremost, we need to know, do you prefer coffee or 62 00:03:40,600 --> 00:03:41,020 tea, 63 00:03:41,920 --> 00:03:45,400 Alexis Mook: coffee. I'm chugging some right now, right? 64 00:03:45,880 --> 00:03:47,440 Tim Houlihan: All day, dude, all day. 65 00:03:47,500 --> 00:03:51,340 Alexis Mook: Yeah, it doesn't affect me that much, honestly. 66 00:03:51,340 --> 00:03:54,460 So yeah, I can have it pretty much all day if I want. All 67 00:03:54,460 --> 00:03:56,980 Kurt Nelson: right, do you have it black? Do you have it with 68 00:03:57,040 --> 00:03:57,940 cream and sugar? 69 00:03:58,060 --> 00:04:02,280 Alexis Mook: Black? Black. I'm one of those. Yeah, oh my gosh, 70 00:04:02,580 --> 00:04:02,760 I'm 71 00:04:05,040 --> 00:04:09,480 Kurt Nelson: crazy cool. All right, Alexis, would you prefer 72 00:04:09,480 --> 00:04:12,000 to have dinner with your favorite artist, actor or 73 00:04:12,000 --> 00:04:12,720 musician? 74 00:04:14,160 --> 00:04:19,500 Alexis Mook: Ooh, that's a tough one. I'll go. I watch more TV 75 00:04:19,500 --> 00:04:23,780 they think than I do either of the other so I'll go an actor or 76 00:04:23,780 --> 00:04:27,860 actress. Okay, all right, top of my head, I don't know who would 77 00:04:27,860 --> 00:04:29,600 be, though there's so many options. 78 00:04:29,960 --> 00:04:32,720 Tim Houlihan: Are you? Are you screaming anything? Right now, 79 00:04:32,720 --> 00:04:32,960 that's 80 00:04:33,080 --> 00:04:38,000 Alexis Mook: my gut says Zac Efron, and I don't know why I 81 00:04:38,000 --> 00:04:41,140 like, grew up with him, but that's probably a bad choice. I 82 00:04:41,140 --> 00:04:43,360 feel like if I was given time, I'd come up with something 83 00:04:43,360 --> 00:04:44,320 better. I 84 00:04:44,320 --> 00:04:45,940 Kurt Nelson: don't know, Zach seems like he'd be an 85 00:04:45,940 --> 00:04:49,720 interesting guy to have a dinner, you know? Yeah, he would 86 00:04:49,720 --> 00:04:51,340 be fun, right? He's really 87 00:04:51,340 --> 00:04:53,080 Alexis Mook: big in sustainability and stuff too. 88 00:04:53,080 --> 00:04:55,540 Like, he's like, smart too. I think we'd have a lot to talk. 89 00:04:55,540 --> 00:04:55,900 Oh, 90 00:04:56,620 --> 00:04:58,300 Kurt Nelson: see, I didn't know that. There you go. 91 00:04:59,560 --> 00:05:01,620 Tim Houlihan: I think. It's a good qualification for a dinner 92 00:05:01,620 --> 00:05:04,080 guest just to have some intellectual capability. Yeah, 93 00:05:04,080 --> 00:05:05,040 absolutely. 94 00:05:06,360 --> 00:05:07,860 Kurt Nelson: Which rules Tim or me out? So 95 00:05:08,940 --> 00:05:11,400 Tim Houlihan: we would, we don't make good dinner guests. Okay? 96 00:05:11,940 --> 00:05:15,780 Alexis, would you prefer, excuse me, not, a prefer. If you could 97 00:05:15,780 --> 00:05:20,540 have one superpower for an hour. What would that superpower be? 98 00:05:20,600 --> 00:05:21,680 This is so 99 00:05:21,680 --> 00:05:24,680 Alexis Mook: funny. I was just talking about this, what, yeah, 100 00:05:24,680 --> 00:05:27,800 like, I was, I was somewhere, and I was like, I wish I could 101 00:05:27,800 --> 00:05:31,340 just be there already. And I think it's teleportation. Like, 102 00:05:31,640 --> 00:05:34,340 I would just like, boop around to different countries for an 103 00:05:34,340 --> 00:05:38,780 hour and check them out and come back. I prefer to flying. In my 104 00:05:38,780 --> 00:05:42,100 mind, I think flying is going to be comparable to running, I feel 105 00:05:42,100 --> 00:05:45,580 is going to exert some energy on my part, whereas I think 106 00:05:45,580 --> 00:05:48,400 teleportation will just like, be able to beep around town do 107 00:05:48,400 --> 00:05:50,560 whatever I want, though. So 108 00:05:50,560 --> 00:05:53,380 Tim Houlihan: does that appeal to your lazy self or your I 109 00:05:53,380 --> 00:05:55,360 can't get enough done in one day self? 110 00:05:55,480 --> 00:05:59,320 Alexis Mook: Yeah, definitely the latter. It's just, it's my, 111 00:05:59,680 --> 00:06:03,180 um, my, my need to fit as much in as possible in as little time 112 00:06:03,180 --> 00:06:05,160 as possible. Oh, 113 00:06:05,160 --> 00:06:08,040 Kurt Nelson: I love it. I love it. I've always thought about 114 00:06:08,040 --> 00:06:11,280 teleportation, like, like, you know you're, you're driving for 115 00:06:11,280 --> 00:06:15,600 four or five hours, and you go, yeah, if we had that, that thing 116 00:06:15,600 --> 00:06:19,380 from Star Trek, I could, I'd already be there, yeah, I would 117 00:06:19,380 --> 00:06:19,680 have 118 00:06:19,680 --> 00:06:22,580 Alexis Mook: the whole day of in this new location, once I got 119 00:06:22,580 --> 00:06:23,720 there, just like, well, and 120 00:06:23,720 --> 00:06:26,120 Kurt Nelson: think about what that would do for work. I mean, 121 00:06:26,180 --> 00:06:29,900 you could have work in person, but you could live anywhere, 122 00:06:29,900 --> 00:06:33,380 because it would be like, Oh, all right, I have to go to my 123 00:06:33,380 --> 00:06:35,420 job in Alaska now. Excuse me, 124 00:06:36,980 --> 00:06:39,560 Alexis Mook: literally perfect, right? Yeah. 125 00:06:40,100 --> 00:06:42,100 Tim Houlihan: Are you dissing the people who already live in 126 00:06:42,100 --> 00:06:43,120 Alaska there? No, 127 00:06:46,300 --> 00:06:48,940 Kurt Nelson: oh my gosh. Only you would take it down that way. 128 00:06:49,180 --> 00:06:52,000 I'm like going, how great it would be to work in Alaska. 129 00:06:52,000 --> 00:06:54,880 There you go. All right, that's let's just forgive Tim and me 130 00:06:54,880 --> 00:07:00,660 bickering here. All right, last Speed Round question, do we 131 00:07:00,660 --> 00:07:03,960 really need UX to design products? I mean, don't usually 132 00:07:04,020 --> 00:07:07,140 people just figure stuff out anyway, and it's just, it's all 133 00:07:07,140 --> 00:07:09,900 fine. Is that people 134 00:07:09,900 --> 00:07:15,000 Alexis Mook: don't know you're gonna go a totally different 135 00:07:15,000 --> 00:07:17,760 direction with that? Was like, AI and stuff. And I was like, 136 00:07:17,760 --> 00:07:23,060 Oh, I don't know if I'm ready for this yet. But no people, no, 137 00:07:24,440 --> 00:07:27,380 especially the products I'm working on. We're in, like, a 138 00:07:27,380 --> 00:07:34,880 really complex domain, so a lot of our stuff, I have a long 139 00:07:34,880 --> 00:07:37,820 winded answer, I guess, which is depends what type of product 140 00:07:37,820 --> 00:07:40,340 you're on. If you're on a new product, and you're starting 141 00:07:40,340 --> 00:07:44,500 from scratch. That's like a different vibe. At IBM, we have 142 00:07:44,500 --> 00:07:47,320 some really, really old legacy products, and when you get in 143 00:07:47,320 --> 00:07:51,160 there and you try to update those, it is just chaos, 144 00:07:52,060 --> 00:07:53,860 Unknown: right? Some people 145 00:07:53,860 --> 00:07:56,020 Alexis Mook: have been using them for decades and, like, it 146 00:07:56,020 --> 00:08:00,220 makes sense to them, but no new workers coming in can use them. 147 00:08:00,400 --> 00:08:02,760 So we're like, oh, we're trying to fix them for the new workers, 148 00:08:02,760 --> 00:08:05,760 and then the older workers, or people been doing it for a long 149 00:08:05,760 --> 00:08:08,820 time. We're like, No, I like it the way it is. And it's like, 150 00:08:08,880 --> 00:08:11,580 I've only been on one of those. But it was, it was frustrating 151 00:08:11,580 --> 00:08:14,700 as a as a user researcher, because it's like, Who do I 152 00:08:14,700 --> 00:08:18,660 cater to? Like, ideally, it should be an easy product so 153 00:08:18,660 --> 00:08:22,040 anyone can come in and use it. But if the experts who've been 154 00:08:22,040 --> 00:08:25,280 using it for years like it the way it is, I don't know, really 155 00:08:25,880 --> 00:08:27,260 a battle between, 156 00:08:28,340 --> 00:08:32,600 Kurt Nelson: yeah, the status quo bias, versus like what, what 157 00:08:32,600 --> 00:08:36,500 is required for new people? Well, you just mentioned IBM. So 158 00:08:36,560 --> 00:08:41,440 you work at IBM in UX. Can you describe what you do, what your 159 00:08:41,440 --> 00:08:43,420 job is for the listeners here? 160 00:08:43,600 --> 00:08:47,740 Alexis Mook: Yeah, absolutely. So I'm currently a user research 161 00:08:47,740 --> 00:08:51,340 manager. So I manage a team of researchers, along with I have 162 00:08:51,340 --> 00:08:55,480 two other managers who our team's quite big, so we split up 163 00:08:55,480 --> 00:08:59,620 who reports to who. Together, we kind of decide, like the team 164 00:08:59,620 --> 00:09:02,340 strategy and who should be placed, where and kind of who's 165 00:09:02,340 --> 00:09:06,000 the best fit for certain teams and stuff like that. Prior to 166 00:09:06,000 --> 00:09:09,540 that, I was a user researcher, and so I'm used to the work, and 167 00:09:09,540 --> 00:09:13,860 I am probably the most recent manager who had been doing the 168 00:09:13,860 --> 00:09:17,220 work, so sometimes my fellow managers will, like, ask me 169 00:09:17,220 --> 00:09:20,220 questions about the research, because I was like, just there, 170 00:09:20,220 --> 00:09:25,400 like, this time last year, I was still a researcher, so almost 171 00:09:25,400 --> 00:09:28,100 coming into my one years management, but, but yeah, then 172 00:09:28,100 --> 00:09:31,760 as as a user researcher, traditionally at IBM, we're 173 00:09:31,760 --> 00:09:36,440 embedded in a design team, so we work with UX designers, content 174 00:09:36,440 --> 00:09:40,660 designers, visual designers, as well as our three In A Box team 175 00:09:40,660 --> 00:09:45,520 would be including product managers and developer so we all 176 00:09:45,580 --> 00:09:50,740 work together to try to build new products for our target 177 00:09:50,740 --> 00:09:51,400 audience. So 178 00:09:51,760 --> 00:09:54,520 Tim Houlihan: So what's the difference between the designer 179 00:09:54,760 --> 00:09:58,840 job and the researcher job when it comes to UX? Well, 180 00:09:58,840 --> 00:10:05,340 Alexis Mook: the researchers are cooler. Clearly, that's a shout 181 00:10:05,340 --> 00:10:08,460 out to my designer friends. But no, they, I mean, the designers 182 00:10:08,460 --> 00:10:11,340 are the ones actually in the design tools, like like figma, 183 00:10:11,340 --> 00:10:15,000 or, let's see InVision, like stuff like that. They're the 184 00:10:15,000 --> 00:10:19,620 ones building the screens. Brother, it's a prototype. Or 185 00:10:20,580 --> 00:10:24,020 even, like wire framing or something that's like, not full, 186 00:10:24,020 --> 00:10:26,840 full fidelity, they call it. So it'd be like, just boxes on a 187 00:10:26,840 --> 00:10:30,260 screen trying to tell you the steps that need to happen. Um, 188 00:10:31,220 --> 00:10:33,680 whereas I have none of those skills, I'll be quite honest 189 00:10:34,040 --> 00:10:37,160 with you, and some some reaches run, some researchers do. I need 190 00:10:37,160 --> 00:10:39,860 to put that out there, like, I don't want to, I want to throw 191 00:10:39,860 --> 00:10:42,580 out all of the under the bus. But, you know, I come from 192 00:10:42,580 --> 00:10:45,760 academia as well, so like, I'm like, a researcher, researcher. 193 00:10:45,940 --> 00:10:48,700 There's some who kind of do a little design and a little 194 00:10:48,700 --> 00:10:52,900 research, and I think we're all, we're all valuable here and 195 00:10:52,900 --> 00:10:55,540 there, depending on the situation, I believe. But, but, 196 00:10:55,540 --> 00:10:58,240 yeah, so they're the ones actually kind of designing it, 197 00:10:58,240 --> 00:11:01,200 and then what I do is I take their designs and then I break 198 00:11:01,200 --> 00:11:04,440 their hearts by showing it to users and telling them where 199 00:11:04,440 --> 00:11:04,980 it's wrong. 200 00:11:06,960 --> 00:11:08,040 Tim Houlihan: So you're a heartbreaker, 201 00:11:09,240 --> 00:11:14,040 Alexis Mook: yeah, and that's, I'll say, the behavioral science 202 00:11:14,040 --> 00:11:18,600 side of this. Like, why we have US researchers in there is a lot 203 00:11:18,600 --> 00:11:22,220 to do with biases. So like, this example just happened to me the 204 00:11:22,220 --> 00:11:25,340 other day where one of my designer friends who's not on my 205 00:11:25,340 --> 00:11:29,120 team, she was like, Yeah, I'm running a usability test today. 206 00:11:29,120 --> 00:11:31,700 And I was like, you are like, because I know they've 207 00:11:31,700 --> 00:11:33,920 researchers on their team. I know them personally. And I was 208 00:11:33,920 --> 00:11:37,160 like, I was like, on your own design? And she was like, yeah. 209 00:11:37,160 --> 00:11:39,980 Like, what's the problem with that? I was like, I mean, other 210 00:11:39,980 --> 00:11:43,840 than the, you know, very apparent bias of you showing 211 00:11:43,840 --> 00:11:48,280 them your design that you want them to, like nothing, I guess, 212 00:11:48,280 --> 00:11:53,620 like she was, she was like, I can do it. And I was like, you 213 00:11:54,160 --> 00:12:00,300 you can try. But like, everyone has biases. It can show whether 214 00:12:00,300 --> 00:12:03,420 you're verbalizing it or not, like, if you might nod your 215 00:12:03,420 --> 00:12:07,380 head, you might smile a little, like, there's a lot of body 216 00:12:07,380 --> 00:12:11,160 language that happens that I think they kind of are unaware 217 00:12:11,160 --> 00:12:13,980 of. And I'll say, like, I'm not totally against designers doing 218 00:12:13,980 --> 00:12:16,800 research ever at all. Like there's a time and a place, but 219 00:12:16,800 --> 00:12:19,920 I it should be a third party, in my opinion, it should not be the 220 00:12:19,920 --> 00:12:23,240 person who, like, actively designed it. Yeah, they're just, 221 00:12:23,240 --> 00:12:25,880 there's a potential of them, you know, hinting them through the 222 00:12:25,880 --> 00:12:29,300 right path, or giving them information they might not have 223 00:12:29,300 --> 00:12:32,420 had if me, a third party is in there, doing it for them, and 224 00:12:32,420 --> 00:12:36,080 then we'll see that struggle point more clearly, you know, 225 00:12:36,560 --> 00:12:41,140 yeah. So, yeah, it's a lot of it is me poo pooing on other 226 00:12:41,140 --> 00:12:41,980 people's parades. 227 00:12:43,600 --> 00:12:45,880 Kurt Nelson: Well, we'll come back to your heartbreaking, 228 00:12:46,480 --> 00:12:49,240 other people's parades here in a minute. But I do want to talk a 229 00:12:49,240 --> 00:12:52,960 little bit about understanding your your journey to this 230 00:12:52,960 --> 00:12:56,140 because, as you said, you were an academia, academia. You had 231 00:12:56,140 --> 00:13:01,680 done research, you had PhD, right? So how, what led you into 232 00:13:02,040 --> 00:13:03,120 this line of work. 233 00:13:03,240 --> 00:13:07,140 Alexis Mook: I never wanted to teach. I've always really liked 234 00:13:07,140 --> 00:13:12,900 research, so right my senior year of college, I was being 235 00:13:12,900 --> 00:13:16,860 encouraged to apply to PhD programs from my mentor at the 236 00:13:16,860 --> 00:13:22,160 time, Aaron Mitchell. Shout out to him. I was just so busy with 237 00:13:22,160 --> 00:13:25,160 classes and stuff. I was like, preparing my application, but 238 00:13:25,160 --> 00:13:28,820 was like, I don't have time to make a good application, if that 239 00:13:28,820 --> 00:13:31,580 makes sense. I was like, I'll be able to get this in. But I think 240 00:13:31,580 --> 00:13:36,020 if I take a year, my GRE scores will be better. Like, I'll write 241 00:13:36,020 --> 00:13:38,660 a better personal essay, because I have time to put into it. Yada 242 00:13:38,660 --> 00:13:43,840 yada. So I decided to take this gap year where I was trying to 243 00:13:43,840 --> 00:13:47,380 get a research job. So I was like looking at potential 244 00:13:47,380 --> 00:13:54,340 research jobs. I only was able to find a position in a lab in 245 00:13:54,340 --> 00:13:58,300 Georgia State, so it was unpaid, and it was just to fill that 246 00:13:58,300 --> 00:14:03,180 year with something of research for my resume, right? But that 247 00:14:03,180 --> 00:14:05,640 lab was a legal psych Lab, which was something that always 248 00:14:05,640 --> 00:14:08,160 interests me. So legal psychology is just like how 249 00:14:08,160 --> 00:14:12,780 aspects of psychology impact the legal system. So I did a year 250 00:14:12,840 --> 00:14:16,440 there where I was doing some really cool work on, like MRI 251 00:14:16,440 --> 00:14:22,520 studies with she was looking at, like auditory hallucinations 252 00:14:22,520 --> 00:14:26,660 with people who are on a spectrum of schizophrenia. I 253 00:14:26,660 --> 00:14:29,360 know I like researching. I'm sure I'll have fun. Like legal 254 00:14:29,360 --> 00:14:33,320 psych seemed very interesting to me. So through that process, I 255 00:14:33,320 --> 00:14:36,740 was accepted to FIU, where they have a legal psych program, 256 00:14:37,040 --> 00:14:40,540 which is pretty niche. There's a few around the country, but 257 00:14:40,540 --> 00:14:44,980 it's, it's very cool work my mentor did eyewitness decision 258 00:14:44,980 --> 00:14:48,640 making work and alibi research. So that's mostly what I did 259 00:14:48,640 --> 00:14:53,020 while I was there, because I worked under him primarily. And 260 00:14:53,020 --> 00:14:56,260 then someone was like, talking about, like, eyewitness memory, 261 00:14:56,860 --> 00:15:01,680 you know, victim memory, a lot of memories. Stuff. So I do, I I 262 00:15:02,400 --> 00:15:04,980 mentioned my degrees in experimental psychology, but I 263 00:15:04,980 --> 00:15:08,160 think if I were to give myself a title, I would call myself a 264 00:15:08,160 --> 00:15:12,240 cognitive psychologist from the work I've done. But, you know, 265 00:15:12,720 --> 00:15:14,640 tomato, tomato, I'm a psychologist. 266 00:15:18,000 --> 00:15:20,960 Tim Houlihan: I mean, you get to decide because you've because 267 00:15:20,960 --> 00:15:24,920 you've got it so and I just want to clarify, FIU, Florida 268 00:15:24,920 --> 00:15:28,520 International University, yes, is what we're talking about 269 00:15:28,520 --> 00:15:32,480 here. You did a lot of work with FMRIs, and 270 00:15:32,780 --> 00:15:35,120 Alexis Mook: just in that was at Georgia State. And like, 271 00:15:35,240 --> 00:15:39,680 Tim Houlihan: Georgia State, Yeah, but how does that I guess. 272 00:15:39,680 --> 00:15:42,940 I'm curious about, how did the technology influence the 273 00:15:43,000 --> 00:15:47,380 neuroscience that comes from this? How does that influence 274 00:15:48,040 --> 00:15:51,700 what we know about our behaviors? How does that inform 275 00:15:52,000 --> 00:15:55,300 what we know about our behaviors and how people actually 276 00:15:55,840 --> 00:16:01,260 Alexis Mook: I mean, I think a lot of that data is almost just 277 00:16:02,520 --> 00:16:06,240 giving validity to current theories, right? So it's like, 278 00:16:06,600 --> 00:16:10,560 you know, just throw someone in an fMRI and see how they react, 279 00:16:10,560 --> 00:16:14,640 and then draw conclusions off of it, right? Like, like, she had a 280 00:16:14,700 --> 00:16:18,480 hypothesis, you know, as science goes, and she was attempting to 281 00:16:18,480 --> 00:16:21,120 back up some of her hypotheses, which were, like I mentioned, it 282 00:16:21,120 --> 00:16:24,740 was, it was, it was pretty cool stuff. It was like, um, some 283 00:16:24,740 --> 00:16:27,860 people are like, pre schizophrenic, where they're not 284 00:16:27,860 --> 00:16:32,780 fully hallucinating things. They might not be hearing or seeing 285 00:16:32,780 --> 00:16:37,940 things, but their imaginations are really, really vivid, so 286 00:16:37,940 --> 00:16:41,680 they sometimes have a hard time differentiate, differentiating 287 00:16:41,680 --> 00:16:46,360 between reality and a dream or reality and a daydream, like 288 00:16:46,360 --> 00:16:48,940 stuff like that. So she was studying this very niche 289 00:16:48,940 --> 00:16:54,340 population to see if they had, I forget what exactly her goal 290 00:16:54,340 --> 00:16:58,780 was. It was something about auditory imaging like it. 291 00:16:58,780 --> 00:17:02,520 Basically, she was trying to prove that them imagining a song 292 00:17:02,700 --> 00:17:07,200 would react the same way in the brain as hearing the song. And 293 00:17:07,320 --> 00:17:11,400 to my knowledge, that's what they found. So wow, it's kind of 294 00:17:11,400 --> 00:17:13,200 cool stuff. Yeah, that 295 00:17:13,200 --> 00:17:14,520 Tim Houlihan: is very cool. Yeah. 296 00:17:14,580 --> 00:17:17,820 Kurt Nelson: Okay, so Alexis, let's go back to the journey. 297 00:17:17,820 --> 00:17:20,900 And you are, you're at, you're at, FIU, you're getting all 298 00:17:20,900 --> 00:17:25,640 this, yeah. How does the jump go? Then into to get to IBM, 299 00:17:25,940 --> 00:17:27,140 yeah. So, like 300 00:17:27,140 --> 00:17:28,820 Alexis Mook: I said, I knew I always want to do applied 301 00:17:28,820 --> 00:17:31,880 research when I got there, because it's an applied program, 302 00:17:31,880 --> 00:17:35,480 I kind of assumed they, like would set you up for that, and 303 00:17:35,480 --> 00:17:38,360 not no shade on them. But they're all academics, right? 304 00:17:38,360 --> 00:17:41,800 Like the actual professors, none of them went into that and so, 305 00:17:42,040 --> 00:17:45,580 so they're there to teach you the science, and not necessarily 306 00:17:45,580 --> 00:17:49,960 teach you how to use it outside of academia. So I was kind of 307 00:17:49,960 --> 00:17:52,840 doing my own leg work on like, what, what roads I could take. 308 00:17:53,080 --> 00:17:55,420 There were some cool things that people from the program have 309 00:17:55,420 --> 00:17:59,320 done. A lot of them go into trial consulting, which I 310 00:17:59,320 --> 00:18:01,800 thought was kind of cool. Like, I was eyeing that up for a 311 00:18:01,800 --> 00:18:06,540 little bit, and then a lot of them went to the FBI, which I 312 00:18:06,540 --> 00:18:11,400 also thought was kind of cool, right? That is, yeah, it was for 313 00:18:11,400 --> 00:18:14,400 a while I was really gung ho on the FBI, and then I was, just 314 00:18:14,400 --> 00:18:17,760 happened to be friends with one of the former graduates who went 315 00:18:17,760 --> 00:18:21,320 there, and she told me that she didn't like it at all. And 316 00:18:21,320 --> 00:18:24,740 chicks like, she was like, it's actually super boring, which I 317 00:18:24,740 --> 00:18:29,540 could not imagine. The FBI, like, that sounds so cool. So 318 00:18:29,960 --> 00:18:32,600 Tim Houlihan: hasn't she seen any of the police dramas on TV? 319 00:18:32,660 --> 00:18:33,320 Yeah, police 320 00:18:33,320 --> 00:18:36,140 Kurt Nelson: dramas movies. That's always exciting. And, 321 00:18:36,140 --> 00:18:37,880 Alexis Mook: you know, they it was like a fellowship. So they 322 00:18:37,880 --> 00:18:40,940 weren't, they were paying, like, a very low salary to live in DC 323 00:18:40,940 --> 00:18:43,600 and stuff. And I was just like, I was like, I don't know, like, 324 00:18:43,660 --> 00:18:46,780 it seems really cool, but what if I just look at other stuff? 325 00:18:47,200 --> 00:18:52,600 And then one of my old colleagues, he was graduating 326 00:18:52,720 --> 00:18:54,940 about the time I was in my second year or third year, I 327 00:18:54,940 --> 00:18:58,780 forget, but he told me to look into UX, because he was looking 328 00:18:58,780 --> 00:19:02,220 into UX, and he sounds right up your alley, because it's applied 329 00:19:02,220 --> 00:19:06,540 psych in a like, active environment where, like, your 330 00:19:06,540 --> 00:19:10,560 research is making an impact, like, instantaneously, like, and 331 00:19:10,560 --> 00:19:14,340 that's, I'd say, the biggest issue with some of our legal 332 00:19:14,340 --> 00:19:17,460 psych research is, like, we can sit here and tell you guys how 333 00:19:17,460 --> 00:19:20,720 to interview someone all day long based on Our research, but 334 00:19:20,720 --> 00:19:24,920 it's all usually not implemented for like, 20 years, you know, 335 00:19:24,920 --> 00:19:28,580 like trying to, trying to talk to, you know, police officers or 336 00:19:29,240 --> 00:19:32,840 or anyone in, like, the government and whatever. So I 337 00:19:32,840 --> 00:19:36,380 was really keen on trying to make an immediate impact, and 338 00:19:36,380 --> 00:19:40,340 like getting, like, quick and dirty research done and helping 339 00:19:40,460 --> 00:19:43,780 a product move along. I started just interviewing around. And 340 00:19:43,780 --> 00:19:47,200 then, you know, once you know I was completed, my dissertation 341 00:19:47,200 --> 00:19:52,060 was like, really on the job market. Networking like crazy 342 00:19:52,060 --> 00:19:54,820 was a big thing too, just like talking to people, understanding 343 00:19:54,820 --> 00:19:57,820 how they got into it. The more people you talk to, the more you 344 00:19:57,820 --> 00:20:01,080 learn, the more prepared you are for an interview. Right? So a 345 00:20:01,080 --> 00:20:05,280 lot of that really helped me. And, yeah, and then I just, I 346 00:20:05,280 --> 00:20:08,880 was, I was on the final round with a couple companies, and 347 00:20:08,880 --> 00:20:13,200 then it was IBM, and I was, like, sold. I really liked the 348 00:20:13,440 --> 00:20:16,140 manager who I was speaking to at the time. He's since left IBM, 349 00:20:16,140 --> 00:20:18,420 but like, really hit off with him. Really hit off with the 350 00:20:18,420 --> 00:20:21,560 researcher who was part of a, like, they gave me a project as 351 00:20:21,560 --> 00:20:24,680 part of the interview, like a research study, to do quick and 352 00:20:24,680 --> 00:20:28,280 easy for them and then show them how I would have they're not 353 00:20:28,280 --> 00:20:30,620 like looking for research to be completed. They're looking to 354 00:20:30,620 --> 00:20:33,740 see, like your thought process of how you would have developed 355 00:20:33,740 --> 00:20:34,220 a study. 356 00:20:38,180 --> 00:20:40,360 Kurt Nelson: Hey, Groovers, we want to take a moment away from 357 00:20:40,360 --> 00:20:43,240 our conversation to thank you for listening to Behavioral 358 00:20:43,240 --> 00:20:45,820 Grooves. If you enjoy the conversations we're having and 359 00:20:45,820 --> 00:20:49,420 want to help us keep the groove going, here are a few simple 360 00:20:49,420 --> 00:20:51,940 ways that you can support the show. First off, 361 00:20:52,000 --> 00:20:54,640 Tim Houlihan: subscribing to our sub stack is a great way to stay 362 00:20:54,640 --> 00:20:57,940 connected with us between episodes. The weekly newsletter 363 00:20:57,940 --> 00:21:01,200 provides you with cool insights that are beyond the episodes, 364 00:21:01,200 --> 00:21:03,540 and they get delivered straight to your inbox, 365 00:21:03,600 --> 00:21:07,200 Kurt Nelson: and if you haven't already leaving a review or a 366 00:21:07,200 --> 00:21:10,440 rating of the podcast on a platform like Apple or Spotify 367 00:21:10,440 --> 00:21:14,100 or YouTube, helps other curious minds discover us. And there's 368 00:21:14,100 --> 00:21:17,640 two great things about that. One, it gives us a boost. And 369 00:21:17,640 --> 00:21:20,040 two, it costs nothing, 370 00:21:20,100 --> 00:21:23,420 Tim Houlihan: and it only takes a second, but it makes a huge 371 00:21:23,420 --> 00:21:26,600 difference for us. Plus, we love hearing from you, so don't be 372 00:21:26,600 --> 00:21:29,720 shy. Leave us a review or give us a quick thumbs up. 373 00:21:30,020 --> 00:21:32,660 Kurt Nelson: We're coming up on 500 episodes, and we're doing 374 00:21:32,660 --> 00:21:34,880 this because we love the conversations we have with our 375 00:21:34,880 --> 00:21:35,840 guests. Yeah, 376 00:21:35,840 --> 00:21:38,540 Tim Houlihan: we also want to do it because we love bringing you 377 00:21:38,540 --> 00:21:41,680 insightful behavior, changing content every week, and we hope 378 00:21:41,680 --> 00:21:48,340 that some of those insights will help you find your groove. It's 379 00:21:48,340 --> 00:21:51,220 good to see the smile on your face that says, Yeah, this is 380 00:21:51,220 --> 00:21:55,480 actually a good gig here at Behavioral Grooves headquarters, 381 00:21:56,260 --> 00:22:01,680 global headquarters, of course, we we try to focus on human 382 00:22:01,680 --> 00:22:04,500 behavior, right? And, and it's the, how do you find your 383 00:22:04,500 --> 00:22:06,960 groove? How do you, you know, what are the why do we do the 384 00:22:06,960 --> 00:22:12,180 things that we do? And as a UX researcher, tell us how your 385 00:22:12,180 --> 00:22:18,660 work works, to try to understand and deal better with how humans 386 00:22:18,660 --> 00:22:20,240 actually behave. You know, 387 00:22:20,240 --> 00:22:22,700 Alexis Mook: where there's this big debate in the field, 388 00:22:22,700 --> 00:22:27,080 actually over whether we should be called a user researcher or a 389 00:22:27,080 --> 00:22:31,100 product researcher or a design researcher like and no one knows 390 00:22:31,100 --> 00:22:33,560 what the best title is. But because, like, what do 391 00:22:33,560 --> 00:22:38,120 Tim Houlihan: you say? What's your vote on that? So I once 392 00:22:38,120 --> 00:22:41,800 Alexis Mook: heard someone say we shouldn't be called user 393 00:22:41,800 --> 00:22:46,060 researchers, because the only time you call someone a user is 394 00:22:46,120 --> 00:22:51,820 if they're, like, addicted to a drug, or one other example that, 395 00:22:51,820 --> 00:22:55,720 like, was also bad. And I was like, Yeah, you're right. Like, 396 00:22:56,200 --> 00:22:59,200 Kurt Nelson: they like, yeah. I'd never even that didn't occur 397 00:22:59,200 --> 00:23:00,420 to me, but yes, 398 00:23:00,420 --> 00:23:03,000 Alexis Mook: so I've actually started at work trying to either 399 00:23:03,000 --> 00:23:06,180 call them like customers. If we actually have customers who use 400 00:23:06,180 --> 00:23:08,580 the product or be on a new product, I try to call them like 401 00:23:08,760 --> 00:23:12,420 a target audience, like I try to use different words. I don't 402 00:23:12,420 --> 00:23:15,600 know if it matters at the end of the day, but after hearing that, 403 00:23:15,660 --> 00:23:20,240 I was like, that is very that makes sense. I personally think 404 00:23:20,240 --> 00:23:24,560 we're more I would call us product design at IBM, and then 405 00:23:24,560 --> 00:23:27,320 I hope none of my design friends yell at me for that, because, 406 00:23:27,320 --> 00:23:31,100 like, we do do design research as well, but at the end of the 407 00:23:31,100 --> 00:23:33,560 day, we're all building a product together, right? And a 408 00:23:33,560 --> 00:23:36,860 lot of our research, especially early on, is in like, go to 409 00:23:36,860 --> 00:23:41,080 market fit and like, like pricing studies and like feature 410 00:23:41,080 --> 00:23:44,620 analysis, like, we'll run some pretty complex statistics on 411 00:23:45,580 --> 00:23:49,180 different features to understand which ones are weighted higher 412 00:23:49,780 --> 00:23:52,960 and what should be prioritized on a roadmap. So, so, yeah, I 413 00:23:52,960 --> 00:23:56,680 would, I would consider as product researchers, but I'll be 414 00:23:56,680 --> 00:23:59,200 called whatever I'm called. I'm not going to be offended. 415 00:23:59,620 --> 00:24:02,220 Tim Houlihan: Yeah. So how does this? How does this get to 416 00:24:02,640 --> 00:24:08,100 understanding or advising or influencing 417 00:24:08,280 --> 00:24:12,240 Alexis Mook: our behaviors? Yeah, so that's what I'm trying 418 00:24:12,240 --> 00:24:15,480 to get to, is like while, at the end of the day, my job is 419 00:24:15,480 --> 00:24:21,140 building a product, my job as a user researcher is making sure 420 00:24:21,140 --> 00:24:27,920 that product is like, intuitive, easy to use and quick for our 421 00:24:27,920 --> 00:24:33,440 customers who are using it. So I guess my job is to influence the 422 00:24:33,440 --> 00:24:38,540 customers to be able to do their job faster given the design of 423 00:24:38,540 --> 00:24:42,700 the product. And like I said, also, I think a lot of my job is 424 00:24:42,700 --> 00:24:47,620 influencing those around me to not build a crazy bias product. 425 00:24:49,480 --> 00:24:52,120 Kurt Nelson: Yeah, yeah. Well, I think I think that's an 426 00:24:52,120 --> 00:24:56,740 interesting piece, right? Because oftentimes our jobs are 427 00:24:56,740 --> 00:25:00,480 not just like focusing in on the end product. The. Result, 428 00:25:00,480 --> 00:25:04,380 whatever it would be, but it is the people that we're working 429 00:25:04,380 --> 00:25:08,700 with that have maybe a different perspective, or don't have the 430 00:25:08,700 --> 00:25:14,400 perspective that we have, and so it's influencing them as well, 431 00:25:14,400 --> 00:25:19,020 through the science, through the experience that you have, 432 00:25:19,020 --> 00:25:23,000 through all of those facets. So in your job, when you're working 433 00:25:23,060 --> 00:25:27,020 in this, obviously, it's looking at the final design and making 434 00:25:27,020 --> 00:25:30,140 sure that it's as easy to use, intuitive to use, as possible, 435 00:25:30,140 --> 00:25:33,380 so that people don't have to have all those friction points. 436 00:25:34,280 --> 00:25:38,240 But with the people that you're working with, what are some of 437 00:25:38,240 --> 00:25:41,080 the what are some of the things you already brought up? One like 438 00:25:41,080 --> 00:25:43,840 the person that you mentioned that I'm going to put my own 439 00:25:43,960 --> 00:25:47,320 research design out there. But what are some other kind of 440 00:25:47,320 --> 00:25:50,740 roadblocks that you run into that you think you need to help 441 00:25:50,740 --> 00:25:53,380 people see a different way of doing things? 442 00:25:54,220 --> 00:25:57,880 Alexis Mook: Yeah, so up Friday, I want to also say I feel like 443 00:25:57,880 --> 00:26:01,800 this is a very important time for me to be sharing the value 444 00:26:01,800 --> 00:26:05,100 of research to a large crowd, because there's a lot of 445 00:26:05,100 --> 00:26:09,660 companies that are cutting a lot of research departments, ours 446 00:26:09,660 --> 00:26:13,560 included. It was not my org, but it was another one parallel to 447 00:26:13,560 --> 00:26:17,520 us, and I think they pretty much cut all their US researchers, 448 00:26:17,640 --> 00:26:21,620 and they were very, very smart and talented people. So, you 449 00:26:21,620 --> 00:26:23,960 know, that's a bummer to see at your own company. And I see it 450 00:26:23,960 --> 00:26:27,140 other places too. I think now more than ever, we're trying to, 451 00:26:27,140 --> 00:26:32,120 like, argue our value, yeah, but I see it every day, quite 452 00:26:32,120 --> 00:26:34,760 honestly, like, and I don't know, like, if I should just 453 00:26:34,760 --> 00:26:38,360 start showing people more more transparently, but I literally, 454 00:26:38,360 --> 00:26:42,880 I had to give a presentation on sample size last week. So like, 455 00:26:43,120 --> 00:26:47,020 like, I any type of study they're running this. This even 456 00:26:47,020 --> 00:26:50,320 happened in a meeting this week. They're like, why, why do you 457 00:26:50,320 --> 00:26:54,640 only talk to six people? Why don't you do up to 10? And it's 458 00:26:54,640 --> 00:26:58,660 like, there is a plethora of work that shows you only need 459 00:26:58,660 --> 00:27:02,100 five to eight people in qualitative work to distill like 460 00:27:02,160 --> 00:27:06,240 85% of themes you're going to get anyways. If we if we were to 461 00:27:06,240 --> 00:27:10,800 do 12 to 10 people, we would have spent more money and more 462 00:27:10,800 --> 00:27:15,540 time to give you probably the same result. And then don't even 463 00:27:15,540 --> 00:27:19,260 get me into quantitative work, because almost no one knows how 464 00:27:19,260 --> 00:27:24,620 to do that, except the PhDs in the program. And we actually, me 465 00:27:24,620 --> 00:27:29,780 and two of my other coworkers who also have PhDs and different 466 00:27:29,780 --> 00:27:32,960 things, we started a quantitative guild where, at 467 00:27:32,960 --> 00:27:35,840 first we were hoping it to kind of be how you guys described 468 00:27:35,840 --> 00:27:38,000 this at the beginning, like we're just gonna be a bunch of 469 00:27:38,000 --> 00:27:41,140 like nerds getting together and talking quant stuff like it that 470 00:27:41,140 --> 00:27:44,740 will be fun. And then when people joined, we saw, like, the 471 00:27:44,740 --> 00:27:48,760 whole audience was like, novices who were like, We need to learn 472 00:27:48,760 --> 00:27:52,540 how to do quant work. So then it came into like, now I'm like, 473 00:27:52,540 --> 00:27:58,840 teaching a stats class to people, and I'll say, like, 474 00:27:58,840 --> 00:28:03,120 there are, I oftentimes it's up to the researcher to know these 475 00:28:03,120 --> 00:28:06,300 things and be the one speaking up to them in meetings when 476 00:28:06,300 --> 00:28:10,080 you're deciding what method to use, or what sample it is, and 477 00:28:10,140 --> 00:28:13,620 how to run the statistics, or if statistics are even needed, like 478 00:28:13,740 --> 00:28:17,280 that's usually on the researcher to know. I will say I have 479 00:28:17,280 --> 00:28:21,500 worked with some really smart PMS who also, you know, they've 480 00:28:21,500 --> 00:28:25,040 taken a stats class. I know the basics, and I personally like 481 00:28:25,040 --> 00:28:27,980 that stuff because it is kind of quantity, and I think it's kind 482 00:28:27,980 --> 00:28:31,220 of fun, but not everyone does. So again, it all kind of just 483 00:28:31,220 --> 00:28:34,160 depends on what products you're on and what phase of like, 484 00:28:34,160 --> 00:28:36,740 development they're in on, like, what you'd actually be really 485 00:28:36,740 --> 00:28:37,640 doing day to day. 486 00:28:38,000 --> 00:28:40,600 Tim Houlihan: So if you could wave a magic wand and have 487 00:28:40,600 --> 00:28:46,780 everybody understand this is the value that the UX researchers 488 00:28:46,780 --> 00:28:50,440 bring to the table. What would those say? Two things be that 489 00:28:50,440 --> 00:28:52,540 they have the largest misconceptions about, 490 00:28:52,900 --> 00:28:55,600 Alexis Mook: oh, my goodness, I would love to have actual data 491 00:28:55,660 --> 00:28:59,860 on, like, product usage with and without a researcher on the 492 00:28:59,860 --> 00:29:03,480 team, or something like, like having them fully live through 493 00:29:03,480 --> 00:29:06,600 the development cycle with a researcher helping them and 494 00:29:06,600 --> 00:29:10,440 without one and see the outcome like that would be, that would 495 00:29:10,440 --> 00:29:13,500 be my dream, like a parallel universe where they're seeing 496 00:29:13,500 --> 00:29:15,900 both experiences at the same time, kind 497 00:29:15,900 --> 00:29:19,380 Kurt Nelson: of like this wonderful Life where, like, 498 00:29:19,380 --> 00:29:21,680 here's your life with your life, with your life, 499 00:29:21,680 --> 00:29:25,520 Alexis Mook: literally, yes, that would be awesome. Because I 500 00:29:25,520 --> 00:29:29,120 do think I it depends on the team. As I said, like, I do 501 00:29:29,120 --> 00:29:33,440 think it's hard to see, like, without actually having 502 00:29:33,440 --> 00:29:37,040 experienced both of those like, I could imagine why it'd be 503 00:29:37,040 --> 00:29:39,860 like, Well, why would we need a researcher? Like, I can do 504 00:29:39,860 --> 00:29:43,840 research like PMS, consider talking to two customers 505 00:29:43,840 --> 00:29:47,800 research, and it's like, it's like you were they're also, 506 00:29:47,800 --> 00:29:49,660 like, their buddies too. It's like, you're just, like, 507 00:29:49,660 --> 00:29:51,760 chatting with your friend. That's not research. 508 00:29:52,660 --> 00:29:54,460 Tim Houlihan: That's a nice conversation, but that's not 509 00:29:54,460 --> 00:29:55,600 really, yeah, and like, 510 00:29:55,660 --> 00:29:58,360 Alexis Mook: some, like, I said, some of them are really good. 511 00:29:58,360 --> 00:30:01,320 And like, they, I know some of them. Am do good research, for 512 00:30:01,320 --> 00:30:04,020 sure, but I'm like, if there's some one off conversations with 513 00:30:04,020 --> 00:30:07,080 customers, that's not really research, like, are you 514 00:30:07,080 --> 00:30:09,480 following a script? Like, are you asking them all the same 515 00:30:09,480 --> 00:30:13,320 questions? Like, I doubt it. I think you're just riffing. And 516 00:30:13,320 --> 00:30:16,080 then you're like, come back with all these ideas. And it's like 517 00:30:16,380 --> 00:30:19,320 those ideas aren't invaluable by any means, but it's like those 518 00:30:19,320 --> 00:30:22,940 ideas need validated if that's how you got them, you know, 519 00:30:23,480 --> 00:30:23,840 yeah, 520 00:30:25,040 --> 00:30:27,140 Kurt Nelson: how are you bringing in with your 521 00:30:27,140 --> 00:30:29,960 background, and obviously the research methodologies of 522 00:30:29,960 --> 00:30:31,700 bringing in, but are you bringing in any of the 523 00:30:31,700 --> 00:30:35,780 psychology or the behavioral science that you studied into 524 00:30:35,780 --> 00:30:37,700 this work that you're doing? I 525 00:30:37,700 --> 00:30:41,800 Alexis Mook: mean, there's a ton of biases that I'll bring up 526 00:30:41,860 --> 00:30:45,280 that, like, I know, just from my background, right? Like, yeah, 527 00:30:45,280 --> 00:30:48,640 like, availability bias, like, we're like, talking about 528 00:30:48,640 --> 00:30:56,500 sampling bias the other day, you know, in my sample size. So 529 00:30:57,520 --> 00:31:03,660 yeah, I say like, social psych and certain examples of why and 530 00:31:03,660 --> 00:31:06,660 when things could be bias, like comes up a lot. Quite honestly, 531 00:31:06,900 --> 00:31:11,760 I do say, though, like a lot of our team, I think trying to be 532 00:31:11,820 --> 00:31:15,300 unbiased has been drilled into their heads so, like, they know 533 00:31:15,300 --> 00:31:18,300 they they know what they shouldn't do. Usually, I think 534 00:31:18,540 --> 00:31:21,560 in this, I see this everywhere all the time. I feel like, you 535 00:31:21,560 --> 00:31:26,360 guys probably can relate. But someone's like, well, you know, 536 00:31:26,420 --> 00:31:32,660 I'm, I'm gonna be as unbiased as possible. And I'm like, I'm 537 00:31:32,660 --> 00:31:36,140 like, you can't be unbiased. Like, that's I like, I've given 538 00:31:36,140 --> 00:31:38,660 up on being unbiased. It's not gonna happen. I'm incredibly 539 00:31:38,660 --> 00:31:42,040 biased. Like, if anything, I'd rather you come to me say I'm 540 00:31:42,040 --> 00:31:46,480 incredibly biased, but I think this easier to work with, and 541 00:31:46,660 --> 00:31:49,600 then I'm like, okay, cool. Like, like, let's see what someone 542 00:31:49,600 --> 00:31:50,260 else thinks. Well, 543 00:31:50,620 --> 00:31:54,640 Tim Houlihan: give us an example of you mentioned confirmation 544 00:31:54,640 --> 00:31:58,900 bias. Can you give us an example a real world you're in a 545 00:31:58,900 --> 00:32:01,800 meeting, you're having a conversation, you're presenting 546 00:32:01,800 --> 00:32:06,900 some ideas, when does of psychological bias come to the 547 00:32:07,200 --> 00:32:08,700 forefront of the discussion? 548 00:32:10,200 --> 00:32:15,900 Alexis Mook: Yeah? Yeah. It's sometimes it's tough because 549 00:32:15,960 --> 00:32:20,660 sometimes it's from a source that is powerful and important, 550 00:32:20,720 --> 00:32:29,240 and often told no. So sometimes I can't call it out, depending 551 00:32:29,240 --> 00:32:34,040 on the situation. Like, I can. I have a pretty big like, I have a 552 00:32:34,040 --> 00:32:38,780 pretty strong backbone. Like, I I'll tell someone how it is. Not 553 00:32:38,780 --> 00:32:42,820 all researchers are that way. Sometimes I'm like, I've told 554 00:32:42,820 --> 00:32:45,880 some of my researchers, pull me into the meeting, and I'll yell 555 00:32:45,880 --> 00:32:49,600 at them for you, like, so, but it will be like, there, there 556 00:32:49,600 --> 00:32:53,320 was this product, and it was an acquisition. So this guy, like, 557 00:32:53,380 --> 00:32:58,120 literally built it from the ground up. It was his baby, and 558 00:32:58,120 --> 00:33:01,380 he was just any I, luckily, I was not the researcher on the 559 00:33:01,380 --> 00:33:03,840 team. I just worked with one of the researchers on the team, and 560 00:33:04,200 --> 00:33:08,520 anything she presented, he just tore apart. And was like, That 561 00:33:08,520 --> 00:33:10,680 can't be that way because of this. Like, that can't be that 562 00:33:10,680 --> 00:33:14,100 way because of this. Like, like, that's totally wrong because, 563 00:33:14,160 --> 00:33:17,340 oh, don't even get me into you talk to the wrong people. That's 564 00:33:17,940 --> 00:33:23,120 everybody's favorite thing to say. And I just said the other 565 00:33:23,120 --> 00:33:26,660 day, I wish I had $1 every time someone said that in a playback. 566 00:33:26,720 --> 00:33:30,260 So like, let me dig into that a little deeper, because that's 567 00:33:30,260 --> 00:33:35,720 that is the form of bias is like they're so, so certain in their 568 00:33:35,720 --> 00:33:39,500 thoughts of how this project should have played out, that 569 00:33:39,500 --> 00:33:43,240 when I'm presenting them research where the people I 570 00:33:43,240 --> 00:33:47,560 selected were selected with a screener that we built together 571 00:33:47,560 --> 00:33:51,100 as a team. He he came up, they came up with all the questions, 572 00:33:51,100 --> 00:33:55,780 like I off. This happened later in my career, but I find this 573 00:33:55,780 --> 00:33:59,740 very valuable. You have the PMs pick who you're going to talk to 574 00:34:00,280 --> 00:34:05,100 out of the list of people who were qualified, and then you do 575 00:34:05,100 --> 00:34:10,260 the study, and then they will still, at the end say, Well, I 576 00:34:10,260 --> 00:34:13,140 just don't really think we are talking to the right people. And 577 00:34:13,140 --> 00:34:17,280 it's like you only think that because the results did not 578 00:34:17,280 --> 00:34:21,260 validate your current thoughts. Like, why don't we take a minute 579 00:34:21,260 --> 00:34:24,500 to process, what if this was the reality? What if I did talk to 580 00:34:24,500 --> 00:34:28,820 the right people, like, let's, let's imagine this. So that 581 00:34:28,820 --> 00:34:31,760 comes up a ton. And, you know, that's just confirmation bias to 582 00:34:31,760 --> 00:34:34,340 a T. It's just they're being told something that is 583 00:34:34,340 --> 00:34:36,800 conflicting with what they thought they'd hear. You know, 584 00:34:37,580 --> 00:34:38,180 yeah, 585 00:34:38,840 --> 00:34:41,200 Kurt Nelson: and it's really interesting when you said, you 586 00:34:41,200 --> 00:34:46,180 know, oftentimes people in power and they don't have that voice 587 00:34:46,180 --> 00:34:49,300 that's telling them, No, this is that's, that's not right, 588 00:34:49,300 --> 00:34:52,240 because that's just how it is. And I see it actually in the 589 00:34:52,240 --> 00:34:57,040 work that we do as well. And so it is a common trait, I think in 590 00:34:57,040 --> 00:35:02,280 many organizations. I mean, the people do. Tend to move up in 591 00:35:02,280 --> 00:35:05,460 the ladder. Are very sure of themselves, and they have very 592 00:35:05,700 --> 00:35:09,240 confident and they kind of push their way through many times, 593 00:35:09,840 --> 00:35:14,460 which can be very, very good for an organization. It does have 594 00:35:14,460 --> 00:35:17,580 its downsides, and those downsides are sometimes when the 595 00:35:17,580 --> 00:35:21,680 truth is shown to them, it's hard for them to accept, if it 596 00:35:21,680 --> 00:35:23,900 is against what they currently believe. 597 00:35:24,200 --> 00:35:26,720 Alexis Mook: So that's funny. You brought that up. We were 598 00:35:26,720 --> 00:35:32,480 just talking about this at lunch the other week, where chicken or 599 00:35:32,480 --> 00:35:38,120 egg, I'm like, Do you only get to a high leadership position 600 00:35:38,420 --> 00:35:42,400 because you have these traits that are like, I'm dominant and 601 00:35:42,400 --> 00:35:44,740 I'm just going to trench forward, and I kind of don't 602 00:35:44,740 --> 00:35:47,560 care who I hurt in the way and like, I'm just going to go up to 603 00:35:47,560 --> 00:35:52,420 the top. Or do you get to the top, and you have to be that way 604 00:35:52,480 --> 00:35:55,540 to like, interact with those even higher than you, and then 605 00:35:55,540 --> 00:35:59,260 you like change your personality to become even more like. I know 606 00:35:59,260 --> 00:36:01,860 they've said like, they share, like, Would it be, like, 607 00:36:01,860 --> 00:36:05,220 sociopathic tendencies or something like that? So funny. 608 00:36:05,520 --> 00:36:08,520 We had that conversation at lunch the other day, and I was 609 00:36:08,520 --> 00:36:11,280 like, I was like, I feel like it's probably a little bit of 610 00:36:11,280 --> 00:36:14,100 both, right? I imagine you probably kind of have some of 611 00:36:14,100 --> 00:36:17,460 those traits to begin with, but then they might get bolstered 612 00:36:17,520 --> 00:36:20,420 the more and more people around you are bolstering them for you, 613 00:36:20,420 --> 00:36:24,080 right? So, yeah, that's, that's funny that you brought that up. 614 00:36:24,080 --> 00:36:26,000 Kurt Nelson: I do love that, because I think there is, 615 00:36:26,000 --> 00:36:29,120 that's, it's an interesting conversation, right, to have 616 00:36:29,120 --> 00:36:33,140 around lunch table or wherever, and that, that there is 617 00:36:33,140 --> 00:36:35,960 probably, it's probably not a chicken or egg, it's probably 618 00:36:36,440 --> 00:36:39,740 something that can be, you know, a little bit of both, right? 619 00:36:39,740 --> 00:36:41,380 Yeah, there's that aspect 620 00:36:41,380 --> 00:36:43,480 Alexis Mook: of HR and nurture. It's a nature 621 00:36:43,600 --> 00:36:46,240 Kurt Nelson: and nurture kind of component, right? Oh, yeah. Tim 622 00:36:46,240 --> 00:36:48,040 will love that. Oh, I'm 623 00:36:48,040 --> 00:36:52,060 Tim Houlihan: always into, always love that discussion. I 624 00:36:52,060 --> 00:36:56,680 guess one of the, one of the things that I think our 625 00:36:56,680 --> 00:36:59,920 listeners might be interested in hearing about, though, is on a 626 00:36:59,920 --> 00:37:03,900 day to day basis. You you have this look on your face. You have 627 00:37:03,900 --> 00:37:08,220 a that is joyful, right? You have this sense of happiness 628 00:37:08,220 --> 00:37:11,280 about you, like you found your groove. Is kind of what it 629 00:37:11,280 --> 00:37:14,880 sounds like. Is that set of assessment you caught me on a 630 00:37:14,880 --> 00:37:15,420 good day? 631 00:37:18,000 --> 00:37:21,320 Alexis Mook: Yeah, I will say this move to management, I've 632 00:37:21,320 --> 00:37:25,160 really enjoyed, I really love my researchers. They're really cool 633 00:37:25,160 --> 00:37:27,500 people. They're very good at their job, which makes my job 634 00:37:27,500 --> 00:37:31,640 easy. So, like, having a really good team like this, I think I 635 00:37:31,640 --> 00:37:34,460 found my groove in managing, which I've always, I've always 636 00:37:34,460 --> 00:37:37,400 sought leadership positions in other organizations I've been 637 00:37:37,400 --> 00:37:40,840 in. Like, it's kind of, I'd say that is part of my nature. Like, 638 00:37:40,840 --> 00:37:45,460 I don't, I don't think that was something that I never had, but, 639 00:37:45,880 --> 00:37:48,820 but, yeah, like, before, prior, like, doing the actual research. 640 00:37:48,820 --> 00:37:53,440 You know, there are times, like, for every pro, there's, you 641 00:37:53,440 --> 00:37:57,040 know, double edged sword of, like, what I'm getting in 642 00:37:57,100 --> 00:38:00,400 applied research that I didn't have in academia. Like, they're 643 00:38:00,400 --> 00:38:03,600 flip sides of the coin, right? So, like, yeah, yeah. I don't 644 00:38:03,600 --> 00:38:06,540 have to be as theoretical. But then all of a sudden, I'm, like, 645 00:38:06,600 --> 00:38:09,240 doing something like that quick and dirty, that kind of, like, 646 00:38:09,360 --> 00:38:12,060 is barely researched, just because the stakeholder needs 647 00:38:12,060 --> 00:38:16,260 it. And then there's other times where, like, like, teaching 648 00:38:16,260 --> 00:38:19,200 people sample size. I'm like, I feel like we should kind of all 649 00:38:19,200 --> 00:38:23,840 know this, like, this is kind of important, and then, like I do, 650 00:38:23,840 --> 00:38:28,280 sometimes I miss the more meaningful work. So yeah, like, 651 00:38:28,940 --> 00:38:32,240 especially being at IBM, especially during the rise of 652 00:38:32,300 --> 00:38:37,760 AI, I feel like my work is somewhat impactful, but the work 653 00:38:37,760 --> 00:38:41,860 I did at FIU was research to mitigate wrongful convictions. 654 00:38:41,920 --> 00:38:45,760 So there's, like, a lot of times I missed that. That was really 655 00:38:45,760 --> 00:38:51,460 cool research. It made me feel good as a person, you know. And 656 00:38:52,120 --> 00:38:54,940 like I said, it's not that I never getting that at IBM. It's 657 00:38:54,940 --> 00:39:00,300 just not as, like, societally impactful, I guess, or I guess 658 00:39:00,300 --> 00:39:04,440 you could argue not with with AI coming up, I don't know, but 659 00:39:04,980 --> 00:39:05,160 well, 660 00:39:05,160 --> 00:39:06,960 Tim Houlihan: but you're influencing a lot of people. 661 00:39:06,960 --> 00:39:09,780 There's a lot of users out there. There are a lot of 662 00:39:09,780 --> 00:39:12,660 customers that end up being influenced by your 663 00:39:12,660 --> 00:39:15,060 Alexis Mook: work, right? Yeah, and, and, as I said earlier, 664 00:39:15,060 --> 00:39:18,840 like in a quick fashion, right? Like something I, something I, 665 00:39:19,020 --> 00:39:23,660 an insight I developed on Friday of the week before could be in 666 00:39:23,660 --> 00:39:26,420 the product, like the next Friday, and then people could be 667 00:39:26,420 --> 00:39:29,600 using it next week. And I'll say one thing we're trying to get 668 00:39:29,600 --> 00:39:32,120 better at, that some teams are really good at this, but 669 00:39:32,720 --> 00:39:38,540 instrumentation, so like actually tracking usage from the 670 00:39:38,540 --> 00:39:43,060 back end, so clicks, where they went, all that stuff, we you can 671 00:39:43,060 --> 00:39:46,780 kind of see the impact of your research when a product is 672 00:39:46,780 --> 00:39:49,780 instrumented like that. And that stuff's very cool to me too, but 673 00:39:49,780 --> 00:39:52,480 also that's because it's like just more data and numbers that 674 00:39:52,480 --> 00:39:56,440 I can play with. So like I said, I'm more on the nerdy number 675 00:39:56,440 --> 00:39:59,320 side than some other researchers, but, but, yeah, 676 00:39:59,320 --> 00:40:02,640 it's that. Kurt is really cool. And, like I said, the impacts 677 00:40:02,640 --> 00:40:06,960 quicker than it would be in academia. So for every pro over 678 00:40:06,960 --> 00:40:09,360 there, there's a con over here and all that good stuff, but, 679 00:40:09,780 --> 00:40:12,480 but I'll say in general, like, I'm in a really good team right 680 00:40:12,480 --> 00:40:16,860 now, so makes my life easier. And I really like helping 681 00:40:16,860 --> 00:40:20,600 people, which being a manager is mostly just helping and teaching 682 00:40:20,600 --> 00:40:24,740 and get guiding them to be good researchers. So I never wanted 683 00:40:24,740 --> 00:40:27,740 to be a teacher, teacher, but I feel like I like this position 684 00:40:27,740 --> 00:40:31,400 where I'm more like mentoring from a distance. But yeah, so 685 00:40:31,400 --> 00:40:34,160 it's like, yeah, I think, I think I found my groove. Hope 686 00:40:34,160 --> 00:40:34,700 for now. 687 00:40:35,960 --> 00:40:38,060 Kurt Nelson: Well, it sounds like we've had conversations 688 00:40:38,060 --> 00:40:42,640 with a couple different people on the show about big P purpose 689 00:40:42,640 --> 00:40:46,600 versus little P purpose. And it sounds to a degree like FIU is 690 00:40:46,600 --> 00:40:49,540 like, there is some big P purpose. It's like wrongful 691 00:40:49,540 --> 00:40:53,800 convictions. We're working to do this. But at IBM, it's like 692 00:40:53,920 --> 00:40:56,080 we're working with and we're getting to work with these 693 00:40:56,080 --> 00:40:59,380 people, and I'm having this mentorship ability and and 694 00:40:59,380 --> 00:41:02,640 leading and doing some, you know, it's not, we're not. You 695 00:41:02,640 --> 00:41:05,460 are impacting a lot of people with the with the products and 696 00:41:05,460 --> 00:41:08,040 different things you're making. But the work that you're really 697 00:41:08,340 --> 00:41:12,060 kind of glomming on is, like, I'm making a difference in, you 698 00:41:12,060 --> 00:41:15,720 know, the smaller circle of people, would you agree? Or am 699 00:41:15,720 --> 00:41:17,280 I? Am I way off base there? 700 00:41:17,280 --> 00:41:20,600 Alexis Mook: No, absolutely. Yeah. That's and I'll say, like, 701 00:41:20,780 --> 00:41:24,860 that's when I have a good day, right? Like, when I come home 702 00:41:24,860 --> 00:41:27,800 and I'm like, I've had a good day, it's because one of my 703 00:41:27,800 --> 00:41:32,060 researchers was having some sort of problem or an issue with a 704 00:41:32,120 --> 00:41:35,060 teammate or just needed help with something, and I like 705 00:41:35,720 --> 00:41:39,560 taught them, like not taught, but like, guided them in what I 706 00:41:39,560 --> 00:41:42,040 thought they should do, and, like, next steps, and then they 707 00:41:42,040 --> 00:41:45,340 tell me, like, Oh, that was that made so much sense. Like, you 708 00:41:45,340 --> 00:41:47,800 helped me so much. Thank you. Like, like, that's a that's a 709 00:41:47,800 --> 00:41:52,480 happy feeling, right? And I kind of get that, like, It's not 710 00:41:52,480 --> 00:41:55,780 every day, but like, I get that kind of frequently. And it's 711 00:41:55,780 --> 00:41:59,680 even beyond the researchers to, like, my my larger team, we work 712 00:41:59,680 --> 00:42:05,040 with design managers, design directors, our VP of design you 713 00:42:05,040 --> 00:42:08,580 know, they're really supportive community as well. And you know, 714 00:42:08,580 --> 00:42:13,140 like, sometimes they'll come to me and be like, hey, like, your 715 00:42:13,140 --> 00:42:17,340 researcher said this, but like, that seems wrong. Like I'm not 716 00:42:17,340 --> 00:42:19,920 trying to throw them under the bus, but like, walk me through 717 00:42:19,920 --> 00:42:24,500 it. And then I can be like, Well, yeah, you're, you 718 00:42:24,500 --> 00:42:28,460 shouldn't be doing the usability test yourself. Like, so, yeah, 719 00:42:28,460 --> 00:42:32,000 it's like, it's, we're helping a larger team as well. So there's 720 00:42:32,000 --> 00:42:35,840 a lot, a lot of good days, some bad days, you know? But 721 00:42:36,620 --> 00:42:39,380 Kurt Nelson: as is, right, there's always good managing, 722 00:42:39,440 --> 00:42:39,740 yeah, 723 00:42:39,740 --> 00:42:41,800 Alexis Mook: it's like, if one of my researchers having a bad 724 00:42:41,800 --> 00:42:45,820 day, I end up having a bad day, usually, because I, because I 725 00:42:45,820 --> 00:42:50,260 want them to be, you know, happy and effective and like, enjoying 726 00:42:50,260 --> 00:42:54,580 their work as much as they can, right? Like I, I don't know if I 727 00:42:54,580 --> 00:42:58,780 should say this, but I will, like, I'm a big like you, you 728 00:42:59,140 --> 00:43:03,660 live, you work to live, not live to work, right? So, like, I just 729 00:43:03,660 --> 00:43:06,720 want to make sure they're, like, having as great of a work 730 00:43:06,720 --> 00:43:10,140 experience as they can, while understanding it's just work, 731 00:43:10,140 --> 00:43:13,680 right? So I'm constantly telling them to, like, take vacation, 732 00:43:13,740 --> 00:43:17,160 which, like, if they, like, on a stressful day, I'm like, like, 733 00:43:17,160 --> 00:43:20,100 take Friday off. Like, I just like, please. Like, do not burn 734 00:43:20,100 --> 00:43:21,200 out on me. I need you 735 00:43:23,060 --> 00:43:26,360 Tim Houlihan: absolutely. I want to switch to a totally 736 00:43:26,360 --> 00:43:30,380 hypothetical question here. Alexis, if you were stranded on 737 00:43:30,380 --> 00:43:34,700 a desert island, and let's say, for a year, and you could bring 738 00:43:34,760 --> 00:43:39,380 a listening device, but it only had two musical artists on it, 739 00:43:40,280 --> 00:43:41,680 which two would you select 740 00:43:41,740 --> 00:43:44,560 Kurt Nelson: their catalogs of music? You get us everything 741 00:43:44,560 --> 00:43:45,100 done, yeah, 742 00:43:45,940 --> 00:43:48,940 Alexis Mook: everything they've ever done, yeah, like one song, 743 00:43:48,940 --> 00:43:55,360 okay, this is good. This is good. Miley Cyrus. She's an 744 00:43:55,360 --> 00:44:01,380 inspiration. I saw her live at ACL, and it changed my life. And 745 00:44:01,560 --> 00:44:05,820 let me think, who do I listen to a lot of a whole album too. 746 00:44:05,820 --> 00:44:09,780 That's like, that's me digging deep in my brain. I might go 747 00:44:09,780 --> 00:44:15,420 blink 182 okay, get a little pop, a little punk, yeah. And 748 00:44:15,420 --> 00:44:17,940 they so much music, it would keep me entertained for a 749 00:44:19,080 --> 00:44:23,120 Tim Houlihan: long time. So still variety seeking. Yeah, of 750 00:44:23,120 --> 00:44:27,020 course. Of course, it does. I love that. Well. Alexis, it is 751 00:44:27,020 --> 00:44:29,480 absolutely a pleasure to have you as a guest of Behavioral 752 00:44:29,480 --> 00:44:32,180 Grooves. Thanks for spending time with us today. Yeah, 753 00:44:32,180 --> 00:44:34,100 Alexis Mook: no. Thank you guys so much. It was awesome talking 754 00:44:34,100 --> 00:44:38,120 with you. As I said, I've heard podcasts before. I've heard good 755 00:44:38,120 --> 00:44:41,380 things through our friends. So yeah, so happy to help you guys 756 00:44:41,440 --> 00:44:42,820 out. Thank you. 757 00:44:51,280 --> 00:44:53,260 Kurt Nelson: Welcome to our grooving session where Tim and I 758 00:44:53,260 --> 00:44:55,840 share ideas on what we learned from our discussion with Alexis. 759 00:44:55,840 --> 00:44:59,560 Have a free flowing conversation and groove on whatever else 760 00:44:59,560 --> 00:45:04,380 comes in. To our user interface brains. Yeah, 761 00:45:04,380 --> 00:45:06,840 Tim Houlihan: we are kind of user interface centric, aren't 762 00:45:06,900 --> 00:45:10,200 we? Like we're, we're looking for ways to connect with the 763 00:45:10,200 --> 00:45:14,640 world and our own user interface, our our hands and 764 00:45:14,640 --> 00:45:15,660 eyes and 765 00:45:16,680 --> 00:45:18,660 Kurt Nelson: ears, not where I was going, but yeah, that's 766 00:45:18,660 --> 00:45:19,380 okay. I was. 767 00:45:21,960 --> 00:45:24,080 Tim Houlihan: That's, that's how I'm, how are you seeing 768 00:45:24,800 --> 00:45:27,320 Kurt Nelson: it, just seeing it as, like, you know, we're 769 00:45:27,320 --> 00:45:32,300 focused outward on people, and our brains are all about the 770 00:45:32,300 --> 00:45:37,760 user internally, and how we do that. So, yeah, a little 771 00:45:37,760 --> 00:45:39,860 Tim Houlihan: green man that sits in the back of the brain. 772 00:45:39,860 --> 00:45:40,220 And, 773 00:45:41,540 --> 00:45:43,840 Kurt Nelson: yeah, I mean, and you've seen enough Disney 774 00:45:43,840 --> 00:45:47,020 movies, you know that there's a little guy inside your head 775 00:45:47,020 --> 00:45:50,620 running it. Anyway. I do. 776 00:45:51,520 --> 00:45:56,740 Tim Houlihan: Okay, so where are you on this you want to so I 777 00:45:56,740 --> 00:45:59,200 Kurt Nelson: think we talked about this front right? I think 778 00:45:59,200 --> 00:46:01,200 there's, there's a couple different things, and one of the 779 00:46:01,200 --> 00:46:06,720 big things is about how confirmation bias really can 780 00:46:06,720 --> 00:46:15,900 have a negative impact in any type of UX design work. Right 781 00:46:15,900 --> 00:46:20,580 this idea that our preconceived notions 782 00:46:22,260 --> 00:46:24,320 Unknown: kind of determine 783 00:46:24,320 --> 00:46:28,280 Kurt Nelson: how we view the research that we do, or even the 784 00:46:28,280 --> 00:46:32,540 design that we do, and that, I think is really an important 785 00:46:32,540 --> 00:46:36,260 piece to take into consideration, and and I loved 786 00:46:36,260 --> 00:46:40,780 Alexis components of saying, make sure that you have a third 787 00:46:40,780 --> 00:46:45,040 party, unbiased third party, coming in and looking at this 788 00:46:45,040 --> 00:46:45,400 stuff, 789 00:46:45,820 --> 00:46:49,000 Tim Houlihan: yeah, it's because we tend to favor information 790 00:46:49,000 --> 00:46:53,260 that confirms our own preconceptions, right? It's what 791 00:46:53,260 --> 00:46:58,060 helps us go. I mean, I think sometimes confirmation bias has 792 00:46:58,060 --> 00:47:02,280 been called like one of the biggest problems of the mother 793 00:47:02,280 --> 00:47:04,980 Kurt Nelson: of all biases, as I like to say, yeah, 794 00:47:06,360 --> 00:47:09,180 Tim Houlihan: as Kurt refers to it as the mother of all biases. 795 00:47:09,780 --> 00:47:14,820 But I think it was, it was research from what was it, Lord 796 00:47:14,820 --> 00:47:19,320 Ross and leper in 1979 that showed how people, when shown 797 00:47:19,380 --> 00:47:24,260 the same data, the same story, that they interpreted 798 00:47:24,260 --> 00:47:27,200 differently based on their prior beliefs, and 799 00:47:27,200 --> 00:47:30,740 Kurt Nelson: they interpreted what the paper said and what 800 00:47:30,740 --> 00:47:33,080 they took out of it. So if they were, and I can't remember 801 00:47:33,080 --> 00:47:36,800 exactly what it was, pro guns versus, you know, gun gun 802 00:47:36,800 --> 00:47:40,040 control, and they read the same paper and they said, Oh yeah, 803 00:47:40,040 --> 00:47:44,080 see, the paper supports my view, whether that view was like we 804 00:47:44,140 --> 00:47:47,260 need to take guns out of people's hands and have more 805 00:47:47,260 --> 00:47:51,040 control, or know that we need to have more gun, not necessarily 806 00:47:51,040 --> 00:47:54,760 more guns, but less restrictions on guns, and that that is 807 00:47:54,760 --> 00:47:57,880 against this thing, same paper, same information. But the way 808 00:47:57,880 --> 00:48:01,800 that our brains operate is at a subconscious level, it's 809 00:48:01,800 --> 00:48:05,160 actually filtering the information or tainting that 810 00:48:05,160 --> 00:48:08,520 information in how we read it, so that those things that 811 00:48:08,520 --> 00:48:12,600 support our pre held beliefs actually gets enlarged, gets 812 00:48:12,600 --> 00:48:16,860 exaggerated within our brain, those things that are opposed to 813 00:48:17,160 --> 00:48:21,260 those pre held beliefs get minimized or shrunk within our 814 00:48:21,260 --> 00:48:24,320 brain, thus we get two different interpretations. 815 00:48:24,560 --> 00:48:28,820 Tim Houlihan: So if the product owner is going to say we need to 816 00:48:28,820 --> 00:48:31,700 do some research to make sure that this new user interface is 817 00:48:31,820 --> 00:48:35,000 going to have the kind of impact that we want, it's really 818 00:48:35,000 --> 00:48:38,900 important for that product owner to actually separate the 819 00:48:38,900 --> 00:48:42,220 research from the product ownership team, so that the 820 00:48:42,220 --> 00:48:46,240 research is done independent of that, so that whatever comes 821 00:48:46,240 --> 00:48:48,940 back isn't easily dismissed because it doesn't agree with 822 00:48:48,940 --> 00:48:52,060 their priors, or is easily confirmed because it does agree 823 00:48:52,060 --> 00:48:53,440 with their priors. Or, 824 00:48:53,500 --> 00:48:55,840 Kurt Nelson: you know, I think Alexis even brought up this idea 825 00:48:55,840 --> 00:49:00,120 that, hey, you know, they had a boss who who said, I'm going to 826 00:49:00,120 --> 00:49:03,180 determine the participant list of the people that we are going 827 00:49:03,180 --> 00:49:07,620 to do this, to kind of do that, and then the results didn't 828 00:49:07,620 --> 00:49:12,360 align up with his expectations. And even after that, that was, I 829 00:49:12,360 --> 00:49:16,560 think, going back and forth. And so we tend to do these things, 830 00:49:16,620 --> 00:49:21,620 and they happen at a level that is below our consciousness at 831 00:49:21,620 --> 00:49:25,160 many points, and so we don't realize that it's going on. And 832 00:49:25,160 --> 00:49:28,940 so if you ask us somebody, it's like, no, I'm not biased on 833 00:49:28,940 --> 00:49:31,640 this, or no, I'm not going to that's not going to impact this, 834 00:49:31,820 --> 00:49:33,740 when, in fact, it actually does so. 835 00:49:34,280 --> 00:49:36,680 Tim Houlihan: One of the other things that I like to call 836 00:49:36,680 --> 00:49:40,780 attention to, that we want to emphasize here, is this idea of 837 00:49:40,780 --> 00:49:44,620 the relationship between overconfidence and power, and 838 00:49:44,620 --> 00:49:48,700 that it regrettably all too common to see these two 839 00:49:48,700 --> 00:49:52,420 correlated, that the higher you go in the organization, the more 840 00:49:52,420 --> 00:49:55,480 power you have, the greater your overconfidence as well. And 841 00:49:55,480 --> 00:49:59,560 overconfidence isn't always a good thing. It can get in the 842 00:49:59,560 --> 00:50:04,860 way. Of making good decisions, right? So, so I think that this 843 00:50:04,860 --> 00:50:08,460 is something that Alexis kind of pointed to. And there was a 844 00:50:08,460 --> 00:50:12,960 study by fast sivanathan, Mayer and Galinsky from 2012 that 845 00:50:12,960 --> 00:50:16,860 actually showed how power increases confidence more than 846 00:50:16,860 --> 00:50:21,860 competence. Power is let me just say that, again, power increases 847 00:50:21,860 --> 00:50:24,980 confidence more than it increases competence, 848 00:50:25,040 --> 00:50:28,100 Kurt Nelson: so just more than confidence increases confidence. 849 00:50:28,160 --> 00:50:29,600 So, yeah, yes, right. 850 00:50:29,780 --> 00:50:32,420 Tim Houlihan: So the higher you go in the organization, doesn't 851 00:50:32,420 --> 00:50:34,460 necessarily mean you're smarter, 852 00:50:35,000 --> 00:50:38,240 Kurt Nelson: you're not more competent, yet you're more 853 00:50:38,240 --> 00:50:42,160 confident, confident. And actually the there's an that's 854 00:50:42,160 --> 00:50:44,920 an interesting point you make up, because there might be more 855 00:50:44,920 --> 00:50:49,480 competent people inside of the organization who have an 856 00:50:49,480 --> 00:50:53,620 expertise in this particular line of business, this 857 00:50:53,620 --> 00:50:59,080 particular research, whatever it would be, who don't have that 858 00:50:59,080 --> 00:51:03,480 same confidence because they don't have the power or the 859 00:51:03,480 --> 00:51:08,220 level of authority within the organization, whereas that 860 00:51:08,220 --> 00:51:14,160 senior vice president or that President, or even the executive 861 00:51:14,160 --> 00:51:19,380 director, will have more confidence, And that is counter 862 00:51:19,380 --> 00:51:24,320 intuitive and counterproductive many times, because you're going 863 00:51:24,320 --> 00:51:29,660 to overpower right so, and it's hard to speak truth to power if 864 00:51:29,660 --> 00:51:32,120 you don't feel confident about it. And the person that you're 865 00:51:32,360 --> 00:51:36,500 speaking to is very confident. It's that, as you said, 866 00:51:36,500 --> 00:51:41,200 overconfidence can lead to misalignments where I can't be 867 00:51:41,200 --> 00:51:45,580 wrong. I'm feel really confident about this, and so you don't 868 00:51:46,540 --> 00:51:51,400 take information from others, and so people feel intimidated 869 00:51:51,460 --> 00:51:56,860 to speak their truth to you. And then you also have the, you 870 00:51:56,860 --> 00:51:59,740 know, confirmation bias coming in then too. So they 871 00:51:59,740 --> 00:52:03,660 Tim Houlihan: play together. So if you are a leader that feels 872 00:52:03,840 --> 00:52:08,640 immune to error, this is an opportunity to add some humility 873 00:52:08,640 --> 00:52:12,300 to your game and improve the way that you are a leader and 874 00:52:12,420 --> 00:52:15,660 improve your decision making because of simply being open to 875 00:52:15,660 --> 00:52:19,200 the idea that maybe, in this case, your gut isn't always 876 00:52:19,200 --> 00:52:22,220 right, right? Yeah. And 877 00:52:22,220 --> 00:52:25,040 Kurt Nelson: lastly, I mean, we talked about this multiple 878 00:52:25,040 --> 00:52:28,520 times, kind of just in passing, but this idea of having a third 879 00:52:28,520 --> 00:52:32,000 party be involved for their objectivity, right? Yes, 880 00:52:32,060 --> 00:52:36,680 separating design from research, separating the review of 881 00:52:36,680 --> 00:52:41,860 findings from the research itself. You know, there's all of 882 00:52:41,860 --> 00:52:46,240 this that comes into this, and we all have blind spots. We all 883 00:52:46,240 --> 00:52:52,600 have biases that we have that we may not be aware of, and that 884 00:52:52,600 --> 00:52:57,940 can be mitigated to a certain degree if we can bring in 885 00:52:58,540 --> 00:53:03,360 objective third party people in the right moments. 886 00:53:03,840 --> 00:53:09,000 Tim Houlihan: Yeah, absolutely. Ronan Gilovich and Ross wrote 887 00:53:09,000 --> 00:53:11,760 about that blind spot bias really beautifully in a 888 00:53:11,760 --> 00:53:16,320 fantastic paper that is just, I mean, Tom Gilovich, Lee Ross, 889 00:53:16,320 --> 00:53:18,960 both of those guys were just monsters, and they wrote such 890 00:53:18,960 --> 00:53:21,500 great stuff. They asked such questions. 891 00:53:21,560 --> 00:53:24,800 Kurt Nelson: And I love, I love this idea that, I think in that, 892 00:53:24,860 --> 00:53:29,300 that study, there was this idea that we can spot the blind spots 893 00:53:29,300 --> 00:53:33,440 in other people like that. It's like, oh my gosh, that person, 894 00:53:33,500 --> 00:53:38,900 that friend of yours, who is always going through, you know, 895 00:53:38,960 --> 00:53:43,120 a new partner. And it's always something that the partner did, 896 00:53:43,120 --> 00:53:47,260 and you're looking at them going, Dude, it's you, 10 897 00:53:47,260 --> 00:53:51,580 examples of this. There's one constant, one constant, and 898 00:53:51,580 --> 00:53:54,280 that's that's not the partners or anything that they're doing. 899 00:53:54,280 --> 00:53:59,140 It's you, right? And that's just a silly takeaway. But I mean, we 900 00:53:59,140 --> 00:54:01,860 can see it in others, much more than we can see it in ourselves, 901 00:54:01,980 --> 00:54:05,880 which goes to the point why we need that third party, because 902 00:54:05,940 --> 00:54:10,020 they're going to be able to see past the biases that we have 903 00:54:10,020 --> 00:54:13,440 now. They have their own biases, but it's not instrumental into 904 00:54:13,440 --> 00:54:17,160 what we're trying to do in some of these things. So so 905 00:54:17,160 --> 00:54:20,780 Tim Houlihan: takeaways, smart organizations and good leaders 906 00:54:20,840 --> 00:54:25,460 separate their product ownership from their product research. 907 00:54:25,640 --> 00:54:28,580 Those are separate teams, and they're the most successful when 908 00:54:28,580 --> 00:54:30,980 they occupy different spaces and different and they have 909 00:54:30,980 --> 00:54:32,120 different objectives. Yeah, 910 00:54:32,720 --> 00:54:35,840 Kurt Nelson: yeah. So, so brand product development are separate 911 00:54:35,840 --> 00:54:39,680 from the people doing the user research and understanding, 912 00:54:39,680 --> 00:54:42,820 because I might No, this is the best way that this product is 913 00:54:42,820 --> 00:54:45,460 going to be, and we're going to do No, and the researchers are 914 00:54:45,460 --> 00:54:48,040 going No. That's not how people see it, right? That's not how it 915 00:54:48,040 --> 00:54:50,860 works. The other piece is that there's an aspect of 916 00:54:50,860 --> 00:54:53,140 psychological safety. We've talked about psychological 917 00:54:53,140 --> 00:54:57,580 safety a lot, but we need to make sure that people can speak 918 00:54:57,580 --> 00:55:00,900 truth to power, that overconfidence. Sense bias, that 919 00:55:00,900 --> 00:55:04,860 idea that, hey, the more power I have, the more confident I am 920 00:55:05,160 --> 00:55:10,500 be. Have some humility and let research researchers speak truth 921 00:55:10,500 --> 00:55:12,060 to power, right? Yeah, 922 00:55:12,120 --> 00:55:15,060 Tim Houlihan: you don't have to just be you might not be wrong, 923 00:55:15,240 --> 00:55:18,120 right? You're a leader. You might not be wrong, but you 924 00:55:18,120 --> 00:55:21,740 might be wrong. The I might be wrong is a really powerful 925 00:55:21,920 --> 00:55:27,260 strengthening device to actually help the team go. Wow, maybe the 926 00:55:27,260 --> 00:55:29,480 boss isn't always right, right, 927 00:55:29,480 --> 00:55:31,100 Kurt Nelson: and it's the thinking and bets that we've 928 00:55:31,100 --> 00:55:33,560 talked about from Annie Duke. And I love this piece, and I 929 00:55:33,560 --> 00:55:36,740 think you we were talking about this right before we got on 930 00:55:37,160 --> 00:55:39,980 recording this, as you talked about, I think it was Roy 931 00:55:39,980 --> 00:55:43,180 Baumeister, who we had a conversation with, he said the 932 00:55:43,180 --> 00:55:47,680 best days are those days where I find that something I thought 933 00:55:47,680 --> 00:55:51,880 about was wrong, and I now know that there's a truth to it, this 934 00:55:51,880 --> 00:55:56,740 idea of disagreeing with your own prior beliefs, so your own 935 00:55:56,740 --> 00:55:59,740 perspective, that's a shift in mindset. And if you can have 936 00:55:59,740 --> 00:56:03,480 that shift in mindset, I think that's a really great piece of 937 00:56:03,480 --> 00:56:07,560 saying, I don't want to be right about my past. I want to be 938 00:56:07,560 --> 00:56:11,340 right about what's right now. I want to be correct. What was it 939 00:56:11,340 --> 00:56:14,280 that would you rather be right or know the truth? I think was 940 00:56:14,280 --> 00:56:17,520 that what Annie said something along those lines, and let's 941 00:56:17,520 --> 00:56:18,960 know the truth, right? Yeah. 942 00:56:20,460 --> 00:56:23,240 Tim Houlihan: Alexis also kind of challenged us in this area of 943 00:56:23,240 --> 00:56:27,200 thinking about mill managers sort of bear this burden of 944 00:56:27,200 --> 00:56:30,620 being the translators, like they have to carry this message up 945 00:56:30,620 --> 00:56:34,100 the line. And I think that that's a hard part of the job, 946 00:56:34,100 --> 00:56:37,160 but it's also an important part of the job, and that if they 947 00:56:37,220 --> 00:56:41,620 they might go about doing that by having, you know, saying, 948 00:56:41,620 --> 00:56:44,080 well, we've got the evidence. Here's we use these structured 949 00:56:44,260 --> 00:56:47,740 interview questions. We structured, we avoided leading 950 00:56:47,740 --> 00:56:51,340 questions. We set this up in a very clinical manner, and so we 951 00:56:51,400 --> 00:56:56,980 ended up collecting really good data. And you know, if you do 952 00:56:56,980 --> 00:56:59,680 have questions about how to have those difficult questions with 953 00:56:59,680 --> 00:57:02,400 your bosses, just want to recommend the book crucial 954 00:57:02,400 --> 00:57:07,440 questions. Fantastic team of writers put that together. And 955 00:57:07,440 --> 00:57:12,000 crucial questions is a really crucial conversation. 956 00:57:12,060 --> 00:57:15,600 Kurt Nelson: Yes, thank you. Crucial Conversation going, I've 957 00:57:15,600 --> 00:57:18,060 never heard of the crucial questions book. Is that No, 958 00:57:18,060 --> 00:57:22,220 cool, yeah. Do you want to have conversations? Yeah? You want 959 00:57:22,220 --> 00:57:23,840 Tim Houlihan: to prep yourself for having those tough 960 00:57:23,840 --> 00:57:27,200 conversations with your boss. Crucial Conversations is a great 961 00:57:27,200 --> 00:57:28,640 book to get you ready. 962 00:57:28,940 --> 00:57:30,860 Kurt Nelson: Yeah? So Tim, I think we can wrap this up and 963 00:57:30,860 --> 00:57:36,560 again, reiterate this, right? What struck me most is that I 964 00:57:36,560 --> 00:57:42,760 don't care how well meaning you are as a leader, that there is a 965 00:57:42,760 --> 00:57:47,860 number of things that unintentionally derail your good 966 00:57:47,860 --> 00:57:51,700 intentions, and that might be the you know, you might be 967 00:57:51,760 --> 00:57:55,480 really good at your job, but you're holding too tightly to 968 00:57:55,600 --> 00:57:59,380 your own expectation, your own belief about what Things should 969 00:57:59,380 --> 00:57:59,440 be. 970 00:58:00,040 --> 00:58:03,300 Tim Houlihan: Yeah, Alexis made this crystal clear also, that 971 00:58:03,300 --> 00:58:07,320 it's really about research methodology. It's not just about 972 00:58:07,320 --> 00:58:10,320 research methodology, that's right, exactly. It's about human 973 00:58:10,320 --> 00:58:12,480 behavior. Yeah, right. Ultimately, it's really about 974 00:58:12,480 --> 00:58:15,540 human behavior. And so the more that we build systems that 975 00:58:15,540 --> 00:58:19,140 recognize our biases, the more effective our teams and our 976 00:58:19,200 --> 00:58:20,180 organizations will 977 00:58:20,180 --> 00:58:23,180 Kurt Nelson: be Yeah, because I can tell you that you know, with 978 00:58:23,180 --> 00:58:28,880 you and me, our methods for research are Yeah, yeah. Make 979 00:58:28,880 --> 00:58:32,420 this this good. Here we go. All right, so if you're a leader, if 980 00:58:32,420 --> 00:58:34,940 you're a designer, or if you're just someone who cares about 981 00:58:34,940 --> 00:58:39,620 doing really good work, pause just a second. Just take a 982 00:58:39,620 --> 00:58:45,700 couple breaths and ask yourself, am I listening to the data, or 983 00:58:46,420 --> 00:58:52,660 am I just looking for validation that I'm right? And we want to 984 00:58:52,660 --> 00:58:56,800 make sure that you're listening to the data, and lastly, want to 985 00:58:56,800 --> 00:59:03,660 encourage you to listen to the data, or maybe listen to your 986 00:59:03,660 --> 00:59:06,960 heart. Oh, what would this be, Tim, if I asked you to check out 987 00:59:06,960 --> 00:59:11,400 our YouTube site and not just listen to us, but get to see our 988 00:59:11,400 --> 00:59:14,880 ugly bugs. If you haven't already, you can do your own 989 00:59:14,880 --> 00:59:19,200 research and say, Do I like the YouTube channel better than I 990 00:59:19,200 --> 00:59:26,060 like the just in my headphones. Do I like seeing Tim and his big 991 00:59:26,060 --> 00:59:30,680 old smile and Kurt's bald head? What is that? What we might lead 992 00:59:30,680 --> 00:59:32,300 you to say, I think I'd rather just 993 00:59:32,300 --> 00:59:36,140 Tim Houlihan: read the newsletter. It might lead you to 994 00:59:36,140 --> 00:59:39,560 just saying, I'm just going to subscribe on substack because 995 00:59:39,560 --> 00:59:40,600 that's enough for me, 996 00:59:41,860 --> 00:59:44,920 Kurt Nelson: or, or, or maybe it just means I'm going to join the 997 00:59:44,920 --> 00:59:48,160 the groove community, the behavioral group community on 998 00:59:48,160 --> 00:59:51,640 Facebook, because, you know, I can interact, but I don't have 999 00:59:51,640 --> 00:59:56,200 to hear their voices, nor do I have to see them. But we get I 1000 00:59:56,320 --> 00:59:59,860 can actually have a conversation. We do. It's I love 1001 00:59:59,860 --> 01:00:04,020 the. Facebook community, and I love how it's expanding and how 1002 01:00:04,200 --> 01:00:09,240 we are having conversations and reply response pieces that are 1003 01:00:09,240 --> 01:00:11,700 going on and there, it's fantastic. Absolutely, 1004 01:00:11,940 --> 01:00:15,060 absolutely. All right, so if this episode sparked something 1005 01:00:15,060 --> 01:00:18,000 in you, we hope that you'll take some insight from our 1006 01:00:18,000 --> 01:00:22,580 conversation with Alexis and go out there and find your group. 1007 01:00:22,580 --> 01:00:22,700 You.