The Sales Engagement Podcast
The Sales Engagement Podcast

Episode · 3 years ago

Why Data Science Is Critical To Growth In Engagement w/ Pavel Dmitriev

ABOUT THIS EPISODE

We're talking all things data and testing with our VP of Data Science, Pavel Dmitriev. Pavel explains the how and why behind using data science and AB testing to grow engagement without the guesswork. Tune in!

Welcome to the sales angagment podcast,this podcast AD brought you my outreach dottio leading sales of Evan platfom,helping companies, sellers and customer success, engage with fires andcustomers and the modern sales error, checkout sales Ascom for new episodes,resources in the book on sales anbagement coming soon. Now, let's getinto today's episode, everybody welcome to the salesengagement podcast on more costo, O vpsales at out reach on this podcast.We talk all things. Sales engagement- and one thing- that's reallyinteresting to me- is how we get better at doing sales engagement, not just howwe do more of it and with me today I have Pavo Dmitrio, whose our BPOF datasins tat out, reach pavol they had everybody hi, have it a run? So tell mea little bit about what you do like BP is at BP data science. Is that what youdo here yeah? So I essentiallythe data science.We have an awesome team here with some data scientist, some date engineers andsome have developers, and we are trying to build some fissures into outridge,which teall make life offereats easier, atemade, some of the stuff that theydon't like doing, make them more effective, nobvies the things that theyare doing as well as help o really improve that whole sales process right.So it's always interesting to me. Manny Medina or CEO told me one day, 's, likehey man, I'm about to hire some guys that are going to go deep on data andon science and experimentation. I was like okay sounds cool, but then, when Imet you and EFA who are kind of our first two people on that team becameclear to me how important this was so as you're looking at what you have donein the past, with data science like how do you see it applying to sale'sengagement in what we do here? Tou...

Reach, I think data science really tiyllbecome critical to optimize and sales engagement because itout data science.What happens is that viyof salls comes in kind of sets up the process. Contentand so on, and then that's what it is. It just stays like that things do notimprove or they improve in some kind of Verry at Halk kind of a way based on you know, anicdots andthis practices, and so on. It's like thit's, not clear at all where ifsomeone goes and changes so the nosticiences or some content, whetherit's actually helps or not yeah, so what Teta science can do is it canbring kind of scentific river and measurement to that process that whenwe make changes when we try to improve something, it can actually verifywhether it really improves things ar not and then over time accumulate thatlearning about what other things we shall work in that we can do more ofand what other things whe Al, not working, which we don't need to do, andwhat that enables is that this whole sales process keeps improvingconstantly improving over time, rather than just remain ing static as it isnow, you used to be at Microsoft, right yeah. How long were you there for eightyears, eight years and there's this interesting story that you told me? Ithink, with the developers of being that, like really opened your eyes wentyour datasigns got O, probably didn't open your eyes, but opened otherpeople's eyes to the fact that hey. This is important like tell me thatstory again yeah. No. This was an amazing story at actually really openedmy eyes on the part of the specific technuccalled Abitestin. So whathappened is once now developer at Microsoft had an idea about how toimprove the way ads. I displayed on the secial results and the improvement wasvery simple. He just wanted to take an...

...of fiurst sentence from the text of theEAD and put it into the title, and it's very simple in just licerally a coupleof lines of cod takes like an hour, probably aisnest, to develop. So youhas ha idea, but you know there are hundreds of developers and PMS.Everyone has idas when it all goes into this backlog and gets prioritized. Thisidea did not make it to the top andhevated for a month and two monthsand three months and after six months he was like this is never going to makeit. So he pretty much just did the tack over the weekend and his free time andthen started this scentific experiment, Calle Adetest, and then what happened?Is that immediately an alortfiered our system and being pugenerate a Lord?Whenever there is some kind of a strange movement, and this time a Lordwas about, we are making too much money. This can be true, never seen a featurewhich makes so much money, and this is a trival change. So what was wrong?Well, it turns out. Nothing was wrong. This pichare actually incurased Beingisrevenue by twelve percent, which was around a hundred million dollars a yearat that time, the six months delay cost Beng fifty million dollars, and no onewould have imagined that this fissure would do it like everyone, praritizedit low and it was somewhere down in the backlog yeah. I think we see that a lotwith sales leaders bringing that back to sales engagement, which is a rap,has an idea. A sales leader has an idea and it gets pushed to the bottom orthey have an idea that they thinkit's important. That gets peroritized thatthey testit test. On and you know, I kind of get theme off kilter where theyshould have been going. But what's interesting to me, though, is you'vebeen here for six months, you've talked to sales leaders, you ou work with oursales team. What is something that...

...you're? Seeing that sales leaders dowrong with AB testing, because most sales leaders understand AB testing,and I should be testing something different against this thing that we'vealways done to get the results, but, like give me a couple things that youfeel like people have done wrong as cl leaders when it comes to AB testingfrom like a scientist perspective yeah, I think that a you thinks some of ithas to do with just really being able to properly run and a te test. It meansto have anoterdathat. You actually need to have statistical tests to determinewhat works and what doesn't. We can just look at the difference, becausethat could be due to noise, so just properly configuring and DRYAN and ABtest. He is actually not as easy as it seems, and we sowhen. We analyzed thedata of out rice and of outrige customers that very few abites areactually run correctly. Most of them are invalid of une reason or another.So that's one thing and another thing is: I feel like a little bit underappreciationg the power of have testion. Everyone is aware of Avy tast inselvess leaders know it existence as heyvaluate. However, they often thinkof it as kind of just the thing that you use to maybe see which subbectlient on a sertaen email is better or what kind of y uknow called to actionto pull it into specific email templatd. But the power of abetastin actual is alot more. You can use it answer some high level business questions such asthings like you know whether video in emails is effective or ot, somethingthat we we talked about before and you cannot do it with just one avy test,but you can do it. This kind of an abetesting initiative, a set ofcordulated tape, test that ran across...

...different scenarios and differentaspects of the product, and that really enables I think, sale leaders to answerthe questions they really care about those highr level, business questions,not just small questions about whether on specific type of Siiens. In aspecific step, something is what is better n than something else right. Sothis is one thing that really struck me when we met and we started talking- isthe idea of getting yourself team better and better through testing andexperimentation, which is probably more technical term? Is it's a lot of worklike it's not easy? Its hard and a lot of people are kind of dabbling in itand doing it sort of right sort of wrong, maybe mostly wrong. What are theconsequences of doing it incorrectly? Other consequences can be very severebecause if you use the results, there is also an abetens Genera. I used tomake some decision about changing. Maybe the process of the strategy,perhaps K using the result of this ave test and incorporating it across knowyour whole company, and if that is based on the wrong or incomplete date,then potentially that decision may be wrong and, depending on how wildly that's going tobe used, that mey actually have really bad consequences. Yeah, and I think ugoing back to story. You told me another story when you're at Microsoft,about developers, intention of creating a good feature and how many of thoseturned out to be features that actually helped the users ar not like tell methat story yeah, that's kind of as a side of the being studies that we justtalked about that. On the one hand, you know often good features do not getveritize very high. On the other hand, when we looked at all of the fficiersthat are actually been built and ship, the users and Twe evaliated them usingavbest, we found that only a thord of...

...them are actually good, like reallybenefit futers and the other thord were just neutral and then another thirdwere actually harmful. Is the WI Luthre Ven? You or we degrade youtherexperience and that's kind of really. I openion an sense, I'm very humbling tooyeah that when developers PMS have ideas and they Bo, they all believethat this ideas are good, actually theyre as likely to help Yusers as theyavpehans them. I always. I thought this was so crazy right, because I'm thesales floor get a couple raps talking and they h y. Then they bring over acouple of their other buddies for lunch, and then they talk to their manager andthey all have the best intention of doing something that helps the salesteam. But actually their idea has just as much ability or just as much of achance to harm the sales team, as it does to help the Sall team and like wejust we have to give it away from our gut right, like we can't rely on ourguts anymore sales people in order to get things right, because now that wehave data, we have dheda science weve machine learning. We have these thingsthat we can use making wrong decisions based on your gut. I don't think peopleare going to tolerate it much longer to you, yeah, no, absolutely not, and Iwould say that there is actually a very sort of national combination of usingyour inturition Yo. Ga Your experiences to come up with ideas. We don't want tothrow thos ideas away. The only really difference that the scentific CICNIQUESaty taste in Makas, that it allows us to treat the Sadias as hypoteses ratherthan the Absolute Truth and then actually tells those hympoteses andthen in the process of fasting those hyposeses. We learn a lot more, whichgives rise to new ideas and that kind of Varchou circle is what every Tastinaables yeah. I think it's interesting.

I think some people think they're goodat picking winning ideas like if I'm my sales manager and five of my reps bringme an awesome idea, like I feel, like my gut, tells me boom. This is theright idea and that's just not the case like there's no magic Wan, likesomebody doesn't have some kind of magic, intuition or magical wondrousgut. That just tells you that it's right, the only way really to know isthe data. As a data scientist, you appreciatethat I'm sure yeah yeah, absolutely the only real way to know, is tested. Soone thing that you taught me- and I remember our friend efay WHO's. One ofyour engineers on your team did a presentation and he got up there and hewanted to talk about how big of a sample size o you need in order tonotice a specific size of a fact that a faring of a tescan have, or that like amaking. A change can have so, for example, walk me through that thingthat you guys taught me if I wanted to see a change of GS. My simple size hasto be Wie. If I want to see a simple size of only youknow hey, my simplesige has to be B. can you like Youy toplain, that for us yeah, that's kindof fun of the aspects of setting up the test correctly? Is that Benan the rightamount of late in statistical terms of simple size and the interesting andmaybe a little bid counter and to it you think about it? Is that the vigor,the difference that we care about detection, the fewer simples we needand the smaller the difference we care about the bigger the number of samplesthat we ned so thenp yeah Giv me give me some example. So, for example, if Iwrite an email and that email goes from ten percent reply rate to elevenpercent reply, right like how big does my sample size need to be in or thatfor that, the difference between those two tests to be relevant and and for meto trust them...

...yeah. So that is actually the exactformula thato calculated, but in this case it's about ten percent relativedifference going from ten to eleven percent and ten percent iis kind ofANAMIDAL. It's not very small. It's not very large. It's probably end up beingsomething in kind of tens of thousands of deliveries, but on the other hand,if Youre tast in something you're expecting that it's going to bring thereply rate from ten Persen Up to twenty percent, which is a hundred percentimprovement. You really only need a few hundred of deliveries. So if I'm a SALleader and I'm running and a B testing in the sales engagement platform orfrom a marketor running it marking, automation and Iran- say two thousandpeople through a test. Most people would say that's a good test, but if myreply rate or my engagement rate is gone from ten to twelve percent, wecan't trust those results because the sample size just isn't big enough tosay, that's, certainly what's made the change in the difference yeah mostlikely. This is not going to be what you call statistically significantdifference in that case, but in the same vain, if I run a test and after acouple hundred people, I see that for fly rate is gone from ten to twentypercent that actually might be statistically significant because itchange is so large. The sample size can be super tiny, and we can still knowthat we can trust those results yeah exactly and the reason this is opinionis that there is always voice in the data. Thar are actuallydifferent people that thear sendind deliverious of say emails of one typeor another time, and they may have slightly different preferences and justby chance it may happen, then you know within this first couple of hundred.Maybe perhaps there were a few of customers who would just be more likelyto reply, regardless of what decend for...

...them yeah are compared to the otherGroph, so that kind of noise can happen and it can cause some difference.However, you don't expect ha noise to cause a very big difference. So if itse a reallybic difference, we need less Dayta to make sure convince as and it'snot noise while is the difference and small we've kind of absorbe more daythat really convince as well. This is real. It's not noice, it's not becausejust this FANDOF variation that cost it now. One of the things that you've beentasked with here at out reaches developing machine learning featuresinside our platform, which is onwy, call amplify that allow us, as salesleaders to be more scientific right. We need help. Like I'm no scientist. Iactually think I remember I had Kem fourteen at Penn State University. Itwas a four hour live every Thursday evening. I had to think twelve weekssemester, so I did probably ten experiments. Guess how many experimentswere my calculation for the number of grams that as supposed to result infrom the chemical reactions I was going to make, because so many of thosecalculations turne out to be true, zero, you ere, gonna, get Aa chance with Wyiwas Gono. Give you wonevery time be like you're going to develop liketwenty three gramds of Sodium corade. I have zero grams like every time right.So I'm not a scientist at all. I starting to understand and appreciatethe value of Science Tso, so heare it out. REACHD, like tell us about onething that you've created, that is helping a sales leader become morescientific so that they can actually get off of their gut and get onto thescience bandwagon yeah. One thing that we did is we developed this featurethat makes Ab Tastin in out rice, more scientific. He always had Abe tastandte Dou, don't actualy have signs behind it and you were in a study and whatpercent of all the Ab testing at outreach was statistically significantor was runing the right way. We...

...actually found that it was less thanone per sepen, so us one of the experts and email that developd they be testingone of the first sales ready a be testing things. We were only getting itone percent right right, so what you created something to help us with thatwhich helps everybody yeah. So what we did is we created this experience hatwe call guided avy tasting and what it does is that behind the scenes it asstatistical tests, it also does some other test to ensure that theexperiment is actually valient and correct, and then it wel held the userin the art when the experiment has a Vinar. It would also, in addition toTet, try to prevent people from breaking experiments, because thereison why we found many of those experience. Whenn correct is becauseusers would just break it, they would come and they would stop a certaintemplate from sendon emails in the middle of the test and thee start itagain, although something like that. So now we have this warnons. Whenever Yotry to break a currently runian experiment and Winderwill Papa with ared sign. Yes, a Donot do tat, that's e Niver, Tak that BOI. It still hasn'tpopped up on me. Not Actually it has it's just harder. So I really thinkthat for sales, engagement, pavil and his team are doing some unbelievablethings to help us as sales leaders bring the science into the art of sale.So we like to say, like ampithized job, is to bring science to the art of salesand I think, he's doing a great job with. I want to thank you for your timetoday on the podcast Pavo. What's the best way for people to get in touchwith you, they have questions. They put email me Fivald, mitre at ot rage thatIOL or connect this meonin thin yeah great. So if you want to talk to themaster, he doesn't have a lot of time because he's off baking experiments tomake us better, but he'll always get...

...back to you and let you know some stuffand of course we can keep you apprized of what we're doing here. N nout reache,to make things easier for you, but thanks foree ton. Today, papl thanksmonicits bother flader in cuote, and that's it fot this one, hey we'll talkto you on the next sale engagement podcast. This was another episode of the SalesAnbagement podcast Jonas at sales, engagementcom for new episodes,resources and the book on sales. A gagement coming to to get the most outof your sales and Gatim Strantegy, make sure to check out out reach at io theleading sales, an meatcl Plat. Se, you on the next episode.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (308)