Podcasts

Ethical AI, chatbots and digitial coaching

Digital
Episode:

7

2020-07-28
Decoding AQ with Ross Thornley Feat. Nicola Strong

Show Notes

Nicola Strong is a trainer and facilitator, the director of Strong Enterprises Limited, and a senior consultant at the Institute for Ethical AI, Oxford Brookes University. Nicola and host Ross Thornley talk about supporting people through change, the debate between augmenting through AI instead of going autonomous — and Nicola’s extensive research into chatbots. The pair also discuss the builders and the user’s responsibility when interacting with technology and how tech can perhaps do the gathering and filtering — and allow humans to more effective and less overworked.  

Subscribe

Download Transcript

Timestamps

  • 1:09: Nicola’s studies at Surrey University
  • 2:58: How beginning her career from an academic start point shaped Nicola’s career
  • 3:52: How technology has changed and evolved work over the past few decades
  • 6:03: How digitization is affecting coaching, learning and training
  • 9:28: Nicola’s research and discoveries about chatbots
  • 14:27: Ethical dilemmas with chatbots 
  • 20:50: Saberr’s CoachBot
  • 23:30: Axa’s Amber chatbot
  • 26:42: How human’s can prepare for e-coaching
  • 28:06: Tips and advice to get started in this technology

Full Podcast Transcript

Episode 8- Decoding AQ with Ross Thornley Feat. Nicola Strong - Ethical AI, chatbots and digital coaching

Intro

Hi, and welcome to Decoding AQ, helping you to learn the tools, mindsets, and actions to thrive in an ever-changing world.

Ross  

Hi and welcome. I am so excited to be joined today by Nicola Strong. She's a coach, explorer, learning expert and I had the pleasure of first meeting and sharing a lovely dinner after CogX in London. And then Nicola kindly came and sat on our panel in our June event called Unlearn. And we've been getting to know each other, so welcome.

Nicola

Thank you, Ross.

Ross 

I always like to start off with a little bit of a journey that people have been on. And for you, I think what intrigues me was, some decades ago, shall we say, you studied Change Agent and Skills and Strategies believe that was at Surrey University. And I just wanted to kind of start there as what was that, was that something quite pioneering at the time it had been running for a while, give me a little bit of context to where that started at University for you. 

Nicola  

I was introduced to the Change Agents, Skills, and Strategies Masters by a very good friend of mine who said to me, this is life-changing. It's a really robust degree. And it will give you I think, he said, a sort of good grounding in what is considered to be good models and theories and practices in change management. So I thought to myself, let's do it. And I found myself experiencing, I think, probably one of the most important lessons. And that was what it actually felt like to change. 

So what I was inviting other people to do was to, to do something that perhaps I hadn't actually really experienced myself. So I was made to experience change, while doing the degree and then obviously, reflect on that with my fellow participants. And that I thought was really valuable. And I think it brought the sort of grounded, guiding sort of clarity to what I thought was intuitively right. And I think it gave me a much better idea about how to support people through change.

Ross  

I love that I like this balance between, you know, the academic world that looks at theory, and then learn by doing, and the practical application and this, perhaps, opportunity you had of experiencing what you were learning about by the journey that you were taking and doing it is quite a nice, you know, subtle piece to recognize and reflect on that. And I'm interested, you know, in terms of how this has shaped perhaps your thinking since then. So you went, you did, you experienced, as many of us do. We go into, you know, our careers being led from an academic standpoint? How has that shaped? And if it didn't, why?

Nicola  

I think it did, I think I have a tough journey, I'd say I have a sort of better awareness than I had of biases of boundaries of knowing of paradigms and paradigm shifts. And I think when you're asking people to adapt when you're asking people to make change in their lives, or in their work, it's about having that awareness to that I think it was very valuable, and has definitely made a difference to the way I work.

Ross  

Have you seen in the last couple of decades a difference that has come about because of technology? So we hear a lot in the media. And I've certainly experienced this acceleration that's coming about because of everything being digitized. Oh, we've got a digital strategy, all of these pieces that are affecting every area of work. How have you seen that evolve and change?

Nicola  

Well, I mean, I started off working as a trainer, facilitator, and an enthusiast and change rather than a change agent. And I've certainly in my world has seen a dramatic change in terms of the way people learn. I mean, obviously, I started off with lots of face-to-face work, I would do courses that were five days long. But of course, I got much more interested in virtual teams and remote teams. And there was an isolation in that. 

And I think what technology has done, of course, on the positive side, is bring people together allow for a more of a community of practice, a sort of hive thinking to happen. And I think that's very, very exciting. But it's not necessarily something that we would have done perhaps just even 10 years ago. So I think it's revolutionized the sort of accessibility and the personalization of learning and communicating involving much larger numbers of people. And I like that because it brings an inclusion into the sort of paradigm of what learning is. So yes, there's just been huge change. And I know a lot of people have questioned it, but I think I've seen lots of good things that have come out of it.

Ross  

A lot of these aspects about our own perception and mindset of things, whilst digital technology can be the reason why change is happening, it can also be the tool and support to deal with that change. So whether that change requires new learning and digital can be part of how that's deployed is quite an interesting point in itself, isn't it by the activity that existing requires change, and then it enabling that change to take place in somebody? In terms of where you touched on there about training and learning, and that aspect of change, and adaption? What's your kind of views of what are we going to expect? So we've done this little quick visit to our past that has brought us to today. But if we took a view of the next few years, and you've done a fair bit of research in this, I know which we'll get to in a bit, but the, what's your vision of tomorrow in terms of how these tools and how digitization is affecting coaching, learning trading those areas.

Nicola  

I'm with the Professor David Clutterbuck on this one. I know there's an interest in having totally autonomous systems that are our assistant or are coaches as an isolated entity. But I think there's, I think that'll swing back, once we've visited that sort of end of pendulum swing to something that it'll be far more augmented to, there will be coaches who will, I suppose augment their practice with supportive autonomous systems of some sort. And I think that's a much healthier from my own point of view, in terms of how we’re implementing AI and I'm very excited about AI, and artificial intelligence, and all the other sort of various digital options that we have available to us. I do, however, think that we are going to be augmenting rather than going autonomous.

Ross  

Interesting, and I think you described this pendulum shift that I see that a lot in terms of, we have to go and explore the edge and the fringe, to then realize is that the place we want to be to just because we can automate something, or we pursue to expand that to replace, we then find do we want to it? Might be because it relates to the outcome, you know, so augmenting might deliver a better outcome. But it can also be about choosing how we want to do this, how we as a recipient want to learn, and how a knowledge entity, human or machine chooses to deliver that. And I think that's a key part of the future for me anyway.

Nicola  

I so agree with you, Ross. I really do, there are some things that I've been looking at where, actually, it's quite interesting. I mean, I used to look at people, you know, eyeball to eyeball, and we'd have a conversation, that coaching would be all about listening to somebody with full attention, and looking at them with that curiosity with that interest. But of course, now we're seeing the eyeball purely as a mechanism of accessing the mind. So it's quite interesting how we've transferred our focus from the sort of new normal human interfaces into things that are far more penetrating.

Ross  

And trying to look deeper, I guess it is really the window to our souls, you know if that's via the mind, and we're trying to discover and understand a little bit more of what influences us to believe that things have set mindsets and evolve through these transitions of thriving and survival through growth and coping, you know, all of those lovely little moments in the ribbon of change, you know, from anxiety to excitement, and everything in the mix. I'm fascinated to get this opportunity to share with some of our listeners about your, research, and particularly in the area of chatbots. These, these have become something that I'm sure many of our audience have experienced. And they'll have their opinion of very bad experiences and maybe some surprising ones.

And of course, technology is accelerating. What I'm interested in is this difference between chatbots where it might have been deployed early by organizations in customer services in sort of a replacement of the FAQ and question and answer kind of process into more coaching. So can you tell us and share us a little bit more about what the research was, what maybe some of the highlights of that are and what we can learn from what you've discovered?

Nicola  

I think my research has, I have various pockets of curiosity that I follow, and sort of delve into and wander around in. I think it started off when I had to do a talk for the Association for Business psychology. And they asked me to do a talk and I said, Well, okay, then I'll do one on if a robot had a personality, Would it improve our productivity? that I thought was sort of pretty important because that's where the focus is for me. And then I thought I got more curious about well okay then, if we look at say Heather Knight’s work around robots and comedy. And she did some really exciting work around that. And she was looking at robots and AI and reading human’s behavior. And she was saying things like human behavior tracking body language, behavior systems, task based interactions, gaze, personality, they were all things that could be decoded in some form. 

And then I thought, well, actually, I need to do the same with humans. How do humans read an AI or robot? And I thought to myself, well, of course, there's work being done around the uncanny valley, you know, how we visually perceive a system, and how it moves, how confident we feel, the way it moves around. Whether it really understands us, whether it could make sense as well, whether it could demonstrate an understanding of who we are, and what sort of intelligence it would display? Did it have any common sense? And was it safe to work with? I think some of the robots that I had stood next to in presentations, people found alarming or frightening. And then, of course, obviously, its functionality. And it's level autonomy. And of course, you've got the shared and mortal around to what extent the bot or the autonomous system is in charge. So I, I sort of looked at those two areas where an AI is looking at a human, and where humans looking at an AI. And that I think, really started to resonate in terms of where I went in further research to then looking at the wider landscape. And I do recommend for people to look at some of the fascinating research that's been done on some of the products that are now available to all of us.

You can see how they're using elements of that to try and build the relationship, the coaching relationship in a way that the human then feel supported because that's what it's all about. That's what you're trying to achieve. And of course, with that in mind, I've been working with a wonderful chap in South Africa, Dr. Nicky Terblanche and he's been working on a jackpot called Vici. She's an I call her she isn't that interesting? Because of the name.

Ross  

I'd be interested. You know, is anybody done any research into the names and personalities of chat bots? And how many of them are female? Because everyone I seem to come across appears to be a female, you know, they have female names. Amber, you know, the chat bot with AXA. You mentioned, Vici, you know, we've been working on one but our working title is Ada Or you know, that's just an interesting slight segue.

Nicola  

Definitely. And there are some people that have real issues around that around gender and chat bot voices, their demeanor, the language, and of course, obviously, their name. So that's a huge conversation, as we must have on vacation. What Vici is doing is she's looking or rather she is as a coach, she's an E coach. So she's looking at conversation, she's looking at ways that she can support to in a traditional coaching scenario, she's right at the beginning of her research. So Nicky, who has been developing this, and I mean, he's been working in this area for some years, he's a coach, professional coach at the University of Stellenbosch in South Africa. So he's, he's very keen that Vici has a truly coaching behavior, a series of behaviors. That's very exciting. I would recommend you exploring that. I know he's coming up to Vici 3.0. Other one, which I've been exploring is the Echoborg or I am Echoborg, which is a conversational bot. 

And I am Echoborg is much more philosophical. It's supposed to be a recruitment bot. But in fact, the person speaking to this particular AI is invited to consider the philosophy of life, to consider the relationship with humans and, and intelligent machines. So it's taking a far more interested conversation in value sets, in the philosophy of life in humanity. I've been experimenting with that too. And I think also working with say, some of the guys that Saberr, I've been very curious to see how Saberr and CoachBot is working in organizations now developing in coaching bot. So yeah, quite a variety, quite a few.

Ross  

I would have hop on the Echoborg piece because I had the opportunity at an event that we were both at London, the AI and robotics conference. And it was fascinating to see what kind of person turned up in the interview. So we had the audience and they would go and be interviewed by this AI and of course, you might know that person Dave or Peter or Sally or whoever it goes up, and do they behave as they would towards me, or towards someone else, or are they manifesting a part of the side of their personality because they feel that either what's expected, or they want to test the system in a different way. So I found some of them being very rude and obtuse, and a way in which I would be so cringing if it was another human.

On the other side, I'm thinking, how can you speak to someone like that? Because I'm putting the emotional feelings in the machine, it'd be Hey, hi, on their behalf and think you've got your roof, I'd send you out. And on the other side, people who showed compassion, irrelevant of whether it was a human, just show compassion, because that's who they wanted to be and how they wanted to behave. And so I found, you know, coming back to your first comment, it's not just about how the AI interacts with us, it's how do we interact with it? Better two very interesting viewpoints of that. And I guess that balance of we talk about ethics a lot in AI, and how do we build ethical AI? What I observed in that example, was unethical behavior in the human towards AI. And what some of the dangers of that are, is if we're using every interaction as a learning point, not just from the databases that we give it and the curriculum we give it, but how people interact with that. That’s the interesting part of the journey, I guess, with coaching specifically, and with these, e-coaches, as you put it.

Nicola  

I think you make a really important point, I think we do need to think about how we interact with these systems and because they are remembering, they're recording every type of way that we're talking to not just what we say. I can assume that there are new distinct to the text of my conversation with you now. I have a feeling that, you know, very soon we're going to be looking at intonation, we're going to be looking at the sort of language, the way that I speak, it's all part of the way I am. And that's what they're seeking to understand. So we are, I think we do need to be more patient with these, these new systems, and understand that it is going to use every piece of data it can in order to respond.

And I am worried about that. One of the areas that I found really interesting, in a sort of almost continue on the sort of research element is I've been curious about how you can support people with disabilities. And in some ways, the ethical challenges and accessibility challenges. And that sort of information that the system receives from somebody can sometimes be seen as an anomaly. So somebody who's trying to interact with a system who may not, in fact, be able to speak clearly, or has maybe not a facial expression that could normally be read by a facial recognition system, or may only have a limited number of words available to them in their vocabulary, will all have an impact around how we're using AI for people with disability. And I think it's so important to value somebody's difference in that way and to respond to it. But at the same time, you need a lot of data to do that, which means you're going straight into the sort of GDPR and privacy issues, you're going into a whole load of sensitive data that you wouldn't normally draw from somebody. And I think that conversation must be had soon I think scope, the charity scope are looking at this in that big hack. And I'm really excited and I'm supportive of their work there. That conversation has to happen.

Ross  

It does. And I think it's also each individual's responsibility to behave in a manner appropriate, you know. I find it fascinating, you know, when I communicate with Alexa, for example, I'll be saying please, and thank you, because that's how I was brought up. You know, my parents told me, that's the way you communicate. I've said before in a couple of other podcasts, you know, I have a different voice when I'm talking to my pet dogs, to my man, to my wife to other people, by as you said intonation changes, the language I use changes because it's contextually aware. 

And I think we have a responsibility when we are interacting with technology to think about it as how would I interact with my child or my grandchild? What is appropriate or not appropriate? Because it's learning from me. And if it's not just the responsibility of the builders, it's the responsibility of the users because they are part of its education system. And so I think that is an interesting future that I see about the ethics in that if we view it as a child, how would you treat that if it was a vulnerable individual that doesn't know right from wrong, how do you help it? Not just rely on be creators at the beginning. So I think that's perhaps a whole big conversation we could have about their conversation.

Nicola

Conversation number two?

Ross

Conversation number two, yeah. Take note. I'm interested to see and share, where today are we getting some successes? Where are digitized versions of coaching? And we've been talking about chatbots in particular, are helping their audiences, helping employees, and what could we take away if we're in a company that we want to maybe give more accessibility, more reach more support? They're seeing stress and impact of change. And they want to leverage the training and coaching opportunities of these, who would they look to, what's working? Well, maybe you could share a couple of stories or examples there

Nicola  

To your, your particular product, or particular?

Ross  

Yeah, any company anywhere where perhaps you've seen through your research or your observations where it's benefiting employees using these types of technologies, I think would be great to hear if you've got the examples. 

Nicola  

Yeah sure, the one that I'm I am really impressed with actually is Saberr’s CoachBot. I know, they're working in NatWest, Siemens, KPMG, and Vodafone. And that what they're doing is they are a typical coach bot, that's augmenting the professional coaches in the banking area, I know what CoachBot has been doing is looking at team performance. So how they do it is that they, in fact, the bot will be informed of all the people involved in the team. And the bot will send out messages inviting them to make it clear what they want from learning or make it clear what they want from working together on the team or developing value.

So essentially, the bottle collates all the data that the team thinks it would like to talk about, it will then simulate that, send it off to the coach, the coach will then review that, and it will inform the coach on how to make the next team intervention. Once the coach and we're talking about human coach here, once the human coach has decided where they think the focus could be, they will then send a message out to all of the team who can again stimulate any data that they think might be useful to the coach in looking after their interests. 

So there's a lovely sort of symmetry between the human coach wanting to support the team and a sort of ongoing support that the CoachBot can bring to the team in their day-to-day activities around how they align that to the values of the organization. And it's been hugely successful. And it's improved team performance, it's improved clarity of expectations, it's improved the overall team dynamic. So there's a real sort of ongoing feedback mechanism that's that was missing before in perhaps some of the team coaching that was done, just face to face.

Ross  

I think that's a interesting part of this future that we're going to adapt to and change to where we are augmenting human and technology together and recognizing the unique abilities of both. And the example you gave there where perhaps technology can do a gathering and the filtering

and then allow the human to then be more effective rather than overworked, overwhelmed or diluted. It reminds me of a similar piece that AXA in Malaysia did with a chatbot called Amber. And what their challenges were was they had a very high HRBP ratio, you know, in terms of the number of business partners in HR per employee, I think it was well over 400 employees per hr business partner.

So their ability to know who was at risk, from change from issues from any support that was needed was very hard. And they're sort of sporadic pulse surveys, or these types of things that might try and capture it, a lot was going through the gaps. And they weren't being able to do it yet they had those limited resources. And their coach book wasn't the CoachBot as such, all it really was was a using AI to get that as you put feedback loop. 

So the sentiment analysis that then allowed to effectively categorize those that it felt were at risk. So it was then a kind of a 1 to 11 ratio, the people and the human part could go in and put the interventions in. So rather than the technology trying to do the interventions at this stage, it was being able to just understand that and they had to some huge benefits that they talked about in terms of retaining a massive proportion of employees who were predicted at risk. You know, risk from burnout stress overwhelm, needing support to going through that transition that was happening.

And so I think that's an interesting part isn't it as we go through this evolutionary journey, where the pendulum at one end is fully augmented, you know, an autonomous to where we both perhaps believe it should be, which is in partnership and in harmony with those things. I'd love to sort of close out round out on a couple of areas if I can. We touched on this balance between how not only technology evolves, but how we as humans evolve in a technically driven world and a technology-driven world. What are the key aspects that you see around adaptability that is the best of humans that they can work on? You know, in our pre conversation, we talked about resilience, you know, I felt I picked up a bit of cold on the trade landed yesterday, and you said, all that build up your resilience for Christmas, you know, these types of things? What things around the human side can our audience maybe focus on and work on to help prepare them for what's coming ahead, in your view?

Nicola  

Is that in terms of the augmented coaching?

Ross  

Yeah, so if that's coming, and we're now going to face a future where there's going to be more digital coaches and e-coaches. How might I prepare myself for that? How do I prepare my workforce for it? How might I start to think about what I should be doing? You mentioned curiosity and look at the research or these bits? What could I practically do to help me understand what's happening? How I can think about what's even possible with these types of tools. 

Nicola  

You've already I think, said some key ones, I think resilience, I think there's a frame of mind here. I think the idea of being open and having a mindset, where you're perhaps open to an unusual way of doing something. So the idea of following a process or a procedure is less likely these days, we're much more likely to need to be able to have a sort of ability to see through current as perceptions about how the world works. There's an awful lot that's being said, and this is a very much a western view. Now, the word these doesn't have so much of this view with technology, and optimism for what technology can do and optimism for what we can do as humans, I think that's important, and to be open to notice that.

Ross  

So perhaps the mindset is key, be open, be curious, have hope and optimism. They're really great ways that they could you know, someone can show up if someone was a HR leader, you know, they were hearing this and going, Oh, wow, that's interesting. I hadn't even come across these e-coaches, or you know, what was possible in these chatbots? What sort of tips or advice to get started? Would you give them in these technologies? Is it read this paper, read this article, or go and play with this particular online tool? Where might they begin that journey?

Nicola  

I'm glad you asked me this question before because it is actually quite a deep question. I think there are some couple of things that I think would be useful. I'm of the view that if you ask somebody, the person whom you want to support, what they are doing already for themselves, are they using any support tools, or that will give them a very good idea of to what extent their staff or their team are already open to this sort of a tool or platform, I think it's also worth looking at the current coaching landscape, there's a lot of going on in the sort of professional coaching arena around this. Now, there's a lot of movement and curiosity and papers. So certainly some research around that. So talk to your team and ask them where they're at, and how I would look at the coaching landscape. And the last one is, of course, digital chatbots are changing all the time, there's new things coming in with extraordinarily exciting ways of seeing life. 

And so I would recommend, at the moment when you're ready when you have that, that idea of where you think you're coming from, it's worth looking at the landscape to see where you're going to. The other thing on a practical note, I would have in mind, the digital platform and the longevity of it. They're changing all the time. They're being bought out, they're being updated, their usefulness is changed. So it's worth looking at how sustainable your digital platform is. I would also be curious about the values, the biases, the cultural norms of your organization, which of those you feel you've are important, and those that who perhaps, would want to not pursue at that because those will be elongated, there'll be exposed there'll be stretched by the digital system and the awareness of that will be useful.

I would also be mindful of the accountability issues when implementing a coach bot. Where the accountability sits is with the organization? Does it sits with the individual? Does it sit with the data holder? Does it sit with the health and safety officer? There's some really important questions around that. And I think finally, with the sort of increased importance of understanding GDPR, and the the privacy of data and the dangers of losing that data, I'd have a quick look at what your company policies are around data security, and the relationship you'd have the coach, and the personalization issues that might come out of that.

Ross  

It’s a great list. And I think it's a reflection of where we're at in the phase that there's perhaps more questions than answers at the moment. And for me, it's around, this is for the pioneers at the moment. It's in the deceptive or the disruptive kind of phase before it then becomes hugely accessible, democratize that anyone can use it with all of those things taken care of. So I would encourage people to have the conversation, as you said, ask the questions, listen to what's happening at the moment.

And as always, from my perspective, you know, if we want to improve something, we have to measure it. So a key component of that for us is, is measuring the aspects of adaptability and of change to be able to deploy it in the right way. It's been a real pleasure. If people want to look you up, you're on LinkedIn, as Nicola Strong and amazing speaker curator of research and this harmony that we're building between technology and humans, and helping us all create, the sort of environments we want to be part of. I've really enjoyed it. And I look forward to seeing you again very soon. And just a thank you and a virtual handshake before the haptics can be. So thank you.

Nicola  

Thank you very much for inviting me. It's always a pleasure to chat with you. I always have an interesting conversation. So yeah, thank you very much.

Voiceover  

Do you have the level of adaptability to survive and thrive in the rapid changes ahead? Has your resilience got more comeback than a yo-yo? Do you have the ability to unlearn in order to reskill, upskill, and break through? Find out today and uncover your adaptability profile and score your AQ visit "AQai.io" To gain your personalized report across 15 scientifically validated dimensions of adaptability for limited time enter code "Podcast65" for a complimentary AQ me assessment. AQ AI transforming the way people, teams, and organizations navigate change.

Outro

Thank you for listening to this episode of Decoding AQ. Please make sure you subscribe to your favorite podcast directory, and we'd love to hear your feedback. Please leave a review and be sure to tune in next time for more insights from our amazing guests.