The Digital Dose

Reflections of the 47th International Mental Health Nursing Conference

September 22, 2023 Prof Rhonda Wilson & Oliver Higgins Season 1 Episode 4
Reflections of the 47th International Mental Health Nursing Conference
The Digital Dose
More Info
The Digital Dose
Reflections of the 47th International Mental Health Nursing Conference
Sep 22, 2023 Season 1 Episode 4
Prof Rhonda Wilson & Oliver Higgins

Imagine a world where artificial intelligence is not just a convenience, but a necessity for healthcare. How do we ensure trustworthiness in AI? How do we equip everyone with the necessary skills to navigate digital health resources? These are some of the crucial questions we tackle in our engaging conversation with Jette Sørensen from Denmark. She brings to the table thought-provoking insights from the recent Australian College of Mental Health Nurses Conference, where the buzz was all about the surge of interest in digital health literacy within the mental health field.

Critical thinking and reflection are key when using AI, chatbots, or any other digital health tool, and we explore this in-depth with Jetta. We investigate the role of public health institutions in ensuring everyone can access digital health resources, and discuss the courage shown by mental health nurses as they embrace digital mental health interventions. With the anticipation of future advancements in this exciting space, we close with the question – are we ready for what the next 12 months will bring? Join us for this fascinating exploration of digital health literacy in mental health, and the promise it holds for the future.

Brown, Benhamou, May, and Berk. Machine Learning Algorithms in Suicide Prevention: Clinician Interpretations as Barriers to Implementation. The Journal of Clinical Psychiatry 2020 Vol. 81 Issue 3 DOI: https://dx.doi.org/10.4088/JCP.19m12970

Support the Show.

Follow us at @digitaldosenews

The Digital Dose +
Become a supporter of the show!
Starting at $3/month
Support
Show Notes Transcript Chapter Markers

Imagine a world where artificial intelligence is not just a convenience, but a necessity for healthcare. How do we ensure trustworthiness in AI? How do we equip everyone with the necessary skills to navigate digital health resources? These are some of the crucial questions we tackle in our engaging conversation with Jette Sørensen from Denmark. She brings to the table thought-provoking insights from the recent Australian College of Mental Health Nurses Conference, where the buzz was all about the surge of interest in digital health literacy within the mental health field.

Critical thinking and reflection are key when using AI, chatbots, or any other digital health tool, and we explore this in-depth with Jetta. We investigate the role of public health institutions in ensuring everyone can access digital health resources, and discuss the courage shown by mental health nurses as they embrace digital mental health interventions. With the anticipation of future advancements in this exciting space, we close with the question – are we ready for what the next 12 months will bring? Join us for this fascinating exploration of digital health literacy in mental health, and the promise it holds for the future.

Brown, Benhamou, May, and Berk. Machine Learning Algorithms in Suicide Prevention: Clinician Interpretations as Barriers to Implementation. The Journal of Clinical Psychiatry 2020 Vol. 81 Issue 3 DOI: https://dx.doi.org/10.4088/JCP.19m12970

Support the Show.

Follow us at @digitaldosenews

Oliver Higgins:

Hello and welcome everybody to the Digital Dose podcast. This week we will be having a reflection on the recent Australian College of Mental Health nurses conference held down in Melbourne. As usual, I'm joined by the illustrious Professor Ronda Wilson and a special guest.

Jette Sørensen:

Jetta. I'm Jetta from Denmark and I have had a pleasure to be a guest, I think, at the conference. It was a very good conference. In mental health, I'm at Newcast University, together with Ronda and all her fantastic team.

Rhonda Wilson:

The conference was amazing. I was overwhelmed, overloaded, absolutely saturated with new ideas, with meeting with colleagues face to face. It was a very exciting conference and I have to say there was a real buzz about digital mental health at this particular conference.

Rhonda Wilson:

It was the talk around the coffee, around the morning teas over lunch, around the posters. There was a lot of good discussion and a lot of really engaged thinking about what does digital health mean in the mental health and particularly the mental health nursing context. So I thought that it was a really fantastic conversation. I think people are ready for it. I think clinicians are mindful that digital health is a real thing and that it is racing along really with. It's becoming quite prolific and it is infiltrating people's lives in so many ways that mental health nurses realise they've got to get on board and understand a lot more about their own digital literacy and the access to digital health services and digital mental health services of the population, the people they care for.

Oliver Higgins:

Yeah, I couldn't agree more. I think, overall, there was a huge appetite for anything digital health and there wasn't a. There was quite a number of digital health talks actually given it wasn't massive, but there was quite a few and every one of those generated some level of interest, some level of discussion, and I know throughout the time, both you and I were frequently approached by people with questions concerning digital health, concerning the components of AI, concerning health delivery, and very much an appetite to understand further how this interacts with their practice. What does digital health mean for mental health nurses? How can we leverage all these ideas to actually be creating new solutions to deliver that high level of care that we are capable to do living to more and more people?

Oliver Higgins:

So I know that we both presented on some AI components around our previous podcast, around the commercial determinants, and there was a lot of interest with people there trying to actually understand further what was actually occurring in the space. But what I found and I'm sure you would have similar is that afterwards getting very specific AI related questions and it was interesting to see where people were at, because they're clearly embracing AI, they're embracing digital health, but it's trying to put those pieces together and how do I do this? In this or in my practice, I look at this or I look after people with XY, how do I bring that in or how do I apply that? So there's a huge, huge appetite and I think that's only going to see that it's going to grow as we see technology and the pickup increase in the coming year, but it's very curious about it and want to know more and increase the knowledge about this area, so I think it's a good idea.

Jette Sørensen:

There was some more talk about it and you have a lot of talk, and you also, ronda, have a lot of talk about it, a lot of question, but I think it's a good way to network and to help each other worldwide to get along with these questions about AI and how do we use the technology in the best way to increase people's need and to yeah, yeah, I think so as well.

Rhonda Wilson:

I think one of the themes of some of the questions and comments that came my way were related to can we trust it, can we trust AI, and that was a really, really interesting discussion. Oliver, you're a bit of an expert on trusting and trusting. Hang on.

Jette Sørensen:

Yeah, yeah, can I supplement with a little bit, because actually I'm a little bit scared about it and I think I sit in front of one of the men who can answer my question or discuss it with us, because I'm not alone by feeling a bit scared. Yeah, you're not alone at all.

Oliver Higgins:

I think we'll touch on it briefly and probably should do a whole podcast just devoted to this topic and it's really pressing. I think that we understand trust because if you look at I think it was Singapore, european Union, it was, I think even Australia has it's the various ones coming out saying that AI must be prove its trustworthiness, and we've broken this down in an article that will actually be published very soon, talking about the fact that when we look at our clinical tools so you know women, health nurses, so we talk about the honours, but you know general practice, water low, you know name a tool we don't talk about those tools being trustworthy. We look at their rigor and validity in these components. So when we actually start looking at AI and start using term trust, which is an inherently human construct, it's actually very difficult for you to consider the AI trustworthy, and I think this is the conversation that we need to be having more and more as we go forward to actually go. Yes, you can consider it trustworthy within the denames of integrity and reliability and looking at that way, but when you come to the empathetic components of trust, I trust that you will do what you say, that you do, you know and these developed relationships. It's really important that we actually don't seek that from a machine. It's a very complicated machine, but it's a machine nevertheless. It's math and statistics that sit underneath. So I think part of our conversation will be about the way in which we utilise AI. Going forward will be that we need to understand how the AI works. Can it explain how it came to its decision? Can it show me why it thinks? You know, I'm even anthropomorphising it right there, which we inherently do, but it's showing that the process by which the machine took to arrive at its outcome and there is other work by Lily Brown which I'll note to in the show notes and they're talking about. You know it's so important for clinicians to actually understand how it arrived no-transcript at the outcome that it did to make that recommendation, and I think this is part of what we need to actually delve into more.

Oliver Higgins:

In a digital spaces not just AI, but any digital tool how do we actually show how the tool works, its complexities, how's the data stored? How is it used? How does it come to help the person? Is it just an interpretation of previous tool or are? Chatbots is a really great example where you can get very complicated chatbots. You can get very straightforward ones. There is research to show that if you're up front in the first place say you're using a chatbot the people are actually more willing to use it because they know that they're interacting with that level, with the machine. And there's some great CBT chatbots that actually just go through the basics of it and just check in with you every day and it means you can have one on one sessions.

Jette Sørensen:

Yeah that's right, use it.

Rhonda Wilson:

When you need it and when it suits you, which is really critical, and that's the flexibility I suppose digital offers. One of some of the other comments that certainly came to me were around how we train AI and the quality of the training data. There was quite a bit of discussion in the conference about training data and how we can ensure that it is clinically trustworthy as well, so that was another theme.

Oliver Higgins:

Yeah, I think that's really important and we'll see more and more I think that came up in my talks as well people asking very similar questions and they're related back to that trustworthiness component. Because if you don't know how the data is classified, can you actually trust the outcomes? If it's classified by non-clinical people or if it's bad data to start with, it's garbage in, garbage out, you're going to perpetuate more of whatever. Along similar lines, I got asked a question about you know systems being biased and it's the same answer that you know. If you've got an inherently biased system in the way you do your practice and you get the data, the data's going to perpetuate that bias because that's the data that's there and being used. So we sort of have to weigh the quality of that totality when we start building models. So just because we have a model and have an outcome or have an intervention doesn't make it any better than the current practice we already do if our actual current practice is already biased.

Rhonda Wilson:

And we attracted some outside interest as well. It was, I think I've never been to a conference before where I had been sought out by a non-delegate who heard that I was going to be speaking at a particular conference in Melbourne and drove from out of town just to come and meet us and talk to us about our work, and actually did that two days in a row. So that was really, really interesting to hear that there is such interest, that you know people are really eager to hear more and more about digital health and digital mental health in particular. We had a couple of interactions that were really quite unexpected.

Oliver Higgins:

I think the interesting conversation where that one ultimately led to was about ethics. You know how we use, how we build, how things get implemented and repeated this sort of saying that just because you have a system and you can use it, does it actually should you use it? I should refrain that and when we look at utilising something in practice, it doesn't actually suit the needs of the person. And we come back to the conversation what does the person need? Are you disclosing to the person that this is AI? Where is that transparency for them to actually understand?

Oliver Higgins:

So, similar to clinicians, clinicians need transparency in how the system works and any digital system, not just AI, but also those who people, that the end users. They need to have that transparency and know. I mean, I would hate to think you perpetuate somebody sitting there using a chatbot for weeks on end thinking they're actually talking to somebody and disclosing things and building our relationship for intensive purposes via this medium, only to suddenly realise that this isn't what they perceived it to be. And so their trust in the system, their trust in the person like there's a whole gamut we could explore just in their space, and it's great that people are seeking us out to have those conversations.

Jette Sørensen:

Yeah, and yeah I think it's a good point, oliver. But also, you know you need to have your critical thinking, your reflection together. What? Not that you can't trust the AI or the chat B2B, but you need to. You know, read what is actually the answer and what question do you give the chat box and what is the answer? Because you need to have a question you know about. You know I have an area I know a lot about. So if I get the chat B2B a question, I need to reflect and I need to read the text critical to see is it actually the right? Is it the topic? Is it the right? Does it write about it or read it? No, write about the right thing.

Rhonda Wilson:

Yeah, I think that's really important point. Yet, because you know our clinical judgment and critiquing the evidence that underpins the AI itself there's never an occasion for us to suspend our clinical judgment as health professionals, that AI and digital solutions need to be critiqued, and so our clinical judgment is really, really important.

Jette Sørensen:

Like everything, every other things, when you read an article and yeah, you also need to be critical Is it the transferability? Is it? Do you know the article? What is about these things like you talk about for a few minutes ago?

Oliver Higgins:

So yeah, it comes to back sort of having that sort of discussion around digital literacy. So it's as much as your critical thinking and those things skills that we have, but it's applying that just because chatGP said it, or a chatbot or whatever the tool you're using, we don't want to. You wouldn't accept the magical thinking from anywhere else. So it's applying that filter, applying that digital literacy, digital health literacy, which I think is really key. That I think that's what people are really crying out for is to actually have that understanding and have a shared understanding of what that means for not just mental health nurses which is, of course, where we're at, but, I think, greater interdisciplinary spaces and understand how these things relate and work and how it affects people as we move forward with these types of interventions.

Rhonda Wilson:

And I guess I think that there's a responsibility for health services and public health sector to ensure that they are assisting the professional development of people in terms of their digital health literacy as well both the health professionals and the general public that these health institutions, public health institutions, actually have a responsibility to make sure that they're bringing everybody along with the development that's required to be able to access these resources.

Rhonda Wilson:

I was really really taken aback at how far along I think the courage of mental health nurses were to start to be brave enough to embrace I suppose at a very early entry point perhaps, but still embrace digital mental health interventions, and I do think that takes a bit of courage because it is stepping bravely into the unknown a little bit, a place where our profession hasn't gone before, and so I really noticed that people were very interested to take some of those emerging steps and to start to think, okay, well, how do we do this? And to recognise their need to be part of the discussion and they need to join the discourse to ensure that we have quality mental health professional and mental health nursing contribution to the development of digital health resources. What do you think about that, oliver?

Oliver Higgins:

I couldn't agree more. I think there were several people that approached me and they had some great ideas. At first I was thinking, well, that's a bit of an out there idea. But stopping and having a conversation with them and putting some context in and understanding what we already know, you could see that they were very they liked the idea, they were moving forward, they could see these next steps and I think, just as we're talking now, I can't wait for next year.

Oliver Higgins:

From the premise of if this is the questions that are being asked now and, as you pointed out, they're brave to come and ask and to work through this, we're going to see they're going to move forward, but also we're going to get other people coming in with more ideas and more viewpoints on what digital health means, what all these components mean to mental health, and I think it's such an exciting time to be working in this space. So I really can't wait to see what's going to emerge over the next 12 months. We've seen the last 12 months in this space has been huge and we have people now who see the importance and understand the importance. So, with this rapid rate of technological change, you know, what are we going to see next time? Are we going to see two or three sessions that have to be devoted to digital health? Are we going to find the digital health is so broad we actually have to categorize it into AI, to interventions, to policy exactly, you know the ethics components of those things.

Oliver Higgins:

So there is a lot to unpack. But if we don't have that level of enthusiasm, if we don't approach it and don't encourage the people that are actually being brave, then we're going to be left in the dark and it won't be in 10 years, it'll be in two years. In three years things will have changed or tools will be put into place. Then we go well, that doesn't suit or doesn't apply, or how does that work? Or you know we aren't at the table. You know driving and you know making sure that that high quality of care is being delivered in a way that these mental health nurses can deliver.

Jette Sørensen:

Yeah, but sorry, I think you should the next year and mental health nursing conference, you should probably have a panel discussion or you should have a little. You should have a little workshop about digital health. You know come combining, you know talk and some workshops and people you know can get in touch with some of the things.

Rhonda Wilson:

So that's also a way I think we can, you can maybe that's a great idea and we give a bit of a shout out to Dr Kathy Daniels, the scientific director for the conference, because I think that's a great idea that we you know as a suggestion to take forward, whether there might be merit in having a masterclass or a deep dive session into digital mental health. But I've come away from the conference really, really inspired, really motivated by all of the ideas that I've heard, the exchange of information, just even the friendly nature of this particular conference and and I'm looking forward very much to the conference again in Perth this time next year. If you're an international listener, then, or somebody from Australia, please keep the Australian College of Mental Health Nurses conference on your radar. It's going to be in Perth, australia, beautiful place to visit and some beautiful projects coming from that area as well. So I'm inspired to keep powering on in digital mental health in particular, and I think you know we've got much to look forward to this very positive future.

Rhonda Wilson:

I'm really, really thrilled to see the way that Australian mental health nurses are bravely embracing, I think, some digital mental health ideas, and that makes me very proud indeed. This has been a fabulous podcast, joined by my cohost, oliver Higgins, and our international guest, yetta Sernsson, from Denmark, and we'd like to sign off and thank you for listening to us on the Digital Dose podcast. We look forward to talking to you again soon.

Exploring the Impact of Digital Health
Digital Health Literacy in Mental Health