Description of the video:
All right. Well, welcome to our colloquium. We are so happy to have Dr. Brian Mustanski here. I have an actual thing that I'm reading for you, even though I would normally introduce you a different way, but here we go. It's formal. Dr. Brian Mustanski earned his Ph.D. in psychology from Indiana University in 2004. He was an engaged student and took seriously the opportunity to grow both his research and clinical skills. And I mean that, right? Like you were, you from the get-go. We're here to learn and to grow. And I mentioned that for all of those who are here early in their career. And you're thinking about how do I have a career that looks like this one day? Well, after Brian's time at IU, he went to UIC, where in 2008, he started IMPACT, the LGBT Health and Development Program, which is a home for research that seeks to improve the health of the LGBT community and increase understanding of the development of sexual orientation and gender identity. He brought this program to Northwestern, where it became a university-level institute known as the Institute for Sexual and Gender Minority Health and Wellbeing. And in 2025, with a nod to their roots, it was renamed as the Impact Institute. Dr. Mustansky is also the founding co -director of the NIH -funded Third Coast Center for AIDS Research in Chicago. And as the son of an educator, Brian also noted important gaps in the sexuality education of young LGBTQ youth. To address those gaps, he created innovative digital media education projects like Queer Sex Ed for LGBTQ teens and Keep It Up developed specifically for gay and bisexual men. And for those of you who aren't familiar, sex education in the United States, when you ask young people what they learn, usually around only six to eight percent of them recall having any information about how like two women have sex together or how two men have sex together. It's just really pretty non -existent. So to create that is critical. Dr. Mustansky has led the longest longitudinal cohort study of LGBT youth and an innovative adolescent extension of the National HIV Behavioral Surveillance Study called Shy Guys, which was funded by the CDC and engaged young men ages 13 to 18. He drew from these and other cohort studies to create RADAR, a dynamic dyadic network cohort study of over 1,000 adolescent and young adult men who have sex with men that examines HIV and substance use through a multi -level perspective. Brian served as the 46th president of the International Academy of Sex Research and is now the associate vice president for social and behavioral research at Northwestern University. He's been a principal investigator of over $70 million in federal and foundation grants and has authored or co-authored more than 375 peer -reviewed scientific publications related to LGBTQ health, adolescent health, mental health, HIV prevention, substance use, and sexual health and behavior. In 2017, NBC News selected him from 1 ,600 nominees as one of 30 changemakers and innovators making a positive difference in the LGBT community, and they've got a great interview with him online. Dr. Mustanski's work has made it possible for thousands of young people to receive free HIV testing and for those who test positive to be supported in their care. His research participants have described being in his studies as sometimes the first time that they were provided with kind care, which speaks both to the kind of research team that he's built, as well as how far we still have to go to meet young people's needs. All of this is incredibly impressive. But as someone who's known Brian for more than 20 years, I can also share that in addition to being an exceptional researcher and educator, Brian is also a thoughtful, self-reflective, lovely, and fun human being, and a devoted friend and caring colleague to so many. Please join me in welcoming Dr. Brian Mustanski. I keep going for a while. Heaven, beautiful. Thank you, Debbie. I'm going to have to find an opportunity to pay it forward and introduce Debbie at some event in the future. Well, thanks for having me. It's a real pleasure to be back here with you. I'm going to try to share a little bit about kind of the arc of my career, share some findings from a couple studies. And then try to leave a fair bit of time for questions. I think we have at least 15 minutes built in for questions. But also, we're not a huge group. So if there's something, as I'm going through a slide, that you want to have a clarifying question, feel free to just raise your hand and we'll kind of address it as we go. All right. So just a little bit about my career path. I did my undergraduate degree at Northwestern University. And during that time, I was introduced in some undergraduate psychology classes to research on behavior genetics. When I was an undergrad, I wasn't sure if I wanted to go into genetics or possibly psychology. And there was a psychology class where they introduced work on behavior genetics. And I was like, oh, wow, this is the fusion of two things that I can't decide between. So why don't I just lean into that? and was very fortunate to get accepted to the Ph.D. program in psychology here at Indiana University because it combined many things. One, they had some great work on behavior genetics, including these very large studies of twins in Finland. And then also the Kinsey Institute was here. And so I had an opportunity to do work on sexuality and sexual health, learn a lot about providing sexual health services. I worked in a clinic there, providing care for folks experiencing sexual problems. And then also, it was a time when like a lot of behavioral scientists were starting to get engaged in HIV research. And there was some HIV research that was happening at the Kinsey Institute that I was fortunate to be able to get involved in. And so as I was finishing graduate school, I was sort of still doing work in behavior genetics, publishing papers using finished twin studies, but also getting involved in HIV research. And all the work I'd done on twins was really focused on adolescents and young adults and understanding the development of substance use and health behaviors using that approach. And then, so I wanted to continue work with young people. And so my residency was matched at the Institute for Juvenile Research in the Department of Psychiatry at the University of Illinois at Chicago. And there I had the opportunity to get mentored by some senior faculty who were engaged in HIV research, including Jerry Donenberg, who's now the director of the Office of AIDS Research at NIH, and started to do more work on HIV in young gay men, it felt like it was really combining my interest in sexual health with my experience with adolescents and adolescent health, and so started to get involved in some very early studies of HIV in young gay men. I went to a conference focused on HIV in young people and presented some of the work I was doing in young gay men, and I was shocked to be at a conference focused on young people, and I was maybe the only person focused on young gay men. The majority of the folks were focused on heterosexual young people and a lot with a big focus on heterosexual young women. And I actually gave a talk at the conference where I showed some of the CDC data that about 80 percent of HIV infections in young people in the United States are in young gay men. And people were very disturbed by that. They were like, why are you talking about this? This is not the face of AIDS in our country. Why are you focusing on young gay men? You can't study them. It's too hard. How are you going to find them? How are you going to get parental permission? no, we need to keep the dialogue focused on young women. Bush was president at the time. There was a big focus on HIV in young girls. And I was like, this is not going to work for me. This is like the biggest group of HIV infections. I'm a young gay, at that time was a younger gay man myself, not so much anymore. But, you know, I was like, this is, this is, if no one else is going to do this, or if not that many other people are going to do this, this is what I'm really going to focus my work on. And so that has really shaped my career for the next 20 years. And I've done epidemiological studies. I've done studies developing interventions, primarily using digital health approaches as a way to reach people that we might not be reaching through traditional approaches like sex education in schools or clinics or community organizations. As Debbie mentioned, when I brought my research program to Northwestern, it became eventually a university-wide institute called the Impact Institute now that's focused on supporting research on LGBT health. We're a very community -engaged, community -based institute. My largest study called RADAR, which is a cohort study of young gay men, is embedded in the community. We actually run it out of Center on Halstead, which is the LGBTQ community center in the city. So happy to talk about all of these things. And because I came to this work because I wanted to make a difference in addressing HIV in young gay men, I ultimately started getting involved in implementation research because it really wasn't enough to just publish our findings and say this intervention works. And really, we needed to understand how can we actually get services to people who need them, which is a different kind of question. So I'm going to do a little bit, just not knowing necessarily people's backgrounds. I thought I would give a few minutes on just like, what is implementation science? Why do we need it? Why did I really get drawn into this particular scientific sub-discipline in the last decade? And so I'll share a couple examples with you about what the point of implementation science is and some examples of how we've done it. So many of you may know, HIV is still a continuing issue in the United States. There was some evidence that maybe rates had been declining in our country, but actually there had been a little bit of an increase in diagnoses in 2022. Possibly, the CDC had estimated those might have been cases that had been delayed in diagnosis because of COVID and lack of access to testing. But actually, the 2023 data that just came out actually is putting the number even higher. And so it's possible that we're actually not showing declines in HIV cases in the United States. It's been, unfortunately, very stable for a long time. And it's stable despite us having highly effective interventions that prevent transmission, that prolong people's lives, that are highly effective in terms of testing. We also know that HIV cases are not randomly distributed amongst people. Men who have sex with men continue to represent about two -thirds of new HIV diagnoses in the United States, So over 25,000 each year in gay and bisexual men. And layered on top of that are some racial and developmental patterns. So the largest group of new cases in men who have sex with men each year are in 25- to 34-year-olds. So young adults. But after, and for about a decade, we saw increases in cases in 13- to 24-year-olds, which might now actually start to be going down. But that's still the second largest number of group in terms of new cases in the United States. So we see these developmental patterns that we really must attend to in terms of new cases. And then there's also racial and ethnic disparities. In 2022 was the first time that actually there were more Latino men who have sex with men diagnosed with HIV than black men who have sex with men. We are seeing perhaps some trends towards declines in HIV cases in black MSM. But we're seeing some big increases in Latino and Hispanic men who have sex with men. So there are racial disparities, developmental disparities or sexual orientation disparities that are sort of all overlapping. Now, I mentioned before, you know, we have we're at this point where like science has produced a test that are highly effective. We have fourth generation point of care HIV tests that can find usually within a few weeks of infection. that will detect the infection. We have PrEP, which is highly effective. We now have injectable PrEP that you only have to get a shot every six months, and there were essentially no HIV infections in people who got the shot. And then we know that treatment is highly effective, not just for prolonging the lives of people living with HIV, but also preventing transmission when people are treated for HIV to the point that they become undetectable. You can't find the virus. they cannot transmit the virus to others as well. So treatment also works as a form of prevention. So we have all these highly effective interventions and tools, but we're not seeing declines in cases. And I think that really speaks to a problem of implementation. And I'll show you just a couple examples of some of the implementation channels. So this is some data from two of my studies, the SMART study, which had over 1,300 teenage gay and bisexual boys in it, you can see in the blue line, an earlier study called Guy2Guy, which was a text messaging-based intervention, we see that in the sexually active teenage gay and bisexual boys, only about one in five have ever had an HIV test in their lifetime. So these are sexually active young men. We know this is the second highest risk group in terms of new diagnoses. And only about one in five of them have ever had an HIV test in their lifetime. As soon as people turn 18, look, we're over 90 percent, according to the NHBS data, have had an HIV test. So there's a clear implementation challenge. We're not reaching teenagers with HIV testing, and we really need to figure out, you know, what are the barriers and what are the solutions to help improve rates of testing in teenagers? I mentioned PrEP, pre -exposure prophylaxis, medication that's highly effective at preventing HIV infections. If it's taken as a pill, it's a daily pill. If it's taken as a shot, it's once every six months. We have seen big increases. Here you can see data. If you're interested in HIV epidemiological data, AIDS View at Emory University is a fantastic resource. You can get maps of countries and states and counties, really great source of data. We have seen these substantial increases in PrEP uptake in the United States, but it's really not reaching necessarily the people who are at the highest risk. So when you look at Black or African-American people, they represent 38% of HIV cases in the United States, but only represent 15 % of the people who are taking PrEP in the United States. Similarly, Hispanics represent 32 or a third of HIV cases are only 18 % of the community that's taking PrEP. So white people represent 24% of HIV cases and represent 63% of the people who are taking PrEP. So there's some real disparities and inequities in terms of how our HIV services are being delivered in this country. And so implementation research is really one of our ways to figure out, like, what are the barriers and solutions to addressing these implementation challenges. And, you know, one way to think about it is we have these interventions, we have treatment, we have PrEP, nearly 100% effective. Of course, nothing is always 100 % effective, but fairly close to 100% effective in terms of preventing transmission. If you think of any intervention that's that effective, then you think of, say, 50% of clinics decide that they're going to offer PrEP. And actually, that's not so far off, at least in the early days of PrEPs. Like health departments, for example, we were only seeing about one in five health departments for the first few years offering PrEP. And then within those clinics, say 50% of the providers decide that their clinic is offering it, but they're not going to offer it to their patients. Then you have the clinician offer it, but only half of their patients decide to take it. And then maybe only half of those patients keep taking it. And actually we see the average duration of oral PrEP people's use is actually tends to only be a few months. People go off it actually fairly quickly, despite it not having side effects or very many side effects. So if you multiply 100% effective intervention, you cut it at 50%, 50%, 50%, you get a 6% population impact. And these are all challenges with implementation of this highly effective intervention. And so implementation science is asking the question, what's not happening? Why is it not happening? How do we get it to happen? How do we know? How would we even know if it's happening in terms of intervention delivery? And it's really about pulling out this black box of saying, we have these effective interventions, we want to get to 100% population impact, but we need to be a little more explicit in terms of our understanding of what's happening and the barriers and facilitators to actually the delivery of this highly effective intervention. So answering that question of being a little more explicit about how delivery is happening is really the charge of implementation science. So I'm going to kind of start with the takeaways of my talk. So I'm going to spoil the rest of the next half hour and just kind of tell you some of the lessons that I've learned, and we'll come back to them. One of the challenges is there's a lot of this very commonly quoted statistic, but it It actually takes 17 years from a health innovation to actually reach routine practice. And only a small percentage of effective health interventions actually ever even reach routine practice. So we take a very long time from discovery of a health innovation before it's getting delivered in practice. We need implementation strategies for health systems to improve the reach of effective interventions, fidelity, sustained delivery of them. We need to understand how health systems work, public health systems. clinical systems. And we also know that patients and consumers, we need to address their needs as well in terms of, do they know about the intervention? Do they know how to access it? Do they know how to pay for it? Do they know, are they motivated to use it? Are they motivated to sustain their use of it? If they want to discontinue it, do they have a strategy of how to discontinue effectively? And so, you know, implementation science in its early days really tended to focus mostly on health systems. A lot of the work came from VAs, Veterans Administration Clinics, which is a closed kind of system, has a lot of advantages for understanding how to implement something. And then as implementation science has grown, it's also focused more, I would say, as well on patients and consumers and their understanding. And we published this paper in 2024 in implementation science on this concept of adjunctive intervention. So we really classify things as implementation strategies. How are you delivering in the system? Adjunctive interventions are really how do you get consumers and people who are going to use that intervention to want to use it, to understand how to use it, to benefit it. And, you know, if you're trained as a behavioral scientist, a lot of the theories that we have in behavioral science is about how to change people's motivation, how to instill self-efficacy in using a health intervention, many of those also work within health systems, right? How do you get clinicians to feel motivated to deliver an intervention? How do you get a health system to prioritize a particular innovation? I'm going to talk more about implementation science. I'm going to get into a little bit of the jargon. One of the things I'll say is if you're interested in learning more about implementation science, I co-lead the Implementation Science Coordination Initiative, which is a national resource for folks who are doing HIV implementation science in the United States. And we have tons of trainings and videos and handouts and tools. I'm going to show something from a logic model in a second. We have a logic model builder. So feel free to check that out and go. And there's a lot of free resources that are available to everyone on that website. So if I catch your interest in something, but you're like, wow, you blew through that really quickly, go to the website. You can learn more about all of these concepts on our website. So some key definitions as I talk about my own implementation research, there's a lot of jargon in our field, but just to define a few things. So implementation science is really the scientific field of studying methods to promote the adoption and integration of evidence-based practices, interventions into routine clinical and public health services. So that's the scientific field. We're studying methods. Implementation research is often really more about the scientific study of the particular strategies. So you might do an implementation research study of what are the barriers to implementation or what are the strategies that might improve implementation in practice. And then strategies, we use this word in a very precise way to mean approaches for techniques or methods to enhance the adoption, implementation, or sustainability of an evidence -based intervention, program, or practice. So I'm going to use, sometimes I'm going to be talking about the effectiveness of an intervention, and other times I'm going to be talking about the implementation strategies. The strategies are about how you deliver the intervention. Put in another way of saying it, the innovation, the practice, the policy is the thing. effectiveness research looks at whether the thing works at improving health so say the thing is a pill to lower your cholesterol the pill or the statin is the thing effectiveness research is a study where you're randomizing people to get the statin or a placebo and which one lowers your cholesterol better and if the statin works better it's effective so that's an effectiveness research Implementation is just delivering the thing, doing the thing. Implementation research helps to understand how to help people do the thing. Implementation gaps are where, when, for whom the thing is not being done. Implementation determinants, we talk about that as the context, the barriers, the facilitators, the conditions that make it harder, easier to do the thing. Strategies are the actions we take to do the things, and outcomes are how we know how well or how much we're doing at doing the thing. So these are some kind of definitions. I think the big thing to take away from this is the difference between the thing and the strategies or the effectiveness of the thing and the effectiveness of, say, the strategies. So we have the thing. We study determinants. This is often something you might do early in implementation research. What's getting in the way of helping people do the thing? What's helping them do the thing? What are the strategies that might improve the delivery of the thing? And then what are the outcomes that would tell us, you know, what percentage of providers are doing a prescription? What percentage of patients are filling that prescription? Those would be the implementation outcomes. We're not looking at whether the thing works anymore. We're just looking at how it's being delivered. IS loves our theories, models, and frameworks. In fact, we actually call them TMS because we have so many of them. There was a review recently done, but there's actually over 150 theories, models, and frameworks that have been described in the implementation research literature. Fortunately, in this excellent paper, they kind of broke them down into sort of clusters of theories. And so it's really the models that would describe the process of moving research into practice. Many of you might be familiar with EPIS, Exploration, Preparation, Implementation, Sustainment Model. Then there's the things that tell you about the context, the barriers or facilities. We call those determinants frameworks. A really widely used one, we use it a lot, it's called CFIR, the Consolidated Framework for Implementation Research. Then you have all the classic theories. If you're a behavioral scientist, you're thinking about stages of change and things like that, that are the classic behavioral therapies that are theories that are very relevant to implementation as well. We have some implementation-specific theories. So, like, ERIC is one around different types of strategies that you can use for implementation. And then you have your approaches to evaluating how your implementation project is going. One that's very widely used, I use a lot, is called RE-AIM, which is really about the reach of the intervention, the effectiveness of the intervention, the adoption of the intervention, et cetera. So it's an acronym. So just as you're kind of thinking about if I want to learn more about implementation science, sometimes it gets a little overwhelming because you see all these theories, models, and frameworks. How do these even fit together? We developed at Northwestern this implementation research logic model as a way to sort of also put this all together. And on our website, we have a logic model builder. So if you're interested in designing an implementation research study, you can use our logic model builder to sort of build out the study, and it'll actually output graphics for you that are, like, ready to go into a grant proposal. And actually, several NIH requests for proposals have suggested that people use our logic model builder as part of their application. So it really lays out first, like, what are the barriers or facilitators around implementation? then what are the strategies? And these are from ERIC, the sort of big class of the big domains of different types of implementation strategies. I find ERIC to be particularly helpful when I'm thinking about studying the implementation because it does lay out pretty much all of the options you might have. Not all of the options, but many of the options you might use. So you might not, I mean, everyone starts with training, right? I have a problem. There's not enough PrEP prescriptions. We're going to train the providers to do PrEP prescriptions. Training, we call it the train and pray. We're going to teach people to do something and we're going to pray that they're going to do it. That is probably the most widely used implementation strategy. But there's other things you can think about, like interactive assistance, supporting clinicians. What are the financial strategies that might support implementation? And then you have the innovation, the things, the pills, practice, principles, products, and then you have the outcomes of the implementation research project. So that kind of helps put it all together. And as As I mentioned, on the ISCII website, we have some trainings about how to develop a logic model. We have this interactive IRM tool that you can use to build a logic model for a bunch of different types of studies. So that's my Implementation Science 101 in 10 minutes. Now I'm going to talk a little bit about how I've applied it in my own work. So as I mentioned earlier, a lot of my work is my intervention work has been around using digital technologies to train, motivate, give skills to young people to improve their sexual health. This has been a big, at least as it relates to HIV, a substantial investment from the NIH. So actually, in NIH Reporter, which you can't go to today because it's down, I think, but if you ever were to search, they have one category called telehealth, which is actually their category for anything that's like mobile technology, digital technology. And if you combine telehealth and HIV, you'll see that NIH invested $108 million just in three fiscal years. So we're talking about over $30 million a year in development and studying the effectiveness of digital HIV interventions. Huge investment. We see that as the sort of beginning of the process, right? You apply for an idea to test an intervention, and things might get into this intervention science loop, which is I'm developing an intervention, and say it doesn't work. You know, you do the trial. It's not effective. You might say, well, that's because we needed to add this other component. We'll go back to the start. Some things get stuck in this loop, and that's good because we don't want to scale them because they don't work. Maybe we need to keep optimizing. Then you say, okay, it's effective. And a lot of times in HIV, things move into these compendiums, these lists of what are the effective interventions. They met this criteria. CDC has a list of criteria if you want to be a best-evidence HIV intervention. You must have designed your study with this rigor. The results have to be this strong. You get into this compendium. CDC has one. HRSA has one for HIV services. And there's 43 digital HIV interventions in the CDC compendium for HIV prevention and 14 in the HRSA compendium for HIV care delivery. So now you're in this compendium. And in the ideal scenario, now you're in the compendium and now everyone's going to start using your intervention, right? It's on the list. So all the clinics are going to start delivering it and you start delivering this digital intervention. It's reaching patients and it's having the outcome that you would hope it did. Who votes that this is the way things work in the real world? No, what we see is a lot of challenges, a lot of cones, a lot of breaks in the road. And that's really the job of implementation researchers to say, like, there are all these challenges along the way. How can we identify those challenges and how can we create solutions? So if you work at a university and you develop this digital intervention, how do you get it out of the university? right? Like, is IU going to deliver this digital intervention to everyone around the world? Probably not, right? It's not, IU's technology platforms were not designed for delivering things in that way. They're designed for students and research. So how do you make it scalable? How do you make it, maybe it has to be available in multiple languages. It has to be secure, maybe in a different way. So those are implementation questions. How do you maintain the technology, right? Like, I did this sex education program. It's highly effective, but now there's a new browser that it doesn't work with, and everyone decides they want to use that browser, and now nobody can use it anymore. You've got to update the technology. You've got to keep maintaining it to make it continue to be secure, which often means updates as we find about insecurity or vulnerabilities in different technologies. You also might be in a situation I was in in the middle of a large study of HIV prevention in adolescents where the FDA approved PrEP for teens right in the middle of my trial. So we had all this content that talked about when you're a little older, you might consider getting on PrEP. And now they can get it today. And so we have to, obviously, we have to update the content because we're not going to falsely inform people about an intervention. So you can't just set it and forget it and put it up on a website and it's going to take care of itself. There has to be maintenance. How do you train organizations to deliver it? How do you reach the end user? How do you support recipients to learn about the intervention, want to use it? These are all implementation research questions and ones that implementation science can do a lot to address. The problem, as I see it, and we wrote about this in this paper in current HIV AIDS report about, we call it the package could not be delivered, the state of digital HIV interventions in the United States, is that we can solve all these problems. And there's actually a lot of fabulous implementation research looking at each of these cones along the way. But there's still this chasm, which is that our funders, our health system, is not designed to set up to support the delivery of them. So community organizations often can't get funding to deliver digital interventions. There is no CDC funding to deliver, to support the delivery on a national level and of an effective digital intervention. So we still have some policy problems that we need to push our public health agencies to address because there's only so much researchers can do. And we have to be realistic about that, right? Like we We can't promise to solve problems that are actually inherent within our health system. But there's a lot that we can do. So I'm going to give you some examples of some implementation research that I've done with the Keep It Up intervention. And this was funded by multiple divisions within the National Institutes of Health as a hybrid type 3 cluster randomized trial. Now, I'll talk a little bit more about hybrid trials later if there's time, but a type 3 trial is really focused on how do you implement it, but while we're implementing it in practice, let's just see if it still works. So you're getting a little bit of effectiveness data, but you're really focused on implementation. Keep it up. This intervention has been a long journey for me. I actually got an R34 in 2007. So talk about that 17-year path. Like, this is real. So, you know, and I was trying to move as fast as I could. I was not delaying in any way. But in 2007, I got an R34 to develop a digital HIV prevention intervention for young gay and bisexual men. We did a lot of focus groups with diverse young men. We did focus groups or individual interviews with staff who delivered HIV testing services. One of the things that was very cool is that came out from the young people is they were like, we do not want an HIV intervention. where you say, session one, HIV is a virus, and here's how it's transmitted. Session two, here's how condoms work. Session three, we need this about our lives. We've got a lot going on in our lives. How does HIV prevention fit into our lives? We don't necessarily need the facts. We kind of already know that. Let's skip to the context and the realities of like, should I use a condom with my boyfriend? We were talking about that at breakfast today. Those are like real questions that people have. It's not, should I use a condom? It's like, should I be using it in this particular setting? We published the results of that. It showed a reduction in condomless sex compared to a very active control condition, which was sex education. So we did sex education in one arm, and we did this very motivational, interactive digital intervention in the other. We saw reduction in risky sexual behavior in the keep it up arm. I then applied for an R01 and didn't get it. I was like, okay, we are ready to move this on, people, let's do the full effectiveness study. And it had to submit it several times. And it was like, well, we prefer you use this questionnaire than that questionnaire. It was like a lot of nitpicking, which is how the review process works. But in the meantime, the city of Chicago health department said, we're looking for homegrown interventions that address young gay and bisexual men. We don't have interventions. There's no evidence-based interventions. Does anybody have anything? I said, well, I just developed this and we have some evidence that it works. And so they actually funded our community partner to deliver this as a service in the city of Chicago. We enrolled over 600 young gay and bisexual men, got the intervention, no control group, but we studied it. We did the evaluate. Our role was to deliver the technology and to do the evaluation. We actually found it worked better when it was delivered in the community than what was delivered by us through the university. I then finally got the R01 I'd been wanting that NIDA and NIMH funded to do a three-city effectiveness trial where it was delivered through community-based testing sites in New York, Chicago, and Atlanta. And it actually had one of the first studies of its kind to have a biomedical endpoint. So we actually did rectal and urethral STI testing in the study, and we did it through the mail. So we actually mailed people kits. They self-swabbed themselves. Actually, there was some great work at IU that was happening that helped us realize that's something that we could have done, that people could self-collect. And this ended up being the first digital study in the HIV space that showed effects on a biomedical endpoint. So the guys in the Keep It Up arm had 40 % fewer rectal STIs than the guys in the sex education arm. So highly effective, one of the first studies of its kind to show effectiveness on a biomedical endpoint. It was then delivered as a service again through some funding from the Veeb Foundation in Jackson, Mississippi. Service organization in Jackson, Mississippi wanted to deliver it. They sought out funding to get it, and we were like, sure, we'll help you deliver it. So we did another implementation study. And then the question was, okay, so we've shown this works in several different ways and several different delivers. How do we implement it? And I'm fortunate to have a lot of inroads with leadership at CDC, Division of HIV Prevention, and I was like, let's, okay, we really have to figure out how to get these things implemented. And they're like, no, you need to go figure that out. Like, write another grant, study the implementation. And so the main study that I'm going to show you results from is a large implementation study looking at how to implement these digital interventions. Before I get into that, I was just going to show you a little bit about what Keep It Up includes. And Keep It Up is informed by the information, motivation, behavioral skills theory of health behavior change. It includes a web series. So there's like an online soap opera where you're following a group of young gay men. It's designed for people who just tested HIV negative. That's why we call it Keep It Up. They just got tested. They tested negative. The goal is for them to keep up their negative status. It was really informed by the reality that our community partners said, we have no resources to do risk reduction with guys who test negative. When someone comes in and tests negative, we say, congrats, you're negative. We'll see you again in six months or whenever you come in for a test. And so we were thinking, well, let's develop something online and they could at least send people to that as a way to promote prevention. And it has school setting, a lot of interactive exercises. is I'll show you some pieces of it. If you go to kiu.northwestern .edu or scan that code, you can actually see like some of the videos and content from the Keep It Up intervention. And one of the things that is really key in implementation research is the concept of adaptation and how you adapt things. And so Keep It Up does not look the same way it did in 2007. So in 2007, you know, it had like graphics that look like this. It was like very low res videos. I mean, we frankly shot them on like a flip cam in my dining room at my house. You know, it was like my administrative assistant is holding the microphone. You know, it was like we had a very small budget. We, I think, did a great job with what we had. But, you know, we had to update it. We had to keep it relevant. We had to improve the quality of the content. And so I was just going to share a little bit of it with you. But these are the different modules in the intervention. And the young people that we talked to when we were developing it, as I said before, said, we don't want just the HIV education. We want this to be integrated in our life. So each module is actually a different kind of context or setting in the lives of young people. So it's about how does HIV fit into dating? What if you're hooking up with somebody on the internet? What if you're at a bar and you use drugs or got too drunk and you're now trying to navigate having sex with somebody? how do you integrate prevention into those different settings? And so that's kind of the larger content. As we've been adapting and keeping it up current, one of the things that's been really helpful for us, and I credit my colleague Dennis Lee, who actually wrote a paper about this, is that we really, for all of the modules, we really have a crosswalk of how it fits into our behavioral theory of the information motivation behavioral skills and how each module addresses some of those components so that when we're updating the platform when we're reshooting the videos we're not just taking the exact same script in the first video and just shooting it with a better camera we are updating it to be a little more current but anytime you are making updates you have to have a sort of a belief about the underlying theory the underlying mechanism of action so you keep it relevant to that. We use a framework a lot that's used in eHealth by my colleague David Moore called Trials of Intervention Principles, which means when we're making updates, we sort of classify things as these are core components that should not be changed, and these are things that must be changed, like, you know, the browser doesn't work anymore. If you don't update it, it's not going to work, so you must update that. Or there's a change in medical information, and you must make an update. Then there's other things like how we make people feel vulnerable in a way that would want them to engage in a prevention program. Those might stay the same as we refine and continue to update the intervention. So I thought I would just show you some examples of some of the videos. One of the things that I absolutely love is man-on-the- street videos or person -on-the-street videos. I include them in all of my interventions. It never lands the same with actual actors. I've worked with so many production companies who, as soon as I say we're going to do person on the street videos, they're like, OK, well, we'll do a casting call. We'll hire the actors. And I was like, no, this is we're on the street with my dog. People stop to talk to pet my dog. And we're going to ask them questions. And they're going to sign a consent to be in the video. And the things that they say are just so powerful that I don't think there's any way to recreate them with actors. So this is the same module in Keep It Up 1.0 in 2007 and the new version in the current version. How many examples in that video where you could see, like, if I had actors doing it? I mean, that one when the guy turns and says, that's a stalker. I love it. And these two things are just amazing. So, you know, that's part of our theory is that, like, you connect to them. You want to hear what they're saying. And these are the first videos that people see in the intervention. And so, again, they get started and it's not HIV is a virus. And here's how it's transmitted. It's hearing from other people like them talking about the realities of being in relationships, why you'd want to be, why you want to be single. We're not your mother. We're not going to tell you to lock it down with your boyfriend. There's a lot of different ways that people seek fulfillment. And we're going to try to help you understand how to address all of those forms of fulfillment in the context of also addressing HIV. This is the more refined, newer version. Maybe better, maybe not. You can see it's different people talking, but you get really the same flavor. The production quality is a little better. The video quality is a little better. But, you know, like we couldn't just hire actors to say the scripts of what those other guys said. It wouldn't certainly land the same way. One of the other things we built into this is an HIV transmission risk calculator. This was a new approach. In the Keep It Up 1.0, we gave people a lot of facts about the risks of different sexual behavior. We really, as technology is developed, there's ways to do things in a much more interactive way now. So you can actually click on these things, and it'll give you your epidemiological risk of HIV, as we've confirmed with the CDC, as the per-act transmission risk. So you could say, like, I'm HIV negative, and I'm going to put my eggplant in someone's peach, and it'll tell me what my risk of HIV transmission is. then I might say, well, I have an STI, so that actually significantly increases my infection risk. My partner is undetectable. That's going to lower the risk. We're going to use a condom. You can see the risk. I'm going to take PrEP. And we actually find people spend a fair bit of time on this calculator because we have data to see how much time they spend on each page. And they actually do a lot of like, oh, wait, so I guess if I'm going to do this with this type of person, maybe I should instead do that because my risk would be much less that it might still, you know, be pleasurable. And that's what it's all about is we don't want people to sacrifice their pleasure, but we need them to reduce their risk or want to encourage them to reduce their risk. So as I mentioned, there's these compendiums, Keep It Up was added to the compendium of, it was one of the first digital interventions added to the CDC compendium of evidence-based HIV prevention intervention interventions. And then our question was like, okay, so this works. We've shown it works in a bunch of things. How do we implement it? And so that was really the focus of the Keep It Up 3 .0 study. And so we thought, what are ways that interventions get delivered in the United States, HIV prevention interventions? So the most traditional way is a health department or the CDC puts out a funding announcement and a bunch of community organizations apply and say, fund us to deliver this intervention? And that's the sort of classic approach. We call it the CBO kind of based approach. How are technology interventions delivered or how are technologies delivered? Like you don't go to your local health department to download an app, right? You just do it on your phone yourself. You can do it at two in the morning. You can do it whenever you want. And we call that the direct to consumer approach. So what we did is we identified 66 U.S. counties that had very high levels of young gay and bisexual men. And we randomized them two to one. So two times as many counties were randomized to the community approach. And then within those counties, we sent out announcements saying, we have money to give you if you implement Keep It Up as a service for young gay men who come in for HIV testing. So we're going to give you $40,000 to implement this over two years in your organization. And we're going to do most of the, we're going to deliver the technology. we're going to train your staff, we're going to do all that. Literally, what we want for you to do is when someone comes in and tests negative, if they're a young gay man, offer them keep it up, and we're going to give you some money to do that. So we thought that's very pragmatic. That's kind of the way health prevention interventions are often delivered in the United States. The other approach was, well, we're just going to do it all from Chicago. We'll do advertising on Grindr and other apps. We'll recruit people. We'll mail them an HIV test kit through the mail. We already showed we could do that. We'll mail them an STI test kit. We'll deliver the result. We'll refer them to CARE if they test positive. We'll deliver the intervention. We'll collect the outcomes. And our question is like, well, which works better or which works better in different types of ways? And one of the things that you'll see in implementation research is a lot of times you have a lot more outcomes than you might have in a clinical trial. Clinical trials like, did the cholesterol go down? How many HIV cases were there? That's really the only thing that matters. In implementation research, different people, different stakeholders care about different things. Some people care about the cost. Some people care about, are you reaching the most high-risk people? Some people care about fidelity. So you'll often look at a lot of those different outcomes, and we're interested in those and comparing these two approaches. What I'm not going to talk too much about is the fact that this was my dream study. Literally, like my whole career had been building to do this study. I convinced NIH to give me this enormous grant, including me telling them, you give me the money, and then I'm going to do a grant competition for other people to get money to deliver the intervention. I can't even tell you who those people are yet, because I haven't done the request for proposals yet. It was like the NIH was, you know, when it got a great score, they were like, we don't, you want us to give you money, and then you're going to give it away to other people, and we don't even know who those people are yet. I was like, exactly. So this was my dream study. It was really designed to address questions. And then COVID hit. And we had just gotten the site selected, funded, trained, and then their clinics were closing and their clinics were reopening and they had capacity restrictions. So we continued along. I'm not going to go into all the details of how the study had to be salvaged. But there were some limitations that you have to sort of roll with. We still learned a ton about how to implement an intervention like this. So our primary outcome was cost per infection averted, because that includes, it sort of combines information about cost, it combines information about effectiveness, it also, you can invert more bad outcomes in people who are at higher risk. So it also integrates information about the riskiness of the groups that are enrolled using these different strategies. So that was our primary outcome. Our secondary outcomes were really about the context. Like, why might some community organizations do a better job of implementing than others? Maybe some do a better job of enrolling people or enrolling people that are higher risk. Some might do a better job at fidelity and delivering the intervention. And so we did interviews with these 22 community organizations over time to just ask them about, like, before they got started, tell us about your organization. How much leadership buy-in is there for this? Is that the only person who likes it, is the leader and everybody else on the staff thinks we have a lot of better things to do with our time? Or is the whole organization excited about the delivery of the intervention? And so we use the CFIR Consolidated Framework for Implementation Research Guide. They have an interview guide to collect that information from organizations. This paper recently came out in AIDS and Behavior. And we did look at using this interesting analytic approach, which I'm not an expert in. But we identified that some of the barriers were around the self-efficacy of the staff at delivering it, like whether people were still feeling confident or not that they could actually refer people to the intervention, their motivation. Staff turnover is a big one. We really had to optimize our systems for the fact that there's so much turnover in these community organizations. And so how do you train people quickly? How do you have things at people's fingertips when they leave and somebody else comes in and they know how to find the way to enroll somebody into the intervention? Some of their infrastructure capacity were important predictors. So if you want to read more about why some community organizations did better than others. This is a great paper. We put together a ton of materials, but we didn't want to just do train and pray. We didn't want to do like one training and then you know how to do keep it up. So we also put together this dashboard that this is a highly sped up video of because the video is quite long. It's on our website if you want to see it. But it showed it's a dashboard that really organizes like, are you a project director? Are you a coordinator? All you're doing is recruiting. I just need to put names in the system. I don't actually never need to be able to follow them after that. Here's a list. Here's your to-do list. You come to the website. You get to work in the morning. You log into the thing. Here are the people you need to follow up with. So we really tried to facilitate the practice of the delivery of the intervention so that it was just at people's fingertips. And they didn't actually even maybe necessarily have to know that much about Keep It Up in order to successfully enroll people and track them through the intervention. Again, you can see how we developed this platform and more about it on the Keep It Up website. Here's the primary outcome paper, which actually just came out, I think, a few weeks ago. We did costing analysis. So we had a whole team of health economists who did micro costing. They interviewed each of the clinics to understand some of the costs that went into delivering. So we gave them $40,000, but maybe it cost them $100,000 because they were using a lot of other in-kind resources. Or maybe we gave them $40,000 and they only spent a little bit of it on the intervention delivery. And so we did micro costing with all of the clinics, as well as for the direct -to-consumer approach. They interviewed the staff in Chicago. We were interested in demographic differences, given what I said about differences in the HIV epidemiology in the United States. This is comparing the direct-to-consumer to the CBO. You can see the direct-to-consumer approach enrolled more than twice as many participants. So 1,468 young men enrolled directly in the approach we used from Chicago, 656 participants enrolled through 22 community-based organizations. So this was more efficient in terms of reaching people. There were some differences in terms of the age. So the age split in the DTC was more balanced, whereas they were slightly more concentrated in the 24 to 29-year-old group for the CBOs. The CBOs did a better job of recruiting Black and Hispanic participants than we did in the direct-to-consumer. So there was a higher proportion of minority participants in the community-based approach. Now, when you start multiplying these out, though, you actually see the TTC recruited more Black participants because they recruited more participants overall. It was just a higher proportion in the community-based approach. And then there were some differences that the DTC actually did a better job of recruiting bisexual participants. So one of our primary outcomes was how we did in terms of increasing PrEP use. Because of the challenges of COVID, originally we were going to do a one-year follow-up. We had to limit that to just a three-month follow-up because we needed to have more time in the field. And at baseline, the arms were actually quite similar. The community group just had a slightly higher rate of guys who were already using PrEP. At three months, both arms increased in their use of PrEP. That increase was bigger in the community arm. And the way we interpret that is that Keep It Up does a good job of motivating people to get on PrEP. But if you're already embedded and connected to a community organization, maybe where they can actually prescribe it to you and maybe even give you a starter pack right there on the spot, it's a lot easier to do uptake than if now you decided you want it. Now you have to find a clinic. You have to get yourself there. So we think it works in both arms, but there's a lot of benefits to bring it in the community because people can get the prescription directly. And we also found that the intervention effects were larger in people who actually finished the intervention. Of course, not everybody finishes things. So we did this model of HIV infections averted using rectal gonorrhea. There's an approach to convert rectal gonorrhea rates into estimated what would have been the HIV incidence in the cohort. We use this modeling approach to compare the number of infections averted per 100 people years in both arms. And then we use the cost data that we collected to look at cost per infection averted. The CBO arm cost about a million dollars to avert one HIV infection. whereas the direct-to-consumer costs about $170,000. This is not per person. The per-person cost is in the hundreds. This is about the cost of averting an HIV infection. So we see that going through the community organizations is tremendously more costly, not just in the money we were giving out, but the other in-kind costs that were brought to the table in the delivering of the intervention. in mind, though, these clinics were opening and closing. They had capacity restrictions. We estimate from our experience in Jackson, Mississippi, that the cost is significantly lower for delivery, which, of course, then brings down the cost for infection averted outside of COVID. So we actually think it's probably a lot less costly than that. But I think the relative ratio of cost is probably true between the two arms. Because also, like, our staff sometimes couldn't go out of the office because of COVID. So we were experiencing similar issues. But let's put this in context. So the CDC estimates that anything that costs less than $494,000 per infection averted is cost savings, actually saves the health system money given the lifetime cost of HIV care. They also say that something that is cost effective at less than $100,000 per quality adjusted life year. And so PrEP, for example, lands around here, at least used to, I think the cost has also come down. But that suggests that certainly the direct -to-consumer model is for sure cost-saving to the health system and definitely then also cost-effective. I think whether the community-based approach is also cost -effective would require additional data on quality of life and stuff that we don't have. Also putting this in perspective, this is CDC estimates of the cost per infection averted for behavioral interventions in PrEP in high-risk heterosexuals. We're talking at almost $25 million for the cost per infection averted for PrEP in high-risk heterosexuals, or over $15 million for averting an HIV infection using a behavioral intervention in high-risk heterosexuals. So while these numbers may sound high in the Keep It Up intervention focused on men who have sex with men, you can see there's just a drop in the bucket. And I don't think anyone would say we shouldn't do anything about HIV and heterosexuals in our country, right? So clearly, we also have to contextualize these cost data. For the sake of moving on to questions, I think I'm going to skip through some of the other slides. I think they're, you know, can be available to people if you reach out to me. I also, you know, we were talking about SMART studies earlier. I also led a multilingual intervention for teenage gay and bisexual boys. But I'm going to just wrap it up. I'm happy to answer questions about that, but just so we have time for questions. And thank you. This is our newsletter for the HIV Implementation Science Coordination Initiative, and that's the website again, which you can also sign up for the newsletter if you're interested. So, questions? Thank you, Brian. That was an excellent talk. Thank you. I'm just curious to learn more about the implementation strategies. I went talking implementation research. What were those? And what were your implementation outcomes? I saw your effectiveness outcomes, which are secondary and type three. So, I'm just curious to know what were your implementation strategies? I know you mentioned train and pray, which we always start with. It's step one. Yeah. We have to do it. Yeah. I hear you mentioned practice facilitation, so I was thinking, can you talk a little bit more about that and what was the implementation outcome, like your primary outcome? Yeah, yeah. Great question. So in this particular study, we did use cost per infection averted as our primary outcome because we felt like that integrates some important implementation data cost and also reach and reach to high risk populations, which you the cost per infection averted is almost always lower if you're reaching higher risk people because there's going to be more cases in high risk people. And so if one strategy reaches the highest risk people, it's probably going to have a lower cost for infection averted, unless it's incredibly costly. So we felt like that actually wrapped up a bunch of different outcomes that we were interested in. Then we were interested in just using re-aim more broadly. So we were interested in how many people were reached using each of the strategies in the same time period and reached to key populations, Black, Latino, in particular, given the epidemiology of HIV. We were interested in adoption. And so adoption is very hard to define for the direct-to-consumer strategy because we were delivering it. But we were interested in understanding adoption at each of the sites. So they applied to us, right? So they had said that they wanted to do the program. And we, by the way, got more applicants than we could fund. So it's not like we've just funded all the people who applied. We had screened some people out or some sites out. But, you know, how many of them, once they were funded, actually delivered it and started routinely offering it to participants? So what's interesting about this study is I think it's different than when you think about strategies like using the ERIC approach, which we also did. And I can share a paper with you where we talk about the ERIC type strategies for the CBR. We had this sort of meta level question, which is like, what is the delivery approach? Is it direct to consumer versus community based? And so it's a little different than maybe you've already identified the delivery strategy, and now you're trying to optimize delivery using, say, ERIC approaches. Within each arm, we then had our implementation strategies in terms of how we delivered. So, like, we really encouraged community organizations to implement, like, an audit and feedback type approach. So we encouraged them to, and taught them how and did check-ins with them to see if we could help them further, use their own data to say like, okay, how many young MSM did you test for HIV this last two weeks? How many of them enrolled? Do you see differences by staff person, right? So one staff person did 25 HIV tests this week and 20 people enrolled, and another staff person did 25 HIV tests and one person enrolled. You might need to talk to those staff people, and maybe your one person could be a role model to the other person. So we encouraged them to use an audit and feedback type strategy. And then we did that facilitation approach of using the dashboard that really laid out, like, this is what you should do. And then we also had an online learning management system. So if a new staff member came in or at the beginning of the whole thing, there was a self-paced online learning system that they could go to learn. How do you create advertisement? So some of it was specific to keep it up, but some of it was like, how do you promote HIV testing to young gay men? Because if you're not testing enough of them, it doesn't matter what we have to offer. You need to get them through the door for testing. So we had, like, learning how to do outreach on social media to young gay men. We had, like, ads that they could collect and use. And so that learning management system was our approach. And we purposely did it that way versus, like, a Zoom meeting or something because we knew the staff turnover was so high that in these clinics that you just need to have people be able to come in and learn it whenever they need to. And then the last was the last piece. It'll come to me. But there's also another really good, if you're interested in digital health, my colleague Nanette Bembo just published a paper in Implementation Science Communication using the PRICE system, which is how to score implementation studies on how pragmatic they are across a bunch of different dimensions. And she scored it for both of the delivery approaches, which is also a paper where you learn a lot about the strategies because we talked about how pragmatic they were compared to like the way things are normally done. yeah thank you again for the the talk uh let me put it like in my point of view something like i have to tell you uh we in some countries we have a lot of problem about hpv something you know it's a virus and they probably say you don't the first thing we have to do don't have sex and didn't work, of course. Second thing, they have used condom in the work as well. So some countries like Australia, they start having the vaccine for men, for the girls and the boys. So it decreased a lot. They said that HPV is much worse in women than men. What do you think for the future? The PrEP will be the same because we have like this, you show the data, there has a window of age that you have this really increased number of, you know, positive. Do you think the future will be that? We will have this PrEP six-month, everybody for that population? Or do you think something? I think that would be lovely. I mean, I think that could be a game. I mean, that could end the epidemic if we really created systems that would allow and empower young people in that age group to get access to injectable PrEP. I mean, there's, like, almost no side effects. I mean, there were like very few cases in the study that became infected. One was like resistant to the particular. But, you know, we would literally probably almost eliminate new cases and young people if people got that shot every six months. But then you have like we have a Covid vaccine that was highly effective and people have a lot of beliefs about some kind of injection being put in their body. And now the government's telling you to do it. They trust the government telling them that they should get this shot. another vulnerable group that's been told by the government that they don't exist or that they should not exist. So I think that, and even if those sentiments weren't there, I mean, there also are just the real implementation challenges, like how do we get this out to communities? And so there is some very cool work happening with like mobile vans into communities. But I think we really need an all hands on deck approach to make that kind of radical change that it would take to have something like that work, which would be like social marketing, community norms, community health centers that people could easily go in, that you don't have to make an appointment three months in advance, but you can just walk in the door and get your shot. You can get your STI testing. So I think it's achievable. I mean, I really do think we could end the epidemic in at least in the United States and probably in some other countries it'll be much easier to do that have integrated health systems. But we have an uphill battle. to be honest here. That was awesome. I'm an epidemiologist. So this is outside of the scope of what I do. And I just found this so informative and well presented. Thank you for that. I had a question. I have two questions. One is like a very broad question, which is how did you conceptualize this career that you've made? And like, I'm astonished by your productivity too. Like you seem very normal the mascot published 365 papers that's unreal it's obviously like incredibly meaningful work and valuable work like when you're earlier in your career how did you how did you get going and get like what was the motivation and the like the strategy yeah kept you able to get where he planned yeah I mean a couple of things I say one is like I love writing I really do like I love writing I love writing papers I love reading my students papers and my mentees papers um so that helps a lot in writing a lot of papers um doesn't mean I always feel like writing but I do make it as part of my practice where I'm writing every week and I just really try to keep that up and it's so all the forces in academia and the world are designed to make us not be writing right now or in most of our career and like I do firmly believe that we particularly when I receive taxpayer money to do these projects that I have an obligation to share the results and share them in a timely way and I really I don't just say that like I really do believe that um that's obvious of what you've done too and I also think that I am very driven by the mission of what I'm doing like I really I get outraged by things and then I'm like I'm going to go do something about it. Like another example, my first grant when I was a young professor was from the American Foundation for Suicide Prevention to do a two-year study of suicide risk in LGBT youth. It took me 10 months to get IRB approval for my two-year study because the IRB was like, you need parental permission. And I was like, no, I don't. And here's the rules and here's how you can get along them. And then they were like, we don't like this advertisement. You can't use this word. You can't use that. And I was like, you know, I'm going to start studying this. Like and I actually learned a lot about ethics research and got an R01 to study the ethics of LGBTQ being in research. And so I think a lot of what I do is like always I'm thinking about like, what are the problems and how do I use science to address them? Like I can't I'm not you know, I don't work and I don't lead a federal agency. I don't have that. But I have science as my tool. And so I just like apply it to all the things that piss me off, which is an unlimited supply of things. One of your slides had like the, I forget, I'm sorry, I forget what you called it, but you had like circles within circles and it was like a hundred percent, 50%, 50%. I was thinking about, So I'm a vascular epidemiologist, and I study basically physicians' delivery of antiplatelets and statins and other guideline -directed medicines, which they don't do. But I sort of think of myself as like, that's where I stop. And then there's, what do you think about that? Like, you don't stop there. You seem to cover all the rules. I don't believe in science being a relay sport. I really don't. I think that maybe it works in some areas, but I just don't think there's somebody waiting to grab the baton to take it to the next level. And that sucks because I don't feel like that's how we're trained. And it means oftentimes you have to learn a lot of things that you thought you were done learning, and now you have to learn something else. But it also is incredibly rewarding to see something through from development to at least implementation research, if not eventual implementation. So yeah, I just, I don't, I think everything we're learning from implementation research is there's just not people to grab the baton or you have to find those people and you have to actively find them and then hand, maybe forcefully hand them the baton, you know, convince them to take it. They're not going to be out there looking to grab things and move it to the next level. So I do really firmly believe that. The last thing I want to say is I went through my last few slides very quickly, but I also have like a team of nearly 100 people, you know, that I work with. And so it's not just me doing all these things. Like it's a super collaborative environment. And so many of these projects are literally none of this would have been possible without this like amazing team of people I get to work with. So that also is and also makes it fun because they're cool people, too. Thank you for your questions. To be fair, you also built that team. Yeah, too, right? I mean, like that, those people didn't exist at Northwestern. You went to Northwestern and you created. Yeah. That's part of the story that it is about the players. And you've got a wonderful team. You mentioned Dennis. Yes. Catherine. Catherine. And Michael. On and on. Here at IU. Mm-hmm. Yeah. That also was part of the strategy too. So let's give Dr. Mustansky another round of applause. it was a wonderful opportunity to have him back to Bloomington. He said that he hasn't been to Bloomington in several years, but welcome back. And this is just a small thing to welcome you back. So if we can do a quick photo here, so you in the middle and you can, I don't know. there will be other opportunities to engage with Dr. Mustansky um 2 p.m uh cedar activity what's the room that's a room somewhere oak room oak room oh if you want more opportunities to engage with Dr. Mustansky come to the cedar thing at two; sex salon tonight sex salon fun at seven o'clock and where is that? The Bishop. The Bishop, sex salon seven o'clock; thank you so much.