Design and Methods.
Table 8.1 is derived from ‘Three Approaches to Case Study Methods in Education: Yin, Merriam, and Stake ‘ by Bedrettin Yazan, licensed under CC BY-NC-SA 4.0. 5
There are several forms of qualitative case studies. 1,2
Discovery-led case studies, which:
Theory-led case studies, which:
Single and collective case studies, where: 2, 9
In both intrinsic, instrumental and illustrative case studies, the exploration might take place within a single case. In contrast, a collective case study includes multiple individual cases, and the exploration occurs both within and between cases. Collective case studies may include comparative cases, whereby cases are sampled to provide points of comparison for either context or the phenomenon. Embedded case studies are increasingly common within multi-site, randomised controlled trials, where each of the study sites is considered a case.
Multiple forms of data collection and methods of analysis (e.g. thematic, content, framework and constant comparative analyses) can be employed, since case studies are characterised by the depth of knowledge they provide and their nuanced approaches to understanding phenomena within context. 2,5 This approach enables triangulation between data sources (interviews, focus groups, participant observations), researchers and theory. Refer to Chapter 19 for information about triangulation.
Advantages of using a case study approach include the ability to explore the subtleties and intricacies of complex social situations, and the use of multiple data collection methods and data from multiple sources within the case, which enables rigour through triangulation. Collective case studies enable comparison and contrasting within and across cases.
However, it can be challenging to define the boundaries of the case and to gain appropriate access to the case for the ‘deep dive’ form of analysis. Participant observation, which is a common form of data collection, can lead to observer bias. Data collection can take a long time and may require lengthy times, resources and funding to conduct the study. 9
Table 8.2 provides an example of a single case study and of a collective case study.
Title | ||
---|---|---|
Nayback-Beebe, 2012 | Clack, 2018 | |
‘The purpose of this phenomenological qualitative case study… was to gain a holistic understanding of the lived-experience of a male victim of intimate partner violence and the real-life context in which the violence emerged.’ | ‘in-depth investigation of the main barriers, facilitators and contextual factors relevant to successfully implementing these strategies in European acute care hospitals’ | |
‘What is the lived experience of living in and leaving an abusive intimate relationship for a white middle class male?’ | ‘(1) what are the main barriers and facilitators to successfully implementing CRBSI prevention procedures?; and (2) what role do contextual factors play?’ | |
A single, intrinsic qualitative research study. Following Yin’s case study approach, the authors wished to uncover the contextual conditions relevant to the phenomenon under study – living in and leaving an abusive intimate relationship as a white, middle-class male. The researchers wanted to understand and explore the contextual conditions related to female-to-male perpetrated intimate partner violence. | A qualitative comparative case study of 6 of the 14 hospitals participating in the Prevention of Hospital Infections by Intervention and Training (PROHIBIT) randomised controlled study on the prevention of catheter-related bloodstream infection prevention. The case study examined contextual factors that affect the implementation of an intervention, particularly across culturally, politically and economically diverse hospital settings in Europe. | |
United States of America, insights from a case study to provide nurses with an understanding that intimate partner violence occurs in the lives of men and women, and to be aware of this in the inpatient and outpatient settings. | European acute-care hospitals that were participating in the PROHIBIT randomised controlled trial. | |
Three in-depth interviews conducted for one month. The participant was a 44-year-old man who met the following inclusion criteria: • self-reported survivor of physical, emotional, verbal abuse, harassment and/or humiliation by a current or former partner • the violence occurred in the context of a heterosexual relationship • was in the process of leaving or had left the relationship | Data collection before and after the implementation of an intervention and included 129 interviews (133 hours) with hospital administration, IPC and ICU leadership and staff, telephone interviews with onsite investigators alongside 41 hours of direct observations | |
Existential phenomenology following Colaizzi’s method for data analysis. | Thematic analysis was inductive (first site visit) and deductive (second site visit), with cross-case analysis using a stacking technique; cases were grouped according to common characteristics and differences, and similarities were examined. | |
Theme 1. Living in the relationship – confrontation from within Theme 2. Living in the relationship – confrontation from without Theme 3. Leaving the relationship – realisation and relinquishment Overarching theme: Living with a knot in your stomach | Three meta themes were identified • implementation agendas • resourcing • boundary spanning |
Qualitative case studies provide a study design with diverse methods to examine the contextual factors relevant to understanding the why and how of a phenomenon within a case. The design incorporates single case studies and collective cases, which can also be embedded within randomised controlled trials as a form of process evaluation.
Qualitative Research – a practical guide for health and social care researchers and practitioners Copyright © 2023 by Darshini Ayton is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.
(Stanford users can avoid this Captcha by logging in.)
Available online, at the library.
The Education Library is closed for construction. Request items for pickup at another library.
Call number | Note | Status |
---|---|---|
LB1028 .M396 1998 | In process |
Creators/contributors, contents/summary.
Browse related items.
© Stanford University , Stanford , California 94305 .
Introduction, challenging some common methodological assumptions about online qualitative surveys, ten practical tips for designing, implementing and analysing online qualitative surveys, acknowledgements, conflict of interest statement, data availability, ethical approval.
Samantha L Thomas, Hannah Pitt, Simone McCarthy, Grace Arnot, Marita Hennessy, Methodological and practical guidance for designing and conducting online qualitative surveys in public health, Health Promotion International , Volume 39, Issue 3, June 2024, daae061, https://doi.org/10.1093/heapro/daae061
Online qualitative surveys—those surveys that prioritise qualitative questions and interpretivist values—have rich potential for researchers, particularly in new or emerging areas of public health. However, there is limited discussion about the practical development and methodological implications of such surveys, particularly for public health researchers. This poses challenges for researchers, funders, ethics committees, and peer reviewers in assessing the rigour and robustness of such research, and in deciding the appropriateness of the method for answering different research questions. Drawing and extending on the work of other researchers, as well as our own experiences of conducting online qualitative surveys with young people and adults, we describe the processes associated with developing and implementing online qualitative surveys and writing up online qualitative survey data. We provide practical examples and lessons learned about question development, the importance of rigorous piloting strategies, use of novel techniques to prompt detailed responses from participants, and decisions that are made about data preparation and interpretation. We consider reviewer comments, and some ethical considerations of this type of qualitative research for both participants and researchers. We provide a range of practical strategies to improve trustworthiness in decision-making and data interpretation—including the importance of using theory. Rigorous online qualitative surveys that are grounded in qualitative interpretivist values offer a range of unique benefits for public health researchers, knowledge users, and research participants.
Public health researchers are increasingly using online qualitative surveys.
There is still limited practical and methodological information about the design and implementation of these studies.
Building on Braun and Clarke (2013) , Terry and Braun (2017) and Braun et al . (2021) , we reflect on the methodological and practical lessons we have learnt from our own experience with conducting online qualitative surveys.
We provide guidance and practical examples about the design, implementation and analysis processes.
We argue that online qualitative surveys have rich potential for public health researchers and can be an empowering and engaging way to include diverse populations in qualitative research.
Public health researchers mostly engage in experiential (interpretive) qualitative approaches ( Braun and Clarke, 2013 ). These approaches are ‘centred on the exploration of participants’ subjective experiences and sense-making’ [( Braun and Clarke, 2021c ), p. 39]. Given the strong focus in public health on social justice, power and inequality, researchers proactively use the findings from these qualitative studies—often in collaboration with lived experience experts and others who are impacted by key decisions ( Reed et al ., 2024 )—to advocate for changes to public health policy and practice. There is also an important level of theoretical, methodological and empirical reflection that is part of the public health researcher’s role. For example, as qualitative researchers actively construct and interpret meaning from data, they constantly challenge their assumptions, their way of knowing and their way of ‘doing’ research ( Braun and Clarke, 2024 ). This reflexive practice also includes considering how to develop more inclusive opportunities for people to participate in research and to share their opinions and experiences about the issues that matter to them.
While in-depth interviews and focus groups provide rich and detailed narratives that are central to understanding people’s lives, these forms of data collection may sometimes create practical barriers for both researchers and participants. For example, they can be time consuming, and the power dynamics associated with face-to-face interviews (even in online settings) may make them less accessible for groups that are marginalized or stigmatized ( Edwards and Holland, 2020 ). While some population subgroups (and contexts) may suit (or require) face-to-face qualitative data collection approaches, others may lend themselves to different forms of data collection. Young people, for example, may be keen to be civically involved in research about the issues that matter to them, such as the climate crisis, but they may find it more convenient and comfortable using anonymized digital technologies to do so ( Arnot et al ., 2024b ). As such, part of our reflexive practice as public health researchers must be to explore, and be open to, a range of qualitative methodological approaches that could be more convenient, less intimidating and more engaging for a diverse range of population subgroups. This includes thinking about pragmatic ways of operationalizing qualitative data collection methods. How can we develop methods and engagement strategies that enable us to gain insights from a diverse range of participants about new issues or phenomenon that may pose threats to public health, or look at existing issues in new ways?
Advancements in online data collection methods have also created new options for researchers and participants about how they can be involved in qualitative studies ( Hensen et al ., 2021 ; Chen, 2023 ; Fan et al ., 2024 ). Online qualitative surveys—those surveys that prioritize qualitative values and questions—have rich potential for qualitative researchers. Braun and Clarke (2013 , p. 135) state that qualitative surveys:
…consist of a series of open-ended questions about a topic, and participants type or hand-write their responses to each question. They are self-administered; a researcher-administered qualitative survey would basically be an interview.
While these types of studies are increasingly utilized in public health, researchers have highlighted that there is still relatively limited discussion about the methodological and practical implications of these surveys ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ; Braun et al ., 2021 ). This poses challenges for qualitative public health researchers, funders, ethics committees and peer reviewers in assessing the purpose, rigour and contribution of such research, and in deciding the appropriateness of the method for answering different research questions.
Using examples from online qualitative surveys that we have been involved in, this article discusses a range of methodological and practical lessons learnt from developing, implementing and analysing data from these types of surveys. While we do not claim to have all the answers, we aim to develop and extend on the methodological and practical guidance from Braun and Clarke (2013) , Terry and Braun (2017) and Braun et al . (2021) about the potential for online qualitative surveys. This includes how they can provide a rigorous ‘wide-angle picture’ [( Toerien and Wilkinson, 2004 ), p. 70] from a diverse range of participants about contemporary public health phenomena.
Figure 1 aims to develop and extend on the key points made by Braun and Clarke (2013) , Terry and Braun (2017) and Braun et al . (2021) , which provide the methodological and empirical foundation for our article.
: Methodological considerations in conducting online qualitative surveys.
Online qualitative surveys take many forms. They may be fully qualitative or qualitative dominant—mostly qualitative with some quantitative questions ( Terry and Braun, 2017 ). There are also many different ways of conducting these studies—from using a smaller number of questions that engage specific population groups or knowledge users in understanding detailed experiences ( Hennessy and O’Donoghue, 2024 ), to a larger number of questions (which may use market research panel providers to recruit participants), that seek broader opinions and attitudes about public health issues ( Marko et al ., 2022a ; McCarthy et al ., 2023 ; Arnot et al ., 2024a ). However, based on our experiences of applying for grant funding and conducting, publishing and presenting these studies, there are still clear misconceptions and uncertainties about these types of surveys.
One of the concerns raised about online qualitative surveys is how they are situated within broader qualitative values and approaches. This includes whether they can provide empirically innovative, rigorous, rich and theoretically grounded qualitative contributions to knowledge. Our experience is that online qualitative surveys have the most potential when they harness the values of interpretivist ‘Big Q’ approaches to collect information from a diverse range of participants about their experiences, opinions and practices ( Braun et al ., 2021 ). The distinction between positivist (small q) and interpretivist (Big Q) approaches to online qualitative surveys is an important one that requires some initial methodological reflection, particularly in considering the (largely unhelpful) critiques that are made about the rigour and usefulness of these surveys. These critiques often overlook the theoretical underpinnings and qualitative values inherent in such surveys. For example, while there may be a tendency to think of surveys and survey data as atheoretical and descriptive, the use of theory is central in informing online qualitative surveys. For example, Varpio and Ellaway (2021 , p. 343) explain that theory can ‘offer explanations and detailed premises that we can wrestle with, agree with, disagree with, reject and/or accept’. This includes the research design, the approach to data collection and analysis, the interpretation of findings and the conclusions that are drawn. Theory is also important in helping researchers to engage in reflexive practice. The use of theory is essential in progressing online qualitative surveys beyond description and towards in-depth interpretation and explanations—thus facilitating a deeper understanding of the studied phenomenon ( Collins and Stockton, 2018 ; Jamie and Rathbone, 2022 ).
The main assumptions about online qualitative surveys are that they can only collect ‘thin’ textual data, and that they are not flexible enough as a data collection tool for researchers to prompt or ask follow-up questions or to co-create detailed and rich data with participants ( Braun and Clarke, 2013 ; Terry and Clarke, 2017 ; Braun et al ., 2021 ). While we acknowledge that the type of data that is collected in these types of studies is different from those in in-depth interview studies, these surveys may be a more accessible and engaging way to collect rich insights from a diverse range of participants who may otherwise not participate in qualitative research ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ; Braun et al ., 2021 ). Despite this, peer reviewers can question the depth of information that may be collected in these studies. Assumptions about large but ‘thin’ datasets may also mean that researchers, funders and reviewers take (and perhaps expect) a more positivist approach to the design and analytical processes associated with these surveys. For example, the multiple topics and questions, larger sample sizes, and the generally smaller textual responses that online qualitative surveys generate may lead researchers to approach these surveys using more descriptive and atheoretical paradigms. This approach may focus on ‘measuring’ phenomena, using variables, developing thinner analytical description and adding numerical values to the number of responses for different categories or themes.
We have found that assumptions can also impact the review processes associated with these types of studies, receiving critiques from those with both positivist and interpretivist positions. Positivist critiques focus on matters associated with whether the samples are ‘representative’, and the flaws associated with ‘self-selecting convenience’ samples. Critiques from interpretivist colleagues question why such large sample sizes are needed for qualitative studies, seeing surveys as a less rigorous method for gaining rich and meaningful data. For example, we have had reviewers query the scope and depth of the analysis of the data that we present from these studies because they are concerned that the type of data collected lacks depth and does not fully contextualize and explain how participants think about issues. We have also had reviewers request that we should return to the study to collect quantitative data to supplement the qualitative findings of the survey. They also question how ‘representative’ the samples are of population groups. These comments, of course, are not unique to online qualitative surveys but do highlight the difficulty that reviewers may have in placing and situating these types of studies in broader qualitative approaches. With this in mind, we have also found that some reviewers can ask for additional information to justify both the use of online qualitative surveys and why we have chosen these over other qualitative approaches. For example, reviewers have asked us to justify why we have chosen an online qualitative survey and also to explain what we may have missed out on by not conducting in-depth interviews or quantitative or mixed methods surveys instead.
While there is now a general understanding that attributing ‘numbers’ to qualitative data is largely unhelpful and inappropriate ( Chowdhury, 2015 ), there may be expectations that the larger sample sizes associated with online qualitative surveys enable researchers to provide numerical indicators of data. Rather than focusing on the ‘artfully interpretive’ techniques used to analyse and construct themes from the data ( Finlay, 2021 ), we have found that reviewers often ask us to provide numerical information about how many people provided different responses to different questions (or constructed themes), and the number at which ‘saturation’ was determined. Reviewer feedback that we have received about analytical processes has asked for detailed explanations about why attempts to ‘minimize bias’ (including calculations of inter-rater reliability and replicability of data quality) were not used. This demonstrates that peer reviewers may misinterpret the interpretivist values that guide online qualitative surveys, asking for information that is essentially ‘meaningless’ in qualitative paradigms in which researchers’ subjectivity ‘sculpts’ the knowledge that is produced ( Braun and Clarke, 2021a ).
As well as a ‘wide-angle picture’ [( Toerien and Wilkinson, 2004 ), p. 70] on phenomenon, online qualitative surveys can also: (i) generate both rich and focused data about perceptions and practices, and (ii) have multiple participatory and practical advantages—including helping to overcome barriers to research participation ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ; Braun et al ., 2021 ). For researchers , online qualitative surveys can be a more cost-effective alternative ( Braun and Clarke, 2013 ; Terry and Braun, 2017 )—they are generally more time-efficient and less labour-intensive (particularly if working with market research companies to recruit panels). They are also able to reach a broad range of participants—such as those who are geographically dispersed ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ), and those who may not have internet connectivity that is reliable enough to complete online interviews (a common issue for individuals living in regional or rural settings) ( de Villiers et al ., 2022 ). We are also more able to engage young people in qualitative research through online surveys, perhaps partly due to extensive panel company databases but also because they may be a more accessible and familiar way for young people to participate in research. The ability to quickly investigate new public health threats from the perspective of lived experience can also provide important information for researchers, providing justification for new areas of research focus, including setting agendas and advocating for the need for funding (or policy attention). Collecting data from a diverse range of participants—including from those who hold views that we may see as less ‘politically acceptable’, or inconsistent with our own public health reasoning about health and equity—is important in situating and contextualizing community attitudes towards particular issues.
For participants , benefits include having a degree of autonomy and control over their participation, including completing the survey at a time and place that suits them, and the anonymous nature of participation (that may be helpful for people from highly stigmatized groups). Participants can take time to reflect on their responses or complete the survey, and may feel more able to ‘talk back’ to the researcher about the framing of questions or the purpose of the research ( Braun et al ., 2021 ). We would also add that a benefit of these types of studies is that participants can also drop out of the study easily if the survey does not interest them or meet their expectations—something that we think might be more onerous or uncomfortable for participants in an interview or focus group.
For knowledge users, including advocates, service providers and decision-makers, qualitative research provides an important form of evidence, and the ‘wide-angle picture' [( Toerien and Wilkinson, 2004 ), p. 70] on issues from a diverse range of individuals in a community or population can be a powerful advocacy tool. Online qualitative surveys can also provide rapid insights into how changes to policy and practice may impact population subgroups in different ways.
There are, of course, some limitations associated with online qualitative surveys ( Braun et al ., 2021 ; Marko et al ., 2022b ). For example, there is no ability to engage individuals in a ‘traditional’ conversation or to prompt or probe meaning in the interactive ways that we are familiar with in interview studies. There is less ability to refine the questions that we ask participants in an iterative way throughout a study based on participant responses (particularly when working with market research panel companies). There may also be barriers associated with written literacy, access to digital technologies and stable internet connections ( Braun et al ., 2021 ). They may also not be the most suitable for individuals who have different ways of ‘knowing, being and doing’ qualitative research—including Indigenous populations [( Kennedy et al ., 2022 ), p. 1]. All of these factors should be taken into consideration when deciding whether online qualitative surveys are an appropriate way of collecting data. Finally, while these types of surveys can collect data quickly ( Marko et al ., 2022b ), there can also be additional decision-making processes related to data preparation and inclusion that can be time-consuming.
There are a range of practical considerations that can improve the rigour, trustworthiness and quality of online qualitative survey data. Again, developing and expanding on ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ; Braun et al ., 2021 ), Figure 2 gives an overview of some key practical considerations associated with the design, implementation and analysis of these surveys. We would also note that before starting your survey design, you should be aware that people may use different types of technology to complete the survey, and in different spaces. For example, we cannot assume that people will be sitting in front of a computer or laptop at home or in the office, with people more likely to complete surveys on a mobile phone, perhaps on a train or bus on the way to work or school.
: Top ten practical tips for conducting online qualitative surveys.
Creating an appropriate and accessible structure
The first step in designing an online qualitative survey is to plan the structure of your survey. This step is important because the structure influences the way that participants interact with and participate through the survey. The survey structure helps to create an ‘environment’ that helps participants to share their perspectives, prompt their views and develop their ideas ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ). Similar to an interview study, the structure of the survey guides participants from one set of questions (and topics) to the next. It is important to consider the ordering of topics to enable participants to complete a survey that has a logical flow, introduces participants to concepts and allows them to develop their depth of responses.
Before participants start the survey, we provide a clear and simple lay language summary of the survey. Because many individuals will be familiar with completing quantitative surveys, we include a welcoming statement and reiterate the qualitative nature of the survey, stating that their answers can be about their own experiences:
Thank you for agreeing to take part in this survey about [topic] . This survey involves writing responses to questions rather than checking boxes.
We then clearly reiterate the purpose of the survey, providing a short description of the topic that we are investigating. We state that we do not seek to collect any data that is identifiable, that we are interested in participants perspectives, that there are no right or wrong answers, and that participants can withdraw from the survey at any time without giving a reason.
Similar to Braun et al . (2021) , we start our surveys with questions about demographic and related characteristics (which we often call ‘ participant/general characteristics ’). These can be discrete choice questions, but can also utilize open text—for example, in relation to gender identity. We have found that there is always a temptation with surveys to ask many questions about the demographic characteristics of participants. However, we caution that too many questions can be intrusive for participants and can take away valuable time from open-text questions, which are the core focus of the survey. We recommend asking participant characteristic and demographic questions that situate and contextualize the sample ( Elliott et al ., 1999 ).
We generally start the open-text sections of these surveys by asking broad introductory questions about the topic. This might include questions such as: ‘Please describe the main reasons you drink alcohol ’, and ‘W hat do you think are the main impacts of climate change on the world? ’ We have found that these types of questions get participants used to responding to open-text questions relevant to the study’s research questions and aims. For each new topic of investigation (which are based on our theoretical concepts and overall study aims and research questions), we provide a short explanation about what we will ask participants. We also use tools and text to signpost participant progress through the survey. This can be a valuable way to avoid high attrition rates where participants exit the survey because they are getting fatigued and are unclear when the survey will end:
Great! We are just over half-way through the survey.
We ask more detailed questions that are more aligned with our theoretical concepts in the middle of the survey. For example, we may start with broad questions about a harmful industry and their products (such as gambling, vaping or alcohol) and then in the middle of the survey ask more detailed questions about the commercial determinants of health and the specific tactics that these industries use (for example, about product design, political tactics, public relations strategies or how these practices may influence health and equity). In relation to these more complex questions, it is particularly important that we reiterate that there are no wrong answers and try to include encouraging text throughout the survey:
There are no right or wrong answers—we are curious to hear your opinions .
We always try to end the survey on a positive. While these types of questions depend on the study, we try to ask questions which enable participants to reflect on what could be done to address or improve an issue. This might include their attitudes about policy, or what they would say to those in positions of power:
What do you think should be done to protect young people from sports betting advertising on social media? If there was one thing that could be done to prevent young people from being exposed to the risks associated with alcohol, cigarettes, vaping, or gambling, what would it be? If you could say one thing to politicians about climate change, what would it be?
Finally, we ask participants if there is anything we have missed or if they have anything else to add, sometimes referred to as a ‘clean-up’ question ( Braun and Clarke, 2013 ). The following provides a few examples of how we have framed these questions in some of our studies:
Is there anything you would like to say about alcohol, cigarettes, vaping, and gambling products that we have not covered? Is there anything we haven’t asked you about the advertising of alcohol to women that you would like us to know?
Considering the impact of the length of the survey on responses
The length of the survey (both the number of questions and the time it takes an individual to complete the survey) is guided by a range of methodological and practical considerations and will vary between studies ( Braun and Clarke, 2013 ). Many factors will influence completion times. We try to give individuals a guide at the start of the survey about how long we think it will take to complete the survey (for example, between 20 and 30 minutes). We highlight that it may take people a little longer or shorter and that people are able to leave their browser open or save the survey and come back to finish it later. For our first few online qualitative surveys, we found that we asked lots of questions because we felt less in control of being able to prompt or ask follow-up questions from participants. However, we have learned that less is more! Asking too many questions may lead to more survey dropouts, and may significantly reduce the textual quality of the information that you receive from participants ( Braun and Clarke, 2013 ; Terry and Clarke, 2017 ). This includes considering how the survey questions might lead to repetition, which may be annoying for participants, leading to responses such as ‘like I’ve already said’ , ‘I’ve already answered that’ or ‘see above’ .
Providing clear and simple guidance
When designing an online qualitative survey, we try to think of ways to make participation in the survey engaging. We do not want individuals to feel that we are ‘mining’ them for data. Rather we want to demonstrate that we are genuinely interested in their perspectives and views. We use a range of mechanisms to do this. Because there is no opportunity to verbally explain or clarify concepts to participants, there is a particular need to ensure that the language used is clear and accessible ( Braun and Clarke, 2013 ; Terry and Clarke, 2017 ). If language or concepts are complex, you are more likely to receive ‘I don’t know’ responses to your questions. We need to remember that participants have a range of written and comprehension skills, and inclusive and accessible language is important. We also never try to assume a level of knowledge about an issue (unless we have specifically asked for participants who are aware and engaged in an issue—such as women who drink alcohol) ( Pitt et al ., 2023 ). This includes avoiding highly technical or academic language and not making assumptions that the individuals completing the survey will understand concepts in the same way that researchers do ( Braun and Clarke, 2013 ). Clearly explaining concepts or using text or images to prompt memories can help to overcome this:
Some big corporations (such as the tobacco, vaping, alcohol, junk food, or gambling industries) sponsor women's sporting teams or clubs, or other events. You might see sponsor logos on sporting uniforms, or at sporting grounds, or sponsoring a concert or arts event.
At all times, we try to centre the language that we use with the population from which we are seeking responses. Advisory groups can be particularly helpful in framing language for different population subgroups. We often use colloquial language, even if it might not be seen as the ‘correct’ academic language or terminology. Where possible, we also try to define theoretical concepts in a clear and easy to understand way. For example, in our study investigating parent perceptions of the impact of harmful products on young people, we tried to clearly define ‘normalization’:
In this section we ask you about some of the perceived health impacts of the above products on young people. We also ask you about the normalisation of these products for young people. When we talk about normalisation, we are thinking about the range of factors that might make these products more acceptable for young people to use. These factors might include individual factors, such as young people being attracted to risk, the influence of family or peers, the accessibility and availability of these products, or the way the industry advertises and promotes these products.
Using innovative approaches to improve accessibility and prompt responses
Online qualitative surveys can include features beyond traditional question-and-answer formats ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ). For example, we often use a range of photo elicitation techniques (using images or videos) to make surveys more accessible to participate in, address different levels of literacy, and overcome the assumption that we are not able to ‘prompt’ responses. These types of visual methodologies enable a collaborative and creative research experience by asking the participant to reflect on aspects of the visual materials, such as symbolic representations, and discuss these in relation to the research objectives ( Glaw et al ., 2017 ). The combination of visual images and clear descriptions helps to provide a focus for responses about different issues, as well as prompting nuanced information such as participant memories and emotions ( Glaw et al ., 2017 ). We use different types of visuals in our studies, such as photographs (including of the public health issues we’re investigating); screenshots from websites and social media posts (including newspaper headlines) and videos (including short videos from social media sites such as TikTok) ( Arnot et al ., 2024b ). For example, when talking about government responses to the climate crisis, we used a photograph of former Australian Prime Minister Scott Morrison holding a piece of coal in the Australian parliament to prompt participants’ thinking about the government’s relationship with fossil fuels and to provide a focal point for their answer. However, we would caution against using any images that may be confronting for participants or deliberately provocative. The purpose of using visuals must always be in the interests of the participants—to clarify, prompt and reflect on concepts. Ethics committees should carefully review the images used in surveys to ensure that they have a clear purpose and are unlikely to cause any discomfort.
Thinking carefully about your criteria for recruitment
Determining the sample size of online qualitative studies is not an exact science. The sample sizes for recent studies have ranged from n = 46 in a study about pregnancy loss ( Hennessy and O’Donoghue, 2024 ), to n = 511 in a study with young people about the climate crisis ( Arnot et al ., 2023b ). We follow ‘rules of thumb’ [( Braun and Clarke, 2021b ), p. 211] which try to balance the needs of the research and data richness with key practical considerations (such as funding and time constraints), funder expectations, discipline-specific norms and our knowledge and experience of designing and implementing online qualitative surveys. However, we have found that peer reviewers expect much more justification of sample sizes than they do for other types of qualitative research. Robust justification of sample sizes are often needed to prevent any ‘concerns’ that reviewers may raise. Our response to these reviews often reiterates that our focus (as with all qualitative research) is not to produce a ‘generalisable’ or ‘representative’ sample but to recruit participants who will help to provide ‘rich, complex and textured data’ [( Terry and Braun, 2017 ), p. 15] about an issue. Instead of focusing on data saturation, a contested concept which is incongruent with reflexive thematic analysis in particular ( Braun and Clarke, 2021b ), we find it useful to consider information power to determine the sample size for these surveys ( Malterud et al ., 2016 ). Information power prioritizes the adequacy, quality and variability of the data collected over the number of participants.
Recruitment for online qualitative surveys can be influenced by a range of factors. Monetary and time constraints will impact the size and, if using market research company panels, the specificity of participant quotas. Recruitment strategies must be developed to ensure that the data provides enough information to answer the research questions of the study. For our research purposes, we often try to ensure that participants with a range of socio-demographic characteristics are invited to participate in the sample. We set soft quotas for age, gender and geographic location to ensure some diversity. We have found that some population subgroups may also be recruited more easily than others—although this may depend on the topic of the survey. For example, we have found that quotas for women and those living in metropolitan areas may fill more quickly. In these scenarios, the research team must weigh up the timelines associated with recruitment and data collection (e.g. How long do we want to run data collection for? How much of our budget can be spent on achieving a more equally split sample? Are quotas necessary?) versus the purpose and goals of the research (i.e. to generate ideas rather than data representativeness), and the study-specific aims and research questions.
There are, of course, concerns about not being able to ‘see’ the people that are completing these surveys. There is an increasing focus in the academic literature on ‘false’ respondents, particularly in quantitative online surveys ( Levi et al ., 2021 ; Wang et al ., 2023 ). This will be an important ongoing discussion for qualitative researchers, and we do not claim to have the answers for how to overcome these issues. For example, some individuals may say that they meet the inclusion criteria to access the survey, while others may not understand or misinterpret the inclusion criteria. There is also a level of discomfort about who and how we judge who may be a ‘legitimate’ participant or not. However, we can talk practically about some of the strategies that we use to ensure the rigour of data. For example, we find that screening questions can provide a ‘double-check’ in relation to inclusion criteria and can also help with ensuring that there is consistency between the information an individual provides about how they meet the inclusion criteria and subsequent responses. For example, in a recent survey of parents of young people, a participant stated that they were 18 years old and were a parent to a 16-year-old and 15-year-old. Their overall responses were inconsistent with being a parent of children these ages. Similarly, in our gambling studies, people may tick that they have gambled in the last year but then in subsequent questions say they have not gambled at all. This highlights the importance of checking data across all questions, although it should be noted that time and cost constraints associated with comprehensively scanning the data for such responses are not always feasible and can result in overlooking these participants.
Ensuring that there are strategies to create agency and engage participants in the research
One of the benefits of online qualitative surveys compared to traditional quantitative surveys is the scope for participants to explain their answers and to disagree with the research team’s position. An indication that participants are feeling able to do this is when they are asked for any additional comments at the end of the survey. For example, in a survey about women’s attitudes towards alcohol marketing, the following participant concluded the survey by writing: ‘I think you have covered everything. I think that you need to stop shaming women for having fun’. Other participants demonstrate their engagement and interest in the survey by reaffirming the perspectives they have shared throughout the survey. For example, in a study with young people on climate, participants responded at the end that ‘it’s one of the few things I actually care about’ , while another commented on the quality of the survey questions, stating, ‘I think this survey did a great job with probing questions to prompt all the thoughts I have on it’ .
We also think that online qualitative surveys may lead to less social desirability in participants’ responses. Participants seem less wary about communicating less politically correct opinions than they may do in a face-to-face interview. For example, at times, participants communicate attitudes that may not align with public health values (e.g. supporting personal responsibility, anti-nanny state, and neoliberal ideologies of health and wellbeing), that we rarely see communicated to us in in-depth interview or focus group studies. We would argue that these perspectives are valuable for public health researchers because they capture a different community voice that may not otherwise be represented in research. This may show where there is a lack of support for health interventions and policy reforms and may indicate where further awareness-raising needs to occur. These types of responses also contribute to reflexive practice by challenging our assumptions and positions about how we think people should think or feel about responses to particular public health issues. Examples of such responses from our surveys include:
"Like I have already said, if you try to hide it you will only make it more attractive. This nanny-state attitude of the elite drives me crazy. People must be allowed to decide for themselves."
Ethical issues for participants and researchers
Researchers should also be aware that some of the ethical issues associated with online qualitative surveys may be different from those in in-depth interviews—and it is important that these are explained in any ethical consideration of the study. Providing a clear and simply worded Plain Language Statement (in written or video form) is important in establishing informed consent and willingness to participate. While participants are given information about who to contact if they have further questions about the study, this may be an extra step for participants, and they may not feel as able to ask for clarification about the study. Because of this, we try to provide multiple examples of the types of questions that we will ask, as well as providing downloadable support details (for example, for mental health support lines). A positive aspect of surveys is that participants are able to easily ignore recruitment notices to participate in the study. They are also able to stop the survey at any time by exiting out of the browser if they feel discomfort without having to give a reason in person to a researcher.
While the anonymous nature of the survey may be empowering for some participants ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ; Braun et al. , 2021 ), it can also make it difficult for researchers to ascertain if people need any further support after completing the survey. Participants may also fill in surveys with someone else and may be influenced about how they should respond to questions (with the exception of some studies in which people may require assistance from someone to type their responses). Because of the above, some researchers, ethics committees and funders may be more cautious about using these studies for highly sensitive subjects. However, we would argue that the important point is that the studies follow ethical principles and take the lack of direct contact with participants into the ethical considerations of the study. It is also important to ensure that platforms used to collect survey data are trusted and secure. Here, we would argue that universities have an obligation to investigate and, where possible, approve survey providers to ensure that researchers are using platforms that meet rigorous standards for data and privacy.
It is also important to note that there may be responses from participants that may be challenging ( Terry and Braun, 2017 ; Braun and Clarke, 2021 ). Online spaces are rife with trolling due to their anonymous nature, and online surveys are not immune to this behaviour. Naturally, this leads to some silly responses—‘ Deakin University is responsible for all of this ’, but researchers should also be aware that the anonymity of surveys can (although in our experience not often) lead to responses that may cause discomfort for the researchers. For example, when asked if participants had anything else to add to a climate survey ( Arnot et al ., 2024c ), one responded ‘ nope, but you sure asked a lot of dumbass questions’ . Just as with interview-based studies, there must be processes built into the research for debriefing—particularly for students and early career researchers—as well as clear decisions about whether to include or exclude these types of responses when preparing the dataset for analysis and in writing up the results from the survey.
The importance of piloting the survey
Because of the lack of ability to explain and clarify concepts, piloting is particularly important ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ; Braun et al. , 2021 ) to ensure that: (i) the technical aspects of the survey work as intended; (ii) the survey is eliciting quality responses (with limited ‘nonsensical’ responses such as random characters); (iii) the survey responses indicate comprehension of the survey questions; and (vi) there is not a substantial number of people who ‘drop-out’ of the study. Typically, we pilot our survey with 10% of the intended sample size. After piloting, we often change question wording, particularly to address questions that elicit very small text responses, the length of the survey and sometimes refine definitions or language to ensure increased comprehension. Researchers should remember that changes to the survey questions may need to be reviewed by ethics committees before launching the full survey. It is important to build in time for piloting and the revision of the survey to ensure you get this right as once you launch the full survey, there is no going back!
Preparing the dataset
Once launching the full survey, the quality of data and types of responses you receive in these types of surveys can vary. There is very limited transparency around how the dataset was prepared (more familiar to some as ‘data cleaning’) in published papers, including the decisions about which (if any) participants (or indeed responses) were excluded from the dataset and why. Nonsensical responses can be common—and can take a range of forms ( Figure 3 ). These can include random numbers or letters, a chunk of text that has been copied and pasted from elsewhere, predictive text or even repeat emojis. In one study, we had a participant quote the script of The Bee Movie in response to questions.
: Visual examples of nonsensical responses in online qualitative surveys.
Part of our familiarization with the dataset [Phase One in Braun and Clarke’s reflexive approach to thematic analysis ( Braun and Clarke, 2013 ; Braun et al ., 2021 )] includes preparing the dataset for analysis. We use this phase to help make decisions about what to include and exclude from the final dataset. While a row of emojis in the data file can easily be spotted and removed from the dataset, sometimes responses can look robust until you read, become familiar and engage with the data. For example, when asked about what they thought about collective climate action ( Arnot et al ., 2023a , 2024c ), some participants entered random yet related terms such as ‘ plastic ’, or repeated similar phrases across multiple questions:
“ why do we need paper straws ”, “ paper straws are terrible ”, “ papers straws are bad for you ”, “ paper straws are gross .”
Participants can also provide comprehensive answers for the first few questions and then nonsensical responses for the rest, which may also be due to question fatigue [( Braun and Clarke, 2013 ), p. 138]. Therefore, it is important to closely go through each participant’s response to ensure they have attempted to provide bone-fide responses. For example, in one of our young people and climate surveys ( Arnot et al ., 2023a , 2024c ), one participant responded genuinely to the first half of the survey before their quality dropped dramatically:
“I can’t even be bothered to read that question ”, “ why so many questions ”, “ bro too many sections. ”
Some market research panel providers may complete an initial quality screen of data. However, this does not replace the need for the research teams’ own data preparation processes. Researchers should ensure they are checking that responses are coherent—for example, not giving information that contradicts or is not credible. In our more recent studies, we have increasingly seen responses cut and pasted from ChatGPT and other AI tools—providing a new challenge in assessing the quality of responses. If you are seeing these types of responses, it might be an opportunity to think about the style and suitability of the questions being asked. For example, the use of AI tools might suggest that people are finding it difficult to answer questions or may feel that they have to present a ‘correct’ answer. We would also note that because of the volume of data in these surveys, the preparation of data involves multiple members of the team. In many cases, decisions need to be made about participants who may not have provided authentic responses across the survey. The research team should make clear in any paper their decisions about their choices to include or exclude participants from the study. There is a careful balancing act that can require assessing the quality of the participants’ responses across the whole dataset to determine if the overall quality of responses contributes to the research.
Navigating the volume of data and writing up results
Finally, discussions about how to navigate the volume of data that these types of studies produce could be a standalone paper. In general, principles of reflexive practices apply to the analysis of data from these studies. However, as a starting point, here are a few considerations when approaching these datasets.
We would argue that online qualitative surveys lend themselves to some types of analytical approaches over others—for example, reflexive thematic analysis, as compared to grounded theory or interpretive phenomenological analysis (though it can be used with these) ( Braun and Clarke, 2013 ; Terry and Braun, 2017 ).
While initial familiarization, coding and analysis can focus on specific questions and associated responses, it is important to analyse the dataset as a whole (or as clusters associated with particular topics) as participants may provide relevant data to a topic under multiple questions ( Terry and Braun, 2017 ). We initially focus our coding on specific questions or a group of survey questions under a topic of investigation. Once we have developed and constructed preliminary themes from the data associated with these clusters of questions, we then move to looking at responses across the dataset as we review themes further.
Researchers should think carefully about how to manage the data—which may not be available as ‘individual participant transcripts’ but rather as a ‘whole’ dataset in an Excel spreadsheet. Some may prefer qualitative data analysis software (QDAS) to manage and navigate data. However, many of us find that Excel (and particularly the use of labelled Tabs) is useful in grouping data and moving from codes to constructing themes.
As with all rigorous qualitative research, coding and theme development should be guided by the research questions. A clear record of decision-making about analytical choices (and being reflexive about these) should be kept. In any write-up, we would recommend that researchers are clear about which survey questions they used in the analysis [researchers could consider providing a supplementary file of some or all of the survey questions—see, for example Hennessy and O’Donoghue (2024) ].
In writing up the results, researchers should still seek to present a rich description of the data, as demonstrated in the presentation of results in the following papers ( Marko et al ., 2022a , 2022b ; McCarthy et al ., 2023 ; Pitt et al ., 2023 ; Hennessy and O’Donoghue, 2024 ). We have found the use of tables with additional examples of quotes as they relate to themes and subthemes can be a practical way of providing the reader with further examples of the data, particularly when constrained by journal word count limits [see, for example, Table 2 in Arnot et al ., (2024c) ]. However, these tables do not replace a full and complete presentation of the interpretation of the data.
This article offers methodological reflections and practical guidance around online qualitative survey design, implementation and analysis. While online qualitative surveys engage participants in a different type of conversation, they have design features that enable the collection of rich data. We recognize that we have much to learn and that while no survey of ours has been perfect, each new experience with developing and conducting online qualitative surveys has brought new understandings and lessons for future studies. In recognizing that we are learning, we also feel that our experience to date could be valuable for progressing the conversation about the rigour of online qualitative surveys and maximizing this method for public health gains.
H.P. is funded through a VicHealth Postdoctoral Research Fellowship. S.M. is funded through a Deakin University Faculty of Health Deans Postdoctoral Fellowship. G.A. is funded by an Australian Government Research Training Program Scholarship. M.H. is funded through an Irish Research Council Government of Ireland Postdoctoral Fellowship Award [GOIPD/2023/1168].
The pregnancy loss study was funded by the Irish Research Council through its New Foundations Awards and in partnership with the Irish Hospice Foundation as civil society partner [NF/2021/27123063].
S.T. is Editor in Chief of Health Promotion International, H.P. is a member of the Editorial Board of Health Promotion International, S.M. and G.A. are Social Media Coordinators for Health Promotion International, M.H. is an Associate Editor for Health Promotion International. They were not involved in the review process or in any decision-making on the manuscript.
The data used in this study are not available.
Ethical approval for studies conducted by Deakin University include the climate crisis (HEAG-H 55_2020, HEAG-H 162_2021); parents perceptions of harmful industries on young people (HEAG-H 158_2022); women and alcohol marketing (HEAG-H 123_2022) and gambling (HEAG 227_2020).
Arnot , G. , Pitt , H. , McCarthy , S. , Cordedda , C. , Marko , S. and Thomas , S. L. ( 2024a ) Australian youth perspectives on the role of social media in climate action . Australian and New Zealand Journal of Public Health , 48 , 100111 .
Google Scholar
Arnot , G. , Pitt , H. , McCarthy , S. , Cordedda , C. , Marko , S. and Thomas , S. L. ( 2024b ) Australian youth perspectives on the role of social media in climate action . Australian and New Zealand Journal of Public Health , 48 , 100111 .
Arnot , G. , Thomas , S. , Pitt , H. and Warner , E. ( 2023a ) Australian young people’s perceptions of the commercial determinants of the climate crisis . Health Promotion International , 38 , daad058 .
Arnot , G. , Thomas , S. , Pitt , H. and Warner , E. ( 2023b ) ‘It shows we are serious’: young people in Australia discuss climate justice protests as a mechanism for climate change advocacy and action . Australian and New Zealand Journal of Public Health , 47 , 100048 .
Arnot , G. , Thomas , S. , Pitt , H. and Warner , E. ( 2024c ) Australian young people’s perspectives about the political determinants of the climate crisis . Health Promotion Journal of Australia , 35 , 196 – 206 .
Braun , V. and Clarke , V. ( 2013 ) Successful Qualitative Research: A Practical Guide for Beginners . Sage , London .
Google Preview
Braun , V. and Clarke , V. ( 2021a ) One size fits all? What counts as quality practice in (reflexive) thematic analysis ? Qualitative Research in Psychology , 18 , 328 – 352 .
Braun , V. and Clarke , V. ( 2021b ) To saturate or not to saturate? Questioning data saturation as a useful concept for thematic analysis and sample-size rationales . Qualitative Research in Sport, Exercise and Health , 13 , 201 – 216 .
Braun , V. and Clarke , V. ( 2021c ) Can I use TA? Should I use TA? Should I not use TA? Comparing reflexive thematic analysis and other pattern‐based qualitative analytic approaches . Counselling and Psychotherapy Research , 21 , 37 – 47 .
Braun , V. and Clarke , V. ( 2024 ) A critical review of the reporting of reflexive thematic analysis in Health Promotion International . Health Promotion International , 39 , daae049 .
Braun , V. , Clarke , V. , Boulton , E. , Davey , L. and McEvoy , C. ( 2021 ) The online survey as a qualitative research tool . International Journal of Social Research Methodology , 24 , 641 – 654 .
Chen , J. ( 2023 ) Digitally dispersed, remotely engaged: interrogating participation in virtual photovoice . Qualitative Research , 23 , 1535 – 1555 .
Chowdhury , M. F. ( 2015 ) Coding, sorting and sifting of qualitative data analysis: debates and discussion . Quality & Quantity , 49 , 1135 – 1143 .
Collins , C. S. and Stockton , C. M. ( 2018 ) The central role of theory in qualitative research . International Journal of Qualitative Methods , 17 , 160940691879747 .
de Villiers , C. , Farooq , M. B. and Molinari , M. ( 2022 ) Qualitative research interviews using online video technology—challenges and opportunities . Meditari Accountancy Research , 30 , 1764 – 1782 .
Edwards , R. and Holland , J. ( 2020 ) Reviewing challenges and the future for qualitative interviewing . International Journal of Social Research Methodology , 23 , 581 – 592 .
Elliott , R. , Fischer , C. T. and Rennie , D. L. ( 1999 ) Evolving guidelines for publication of qualitative research studies in psychology and related fields . British Journal of Clinical Psychology , 38 , 215 – 229 .
Fan , H. , Li , B. , Pasaribu , T. and Chowdhury , R. ( 2024 ) Online interviews as new methodological normalcy and a space of ethics: an autoethnographic investigation into Covid-19 educational research . Qualitative Inquiry , 30 , 333 – 344 .
Finlay , L. ( 2021 ) Thematic analysis: the ‘good’, the ‘bad’ and the ‘ugly’ . European Journal for Qualitative Research in Psychotherapy , 11 , 103 – 116 .
Glaw , X. , Inder , K. , Kable , A. and Hazelton , M. ( 2017 ) Visual methodologies in qualitative research: autophotography and photo elicitation applied to mental health research . International Journal of Qualitative Methods , 16 , 160940691774821 .
Hennessy , M. and O’Donoghue , K. ( 2024 ) Bridging the gap between pregnancy loss research and policy and practice: insights from a qualitative survey with knowledge users . Health Research Policy and Systems , 22 , 15 .
Hensen , B. , Mackworth-Young , C. R. S. , Simwinga , M. , Abdelmagid , N. , Banda , J. , Mavodza , C. et al. . ( 2021 ) Remote data collection for public health research in a COVID-19 era: ethical implications, challenges and opportunities . Health Policy and Planning , 36 , 360 – 368 .
Jamie , K. and Rathbone , A. P. ( 2022 ) Using theory and reflexivity to preserve methodological rigour of data collection in qualitative research . Research Methods in Medicine & Health Sciences , 3 , 11 – 21 .
Kennedy , M. , Maddox , R. , Booth , K. , Maidment , S. , Chamberlain , C. and Bessarab , D. ( 2022 ) Decolonising qualitative research with respectful, reciprocal, and responsible research practice: a narrative review of the application of Yarning method in qualitative Aboriginal and Torres Strait Islander health research . International Journal for Equity in Health , 21 , 134 .
Levi , R. , Ridberg , R. , Akers , M. and Seligman , H. ( 2021 ) Survey fraud and the integrity of web-based survey research . American Journal of Health Promotion , 36 , 18 – 20 .
Malterud , K. , Siersma , V. D. and Guassora , A. D. ( 2016 ) Sample size in qualitative interview studies: guided by information power . Qualitative Health Research , 26 , 1753 – 1760 .
Marko , S. , Thomas , S. , Pitt , H. and Daube , M. ( 2022a ) ‘Aussies love a bet’: gamblers discuss the social acceptance and cultural accommodation of gambling in Australia . Australian and New Zealand Journal of Public Health , 46 , 829 – 834 .
Marko , S. , Thomas , S. L. , Robinson , K. and Daube , M. ( 2022b ) Gamblers’ perceptions of responsibility for gambling harm: a critical qualitative inquiry . BMC Public Health , 22 , 725 .
McCarthy , S. , Thomas , S. L. , Pitt , H. , Warner , E. , Roderique-Davies , G. , Rintoul , A. et al. . ( 2023 ) ‘They loved gambling more than me’. Women’s experiences of gambling-related harm as an affected other . Health Promotion Journal of Australia , 34 , 284 – 293 .
Pitt , H. , McCarthy , S. , Keric , D. , Arnot , G. , Marko , S. , Martino , F. et al. . ( 2023 ) The symbolic consumption processes associated with ‘low-calorie’ and ‘low-sugar’ alcohol products and Australian women . Health Promotion International , 38 , 1 – 13 .
Reed , M. S. , Merkle , B. G. , Cook , E. J. , Hafferty , C. , Hejnowicz , A. P. , Holliman , R. et al. . ( 2024 ) Reimagining the language of engagement in a post-stakeholder world . Sustainability Science .
Terry , G. and Braun , V. ( 2017 ) Short but often sweet: the surprising potential of qualitative survey methods . In Braun , V. , Clarke , V. and Gray , D. (eds), Collecting Qualitative Data: A Practical Guide to Textual, Media and Virtual Techniques . Cambridge University Press , Cambridge .
Toerien , M. and Wilkinson , S. ( 2004 ) Exploring the depilation norm: a qualitative questionnaire study of women’s body hair removal . Qualitative Research in Psychology , 1 , 69 – 92 .
Varpio , L. and Ellaway , R. H. ( 2021 ) Shaping our worldviews: a conversation about and of theory . Advances in Health Sciences Education: Theory and Practice , 26 , 339 – 345 .
Wang , J. , Calderon , G. , Hager , E. R. , Edwards , LV , Berry , A. A. , Liu , Y. et al. . ( 2023 ) Identifying and preventing fraudulent responses in online public health surveys: lessons learned during the COVID-19 pandemic . PLOS Global Public Health , 3 , e0001452 .
Month: | Total Views: |
---|---|
June 2024 | 790 |
Citing articles via.
Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide
Sign In or Create an Account
This PDF is available to Subscribers Only
For full access to this pdf, sign in to an existing account, or purchase an annual subscription.
Retention of Nursing Assistants in Assisted Living Using Stay Interviews: A Qualitative Case Study Using Stay Interviews: A Qualitative Case Study
109 Pages Posted: 27 Jun 2024
Southeastern University
Date Written: October 07, 2021
The expectation of the impact of an aging population and the future availability of nursing staff is a significant challenge for senior living communities around the world. Lack of nursing staff retention has driven managers to hire quickly, denying residents living in long-term care communities the care they need. Exploratory, qualitative design, using case study research, was used for this study to learn what motivates resident care attendants (RCAs) to stay employed in assisted living communities of senior living. Richard Finnegan’s five stay interview questions were used to interview nine RCAs from various assisted living communities in Louisiana. The data obtained from the interviews were coded, evaluated, and examined using thematic analysis. The results of the study showed that the employees enjoy working with seniors, admire leaders and coworkers, prioritize self-care, and appreciate opportunities for professional development. The findings suggest that innovative retention strategies are required to retain RCAs in the senior living industry. As more job choices and career opportunities evolve for people working in health care, senior community administrators need to re-evaluate best practices to retain and attract a quality nursing staff.
Suggested Citation: Suggested Citation
Southeastern university ( email ), do you have a job opening that you would like to promote on ssrn, paper statistics, related ejournals, international business strategy & structure ejournal.
Subscribe to this fee journal for more curated articles on this topic
Family, life course & aging ejournal.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Email citation, add to collections.
Your saved search, create a file for external citation management software, your rss feed.
Affiliations.
The rapid aging and increasing care demands among the elderly population present challenges to China's health and social care system. The concept of aging in place has prompted the implementation of integrated community care (ICC) in the country. This study aims to provide empirical insights into the practices of integrated care policies and approaches at the community level. Data for this study were collected through six months of participatory observations at a local community health service center in a southern Chinese city. Semi-structured interviews were conducted with the multidisciplinary community care team to gather frontline formal caregiver perceptions of ICC, thereby facilitating a better understanding of the obstacles and opportunities. Qualitative analysis revealed four themes: the ICC delivery model and development strategies within the community care scheme, the person-centered guiding principle, and the challenges and struggles encountered by formal caregivers within China's current ICC system. The case study presented herein serves as a notable example of the pivotal role of primary care in the successful implementation of elderly care within a community setting. The adoption of a private organization-led approach to medico-social integration care in the community holds significant potential as a service delivery model for effectively addressing a wide range of elderly care issues.
Keywords: aging in place; elderly care service; healthy ageing; integrated community care.
PubMed Disclaimer
Grants and funding.
Full text sources.
NCBI Literature Resources
MeSH PMC Bookshelf Disclaimer
The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.
Employees who use AI as a core part of their jobs report feeling more isolated, drinking more, and sleeping less than employees who don’t.
The promise of AI is alluring — optimized productivity, lightning-fast data analysis, and freedom from mundane tasks — and both companies and workers alike are fascinated (and more than a little dumbfounded) by how these tools allow them to do more and better work faster than ever before. Yet in fervor to keep pace with competitors and reap the efficiency gains associated with deploying AI, many organizations have lost sight of their most important asset: the humans whose jobs are being fragmented into tasks that are increasingly becoming automated. Across four studies, employees who use it as a core part of their jobs reported feeling lonelier, drinking more, and suffering from insomnia more than employees who don’t.
Imagine this: Jia, a marketing analyst, arrives at work, logs into her computer, and is greeted by an AI assistant that has already sorted through her emails, prioritized her tasks for the day, and generated first drafts of reports that used to take hours to write. Jia (like everyone who has spent time working with these tools) marvels at how much time she can save by using AI. Inspired by the efficiency-enhancing effects of AI, Jia feels that she can be so much more productive than before. As a result, she gets focused on completing as many tasks as possible in conjunction with her AI assistant.
Advertisement
Supported by
The case, involving a supplement intended to reduce cholesterol, has put attention on how companies are allowed to self-report claims about their products.
By River Akira Davis and Hisako Ueno
Reporting from Tokyo
A Japanese pharmaceutical company is investigating 80 deaths possibly linked to a yeast-containing supplement it sells in Japan, the country’s health ministry said Friday, in a shocking increase from an earlier revelation that is focusing attention on how supplements are regulated.
The company, Kobayashi Pharmaceutical, in March had reported five deaths potentially linked to its CholesteHelp rice and red-yeast pills. Japanese government health officials said the supplement, which claimed to help reduce cholesterol, contained puberulic acid, a highly toxic compound that is a product of mold.
In response to the sudden surge in reported deaths, Health Minister Keizo Takemi said it was “extremely regrettable” that Kobayashi Pharmaceutical had not updated the ministry sooner. The company, which is based in Osaka, had not provided new information on deaths potentially linked to CholesteHelp since March.
Since then, Kobayashi Pharmaceutical has received reports that 1,656 people sought medical advice for CholesteHelp-related health concerns, and 289 people have been hospitalized, the company reported. CholesteHelp has been recalled in Japan and China, the only countries the supplement was sold in, according to a spokeswoman for Kobayashi Pharmaceutical.
Mr. Takemi said the government would step in to take a more active role in investigating, after allowing the company to self-report its findings. “We cannot leave Kobayashi Pharmaceutical alone to handle it anymore,” he said.
Kobayashi Pharmaceutical was founded in 1919. While it is not one of Japan’s top pharmaceutical companies, it produces a variety of supplements and health products such as hand warmers and air fresheners, some of which are sold in the United States and elsewhere in Asia.
Quality-control guidelines related to supplements and other products making health claims were established in Japan in 2015. Those regulations are perceived to be less stringent than Japan’s rules governing prescription medications. Companies are typically responsible for self-reporting compliance rather than undergoing state screenings.
In the United States, where the dietary supplement market is booming , organizations like the American Medical Association have urged the Food and Drug Administration to introduce stricter rules to ensure supplement safety. Dietary supplements marketed for weight loss and muscle building have been linked with a number of deaths in the United States.
At a news conference in March when the potentially CholesteHelp-related deaths were first disclosed, the president of Kobayashi Pharmaceutical, Akihiro Kobayashi, apologized for not providing information sooner and said he had “no words.”
River Akira Davis covers Japan, including its economy and businesses, and is based in Tokyo. More about River Akira Davis
Hisako Ueno is a reporter and researcher based in Tokyo, writing on Japanese politics, business, labor, gender and culture. More about Hisako Ueno
IMAGES
VIDEO
COMMENTS
A case study is one of the most commonly used methodologies of social research. This article attempts to look into the various dimensions of a case study research strategy, the different epistemological strands which determine the particular case study type and approach adopted in the field, discusses the factors which can enhance the effectiveness of a case study research, and the debate ...
The case study is a qualitative approach used to study phenomena within contexts (Baxter & Jack, 2008) and can be used as a tool for learning (Baskarada, 2014).
Qualitative case study methodology enables researchers to conduct an in-depth exploration of intricate phenomena within some specific context. By keeping in mind research students, this article presents a systematic step-by-step guide to conduct a case study in the business discipline. Research students belonging to said discipline face issues ...
Revised on November 20, 2023. A case study is a detailed study of a specific subject, such as a person, group, place, event, organization, or phenomenon. Case studies are commonly used in social, educational, clinical, and business research. A case study research design usually involves qualitative methods, but quantitative methods are ...
The Qualitative Report. Volume 13 Number 4 Article 2 12-1-2008. Qualitative Case Study Methodology: Study Design and Implementation for Novice Researchers. Pamela Baxter. McMaster University, [email protected]. Susan Jack. McMaster University, [email protected] Follow this and additional works at: https://nsuworks.nova.edu/tqr Part of the ...
Qualitative case study methodology enables researchers to conduct an in-depth exploration of intricate phenomena within some specific context. By keeping in mind research students, this article presents a systematic step-by-step guide to conduct a case study in the business discipline. Research students belonging to said discipline face issues ...
A case study is a research method that involves an in-depth examination and analysis of a particular phenomenon or case, such as an individual, organization, community, event, or situation. It is a qualitative research approach that aims to provide a detailed and comprehensive understanding of the case being studied.
Case studies are designed to suit the case and research question and published case studies demonstrate wide diversity in study design. There are two popular case study approaches in qualitative research. The first, proposed by Stake ( 1995) and Merriam ( 2009 ), is situated in a social constructivist paradigm, whereas the second, by Yin ( 2012 ...
Stake mentions four defining characteristics of qualitative research which are valid for qualitative case studies as well: they are "holistic," "empirical," "interpretive," and "emphatic." Whether the study is experimental or quasi-experimental, the data collection and analysis methods are known to hide some details (Yazan, 2015).
Abstract. This chapter explores case study as a major approach to research and evaluation. After first noting various contexts in which case studies are commonly used, the chapter focuses on case study research directly Strengths and potential problematic issues are outlined and then key phases of the process.
An example of a qualitative case study is a life history which is the story of one specific person. A case study may be done to highlight a specific issue by telling a story of one person or one group. ... Case studies are seen by many as a weak methodology because they only look at one person or one specific group and aren't as broad in ...
Abstract. This article presents the case study as a type of qualitative research. Its aim is to give a detailed description of a case study - its definition, some classifications, and several ...
Books about Case Studies. The following books are available as ebooks through the UCSF Library unless otherwise noted. Case studies in food safety and environmental health by Peter Ho, Maria Margarida Cortez Vieira. Call Number: RA601 .C37 2007eb. ISBN: 9781441941374.
Designing and Conducting Case Studies. This guide examines case studies, a form of qualitative descriptive research that is used to look at individuals, a small group of participants, or a group as a whole. Researchers collect data about participants using participant and direct observations, interviews, protocols, tests, examinations of ...
Why a qualitative case study was conducted: A single, intrinsic qualitative research study. Following Yin's case study approach, the authors wished to uncover the contextual conditions relevant to the phenomenon under study - living in and leaving an abusive intimate relationship as a white, middle-class male. The researchers wanted to ...
The study was a qualitative descriptive case study research methodology that explored the commonality of the experiences of students enrolled in undergraduate online courses at a private, urban university in the Northeast United States. A qualitative study was appropriate for this study because the research sought to
Qualitative case study methodology provides tools for researchers to study complex phenomena within their contexts. When the approach is applied correctly, it becomes a valuable method for health science research to develop theory, evaluate programs, and develop interventions. The purpose of this paper is to guide the novice researcher in identifying the key elements for designing and ...
Furthermore, the ability to describe in detail how the analysis was conducted ensures rigour in reporting qualitative research. Data sources: The research example used is a multiple case study that explored the role of the clinical skills laboratory in preparing students for the real world of practice. Data analysis was conducted using a ...
The qualitative case study is a research method which enables a complex phenomenon to be explored through the identification of different factors interacting with each other. The case observed is a real situation. In the field of nursing science, it may be a clinical decision-making process. The study thereby enables the patient or health ...
Timely, authoritative, and approachable, Qualitative Research and Case Study Applications in Education is a practical resource that offers the information and guidance needed to manage all phases of the qualitative and case study research process. (source: Nielsen Book Data)
must be noted, as highlighted by Yin (2009), a case study is not a method of data collection, rather is a research strategy or design to study a social unit. Creswell (2014, p. 241) makes a lucid and comprehensive definition of case study strategy. Case Studies are a qualitative design in which the researcher explores in depth a pro-
Qualitative Case Study Guidelines Executive Summary The case study is rapidly gaining acceptance in many diverse scientific domains. Despite the fact thatit has often been viewed as a soft research method, it is actually remarkably difficult to execute well in practice. Accordingly, having a set of clear and succinct
Harnessing interpretivist approaches and qualitative values in online qualitative surveys. Online qualitative surveys take many forms. They may be fully qualitative or qualitative dominant—mostly qualitative with some quantitative questions (Terry and Braun, 2017).There are also many different ways of conducting these studies—from using a smaller number of questions that engage specific ...
Exploratory, qualitative design, using case study research, was used for this study to learn what motivates resident care attendants (RCAs) to stay employed in assisted living communities of senior living. Richard Finnegan's five stay interview questions were used to interview nine RCAs from various assisted living communities in Louisiana.
The case study presented herein serves as a notable example of the pivotal role of primary care in the successful implementation of elderly care within a community setting. The adoption of a private organization-led approach to medico-social integration care in the community holds significant potential as a service delivery model for ...
This research employs the fuzzy-set qualitative comparative analysis (fsQCA) method to investigate the configurations of multiple factors influencing scientific concept learning, including augmented reality (AR) technology, the concept map (CM) strategy and individual differences (eg, prior knowledge, experience and attitudes).
Joel Koopman is the TJ Barlow Professor of Business Administration at the Mays Business School of Texas A&M University. His research interests include prosocial behavior, organizational justice ...
Qualitative case study methodology enables researchers to conduct an in-depth exploration of intricate phenomena within some specific context. By keeping in mind research students, this article presents a systematic step-by-step guide to conduct a case study in the business discipline. Research students belonging to said discipline face issues ...
New research suggests a fossilized ear bone reveals the oldest known case of Down syndrome: a Neanderthal child who lived more than 146,000 years ago.
The case, involving a supplement intended to reduce cholesterol, has put attention on how companies are allowed to self-report claims about their products. Listen to this article · 2:57 min Learn ...