Directing Collaborative Research Practice in a Global Arena

Research study presented at 2014 Midwestern Regional Conference of the Comparative International Education Society (MCIES). Please join us as we discuss how researchers are using collaborative and qualitative technologies to shape their research process.

For additional questions please email me @  nsabir@indiana.edu

Abstract:

Information technologies have rapidly shaped the scope of international education research practices as contemporary software developments that allow global collaborative research projects. This study evaluates value-based judgments about technologies considered relevant to the educational research process. To explore the role of these digital tools bounded within the context of researcher development and ongoing qualitative research projects, this participatory action research study looks at the reflexive journaling of ten doctoral students across a 14 week period. The shift in international education research paradigms and constant updating of expectation in practice prompt a need to better understand technological resources valued. The thematic journal analysis revealed a call for: (1) open educational resources across information management systems, used in the literature review process and theory building stages; (2) resources that promoted accessible, collaborative, and transparent information collection and analysis; and (3) technologies that integrate reflective practices and commentary during the entire research process. Digital tools that reflected these characteristics were highly valued in conducting collaborative research in a global arena. The findings from this study provide groundwork for current international education researchers to successfully navigate the technological logistics of conducting collaborative research endeavors.

 

Suggested citation:

Sabir, N. (2014, October). Directing collaborative research practice in a global arena. Presentation at the 2014 Midwestern Regional Conference of the Comparative International Education Society, Bloomington, IN.

Importance of Teamwork in Mixed Method Research Projects

With the implementation of survey instruments there is little movement of quantitative data, and minimal opportunity for varying interpretation of responses, as well as questions items (Bryman & Burgess, 1994).  With qualitative instruments integrated into mulimethod studies, the case is not as pronounced and therefore difficulties may arise in interpreting and evaluating qualitative data (Maderson, Kelaher, & Woelz-Stirling, 2011).  Hence in constant data collection phases, the management of information can become problematic when data are qualitative, collected by more than one researcher, and are intended for multiple users (Bryman & Burgess, 1994).

As two researchers working on individual projects are compounding data within a single research endeavor, the aspect of teamwork becomes crucial to the success of data analysis. Teamwork paired with reflexivity leads to improved productivity, effectiveness, and more robust research – overall higher quality (Barry et al., 1999). At the qualitative stage specifically, West (1994) reports that teamwork enhances the rigor of the methodological design, analysis, and interpretive elements of a research project.

Additionally teams can foster deeper conversations and higher levels of conceptual thinking than researchers working alone hence enriching the coding and analysis process at each stage (Barry et al., 1999).  This will include: integrating differing perspectives and ease at identifying bias (Liggett et al., 1994); a better standardization for coding and improving accuracy in theme creation and application (Delaney & Ames, 1993); and advancing the overall analyses to a higher level of abstraction (Olesen, Droes, Hatton, Chico & Schatzman, 1994).  In an effort to have a more rigours data analysis process and the reduction of personal bias, teamwork is crucial to the multiphase research model.

During the analysis phase of both the quantitative data and the qualitative information, the team aspect is crucial to the development of coding schemes and information interpretations.  The multidisciplinary discussions will act as a mindset for the two main analysis phases, sharpening the researchers to code of themes they might not have individually considered.

 

References

Barry, C. A., Britten, N., Barber, N., Bradley, C., & Stevenson, F. (1999). Using reflexivity to optimize teamwork in qualitative research.Qualitative health research,9(1), 26-44.

Bryman, A., & Burgess, B. (Eds.). (1994).Analyzing qualitative data. New York, NY: Routledge.

Delaney, W., & Ames, G. (1993). Integration and exchange in multidisciplinary alcohol research. Social Science and Medicine, 37, 5-13.

Friedman, T. (2005). The world is flat. New York, NY: Farrar, Straus & Giroux.

Liggett, A. M., Glesne, C. E., Johnston, A. P., Hasazi, B.,&Schattman, R. A. (1994). Teaming in qualitative research: Lessons learned. Qualitative Studies in Education, 7, 77-88.

Manderson, L., Kelaher, M., & Woelz-Stirling, N. (2001). Developing qualitative databases for multiple users.Qualitative health research,11(2), 149-160.

Olesen, V., Droes, N., Hatton, D., Chico, N.,&Schatzman, L. (1994). Analyzing together: Recollections of a team approach. In R. G. Burgess (Ed.), Analyzing qualitative data (pp. 111-128). London, UK: Routledge.

West, M. A. (1994). Effective teamwork. Leicester, UK: BPS Books.

 

IRB approval & cloud storage

The Institutional Review Board (IRB) approval process drives even the best of faculty and researchers a little batty. The entire process can be confusing, convoluted, and inconvenient, depending on the institution and potential partners. The common push back to IRB approval processes is grounded in the belief that it is the researchers, not members of the IRB, who hold the specialized experience and knowledge required to make final decisions (Howe & Dougherty, 1993).

Our university recently underwent changes to move the document creation and submission process online. While the green effort was noble in attempting to streamline the application process and digitize documentation, the constant shift and recent changes proved to further irk faculty. In fact, one of my personal mentors was so troubled by the system that we called our local IRB office; and then refused to let them hang up until we had completed the application process.

While annoying, it is important to remember where the IRB process stemmed from: Recall only a few decades ago, we were giving people syphilis (Gjestland, 1954) and convincing them that they were responsible for electrocuting individuals (Milgram, 1963). Oh how we have progressed in standardizing research methodology?

While irksome, the IRB is in place to ensure that all participants are protected and appropriately informed of their rights. Over the past two decades IRBs have transformed the conduct of research endeavors involving any human subjects. Researchers are no longer able to implement research projects without weighing the risks they are asking participants to assume (Edgar & Rothman, 1995). As people hold a special status, their participation in research projects is a “means to a higher end, they also deserve to be treated with a particular kind of moral regard or dignity” (Pritchard, 2002).

IRBs have often shared a concern for information exchanged, stored, and analyzed via cloud storage (Carrell, 2011; Kumbhare, Simmhan, & Prasanna 2011). In fact, in a recent application for IRB approval our research team received much push-back when proposing to host research files on a cloud-based server. Through conversations with IRB members our research team was able to reach a compromise and ensure secure storage of data.

References:

Carrell, D. (2011, January). A strategy for deploying secure cloud-based natural language processing systems for applied research involving clinical text. In System Sciences (HICSS), 2011 44th Hawaii International Conference on (pp. 1-11). IEEE.

Edgar, H., & Rothman, D. J. (1995). The institutional review board and beyond: future challenges to the ethics of human experimentation.Milbank Quarterly, 73(4), 489-506.

Gjestland, T. (1954). The Oslo study of untreated syphilis; an epidemiologic investigation of the natural course of the syphilitic infection based upon a re-study of the Boeck-Bruusgaard material.Acta dermato-venereologica. Supplementum, 35(Suppl 34), 3-368.

Howe, K. R., & Dougherty, K. C. (1993). Ethics, institutional review boards, and the changing face of educational research.Educational Researcher, 22(9), 16-21.

Kumbhare, A. G., Simmhan, Y., & Prasanna, V. (2011, November). Designing a secure storage repository for sharing scientific datasets using public clouds. In Proceedings of the second international workshop on Data intensive computing in the clouds (pp. 31-40). ACM.

Milgram, S. (1963). Behavioral study of obedience.The Journal of Abnormal and Social Psychology, 67(4), 371 -378.

Pritchard, I. A. (2002). Travelers and trolls: Practitioner research and institutional review boards. Educational Researcher, 31(3), 3-13.

 

Keeping prosthetics squeaky clean

After much discussion I realized that exposure to CAQDAS tools isn’t simply enough to remain competitive in the technocentric research environment. There is a need to stay updated on all the latest, and greatest, tools out there. More than the tool itself – researchers need to be able to leverage the tools in a unique, efficient, and critical manner.  So here is a brief list of steps anyone can take to stay current with CAQDAS.

Gain hands on and practical skills as soon as you can!  The longer you spend getting to know a software package the easier it is to use in actual research project, and it progressively becomes less stressful to work in the software environment.

Research what is out there. There are some great resources out there and sometimes a simple Google search can yield amazing results – For example the CAQDAS Networking Project is a great networking source that has a repository of support and tutorials.  Another example are comparative research papers like Ness’ work on “Computer Assisted Qualitative Data Analysis Software (CAQDAS)”.  Some big considerations are software capabilities, general assumptions, and potential licensing costs.  Additionally you can search for alternative open source or open educational research software options out there that could assist you.

Beta test new versions & actively give feedback and review products.  Much of the software development and evolution keeps the end user in mind.  Reference management systems such as Mendeley are embedded in the social science research process, in that user feedback is actively considered when upgrading.  So if you are having an issue or idea? Share it! Start a conversation about changes you wish to see.  You just might be surprised to see that a developer took the time to listen/read your thoughts.

Use any free training you have at your discipline!  Often university or departments have access to internal technology training systems built into the university structure. For example, here at IUB we have IT training sessions and online tutorials via Lydia.

If you don’t have organizational access to these kinds of resources, look into software developer developed guides, walkthroughs, and videos.  You would be surprised at the number of YouTube videos and forums that could help you answer quires and introduce you to new ways of leveraging the tools.  Another set of great resources are blogs (e.g. NVivo Blog) that review researcher tools, even microblogs like Twitter.

Talks to your peers! You will be amazed at how much you can learn from your colleagues.  Attend conferences paying close attention to the types of tools and methods utilized. It is also important to note what limitations are brought up and how final results are conveyed.

Subscribe to updates and newsletters. Let the information come to you. Everyone hates getting tons of emails and have an inbox full of irrelevant emails, that get tossed before they are even opened.  But here’s the thing – you don’t’ have to always read the emails sent from developers; you can follow up on updates whilst you are searching for answers.  Use newsletters as aggregators so you can read up to date information on your time!

CAQDAS as researcher’s prosthetic

Much of the Konopasek (2008) article resonated with my general approach to qualitative research, and the integration of technological tools into said processes.  Konopasek (2008) argues that grounded theory methodology can be synchronous to qualitative research, and qualitative data analysis at large, when used for non-deductive research projects.  In grounded theory, the investigators remain the “primary instrument of data collection and analyses assumes an inductive stance and strives to derive meaning from the data” (Merriam, 2009, p. 29). 

Coming from a hard analytic background that consistently uses algorithmic software to assist with data analysis; we are used to contextualizing technological tools as simply tools that do what they are told!  Any data analysis tools should be critically considered, and their capabilities, before fully integrating that tool into the research design.  Konopasek (2008) claims that “humans, not machines, do the crucial work of coding and retrieving” (p. 2) and that qualitative data analysis is more than a “careful reading of data” (p. 3).  Comparative educationalist can work with a specialized computer program where information is manipulated.  In this manner the computer program could be considered a direct extension of a researchers’ thinking. 

While a computer program can provide valuable insight into data trends, they are often limited in creative approaches, flexibility, and issues dealing with uncertainty.  Issues of ambiguity, flexibility, creativity, expanded vocabulary, and ethics all need to be considered when coding data (Saldana, 2009, p. 29).  It is the duty of the researchers to manipulate knowledge to add meaning through these manipulation researchers can “show differences and similarities, emerging patterns, [and] new contexts” (Konopasek, 2008, p. 5).

 

 References

Konopasek, Z. (2008). Making things visible with ATLAS.ti: Computer assisted qualitative analysis as textual practices. FORUM: Qualitative Social Research 9(2).

Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. (pp. 29-31). San Francisco, CA: John Wiley & Sons.

Saldaña, J. (2009). An introduction to codes and coding. In The coding manual for qualitative researchers (pp. 1-31). London: Sage Publications.

 

Issues with Online Data Collection

In recent years the use of online data sources has lead to new developments including the standardization of resources and widening of scope; despite not all critical info is visible researcher.  However, online conversations are constantly evolving and perceptions online are always in flux.  Seymour (2001) claims that the internet releases the interview stage from “from its imprisonment in time and place” (p.158).

There are issues that cross over between virtual spaces and traditional data collection methods that stem mainly from the effects of media on the data gathering step.  Traditionally, in qualitative research documents/artifacts, interviews and observations are collected. In traditional data gathering humans are responsive and adaptive, whereas in electronic media data is often constructed via computer assistance. Now there are electronic forms of all these sources, where illustrations, programs and networks are artifacts.  Data collection differs with the nature of the medium; computer mediated communication has a unique effect on information construction.

There are some ethical considerations that need to be considered.  Merriam (2009) discusses four main ethical concerns with online issues: obtaining informed consent; ensuring confidentiality and security; determining what is public and what is private; and the last issue revolves around debriefing participants.  Many of these issues were brought up during the class, the most prominent being the issue of privacy and publicity, gaining informed consent, and issues of “reality.”

For the issue of public and private information, the discussion was centered on expectations for participants. For instance, if the researcher is reflecting on a public space about the participants than it might endanger the identity of the participants. Additionally, while participants often agree to having their information published as part of the study they might be less comfortable with having their information blogged upon.  Personally, I think this is a super important issue for in-service teachers who wish to engage in action research, self-reflection is a critical piece of the action research process (Carr & Kemmis, 2003).

 

References

Carr, W., & Kemmis, S. (2003). Becoming critical: education knowledge and action research. London, UK: Routledge.

Merriam, S. B. (2009). Qualitative research: A guide to design and implementation.. (pp. 156-163). San Francisco, CA: John Wiley & Sons.

Seymour, W. S. (2001). In the flesh or online? Exploring qualitative research methodologies. Qualitative Research, 1(2), 147-168.

 

Digital Tools (Visual Representation)

This graphic represents many of the conversations noted from class.  As we discuss the potential and possible uses of digital tools in (qualitative) research, the process of research and where each tools plays a part in an image that keeps popping into my head. I finally had to create a skeleton adopted from several sources including (Coffey, Holbrook & Atkinson, 1996; Davidson & di Gregorio 2011; Gratton & O’Donnell, 2011; Matthews & Cramer, 2008; Miles & Huberman, 1994; Paulus, Lester, & Dempster, 2013) as well as in-class discussion notes.

When we talk about information and research, there are really two different types of data; there is researcher generated information and there is also participant generated information.  For example Silverman writes that certain kinds of data could not exist without the researchers’ facilitation, these include focus groups or interviews (2011). So as you look over the image below consider the types of information and how different entities can facilitate or create meaning.

When recording field notes and conversations – you can go old school with a pen & paper, you can jazzy it up with video or audio recordings, or new tech such as audio recording pens (e.g. LiveScribe).  The abundance of smart phone applications that allow you to record audio are amazing, regardless of platform.  This shift and expanse of technological tools means that “new sites of study and ways of engaging with participants” (Hine, 2008, as cited by Paulus, Lester & Dempster, 2013, p. 70), is growing ever prominent and accessible.  So not only do we have new ways of constructing meaning but we can also gather information across greater distances (e.g. Gratton & O’Donnell, 2011 and Matthews & Cramer, 2008).  Through archival tools and web conferencing, new sites for study are being revealed to educational researchers.

These ideas of creating knowledge through data reminded me of a classroom discussion on revamping Miles and Huberman’s (1994) list of computer uses in qualitative research.  This idea of shifting from traditional ethnographies to netnographies, using “ethnographic approaches to study unique digital spaces” (Paulus, Lester & Dempster, 2013, p. 76), really pushed me to think about online communities, social networks, virtual worlds, and serious games as new research environments.

  DigitalToolQualResearch

References:

Coffey, A., Holbrook, B. & Atkinson, P. (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online 1(1). Retrieved from: http://www.socresonline.org.uk/1/1/4.html.

Davidson, J. & di Gregorio, S. (2011). Qualitative research and technology: In the midst of a revolution. In Denzin & Lincoln (eds.) The Sage Handbook of Qualitative Research, 4th edition. (pp. 627-643). London, UK: Sage.

Gratton, M. & O’Donnell, S. (2011). Communication technologies for focus groups with remote communities: a case study of research with First Nations in Canada. Qualitative Research 11(2) 159-175.

Matthews, J. & Cramer, E.P. (2008). Using technology to enhance qualitative research with hidden populations. The Qualitative Report 13(2), 301-315.

Miles, M.B. & Huberman, A.M. (1994) Qualitative Data Analysis: An Expanded Sourcebook. London, UK: SAGE

Paulus, T. M., Lester, J. N., & Dempster, P. (2013). Digital tools for qualitative researchLondon, UK: Sage.

Silverman, D. (2011). Interpreting qualitative data: Methods for Analysing Talk, Text, and Interaction. London, UK: Sage.

Digital diaries

Blogs are becoming ever popular for researchers and teachers.  They are used as a source of archiving thoughts, starting conversations, and housing resources.  In my teaching teachers course we talk about the importance of instructors having blogs and websites.  Not only are they a source of information but they are also valuable tools in establishing digital identities.  Overall blogs have emerged as a creative spaces that allow users to carry on a “persistent conversation” (Paulus, Lester & Dempster, 2014, p. 15), and represent a multi-user environment which is not platform restrictive.

More importantly these digital diaries are more than simple text. They allow for hyperlinking of content, embedding visual tools (such as videos or images), and are sometimes video or audio based entries.  Many of the blogging tools used by teachers and educational researchers are very personal expressions and accounts.  They incorporate happenings, ideals, and current issue of concern.  Their posting of text and visuals allows blog visitors to gain a richer illustration of the blogger’s life experiences.  These interactive communicative diaries are often linked to social networks, and “respond to new conditions” (Willi, Melewar, & Broderick, 2013, p. 103) of identity creation.  In other words they allow the blogger to develop a holistic online identity centered around their diary postings, pertaining to issues deemed of value.

So we talked a bit about online identities and blog…what about the other aspects?  Well there is this notion centered about reflexivity, and using blogs as tools to streamline that process.  Let’s start by first defining reflexive practices.  Paulus, Lester and Dempster define this as, “the process of intentionally attending to the perspectives, attitudes and beliefs that shape how you design a research study and make sense of your data” (2014, p. 13).  In other words, how is ‘you as a person’ impacting a situation.  This doesn’t have to be in the realm of data collection or research design, while it might often be related to those two areas.

Let’s ground this fuzzy concept with any example of action research.  A new 3rd grade teacher is welcoming his class during the first week of school and notices that two of his (international) students aren’t socializing with their peers, nor are they speaking up, participating in class, and they even refuse to make eye contact.  Now there are several ways an inexperienced teacher might reflect on these issues. He might think that something is wrong with the students: maybe they are just shy or perhaps they are having troubles adjusting to the new classroom.  He might not even stop to consider that they students come from a different school culture and that instead of changing them he might have to change his approach.

Had this teacher kept a journal he might be able to look over what changes positively impacted his classroom and make changes accordingly.  Additionally, if he kept an online blog perhaps his online community of teachers might have been able to offer suggestions and possible resources. And so, we come back to this concept of changing or practices based on our reflections, and realizing how we impact our design.  They don’t always have to be grounded in the concept of research.

So we have reflective practices and establishing digital footprints and creating conversations.

Let me shift gears and tell you about a great way to aggregate your diary posts and notice trends, themes and issue of importance. Often times when I site down to blog, I start writing about a topic and it takes an interesting turn…sometimes I run off on tangents.  At the end of each blog entry I push all my talking points into a word cloud and use those as my tags. This allows me to self-evaluate what I am really talking about and sometimes enlightens me to the topics that are most prevalent in my entry.  Consider using word cloud tools to thematically analyze, code even, your digital diary.  There are some great websites out there, and Wordle and Tagxedo are two of the best I have found.  Both are fairly simple to use and allow for a customizable experience. Simply copy and paste your text into the box and create! Wordle is simple, clean, and classic (Wordle even recognized a plethora of other examples), while Tagxedo allows for more customization.  Make sure you have Microsoft Silverlight and Java up to date. 

Here is an example of what a word cloud of this post may look like

Using Wordle

WordCloud 3

Using Tagxedo

WordCloud 2

References

Harricharan, Michelle and Bhopal , Kalwant (2014) Using blogs in qualitative educational research: an exploration of method. International Journal of Research & Method in Education (In Press).

Paulus, T. M., Lester, J. N., & Dempster, P. G. (2014). Digital tools for qualitative research. Thousand Oaks, CA: Sage.

Willi, C. H., Melewar, T. C., & Broderick, A. J. (2013). Virtual brand-communities using blogs as communication platforms and their impact on the two-step communication process: A research agenda. The Marketing Review, 13(2), 103-123.

Looking to make the most out of your blog? Check out Bolles’  blog posting on using blogs as a research tool. Not only does it write about his personal experiences but has used the text-based blogging feature to transcribe his video blog.

15 years of going in circles

During a meeting today it was brought to my attention that the issues revolving around technology integration remain the same, nearly 15 years after a call for conversation.  In 1999, Ertmer et al. described the role of technology in learning environment and proposed two frameworks of barriers that needed to be addressed for successful technology integration.  The same can be said for the integration of technology into research methods and qualitative research project design.  Funny how authors despite trying to have a cohesive conversation across disciplines, remain at the same place, as though the conversations haven’t shifted with the times or technological updates.  It seems as though conversations and research agendas are moving in circles.  All this got me thinking once more about users’ fundamental understanding of tools and what that can mean.

Conole and Dyke (2004) break down the notion of affordances when using technological tools in conducting research.  This concept encompasses an ontological approach, that talks about possible uses, and epistemological approaches, that revolve around intended or actual utilizations.

hammer

Let’s take for example a hammer: A hammer can be used for several purposes…perhaps you are using it to nail pins into a wall to hold pictures…or perhaps you are using it to weigh down your door as so it won’t close.  There are ideal uses you think of when someone mentions a hammer, those ideals are constructed based on your personal familiarity with the tool…what often escape us are the list of things one can possibly do with a hammer.

Emu

Here’s another: Quick think of a bird…

Can it fly? Most people think about flying birds when asked; but how does your mental model change when the first bird you think of is a penguin, kiwi, or emu.

When researchers discuss expanding the frameworks about technological tools, some “rather than elaborating on how any one of these ‘affordances’ could be relevant to a learner or a practitioner, the authors tend to indulge in a certain amount of hopeful expectation that affordances and abilities will simply emerge” (Boyle & Cook, 2004, p. 297).  This doesn’t seem too much of a concern and there aren’t persistent calls for more research, so perhaps it has fizzled down a bit.

After reading discussions about technological affordances and best technology integrations practice, I think the response remains the same; “It depends”

References

Boyle, T. & Cook, J. (2004). Understanding and using technological affordances: a commentary on Conole and Dyke. ALT-J Research in Learning Technology 12(3), 295-299.

Conole, G. & Dyke, M. (2004a). What are the affordances or information and communication technologies? ALT-J Research in Learning Technology 12(2), 113-124.

Ertmer, P. A., Addison, P., Lane, M., Ross, E., & Woods, D. (1999). Examining Teachers’ Beliefs about the Role of Technology in the Elementary Classroom. Journal of research on Computing in Education32(1), 54-72.

Uncertainty Principle, Education (research) Style.

\*\ ^_^  /*/  <– Do a little happy dance!

 

Before I was an educationalist, I was a researcher in the Sciences and we had this idea that whatever we created and “discovered” needed to be revalidated.  Basically when we were expected to publish results any lab should be able to replicate the results.  I grew as an experimentalist, under the guise and care of my faculty.

However, as I shifted gears, my paradigm and perspectives on research design didn’t move as fluidly.  Traditionally I have carried about a more, aligned, constructionist epistemological where I believed that meanings and understanding were influenced by surroundings. In qualitative approaches, in education, this generally means that social phenomena are engineered within social contexts.  And, while these phenomena may seem natural they are, in reality, influenced in design, artifact and constrained by the environment around them.

Think of a cell, yes like the kind in your body.  They change form when influenced by cold temperature or wet environments. Sometimes if you poke at them, they bounce back and sometime your interactions and puncture the fine membrane…and then you can clean up the mess. Researching in educational environments is kind of the same.

As soon as you start watching how someone is interacting with a system, they react to your inspection.  When you attempt to change a participants environment or try interventions, sometime it is merely the presence of the observer that is enough to solicit a reaction.

So then how are we, as research, supposed to create systematic research/date that is replicable?  Honestly, sometimes…most times, I don’t know.  I think the beauty of qualitative methods is that they give space for representing unique contexts as “whole pictures” often expressive from the perspective of the participants. However, often when reading research papers that take a holistic approach I question how the role of the researcher impacts the nature of the study.

This all reminds me of the Heisenburg Uncertainty Principle, where you can’t measure both a particle’s position and it’s velocity in the same instance.  The simple act of trying to capture information is disruptive. In fact the more accurately you measure position, the more inaccurately you measure velocity (and vice versa).  The very nature of interventions is disruptive to the natural state of participants, even in educational research.  Some call it observer bias; others call it a threat to internal validity.

While there are methods and designs that can control for such things as the Hawthorne Effect they are more often than not, not incorporated into descriptive qualitative research design.  This measures are not taken because the researcher is meant to be a part of the interpretative instrument (s), but then how does that make this *waves hands around in the air* a) representative, b) accurate, c) maintain naturalistic integrity, and d) replicable?

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 12 other subscribers

Calendar

May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031