What package do you pick?

My research partner and I were having a conversation about what programs we should use to code our open-ended survey data and interviews. So we began listing out all our options from Google documents to complex CAQDAS packages and then some. We soon discovered an issue…we didn’t have a shared skill set, meaning that software my partner had experience with I did not and vice versa.  So instead of trying to find common ground we started looking into what packages offered trial versions long enough for us to complete data coding over the course of a semester.

That narrowed down our list some…only then we ran into another issue. I am a window/linux user and my partner is an apple user that works off their mobile device more often than not. In the end we couldn’t really make a decision because of our differences…we decided that we would have to compromise.  And then for the third time we rewrote our list and began a conversation anew. This time it was guided by functionality and a discussion by Taylor, Lewins and Gibbs (2005).

Since the size of our data set is fairly small this wasn’t too much of an issue, so that didn’t knock any off our list.  The next topic was collaboration: we want to be able to either send documents back and forth via email/cloud or collaborate directly without much hindrance. Surprisingly, we didn’t think about our data type until the third conversation.  Since we have text, audio, and pdf artifacts we needed a CAQDAS package that could support coding directly in the platform for audio elements.  Our fourth criteria was based on using frequency and comparative matrices across our data types.  Because we are doing a mixed methods study, we are very concerned about convergence (or even divergence) across themes.  Additionally, we are very interested in working with quantitative data (for minimal descriptive statistics) within a single package as well. Finally we settled on two possible software options.

What a hassle? There has to be a simpler, comparative chart that would have allowed us to check properties across various CAQDAS packages and cross off options that didn’t meet our criteria.  Turns out that these charts are already floating around the web, we just didn’t look hard enough. Here is an example from UNC that compares ATLAS.ti, MAXQDA, NVivo and Dedoose, and another Stanford site that compares Nvivo, HyperResearch, Studiocode, Atlas.ti, and Tams Analyzer.

 

References:

Taylor, C., Lewins, A., & Gibbs, G. (2005, December 12). Debates about the software. Retrieved from http://onlineqda.hud.ac.uk/Intro_CAQDAS/software_debates.php

Digital Tools (Visual Representation)

This graphic represents many of the conversations noted from class.  As we discuss the potential and possible uses of digital tools in (qualitative) research, the process of research and where each tools plays a part in an image that keeps popping into my head. I finally had to create a skeleton adopted from several sources including (Coffey, Holbrook & Atkinson, 1996; Davidson & di Gregorio 2011; Gratton & O’Donnell, 2011; Matthews & Cramer, 2008; Miles & Huberman, 1994; Paulus, Lester, & Dempster, 2013) as well as in-class discussion notes.

When we talk about information and research, there are really two different types of data; there is researcher generated information and there is also participant generated information.  For example Silverman writes that certain kinds of data could not exist without the researchers’ facilitation, these include focus groups or interviews (2011). So as you look over the image below consider the types of information and how different entities can facilitate or create meaning.

When recording field notes and conversations – you can go old school with a pen & paper, you can jazzy it up with video or audio recordings, or new tech such as audio recording pens (e.g. LiveScribe).  The abundance of smart phone applications that allow you to record audio are amazing, regardless of platform.  This shift and expanse of technological tools means that “new sites of study and ways of engaging with participants” (Hine, 2008, as cited by Paulus, Lester & Dempster, 2013, p. 70), is growing ever prominent and accessible.  So not only do we have new ways of constructing meaning but we can also gather information across greater distances (e.g. Gratton & O’Donnell, 2011 and Matthews & Cramer, 2008).  Through archival tools and web conferencing, new sites for study are being revealed to educational researchers.

These ideas of creating knowledge through data reminded me of a classroom discussion on revamping Miles and Huberman’s (1994) list of computer uses in qualitative research.  This idea of shifting from traditional ethnographies to netnographies, using “ethnographic approaches to study unique digital spaces” (Paulus, Lester & Dempster, 2013, p. 76), really pushed me to think about online communities, social networks, virtual worlds, and serious games as new research environments.

  DigitalToolQualResearch

References:

Coffey, A., Holbrook, B. & Atkinson, P. (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online 1(1). Retrieved from: http://www.socresonline.org.uk/1/1/4.html.

Davidson, J. & di Gregorio, S. (2011). Qualitative research and technology: In the midst of a revolution. In Denzin & Lincoln (eds.) The Sage Handbook of Qualitative Research, 4th edition. (pp. 627-643). London, UK: Sage.

Gratton, M. & O’Donnell, S. (2011). Communication technologies for focus groups with remote communities: a case study of research with First Nations in Canada. Qualitative Research 11(2) 159-175.

Matthews, J. & Cramer, E.P. (2008). Using technology to enhance qualitative research with hidden populations. The Qualitative Report 13(2), 301-315.

Miles, M.B. & Huberman, A.M. (1994) Qualitative Data Analysis: An Expanded Sourcebook. London, UK: SAGE

Paulus, T. M., Lester, J. N., & Dempster, P. (2013). Digital tools for qualitative researchLondon, UK: Sage.

Silverman, D. (2011). Interpreting qualitative data: Methods for Analysing Talk, Text, and Interaction. London, UK: Sage.

A “history” of digital innovation for conveying knowledge (and research)

This week the theme of digital natives comes up a lot, like a lot. Let’s start by first defining digital natives and what that means to the research community. In 2001, Prensky first coined the term to mean individuals who have spent most of their lives “surrounded by and using computers and videogames, digital music players, videocams, cell phones and all other toys and tools of the digital age” (p. 1). These interactions have fundamentally changed the way users interact with platforms and critically think about leveraging technological tools for their motives. Students these days have an affinity for using technology as a crutch and display unique digital literacies.

There however exists a dichotomy between the perceived usefulness of digital tools to convey knowledge and what students area actually doing. Personally, I have noted these two, clear realms of experience in both my teaching pre-service instructors and my attempts to integrate snazzy tools and techniques into my research project designs.

Have you every met a teacher who can use their smart phone to play Words With Friends or Candy Crush, manage their daily lives with integrated calendars and reminder apps, check their emails and leverage social networks for professional development; but failed at using a presentation to effectively communicate ideals in an enhancing manner? There it is again…the idea that digital tools are useful (in our daily lives) but don’t translate into valuable uses for professional outlooks.

Paulus, Lester and Britt (2013) point out that if advisors, faculty, and teachers “are not using the tools in informed ways, it makes it less likely that the next generation will, either” (p. 649). So the baton passes to instructors to enlighten students as to how they can use technological tools creatively and critically. Several texts such as Joiner et al. (2013), Robert and Wilson (2002), and Coffey, Holbrook and Atkinson, (1996) address the current value systems of using digital tools to convey and analyze information.

Even within my coursework I notice the perception that the human approach is best. The responsibility seems to therefore extend to current users to inform the community, and their students, about the ways in which you can use digital tools can be best leveraged. I think early adopters should be models for future users and demonstrate efficient practices.

References

Coffey, A., Holbrook, B., & Atkinson, P. (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online, 1(1), Retrieved from http://www.socresonline.org.uk /1/1/4.html

Joiner, R., Gavin, J., Brosnan, M., Cromby, J., Gregory, H., Guiller, J., … & Moon, A. (2013). Comparing First and Second Generation Digital Natives’ Internet Use, Internet Anxiety, and Internet Identification. Cyberpsychology, Behavior, and Social Networking.

Prensky, M. (2001). Digital natives, digital immigrants. On the horizon, 9(5), 1-13.

Roberts, K. A., & Wilson, R. W. (2002). ICT and the research process: Issues around the compatibility of technology with qualitative data analysis. Forum: Qualitative Social Research, 3(2), Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/862/1872

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 12 other subscribers

Calendar

May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031