Directing Collaborative Research Practice in a Global Arena

Research study presented at 2014 Midwestern Regional Conference of the Comparative International Education Society (MCIES). Please join us as we discuss how researchers are using collaborative and qualitative technologies to shape their research process.

For additional questions please email me @  nsabir@indiana.edu

Abstract:

Information technologies have rapidly shaped the scope of international education research practices as contemporary software developments that allow global collaborative research projects. This study evaluates value-based judgments about technologies considered relevant to the educational research process. To explore the role of these digital tools bounded within the context of researcher development and ongoing qualitative research projects, this participatory action research study looks at the reflexive journaling of ten doctoral students across a 14 week period. The shift in international education research paradigms and constant updating of expectation in practice prompt a need to better understand technological resources valued. The thematic journal analysis revealed a call for: (1) open educational resources across information management systems, used in the literature review process and theory building stages; (2) resources that promoted accessible, collaborative, and transparent information collection and analysis; and (3) technologies that integrate reflective practices and commentary during the entire research process. Digital tools that reflected these characteristics were highly valued in conducting collaborative research in a global arena. The findings from this study provide groundwork for current international education researchers to successfully navigate the technological logistics of conducting collaborative research endeavors.

 

Suggested citation:

Sabir, N. (2014, October). Directing collaborative research practice in a global arena. Presentation at the 2014 Midwestern Regional Conference of the Comparative International Education Society, Bloomington, IN.

Real-Time Audio Reflection of Video Analysis for Action Research

A right of passage for many teachers is a foundational video case analysis of their teaching, recorded and previewed by external faculty and staff members.  This practice is in place for pre-service teachers, teachers under review at new schools, and even associate instructors at the university level. While often viewed as a slightly intimidating process, the video review process is integral in establishing action research ideals for teachers.

The reflection process is crucial to not only the teacher’s development but also for enhancing their instructional approaches.  Many teacher preparatory programs strive to teach future teachers reflective practices that directly inform their action (Hatton & Smith, 1995).  Teachers need to think critically about, and learn from their past experiences through meaningful reflective practices.

While reflections can take place through listening, speaking, drawing, and any other way imaginable, the most meaningful reflections often take place after watching yourself perform tasks.  The idea is for teachers to video record themselves and capture objective descriptions of what happened, discuss feelings, ideas and analysis, and discuss how they reacted as a result of the experience.  The figure below represents the reflection process (adopted from Quinsland & Van Ginkel, 1984).

Quinsland_Van Ginkel

According to Quinsland and Van Ginkel (1984), processing is a practice that encourages one to reflect, describe, analyze, and communicate their experiences. The processing and reflection will not only allow for an enhanced learning experience but will also contribute to the teaching and learning of future students.  Past literature has shown that critical reflection will increase learning, understanding, and retention (Daudelin, 1996).  Additionally it invokes a process of taking meanings and moving them into learning (Mezirow, 1990).

The process of reflection is critical to action research (Kemmis, 1985), and action research need to be systematic (Gore & Zwichner, 1991; Price, 2001) that creates questions and answers them in the teaching context.  Historically, many teachers use a variety of tools such as observation logs and reflective journals (Darling-Hammond, 2012).

This activity will walk you though that process:

The first step is to insert the video observation into ELAN.  Give the video an initial viewing and add in annotations.  Annotations should be reflections of your teaching and immediate methods, they can also be ideas that you wish to further explore and revisit.

The second step is to create an audio-based discussion. As the video is playing, create an audio recording of your immediate reflections.  During the second video run-through, stop the recording periodically to voice record your thoughts.

Place the audio recording into the ELAN platform, synchronizing the wave with the video observation.  Once you silence your observational video you will be able to listen to your thought process overlayed to your observational data.  Another way of looking at this overlay: The reflected audio file replaces the audio component of the video observation.  This will allow you to pair your analysis to the observation, reflecting the moments of instruction.

Once audio and video, with annotations, are embedded and synced, add a second layer of annotations based on the alignment between your audio reflections.  This can be areas for improvement, implications for future practices, and moments that surprise you.  By integrating aspects of verbal, visual, and kinesthetic cues, teachers can establish retrieval systems that will allow them to change practices on the fly.

These approaches will allow teachers to self-reflect and create keys that indicate needs for change.  This systemic approach to identifying problems and providing solution, take a critical approach to teacher-based action research.  The benefit of using video and audio based reflections is the fluid and organic nature of reflection that allow teachers to improve their instructional techniques effectively (Altrichter, Feldman, Posch, & Somekh, 2013).

References

Altrichter, H., Feldman, A., Posch, P., & Somekh, B. (2013).Teachers investigate their work: An introduction to action research across the professions. New York, NY: Routledge.

Darling-Hammond, L. (2012).Powerful teacher education: Lessons from exemplary programs. San Francisco, CA: John Wiley & Sons.

Daudelin, M. (1996). Learning from Experience Through Reflection. Organizational Dynamics, 24(3), 36-48.

Gore, J. M., & Zeichner, K. M. (1991). Action research and reflective teaching in preservice teacher education: A case study from the United States.Teaching and teacher education,7(2), 119-136.

Hatton, N., & Smith, D. (1995). Reflection in teacher education: Towards definition and implementation.Teaching and teacher education,11(1), 33-49.

Kemmis, S. (1985). Action research and the politics of reflection. In D. Boud, R. Keogh & D. Walker (Eds.),Reflection: Turning Experience into Learning(pp. 139-164). New York, NY: Routledge.

Mezirow, J. (1990). How critical reflection triggers transformative learning. Fostering critical reflection in adulthood, 1-20.

Price, J. N. (2001). Action research, pedagogy and change: The transformative potential of action research in pre-service teacher education. Journal of Curriculum Studies,33(1), 43-74.

Quinsland, L. K. & Van Ginkel, A. (1984). How to Process Experience. The Journal of Experiential Education, 7 (2), 8-13.

 

Organic writing process requires audio recorder?

During my digital tools class the discussion moved from representing findings in innovative ways to constructing meaning via organic processes.

A small group discussion began with the idea of leveraging digital tools to support a non-linear writing process. The first issue that was brought to the table was related to this idea of non-linear.  So let’s talk about that notion first (like any good writer we must define our terms first, no?).  So let’s go with the construction of a typical research article.  So what do you read first in a paper: the abstract, introduction, literature review, methodology, results, discussion, and then the conclusion.  Well more often than not, the paper is constructed in this manner. In fact, students are often encouraged to write the abstract last and to rewrite the introduction after the conclusions have been reached.

In this discussion of non-linear writing a student brought up the idea that their writing differed greatly between personal and academic writings.  They wanted to adopt their creative and fluid system of poetic writing to their construction of research papers.  How do you do that? The easy answer – use the same medium across all your writings.  What does that even mean? Well, if you discover that you write better by outlining sentences on paper and then digitizing notes, do that.  If you find that you think best on whiteboards that allow you to construct plans, reorganize and scribble, well then do that.  One of the writing processes that really struck a cord was the notion of writing through recording audio thoughts while out on contemplating walks.  Taking natural conversation and moving them into typed words.

 

On a side note – another idea that came up in the conversation about representing findings was the idea of video.  An example of video representations is prominent in projects like the 1000 Voices project.  This online archive collects, displays and analyses life stories of individuals with disabilities from around the globe.  The not only allows users to upload video but also encourages them to submit images, films, audio, text, or any combination of medium.  Recently one user even passed along personal art projects that told their story.

Another great representation of video based finding presentations are the PhD Comics 2 minute thesis contest videos.  This short animated clips encompass introduction, research questions, methods, and sometimes conclusions in the manner of two minutes.  Here is an example, that talks about how distant reading techniques can be used to acquire information.

New Media Information Display

Throughout this discussion of possible new tools for data interpretation and display remember that the purpose is to communicate not to impress; don’t get caught up in how cool something looks. Think critically about if the options below truly represent the best means of communicating your meaning.
With the rapid centralization of journal article to interactive databases there has been a steady push for incorporating new media in research articles as novel forms of data representation. Often researchers consider tables and other graphical displays completing the discourse, report, or narrative. Typically information is represented in graphs and charts that include: bar charts, pie charts, line graphs (Minter & Michaud, 2003). Other data types can include realistic artifact such as, diagrams, maps, drawings, illustrations, and photographs.
Cidell (2010) took this idea of content analysis and incorporated word clouds into the mix. While word clouds can be effective displays and allow viewers to see what terms are prominent, they don’t allow researchers to display complete phrases. This is where poetic representations (Cahnmann, 2003) can be useful in place of word clouds (for more information check out MacNeil, 2000; Sparkes & Douglas, 2007).
Russ-Eft and Preskill (2009) discuss some very interesting information analysis that includes drama, cartoons, photography, checklists, and videos. These image based constructs of data analysis are further discussed by (Banks, 1998, as cited by Prosser, 1998).
Some interesting approaches to data analysis and display can include the following: cartoons and photo stories, such as graphic novel representation, using recreative images to synthese meaning and convey dialogue; enhanced audio elements as elaborated by Silver & Patashnick (2011); and interpretative live action, as described by Carter (2004), can include dance, plays, and other stage performances. Recently there has also been a push for multimedia video reports, much like this Africa Climate Change Resilience Alliance (ACCRA) project report.  Additionally, other interactive elements such as infographics and webpages are becoming more common place. And lastly, reflective blogs have also proved to be a useful tool (Paulus, Lester, & Dempster, 2013).

 

References

Bank, M. (1998). Visual anthropology: Image, object, and interpretation. In J. Prosser (Ed.), Image-based research: A sourcebook for qualitative researchers. (1st ed., pp. 6-19). Psychology Press.
Cahnmann, M. (2003). The craft, practice, and possibility of poetry in educational research. Educational researcher, 32(3), 29-36.
Carter, P. (2004). Material thinking : the theory and practice of creative research. Carlton, Australia: Melbourne University Press
MacNeil, C. (2000). The prose and cons of poetic representation in evaluation reporting. American Journal of Evaluation, 21(3), 359-367.
Minter, E., & Michaud, M. University of Wisconsin – Extension, Program Development and Evaluation. (2003). Using graphics to report evaluation results. Retrieved from: http://learningstore.uwex.edu/Assets/pdfs/G3658-13.pdf
Paulus, T. M., Lester, J. N., & Dempster, P. (2013). Digital tools for qualitative research. London, UK: Sage.
Russ-Eft, D. F., & Preskill, H. (2009). Communicating and reporting evaluation activities and findings. In Evaluation in organizations: A systematic approach to enhancing learning, performance, and change (2nd ed., pp. 399-442). New York, NY: Basic Books.
Silver, C., & Patashnick, J. (2011, January). Finding Fidelity: Advancing Audiovisual Analysis Using Software. In Forum: Qualitative Social Research (Vol. 12, No. 1). Retrieved from: http://www.qualitative-research.net/index.php/fqs/article/view/1629/3148
Sparkes, A. C., & Douglas, K. (2007). Making the Case for Poetic Representations: An Example in Action. Sport psychologist, 21(2), 170-189

 

 

Keeping prosthetics squeaky clean

After much discussion I realized that exposure to CAQDAS tools isn’t simply enough to remain competitive in the technocentric research environment. There is a need to stay updated on all the latest, and greatest, tools out there. More than the tool itself – researchers need to be able to leverage the tools in a unique, efficient, and critical manner.  So here is a brief list of steps anyone can take to stay current with CAQDAS.

Gain hands on and practical skills as soon as you can!  The longer you spend getting to know a software package the easier it is to use in actual research project, and it progressively becomes less stressful to work in the software environment.

Research what is out there. There are some great resources out there and sometimes a simple Google search can yield amazing results – For example the CAQDAS Networking Project is a great networking source that has a repository of support and tutorials.  Another example are comparative research papers like Ness’ work on “Computer Assisted Qualitative Data Analysis Software (CAQDAS)”.  Some big considerations are software capabilities, general assumptions, and potential licensing costs.  Additionally you can search for alternative open source or open educational research software options out there that could assist you.

Beta test new versions & actively give feedback and review products.  Much of the software development and evolution keeps the end user in mind.  Reference management systems such as Mendeley are embedded in the social science research process, in that user feedback is actively considered when upgrading.  So if you are having an issue or idea? Share it! Start a conversation about changes you wish to see.  You just might be surprised to see that a developer took the time to listen/read your thoughts.

Use any free training you have at your discipline!  Often university or departments have access to internal technology training systems built into the university structure. For example, here at IUB we have IT training sessions and online tutorials via Lydia.

If you don’t have organizational access to these kinds of resources, look into software developer developed guides, walkthroughs, and videos.  You would be surprised at the number of YouTube videos and forums that could help you answer quires and introduce you to new ways of leveraging the tools.  Another set of great resources are blogs (e.g. NVivo Blog) that review researcher tools, even microblogs like Twitter.

Talks to your peers! You will be amazed at how much you can learn from your colleagues.  Attend conferences paying close attention to the types of tools and methods utilized. It is also important to note what limitations are brought up and how final results are conveyed.

Subscribe to updates and newsletters. Let the information come to you. Everyone hates getting tons of emails and have an inbox full of irrelevant emails, that get tossed before they are even opened.  But here’s the thing – you don’t’ have to always read the emails sent from developers; you can follow up on updates whilst you are searching for answers.  Use newsletters as aggregators so you can read up to date information on your time!

Issues with Online Data Collection

In recent years the use of online data sources has lead to new developments including the standardization of resources and widening of scope; despite not all critical info is visible researcher.  However, online conversations are constantly evolving and perceptions online are always in flux.  Seymour (2001) claims that the internet releases the interview stage from “from its imprisonment in time and place” (p.158).

There are issues that cross over between virtual spaces and traditional data collection methods that stem mainly from the effects of media on the data gathering step.  Traditionally, in qualitative research documents/artifacts, interviews and observations are collected. In traditional data gathering humans are responsive and adaptive, whereas in electronic media data is often constructed via computer assistance. Now there are electronic forms of all these sources, where illustrations, programs and networks are artifacts.  Data collection differs with the nature of the medium; computer mediated communication has a unique effect on information construction.

There are some ethical considerations that need to be considered.  Merriam (2009) discusses four main ethical concerns with online issues: obtaining informed consent; ensuring confidentiality and security; determining what is public and what is private; and the last issue revolves around debriefing participants.  Many of these issues were brought up during the class, the most prominent being the issue of privacy and publicity, gaining informed consent, and issues of “reality.”

For the issue of public and private information, the discussion was centered on expectations for participants. For instance, if the researcher is reflecting on a public space about the participants than it might endanger the identity of the participants. Additionally, while participants often agree to having their information published as part of the study they might be less comfortable with having their information blogged upon.  Personally, I think this is a super important issue for in-service teachers who wish to engage in action research, self-reflection is a critical piece of the action research process (Carr & Kemmis, 2003).

 

References

Carr, W., & Kemmis, S. (2003). Becoming critical: education knowledge and action research. London, UK: Routledge.

Merriam, S. B. (2009). Qualitative research: A guide to design and implementation.. (pp. 156-163). San Francisco, CA: John Wiley & Sons.

Seymour, W. S. (2001). In the flesh or online? Exploring qualitative research methodologies. Qualitative Research, 1(2), 147-168.

 

“947 days ago”

Two years, seven months and two days ago, PhD Comics produced a comic about smartphone auto corrections (see below) that included common words used by doctoral students.

phd072911s The reading on audio and visual transcription reminded me of this comic as the discussion moved from techniques (Paulus, Lester, & Dempster, 2013) towards accuracy (Johnson, 2011).  In Johnson’s (2011) paper, comparing listen-&-type method to voice recognition software transcriptions, he comments on issues of time and accuracy.  These two criteria seem to be the sticking points when dealing with transcription in research studies.  It was interesting to see the side-by-side comparison of the two techniques.  But, I must say…all said and done…that I felt slightly irked by the final result.  In Table 1, shown below, the results of the transcribing process are displayed.  In the grand scheme of things ten minutes don’t make a huge difference.  I mean, after all, we spend countless hours researching, implementing and interpreting information.

Johnson Table 1During a recent research study, my research team toyed with the idea of outsourcing our transcriptions. There are sites that charge minimally for transcribing your audio, even video, files (e.g. Outsource 2 India – which changes about $40 for an hour of audio transcription, or Scribe 4 You – which charges a penny a word).  While both of these sites are highly ranked and reviewed, the researchers should still due their due diligence and review the transcript for accuracy.  Now these services only really work if you are seeking verbatim transcriptions without coding or additional notes.  I think, in our research group, it came down to a couple of key questions:

1. Do we really need a transcription?

2. Okay if we really need one, does it have to be verbatim?

3. Does it need coding and key observations noted? (For audio-only, this might look like pauses, sighs, or “uhmm,” for video files there are other codes that might be of interest.)

As it turned out…as a team we decided that we were more interested in learning about particular incidences than reading the transcription through, and then coding.  So instead of taking time to type word-for-word the conversation, we noted moments of interest and pulled out quotes (and notes) that were of particular interest. I suppose you could call this gist transcribing (Paulus, Lester, & Dempster, 2013).  Our goal moved away from analyzing transcripts to really understanding a specific instance.

Also, by not outsourcing our transcription we had to listen to the conversations over and over and over and over again.  It was super annoying at points…but at the end of the day we all began noticing nuanced moments we might have otherwise missed.  I think this was to our advantage. Had we told someone (or a software) to transcribe our data we might have ended up with an autocorrect mistake! You never know when quals can quake and Hindex value become jinxed.

 References

Johnson, B.E. (2011). The speed and accuracy of voice recognition softwareassisted transcription versus the listen-and-type method: A research note. Qualitative Research 11(1), 91-97.

Paulus, T. M., Lester, J. N., & Dempster, P. (2013). Digital tools for qualitative researchLondon, UK: Sage.

 

—===—
Published from a mobile device. Erroneous words are a feature, not a typo.

Digital Tools (Visual Representation)

This graphic represents many of the conversations noted from class.  As we discuss the potential and possible uses of digital tools in (qualitative) research, the process of research and where each tools plays a part in an image that keeps popping into my head. I finally had to create a skeleton adopted from several sources including (Coffey, Holbrook & Atkinson, 1996; Davidson & di Gregorio 2011; Gratton & O’Donnell, 2011; Matthews & Cramer, 2008; Miles & Huberman, 1994; Paulus, Lester, & Dempster, 2013) as well as in-class discussion notes.

When we talk about information and research, there are really two different types of data; there is researcher generated information and there is also participant generated information.  For example Silverman writes that certain kinds of data could not exist without the researchers’ facilitation, these include focus groups or interviews (2011). So as you look over the image below consider the types of information and how different entities can facilitate or create meaning.

When recording field notes and conversations – you can go old school with a pen & paper, you can jazzy it up with video or audio recordings, or new tech such as audio recording pens (e.g. LiveScribe).  The abundance of smart phone applications that allow you to record audio are amazing, regardless of platform.  This shift and expanse of technological tools means that “new sites of study and ways of engaging with participants” (Hine, 2008, as cited by Paulus, Lester & Dempster, 2013, p. 70), is growing ever prominent and accessible.  So not only do we have new ways of constructing meaning but we can also gather information across greater distances (e.g. Gratton & O’Donnell, 2011 and Matthews & Cramer, 2008).  Through archival tools and web conferencing, new sites for study are being revealed to educational researchers.

These ideas of creating knowledge through data reminded me of a classroom discussion on revamping Miles and Huberman’s (1994) list of computer uses in qualitative research.  This idea of shifting from traditional ethnographies to netnographies, using “ethnographic approaches to study unique digital spaces” (Paulus, Lester & Dempster, 2013, p. 76), really pushed me to think about online communities, social networks, virtual worlds, and serious games as new research environments.

  DigitalToolQualResearch

References:

Coffey, A., Holbrook, B. & Atkinson, P. (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online 1(1). Retrieved from: http://www.socresonline.org.uk/1/1/4.html.

Davidson, J. & di Gregorio, S. (2011). Qualitative research and technology: In the midst of a revolution. In Denzin & Lincoln (eds.) The Sage Handbook of Qualitative Research, 4th edition. (pp. 627-643). London, UK: Sage.

Gratton, M. & O’Donnell, S. (2011). Communication technologies for focus groups with remote communities: a case study of research with First Nations in Canada. Qualitative Research 11(2) 159-175.

Matthews, J. & Cramer, E.P. (2008). Using technology to enhance qualitative research with hidden populations. The Qualitative Report 13(2), 301-315.

Miles, M.B. & Huberman, A.M. (1994) Qualitative Data Analysis: An Expanded Sourcebook. London, UK: SAGE

Paulus, T. M., Lester, J. N., & Dempster, P. (2013). Digital tools for qualitative researchLondon, UK: Sage.

Silverman, D. (2011). Interpreting qualitative data: Methods for Analysing Talk, Text, and Interaction. London, UK: Sage.

Digital diaries

Blogs are becoming ever popular for researchers and teachers.  They are used as a source of archiving thoughts, starting conversations, and housing resources.  In my teaching teachers course we talk about the importance of instructors having blogs and websites.  Not only are they a source of information but they are also valuable tools in establishing digital identities.  Overall blogs have emerged as a creative spaces that allow users to carry on a “persistent conversation” (Paulus, Lester & Dempster, 2014, p. 15), and represent a multi-user environment which is not platform restrictive.

More importantly these digital diaries are more than simple text. They allow for hyperlinking of content, embedding visual tools (such as videos or images), and are sometimes video or audio based entries.  Many of the blogging tools used by teachers and educational researchers are very personal expressions and accounts.  They incorporate happenings, ideals, and current issue of concern.  Their posting of text and visuals allows blog visitors to gain a richer illustration of the blogger’s life experiences.  These interactive communicative diaries are often linked to social networks, and “respond to new conditions” (Willi, Melewar, & Broderick, 2013, p. 103) of identity creation.  In other words they allow the blogger to develop a holistic online identity centered around their diary postings, pertaining to issues deemed of value.

So we talked a bit about online identities and blog…what about the other aspects?  Well there is this notion centered about reflexivity, and using blogs as tools to streamline that process.  Let’s start by first defining reflexive practices.  Paulus, Lester and Dempster define this as, “the process of intentionally attending to the perspectives, attitudes and beliefs that shape how you design a research study and make sense of your data” (2014, p. 13).  In other words, how is ‘you as a person’ impacting a situation.  This doesn’t have to be in the realm of data collection or research design, while it might often be related to those two areas.

Let’s ground this fuzzy concept with any example of action research.  A new 3rd grade teacher is welcoming his class during the first week of school and notices that two of his (international) students aren’t socializing with their peers, nor are they speaking up, participating in class, and they even refuse to make eye contact.  Now there are several ways an inexperienced teacher might reflect on these issues. He might think that something is wrong with the students: maybe they are just shy or perhaps they are having troubles adjusting to the new classroom.  He might not even stop to consider that they students come from a different school culture and that instead of changing them he might have to change his approach.

Had this teacher kept a journal he might be able to look over what changes positively impacted his classroom and make changes accordingly.  Additionally, if he kept an online blog perhaps his online community of teachers might have been able to offer suggestions and possible resources. And so, we come back to this concept of changing or practices based on our reflections, and realizing how we impact our design.  They don’t always have to be grounded in the concept of research.

So we have reflective practices and establishing digital footprints and creating conversations.

Let me shift gears and tell you about a great way to aggregate your diary posts and notice trends, themes and issue of importance. Often times when I site down to blog, I start writing about a topic and it takes an interesting turn…sometimes I run off on tangents.  At the end of each blog entry I push all my talking points into a word cloud and use those as my tags. This allows me to self-evaluate what I am really talking about and sometimes enlightens me to the topics that are most prevalent in my entry.  Consider using word cloud tools to thematically analyze, code even, your digital diary.  There are some great websites out there, and Wordle and Tagxedo are two of the best I have found.  Both are fairly simple to use and allow for a customizable experience. Simply copy and paste your text into the box and create! Wordle is simple, clean, and classic (Wordle even recognized a plethora of other examples), while Tagxedo allows for more customization.  Make sure you have Microsoft Silverlight and Java up to date. 

Here is an example of what a word cloud of this post may look like

Using Wordle

WordCloud 3

Using Tagxedo

WordCloud 2

References

Harricharan, Michelle and Bhopal , Kalwant (2014) Using blogs in qualitative educational research: an exploration of method. International Journal of Research & Method in Education (In Press).

Paulus, T. M., Lester, J. N., & Dempster, P. G. (2014). Digital tools for qualitative research. Thousand Oaks, CA: Sage.

Willi, C. H., Melewar, T. C., & Broderick, A. J. (2013). Virtual brand-communities using blogs as communication platforms and their impact on the two-step communication process: A research agenda. The Marketing Review, 13(2), 103-123.

Looking to make the most out of your blog? Check out Bolles’  blog posting on using blogs as a research tool. Not only does it write about his personal experiences but has used the text-based blogging feature to transcribe his video blog.

15 years of going in circles

During a meeting today it was brought to my attention that the issues revolving around technology integration remain the same, nearly 15 years after a call for conversation.  In 1999, Ertmer et al. described the role of technology in learning environment and proposed two frameworks of barriers that needed to be addressed for successful technology integration.  The same can be said for the integration of technology into research methods and qualitative research project design.  Funny how authors despite trying to have a cohesive conversation across disciplines, remain at the same place, as though the conversations haven’t shifted with the times or technological updates.  It seems as though conversations and research agendas are moving in circles.  All this got me thinking once more about users’ fundamental understanding of tools and what that can mean.

Conole and Dyke (2004) break down the notion of affordances when using technological tools in conducting research.  This concept encompasses an ontological approach, that talks about possible uses, and epistemological approaches, that revolve around intended or actual utilizations.

hammer

Let’s take for example a hammer: A hammer can be used for several purposes…perhaps you are using it to nail pins into a wall to hold pictures…or perhaps you are using it to weigh down your door as so it won’t close.  There are ideal uses you think of when someone mentions a hammer, those ideals are constructed based on your personal familiarity with the tool…what often escape us are the list of things one can possibly do with a hammer.

Emu

Here’s another: Quick think of a bird…

Can it fly? Most people think about flying birds when asked; but how does your mental model change when the first bird you think of is a penguin, kiwi, or emu.

When researchers discuss expanding the frameworks about technological tools, some “rather than elaborating on how any one of these ‘affordances’ could be relevant to a learner or a practitioner, the authors tend to indulge in a certain amount of hopeful expectation that affordances and abilities will simply emerge” (Boyle & Cook, 2004, p. 297).  This doesn’t seem too much of a concern and there aren’t persistent calls for more research, so perhaps it has fizzled down a bit.

After reading discussions about technological affordances and best technology integrations practice, I think the response remains the same; “It depends”

References

Boyle, T. & Cook, J. (2004). Understanding and using technological affordances: a commentary on Conole and Dyke. ALT-J Research in Learning Technology 12(3), 295-299.

Conole, G. & Dyke, M. (2004a). What are the affordances or information and communication technologies? ALT-J Research in Learning Technology 12(2), 113-124.

Ertmer, P. A., Addison, P., Lane, M., Ross, E., & Woods, D. (1999). Examining Teachers’ Beliefs about the Role of Technology in the Elementary Classroom. Journal of research on Computing in Education32(1), 54-72.

Previous Older Entries

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 12 other subscribers

Calendar

May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031