A “history” of digital innovation for conveying knowledge (and research)

This week the theme of digital natives comes up a lot, like a lot. Let’s start by first defining digital natives and what that means to the research community. In 2001, Prensky first coined the term to mean individuals who have spent most of their lives “surrounded by and using computers and videogames, digital music players, videocams, cell phones and all other toys and tools of the digital age” (p. 1). These interactions have fundamentally changed the way users interact with platforms and critically think about leveraging technological tools for their motives. Students these days have an affinity for using technology as a crutch and display unique digital literacies.

There however exists a dichotomy between the perceived usefulness of digital tools to convey knowledge and what students area actually doing. Personally, I have noted these two, clear realms of experience in both my teaching pre-service instructors and my attempts to integrate snazzy tools and techniques into my research project designs.

Have you every met a teacher who can use their smart phone to play Words With Friends or Candy Crush, manage their daily lives with integrated calendars and reminder apps, check their emails and leverage social networks for professional development; but failed at using a presentation to effectively communicate ideals in an enhancing manner? There it is again…the idea that digital tools are useful (in our daily lives) but don’t translate into valuable uses for professional outlooks.

Paulus, Lester and Britt (2013) point out that if advisors, faculty, and teachers “are not using the tools in informed ways, it makes it less likely that the next generation will, either” (p. 649). So the baton passes to instructors to enlighten students as to how they can use technological tools creatively and critically. Several texts such as Joiner et al. (2013), Robert and Wilson (2002), and Coffey, Holbrook and Atkinson, (1996) address the current value systems of using digital tools to convey and analyze information.

Even within my coursework I notice the perception that the human approach is best. The responsibility seems to therefore extend to current users to inform the community, and their students, about the ways in which you can use digital tools can be best leveraged. I think early adopters should be models for future users and demonstrate efficient practices.

References

Coffey, A., Holbrook, B., & Atkinson, P. (1996). Qualitative data analysis: Technologies and representations. Sociological Research Online, 1(1), Retrieved from http://www.socresonline.org.uk /1/1/4.html

Joiner, R., Gavin, J., Brosnan, M., Cromby, J., Gregory, H., Guiller, J., … & Moon, A. (2013). Comparing First and Second Generation Digital Natives’ Internet Use, Internet Anxiety, and Internet Identification. Cyberpsychology, Behavior, and Social Networking.

Prensky, M. (2001). Digital natives, digital immigrants. On the horizon, 9(5), 1-13.

Roberts, K. A., & Wilson, R. W. (2002). ICT and the research process: Issues around the compatibility of technology with qualitative data analysis. Forum: Qualitative Social Research, 3(2), Retrieved from http://www.qualitative-research.net/index.php/fqs/article/view/862/1872

Perpetual Heart Break Machine

These past two weeks my students submitted their feedback evaluating the course, my instructional strategies and suggesting improvements. In looking over the initial feedback, I felt it was somewhat heart breaking in their honestly in commenting. In actually averaging their ratings it wasn’t so bad. While most of the comments were negative I realized that they were not directed at me, persay, but the structure of the course. So I moved from being slightly distraught about not being an effective or engaging teacher…to not being able to really implement the changes they suggested.
For example, an overwhelming number of students hated the 3-hour-long 8 am class (not something I had control over) and suggested the class over two weeks. While I cannot control the length or scheduled time, I feel like I should be able to engross and entertain the students enough so that they don’t notice the horribly early and long class.

Another suggestion that came up was reduce the number of PowerPoint (slides) and reduce the number of assignments. Again this wasn’t something I can directly control, as all class sections have the same work load. While realizing that the work load can be overwhelming, I try to give the students additional ‘lab time’ so that they can work on projects. Even though I have been implementing this from the beginning, students still commented that it was not enough. Every week I struggle with trying to balance lecture, engaging discussions and in-class time for projects. Generally a third to half of my class time is devoted to ‘lab-time’ and it still doesn’t seem to be enough.

There was a hiccup in implementing the survey and it was administer twice, the first time 18 of 23 students responded later only 8 resubmitted their thoughts. Half of the questions were on a likert-scale (of 5 or 4). The first time (4-scale) the students agreed the course as outstanding by 2.5 of 4, the second time (5-scale) it improved to 3.5 of 5. Next the students evaluated the outstanding nature of the instructor. Both surveys were very close in response with a 2.7 of 4, first time, and a 2.9 of 5, the second time marking the teacher as outstanding.
The following responses were drawn from the second survey administration. The students rated the clear communication of content as a 2.5 of 5. The students rated the clarity of explanation of requirements/expectations/assignment as a 3 out of 5. Students rated appropriate use of class time as a 2.5 out of 5. On a more positive note the students felt that the instructor was creatively using teaching strategies, ranking it a 3.2 of 5. Lastly, the students noted that the instructor gave timely feedback by rating it 2.3 out of 5.

When students were asked about recommendation to improve the course and suggestions to the instructor a majority of the students responded that the course had too many things due (n=6), with overwhelming due dates and expressed a need to consolidate class resources. An additional student commented that the content housed to many resources and websites. Within the actual class structure, students commented that the class was too long and too early in the morning (n=4). A couple of the students mentioned that there needs to be less PowerPoint (n=3), less in class discussions (n=1), and more time in class to work on projects, homework, and assignments (n=4). Additionally these lectures needed more consistency from week to week (n=2), more explanations/instruction (n=1), and a great focus on projects not on in-class examples (n=1).
There were three students who commented that the instructor was doing alright. Another two students mentioned that student not be graded as stringently. Lastly, one student commented that “the instructor could assist the students in a less sarcastic manner.” These two points are aspects that I feel I have the greatest control in changing and will work diligently in the coming weeks to explain my grading manner and appear less sarcastic.

When students were asked to list things that they did not want to change they discusses the general content and diversity of technology introduced (n=9), the benefits of the projects, particularly the digital story (n=2), the real-world application of the teacher websites and ePortfoilo (n=2), and the scaffold nature of the course (n=1). Moreover the students did not want to see changes in the structure, content, or method by which the PowerPoint were displayed (n=4). As far as the instructor specific embellishments the students liked the text reminders (n=1), the ease and timely manner of receiving feedback (n=2), and that I do not allow my students to procrastinate on projects (n=2). Additionally a student appreciated the structure of the course that allowed students to choose independent and group projects.

Most meaningful aspects of the course were identified as the creation of the teacher and ePortfolio websites (n=8), introduction of the tools, such as Google drive, Skype and Google docs (n=6), and the utilization and discussion of the 3Es (4 in my class, engagement, enhancement, effectiveness, efficiency) (n=5). Students also noted that the content exploration tools/tasks (n=1) and production tools/tasks (n=2), as well as class projects, such as the digital stories (n=2), case analysis (n=1), webquests (n=1) as meaningful activities.
Least meaningful aspects of the course were claimed to be the digital story project (n=2), and all the video editing that accompanied the project (n=8), along with the case analysis (n=5) and webquests (n=2), commenting particularly on the lack of application of early education teachers. Lastly students noted that the teacher and ePortfolio websites (n=1), “busy work” and “time wasters” in class (n=1), demonstrations in class (n=1), the Go Animate (n=1) and voki (n=1) clips, and the use of Google documents (n=1) were not useful to the class.

And for the random student who suggested a picnic: let’s have one when the weather gets a little warmer!

Grey literature on teaching pre-service educators @ IU

Welcome to the teaching diary of an Indiana University doctoral student. To teach is to learn and hopefully these ruminations will track my journey as I prepare future educators in Bloomington, Indiana to best use technology in their classrooms. This site is home to many musings and rants as I reflect upon not only my teaching methods but also what I have learned as an instructor.
I encourage any and all to become a part of the discussion.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 12 other subscribers

Calendar

May 2024
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031