ISTE+Standard+IV+Reflection

ISTE Standard IV: Assessment and Evaluation There is an undeniable reality that churns under the surface of every classroom in America: we live in a world of high-stakes testing. School and student accountability stands as a keystone of current educational policy (Williamson & Redish, 2009, p. 78). As a fifth grade teacher, I swim through these treacherous undercurrents every day, and as a technology facilitator I see teachers show reluctance to deviate from their own ‘tried and true’ methods for fear of a decline in scores. While Texas has not yet moved to wide-spread computer-based testing (CBT) as they begin the transition from TAKS to STAAR, Texas has begun developing CBT formats for language modifications on the STAAR-L assessments for English Language Learners, or ELL’s (Texas Education Agency, 2010). However, the integration of technology into assessment and data analysis does not end with high-stakes testing. The need to vary assessment format according to content matter and student need offers a spectrum of possibilities for technology to become embedded, both in terms of how students are assessed and how that data can be utilized to determine best practices (Williamson & Redish, 2009, pp. 79-80; Solomon & Schrum, 2007, p. 168). While the ideas of varied assessment wove throughout my internship activities, I found myself particularly interested in a few particular ideas: the development of formative assessment through technology integration, the constructivist application of assessment through the development of eportfolios, and the analysis of both qualitative and quantitative data as a means of developing a comprehensive needs assessment for a campus.

In the course of my lesson development evolving from my work in EDLD 5364 and my professional development framework for formative assessment using interactive response systems, I worked with a wide variety of assessment vehicles, and had the opportunity to evaluate the strengths and obstacles of each. One simple alternative we offered in our lesson development involved the integration of word processing technology into the KWHL chart, a classic example of accessing students’ prior knowledge, experiences, and conceptions, transforms into not only a tool of analysis for the teacher but a means of communication between school and home (Pitler, Hubbell, Kuhn, & Malenoski, 2007, pp. 18-19). This real-time qualitative data made an additional leap forward with the use of interactive whiteboard technology or online collaborative products such as Google Docs, where students could contribute in parallel to a class document. Another option for constructivist assessment involved the creation of a multimedia product that could be shared with others. Our proposal: students investigating decomposition or strata could pull actual soil samples then take digital photos to create their own eBook could show three dimensional models of soil core samples. A final option we offered included the development of assessments utilizing interactive response technologies to collect various types of data, including survey responses, collaborative responses, and compare their understanding with that of their peers. I believe our final unit offered robust options for evaluation of student understanding on a variety of levels.

Helen Barrett described the benefits of an e-portfolio not only as an archive of work product, but as an opportunity for authentic assessment; however, she separates in stark terms the benefits of an eportfolio as “an assessment //of// learning versus as assessment //for// learning” (as cited in Solomon & Schrum, 2007, pp. 171-175). Quite simply, like many instructional and assessment tools, not all e-portfolios are created equal. My personal internship experience highlighted both the best and the worst of what eportfolio assessment has to offer. Our entries into TK20 represent the original form of eportfolio as described by Barrett, where the portfolio is more static, providing little opportunity for ongoing development of artifacts (p 172). The artifacts themselves are mandated by the evaluator versus the student or the student’s selected target audience; and the portfolios are generally outcome-driven, used to make high-stakes decisions about a student’s learning (p 173). I found it an unfortunate realization during the past several months that this ‘new’ initiative actually reflects outdated concepts, resources, and goals in developing and utilizing a digital portfolio. Barrett proposes a much different view of the eportfolios: that of ‘Web 2.0’ models, or 'wiki-folios' that “have the potential to change with the pedagogy of interaction, especially as used within a paradigm of assessment for learning” (p. 173). Overall, the autonomy of this eportfolio format works specifically toward supporting affective and strategic learning networks (Rose & Meyer, 2002, Chapter 7). I found my own level of interaction and engagement with the work product vastly more metacognitive and authentic than with the stilted, one-sided composition of the TK20 format. As I used this interactive template for the development of my original professional portfolio, I experienced many of the benefits stemming from its use: its time flexibility offer more regular use; the artifacts offer a view of a more personal learning journey; and interactions with colleagues and mentors offered ongoing opportunities for learning and growth. The wiki that I created for my internship unquestionably turned out to be one of my best teachers.

Finally, I had a unique opportunity during my internship to participate in the district leadership training for our new data analysis and strategy development, an application called DEEP Analysis. This application is used in conjunction with our current student information system and high-stakes assessment data reporting tool to create more comprehensive data analysis and development of strategies for improvement. Even more importantly, this tool increases the district’s ability to work with more complex data, such as determining multi-year data trends (Williamson & Redish, 2009, p. 84). Inputting qualitative data from the staff allowed me to help make connections between specific programs and practices on our campus and their assessed outcomes. This provided adequate context to review the effectiveness of several campus technology initiatives on the campus, an activity necessary to provide adequate information to our district technology coordinator and our stakeholders (Williamson & Redish, 2009, pp. 87,90)

My district seems to stand out in the lead in terms of continuous improvement and data-driven decision making. However, the challenges that plague other districts, such as inadequate funding and inadequate availability of appropriate assessment tools (Williamson & Redish, 2009, pp. 81-82), also work against advancement of these goals in my own backyard. I hope that the lessons learned in my internship will help me to find reasonable solutions to these challenges so that each student will be assessed in authentic, meaningful ways.

References: Pitler, H., Hubbell, E. R., Kuhn, M., & Malenoski, K. (2007). //Using technology with classroom instruction that works.// Alexandria, VA: Association for Supervision and Curriculum Development. Solomon, G., & Schrum, L. (2007). //Web 2.0: new tools, new schools.// Eugene, OR: International Society for Technology in Education. Texas Education Agency. (2010). //House bill 3 transition plan// (Publication No. GE11 601. Austin, TX: TEA Printing Office. Retrieved from [] Williamson, J., & Redish, T. (2009). //ISTE's Technology Facilitation and Leadership Standards: What Every K-12 Leader Should Know and Be Able to Do.// Washington: International Society for Technology in Education.