don's dissertation

183
AN INVESTIGATION OF TECHNOLOGY COMPETENCE OF SCHOOL- BASED ADMINISTRATORS IN THE TRI-COUNTY SECONDARY SCHOOLS IN THE SOUTHEASTERN PART OF SOUTH CAROLINA by Donald D. Simpson MELISSA MCINTYRE-BRANDLY, Ph.D., Faculty Mentor and Chair PATRICIA H. GUILLORY, Ph.D., Committee Member MORRIS RAVENELL, Ed.D., Committee Member Barbara Butts Williams, Ph.D., Dean, School of Education A Dissertation Presented in Partial Fulfillment Of the Requirements for the Degree Doctor of Philosophy Capella University September 2011

Upload: donald-simpson

Post on 10-Mar-2016

228 views

Category:

Documents


2 download

DESCRIPTION

Dissertation

TRANSCRIPT

Page 1: Don's Dissertation

AN INVESTIGATION OF TECHNOLOGY COMPETENCE OF SCHOOL-

BASED ADMINISTRATORS IN THE TRI-COUNTY SECONDARY SCHOOLS

IN THE SOUTHEASTERN PART OF SOUTH CAROLINA

by

Donald D. Simpson

MELISSA MCINTYRE-BRANDLY, Ph.D., Faculty Mentor and Chair

PATRICIA H. GUILLORY, Ph.D., Committee Member

MORRIS RAVENELL, Ed.D., Committee Member

Barbara Butts Williams, Ph.D., Dean, School of Education

A Dissertation Presented in Partial Fulfillment

Of the Requirements for the Degree

Doctor of Philosophy

Capella University

September 2011

Page 2: Don's Dissertation

© Donald D. Simpson, 2011

Page 3: Don's Dissertation

Abstract

This study investigated the level of technology competence for secondary principals and

other school-based administrators (assistant principals, vice principals, or administrative

assistants). This study examined the relationship between the level of use of computer

applications by the secondary principals and previous computer use, computer training,

perceptions, and attitudes that were held by the school administrators toward computers.

The methodology used in this study was quantitative. This study used a descriptive

design as a means to investigate the level of technology competence for secondary

principals and other school-based administrators (assistant principals, vice principals, or

administrative assistants). These hypotheses guided the study: there is no statistically

significant difference in the mean scale scores for skill importance between the secondary

principals and the other administrators (assistant principals, vice principals or

administrative assistants), for the technology competence, for frequency of use between

the secondary principals and the other administrators , and for perceptions and attitudes

of school-based administrators toward computer use and the use of computer

applications. The survey instrument used in this study was the Technology Competence

Survey for School-Based Administrators. The population was secondary principals and

other school-based administrators (assistant principals, vice principals, or administrative

assistants) located in the Tri-County of the southeastern part of South Carolina. The

results of this research study indicated that both principals and other administrators were

found to place a high level of importance on technology skills, rate themselves as fairly to

highly technologically competent, used technology frequently, and had positive

perceptions and attitudes about technology. The findings of this research study were

Page 4: Don's Dissertation

consistent with previous research that principals, who modeled the use of technology,

shared their learning, and actively learned about technology, and were more likely to

have faculty and students who used technology in their daily practice (Page-Jones, 2008).

Page 5: Don's Dissertation

iv

Dedication

This dissertation is dedicated to my family, friends, and mentors who helped,

guided, prayed, and cried with me throughout this process.

To my family, who constantly reminded me that with God’s help, with their

prayers, and support, I could and would make it through this process.

To my friends, who also encouraged me, gave me support, and were there to help

me keep my sights on my ultimate goal.

To my mentors, who shared their infinite wisdom, knowledge, insight, and

encouragement, and who also were there when things were a little rough and tough to

encourage me to continue until I reached my goal.

To all of you, I give my deepest thanks and gratitude.

Page 6: Don's Dissertation

v

Acknowledgments

First, I give thanks to God for providing me with the strength, wisdom, and

knowledge to reach this point. I would also like to thank Him for putting the needed and

necessary people in the proper place to give me the encouragement, and the support,

when needed on this dissertation journey.

Second, I give my wholehearted thanks to my committee: Dr. Melissa McIntyre-

Brandly, Faculty Mentor and Chair, for her many hours of coaching, reading, input, and

encouragement; Dr. Patricia H. Guillory and Dr. Morris Ravenell, who too spent many

hours guiding and encouraging me to continue on this course of work.

In addition, to my committee, a great deal of thanks goes to Dr. Carolyn Rogers,

who saw in me the ability to approach and ascertain such a task as acquiring a doctoral

degree.

Thanks to my parents, mentors, family, and friends. Without you, I would not

have had the support system to make it through this challenging and rewarding process.

To all of you, I am forever indebted.

Page 7: Don's Dissertation

vi

Table of Contents

Acknowledgments iv

List of Tables ix

List of Figures xi

CHAPTER 1. INTRODUCTION 1

Introduction to the Problem 1

Background of the Study 3

Statement of the Problem 6

Purpose of the Study 8

Rationale 9

Research Questions and Hypotheses 11

Significance of the Study 13

Definition of Terms 15

Assumptions 17

Limitations 18

Nature of the Study 19

Organization of the Remainder of the Study 20

CHAPTER 2. LITERATURE REVIEW 21

Technology and School Leadership 22

School Leadership for Information and Communication Technology (ICT) 25

Theoretical Perspectives in Technology Education 28

School Leadership and Technology Integration 34

Barriers to Technology Integration 37

Page 8: Don's Dissertation

vii

School Leadership and Decision-Making Functions 38

School Administrators Usage of Technology and Computers 40

Transformational Leadership 42

Technology in Schools 46

School Administrators’ Management Functions 48

Technology Impact on Instruction 51

School Administrators’ Technological Training 53

Instructional and Academic Technology Leadership 55

School Administrators’ Technology Competencies 57

Summary 60

CHAPTER 3. METHODOLOGY 62

Statement of the Problem 62

Research Questions and Hypotheses 63

Research Methodology 64

Research Design 67

Population and Sampling Procedures 69

Instrumentation 70

Validity 74

Reliability 75

Data Collection Procedures 76

Data Analysis Procedures 78

Ethical Considerations 82

Summary 83

Page 9: Don's Dissertation

viii

CHAPTER 4. DATA COLLECTION AND ANALYSIS 85

Descriptive Data 86

Data Analysis Procedures 89

Results 91

Summary 111

CHAPTER 5. RESULTS, CONCLUSIONS, AND RECOMMENDATIONS 112

Summary of the Study 112

Summary of Findings and Conclusions 115

Recommendations 121

Implications 124

REFERENCES 128

APPENDIX A. LETTER TO THE SUPERINTENDENT 152

APPENDIX B. LETTER TO THE PRINCIPAL 153

APPENDIX C. LETTER TO THE OTHER SCHOOL-BASED ADMINISTRATORS

ASSISTANT PRINCIPALS, VICE PRINCIPALS, OR

ADMINISTRATIVEASSISTANTS 154

APPENDIX D. INFORMED CONSENT FORM 155

APPENDIX E. TECHNOLOGY COMPETENCE SURVEY FOR SCHOOL-

BASED ADMINISTRATORS 157

APPENDIX F. DEMOGRAPHIC QUESTIONNAIRE 164

Page 10: Don's Dissertation

ix

List of Tables

Table 1. Size of School Descriptive Summary 87

Table 2. Job Position Descriptive Summary 88

Table 3. Number of Years Worked as a School Administrator Descriptive

Summary 88

Table 4. Computer Access Descriptive Summary 89

Table 5. Psychometric Results of the Research Survey 90

Table 6. Descriptive Statistics for Skill Importance Survey Items: Principals 92

Table 7. Descriptive Statistics for Skill Importance Survey Items:

Other Administrators 92

Table 8. Descriptive Statistics for the Overall Skill Importance Scale 93

Table 9. Independent Samples t-Test Results for Skill Importance 95

Table 10. Descriptive Statistics for Technology Competence Survey Items:

Principals 97

Table 11. Descriptive Statistics for Technology Competence Survey Items:

Other Administrators 98

Table 12. Descriptive Statistics for the Overall Technology Competence

Scale 100

Table 13. Independent Samples t-Test Results for Technology Competence 100

Table 14. Descriptive Statistics for Frequency of Use Survey Items: Principals 102

Table 15. Descriptive Statistics for Frequency of Use Survey Items:

Other Administrators 103

Table 16. Descriptive Statistics for the Overall Frequency of Use Scale 105

Table 17. Independent Samples t-Test Results for Frequency of Use 105

Table 18. Descriptive Statistics for Perception & Attitude Survey Items:

Principals 107

Page 11: Don's Dissertation

x

Table 19. Descriptive Statistics for Perception & Attitude Survey Items: Other

Administrator 108

Table 20. Descriptive Statistics for the Overall Perceptions and Attitudes

Scale 110

Table 21. Independent Samples t-Test Results for Perceptions and Attitudes 110

Page 12: Don's Dissertation

xi

List of Figures

Figure 1. Box Plots for the Skill Importance Scale 94

Figure 2. Box Plots for the Technology Competence Scale 99

Figure 3. Box Plots for the Frequency of Use Scale 104

Figure 4. Box Plots for the Perceptions and Attitudes Scale 109

Page 13: Don's Dissertation

1

CHAPTER 1. INTRODUCTION

Introduction to the Problem

The implementation of systematic educational reforms and the attitudes of the

classroom teachers are very crucial when determining the success or failure of innovative

curriculum (Barnes, 2005; Lloyd, 2003). Teachers and administrators must agree with

the underlying philosophy of the curriculum, if changes are to be made (Barnes, 2005;

Peters & Carey, 2010). The use of technology in schools was only successful when

initiated by classroom teachers (Barnes, 2005). School administrators must be able to

understand the basic technologies that they were asking their staff to utilize and their

students to learn (Wenzel, 2009). Many of the technology programs required basic

technology competencies that school administrators should have.

Congress passed the No Child Left Behind Act ([NCLB], 2002) that reauthorized

the Elementary and Secondary Education Act. President George W. Bush signed the Act

into law in January 2002. NCLB brought many significant changes and reforms to the

nation’s schools (Learning Point, 2007). NCLB emphasized the improvement of student

achievement in academics with the use of technology in both the elementary and

secondary schools, by building access, accountability, through integration initiatives and

parental involvement.

Helen Padgett (2009), president of the International Society for Technology in

Education (ISTE, 2009), noted that there was substantial research evidence that

technology had become a very important component for the success of the educational

system. A review of research and data that had been conducted and published in the last

Page 14: Don's Dissertation

2

five years by the ISTE, confirmed that technology: (a) improved student achievement in

reading, writing, and mathematics; (b) improved school efficiency, productivity, and

decision-making; (c) helped teachers meet professional requirements; (d) improved

learning skills; (e) helped schools meet the needs of students, (f) promoted equity and

access in education; and (g) improved workforce skills (Ed Tech Action Network, 2009;

ISTE, 2009). According to ISTE, educational technology had a positive effect on student

achievement. However, the correct implementation of the educational technology was

the key (ISTE).

Current research on education technology and student achievement showed

significantly higher gains. For example, in a Michigan’s Freedom to Learn (FTL)

program, students had significantly higher levels of engagement in their work and in

using technology as a learning tool, when compared with national average (Lowther,

Strahl, Inan, & Bates, 2007). The results were consistent for the school years 2004-2005

and 2005 and 2006 (Lowther et al., 2007; Ross & Strahl, 2005). In one FTL school,

eighth grade mathematics achievement doubled from thirty-one percent to sixty-three

percent between 2004 and 2005, and science achievement increased from sixty-eight

percent to eighty percent between 2003 and 2004 (Lowther et al., 2007).

According to Barber (2004), the implementation and the development of

accountability systems was one of the most powerful trends in educational policy in the

last twenty years. The message that was conveyed to parents continued to be that they

should be satisfied with schools that improve test performance from year to year and

begin to question the quality of instruction in the schools that showed poor performance

(Volante, Cherubini, & Drake, 2008). NCLB required every state to develop standards,

Page 15: Don's Dissertation

3

accountability systems, and standardized test, as well as to mandate the option for

students to transfer from schools that had low-test performance to schools that had higher

test performance. NCLB promoted competition between schools (Amrein & Berliner,

2003; Hursh, 2005; Volante et al., 2008).

Understanding the implementation of NCLB, and the positive and/or negative

impact on students, teachers, the improvement of student achievement through

technology, and the school system, was of the utmost importance for our nation’s school

administrators. School administrators need to gain a thorough understanding of what

difficulties teachers may encounter and what students may face if working with

substandard equipment or support.

Technology impacted the instruction of students either directly or indirectly and

had the potential to reform the teaching and learning process through effective classroom

strategies (Blake, 2000). Literature had examined the investigation of technology

competence of school-based administrators and its implications for school improvement

planning remained relatively sparse. However, research continuously demonstrated that

school leadership measured a direct impact on teacher beliefs and student achievement

(Leithwood & Jantzi, 2006; Nettles & Herrington, 2007; O’Donnell & White, 2005;

Volante et al., 2008; Waters, Marzano, & McNulty, 2003).

Background of the Study

Testerman and Hall (2001) contended that one of the critical educational

leadership challenges for school administrators was the successful application of

technology in education. Leadership in utilizing student assessment, evaluation data, and

Page 16: Don's Dissertation

4

the use of technology must be viewed as a pressing concern in our nation’s schools

(Noonan & Renihan, 2006; Popham, 2005; Shepard, 2000; Volante & Cherubini, 2007;

Volante et al., 2008). School administrators must possess a variety of skills and know

various assessment methods and their purposes, the rudiments of technical assessment

quality, how to embed assessment data, how to use data to adjust curriculum and

instruction as well as be technological competent (Gallagher & Ratzlaff, 2008; Volante et

al., 2008). It was unfortunate that many of our school administrators were not required to

complete a course in assessment evaluation, and the use of technology, as a part of their

training in our colleges and universities (Lukin, Bandalos, Eckhout, & Mickelson, 2004;

Volante et al., 2008). School administrators must learn on the job. Finding ways to

strengthen school administrators’ leadership skills required an analysis of the key issues

at the school district and school levels. School administrators in their roles as the

instructional leaders, must possess a level of technological competency as well as possess

a level of competence that allowed them to effectively utilize technology in the

management of their schools (Blake, 2000).

Leadership is important when securing school reform (Noonan & Reniham,

2006). Harris (2002) noted that a central element of leadership mandated in our schools

focuses on student learning. Instructional leadership was one of the important roles of the

school principal (Dufour, 2001; Fullan, 2003, 2001; Noonan & Reniham, 2006). There

was empirical evidence that leadership of school administrators is an important influence

on a school’s effectiveness (Anderson & Dexter, 2003; Hallinger & Heck, 1996, 1998;

Leithwood & Riehl, 2003). There was scarcely any research that has focused on the

Page 17: Don's Dissertation

5

effectiveness of training for school administrators’ use of technology in school

information systems ( Noeth & Volkov, 2004).

Researchers denoted that technologies enhance classroom instruction and school

administration (Benton, 2002; Perez-Prado & Thirunarayanan, 2002; Ringstaff & Kelly,

2002; Roschelle, Pea, Hoadley, Gordin, & Means, 2000; Sivin-Kachala & Bialo, 2000;

Smith, Ferguson & Caris, 2001). However, there was a majority of schools that had yet

to implement technologies beyond the basic level. Technology had the potential to

improve the educational system and the growth of our children. School administrators

today were faced with the dilemma of implementing massive technology, while at the

same time, was experiencing some anxiety that was due to their own inability to make

effective use of technology (Benson, Peltier, & Matranga, 1999; Law, 2002).

There was a growing debate regarding effective instructional strategies in the

educational system that had inspired advances in technology use (Hughes & Zachariah,

2001). The researchers further agreed that school administrators must be equipped to

create the kinds of conditions that allowed for technology and school administrators must

determine the most effective ways to provide access to technology.

Further information on The International Society for Technology in Education

([ISTE], 2002),revealed that it is a not-of-profit organization, which was dedicated to

supporting the use of information to aid the teaching of K-12 teachers and students and

learning, was organized in 2002. The ISTE (2002) was the premier membership

association for educators and education leaders, who were engaged in improving and

teaching by advancing the effective use of technology in PK-12 grades and higher

education. The ISTE (2002) in collaboration with the United States Department of

Page 18: Don's Dissertation

6

Education had developed the National Educational Technology Standards for

Administrators (NETS-A). The NETS-A standards established the standards for

educators across the United States (Langlie, 2008). The NETS-A standards were

developed through input from experts and partner organizations, reviews, and comments

from the field, and oversight by an advisory board to address the technology

competencies for school administrators.

According to NETS-A standards, school administrators must be able to inspire

and lead development and implementation of a shared vision, for comprehensive

integration of technology, must be able to create, promote, and sustain a digital-age

learning culture that provided a rigorous, relevant, and engaging education for all

students, promote an environment of professional learning and innovation that empowers

educators to enhance student learning through the infusion of contemporary technologies

and digital resources, provide digital-age leadership and management to continuously

improve the organization through the effective use of information and technology

resources, and model and facilitate understanding of social, ethical, and legal issues and

responsibilities related to an evolving digital culture (ISTE, 2002).

NCLB had established student achievement outcomes and the evaluation of

instructional programs as the most important measures for determining success in public

schools. Therefore, school administrators must understand how to use evaluation

practices, theories, and technology competences to promote effective decision-making in

the selection and assessment of academic programs (Coleman & Dickerson, 2007).

Technology was playing a predominant role in the field of education and it is essential

that school administrators know and be able to utilize instructional technology, especially

Page 19: Don's Dissertation

7

those technologies that were related to computer use for assessing and finding

information and the creating and communicating new knowledge (Langlie, 2008; Valdez,

2009).

Statement of the Problem

It was not known to what extent school-based administrators were competent in

utilizing instructional technology, especially those technologies that were related to

computer use for assessing and finding information, and for creating and communicating

new knowledge. School districts all over the world were faced with increasing pressure to

implement technology to enhance administration, teaching, and learning (Gurr, 2001).

Principals were expected to be able to manage the explosive change through an

increasing reliance on technological information as well as to become key leaders in

managing schools. Computer technologies were entering school administration systems

and were affecting the work places and paces of administrators, teachers, and even

changing the whole nature and structure of the organization (Yu, Chang, & Tsai, 2009).

Yu et al. (2009) discussed the factor that affects the adoption of computer

technology ―inside the organization, school leader’s acknowledgement and support on

computer technology, the level of the information department, the management skills of

information personnel, and the possible resistances from administrators‖ (p. 2). There

had been very few studies that had empirically addressed and examined the topic of

administrative use of educational technologies in schools, the level of computer use by

principals, principals’ perceived computer competence, and the principals’ leadership

style (Afshari, Bakar, Luan, Samah, & Fooi, 2009). However, research on the role of the

school leaders and technology had been addressed in some studies (Gurr, 2000).

Page 20: Don's Dissertation

8

This study sought to determine the technology competence level of current

secondary principals in the Tri-County located in the southeastern part of South Carolina,

the technology competencies school administrators should possess, and to investigate the

factors that were associated with the concept of technology competence of school-based

administrators. School administrators were the key elements to the success of

communication technology and use of information in education. Principals need to keep

abreast of developments that enhanced teaching and learning as well as to increase the

skills and knowledge of computer technology (Yu et al., 2009).

Principals did not have to be experts on all aspects of technology. However, they

needed to be able to make informed decisions and to be able to seek help when needed.

Principals must be able to mobilize their staffs to create a technology culture, provide

training opportunities, understand technological implementation in classrooms, procure

necessary resources, and reeducate themselves in technology (Yu et al., 2009). Because

of the significance of technology in the twenty-first century and the implementation of

NCLB, it was important for school principals to know what technologies were being used

in their schools by teachers and students. At the same time, principals must be able to

access the fundamental issue of how to effectively integrate technology into the school

curriculum (Slowinski, 2000).

Purpose of the Study

This study investigated the level of technology competence for secondary

principals and other school-based administrators (assistant principals, vice principals,

etc.), who were identified by the principal as proficient users of technology in the

schools. The study focused on the use of computer applications in administrative

Page 21: Don's Dissertation

9

functions of secondary principals. This study examined the relationship between the

level of use of computer applications by the secondary principals and previous computer

use, computer training, perceptions, and attitudes that were held by the school

administrators toward computers.

Brockmeier, Sermon, and Hope (2005) study sought information about principals

and their relationship with computer technology. Brockmeier et al. (2005) concluded that

principals were central to achieving successful learning outcomes with technology.

―Leadership in knowing how to best use technology in the teaching and learning process,

facilitating its integration into the learning environment, and making it possible for

teachers to adopt technology are tasks for the principals‖ (p. 55; Golden, 2004).

Principals must attain a technology expertise threshold of using technology to accomplish

their tasks and facilitate its integration into teaching and learning (Slowinski, 2000).

Rationale

As technology became increasingly important to the field of education in the

United States, the technology competence of secondary principals and other school-based

administrators needed to be investigated to identify what specific technology skills and

knowledge they possess, and what competencies were associated with a successful

educational leader. Langlie (2008) contended that technology leaders would need to be

prepared in both the general competencies associated with successful educational leaders

and also attending to the qualities that school leaders value as important.

Dugger (2007) pointed out that there was an increase in the use of technology

education in our nation’s schools, and educators were placing an increase of importance

on technology education as a part of the overall learning experience. Dugger also stated

Page 22: Don's Dissertation

10

that there were a number of states that now included technology education in the state

framework. This showed that educators were placing an increasing importance on

technology education as part of the overall learning experience. This trend was instigated

by research on the increasing need for a technological literate populace (Dugger, 2007;

ITEA, 2006; Meade & Dugger, 2004; Rose & Dugger, 2002).

Meade and Dugger (2004) conducted a study to determine the current state of

technology education and to place the data obtained in the context of the NCLB

requirements, the standards movement as well as the increasing need for a

technologically literate citizenry. The study found that more states were becoming

informed about what technology and technological literacy encompasses. In the spring of

2001, the International Technology Education Association ([ITEA]), 2006)

commissioned the Gallup Organization to research the American citizens’ knowledge of

technology and attitudes about technological literacy (Rose & Dugger, 2002). The results

of the survey were positive in terms of the public’s acceptance of technological literacy.

The results of the survey concluded that (a) the public considers technology to be an

important factor in everyday life, (b) the public’s definition of technology only

encompassed computers and the Internet, and (c) that schools should be including the

study of technology in the curriculum (Rose & Dugger, 2002).

Macaulay (2008) noted that in order to be effective leaders, principals must

possess knowledge and proficiency in technology skills and technology integration.

Bybee (2003) concluded that the ―new standards for student assessment, professional

development of teachers, and program enhancement direct our attention to central

components of technology education and provide direction for those who made changes

Page 23: Don's Dissertation

11

in policies, programs, and practice‖ (pp. 24-25). School-based administrators must begin

to implement changes based on their capacities and responsibilities within the education

system.

One of the many requirements of an effective school leader was to provide strong

technology leadership (Redish & Chan, 2001). Administrative leadership was one factor

that affected the success of integrating technology into schools (Byrom & Bingham,

2001; Redish & Chan, 2001). In the late 1990s, research had shown that schools with

effective technology programs had strong administrative leaders, who understood the

benefits of technology as well as who supported the technology programs (Redish &

Chan, 2001). According to the South East Initiatives Regional Technology in Education

Consortium ([SEIR*TEC], 2000), the most important factor that influenced successful

integration of technology within the schools was strong leadership.

Schools that had made progress toward technology adoption and integration were

led by school administrators who had a vision of what could be done through the use of

technology (Redish & Chan, 2001). According to Schmeltzer (2001), school leaders

today need more than just the basic competencies, such as word processing, email, and

other daily-used applications. School leaders must understand how technology improve

instructional practices, develop strategies for helping teachers use technology in their

classrooms, and how their mentoring and team-building skills were used to create a

system of ongoing support for the entire educational community as it moved forward in

using technologies (Schmeltzer, 2001). ―Educating those who are in a position to make

organizational decisions and point the way for others, will bring districts and schools

Page 24: Don's Dissertation

12

closer to achieving their vision for technology and, importantly, for education as a

whole‖ (Schmeltzer, 2001, p. 3).

Research Questions and Hypotheses

The following research questions and hypotheses guided this study:

R1 What is the difference in the mean scale sores for skill importance between

the secondary principals and the other administrators (assistant principals, vice principals

or administrative assistants)?

H0 There is no statistically significant difference in the mean scale scores for

skill importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

H1 There is a statistically significant difference in the mean scale scores for skill

importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

R2 What is the difference in the mean scale scores for technology competence

between the secondary principal and the other administrators (assistant principals, vice

principals or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

technology competence between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H2 There is a statistically significant difference in the mean scale scores for

technology competence between the secondary principal and the other administrators

(assistant principals, vice principals, or administrative assistants).

Page 25: Don's Dissertation

13

R3 What is the difference in the mean scale scores for frequency of use between

the secondary principals and the other administrators (assistant principals, vice principals

or administrative assistants)?

H0 There is no statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H3 There is a statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

R4 What is the difference in the mean scale scores for perceptions and attitudes of

school-based administrators toward computer use and the use of computer applications?

H0 There is no statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

H4 There is a statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

Significance of the Study

As the nation became more dependent on technology, parents and their students

continued to expect the public school system to include the integration of computers in

the schools (Slowinski, 2000). Businesses and communities required effective leadership

in the area of technology from forward-thinking and insightful school leaders (Slowinski,

2000). Slowinski (2000) also pointed out that from the demands and expectations of the

Page 26: Don's Dissertation

14

general public, school-based administrators who implement technology effectively in

their schools and communities contributed to both the economy and education in the

United States.

According to O’Dwyer, Russell and Bebell (2004), one of the most powerful

factors to increasing technology use for teaching and learning were the principals and

other school-based administrators. O’Dwyer et al (2004) believed that it was necessary

for administrators to generate an understanding of the organizational characteristics that

were associated with the use of technology in the classroom as well as to effect policy

changes. The Quality Indicators for Assistive Technology Services (2006) revealed that

administrators and principals were change agents. Examining the competency of school-

based administrators and the increased of technology use, had the potential to lead to a

greater understanding of policy differences, organizational practices, and how school-to-

school technology was used as a teaching and learning tool (Blake, 2000).

Chang, Chin, and Hsu (2008) concluded that principals’ technology leadership

was strongly needed for effective utilization of technology in the schools and with

teachers’ integration of educational technology. The principal’s role had become one of

leading student learning, reflecting the vision of the building, supporting, and facilitating

practices of leadership that created change and continual educational improvement in the

schools (Chang et al., 2008; Orr & Barber, 2006), instructional and curriculum leader

(Checkley, 2000; Serhan, 2007; Wu, 2004), technology integration (Chang & Wu, 2008;

Hew & Brush, 2007; Holland & Moore-Steward, 2000), and technology leader (Anderson

& Dexter, 2005; Chang et al., 2008; Gurr, Drysdale, & Mulford, 2006; Yeh, 2003).

Page 27: Don's Dissertation

15

Studies had shown that technology leadership had a significant impact on the educational

system (Anderson & Dexter, 2005; Chang et al., 2008; Creighton, 2003; Scanga, 2004;

Wang, 2010).

In order to prepare all students for success in the information age, educators must

begin to recognize their strengths and security as a nation that depends on remaining

competitive in this global high-tech economy today, as well as in the foreseeable future,

and recognize that education was the key factor that ensured the school success and

prepared educators for this new world of technology (Cavanaugh, 2001; National School

Boards Association [NSBA], 2009). School-based administrators needed to do the vision

thing. That was, to empower their teachers and students in new ways, and to develop the

educational technology vision of the school (Cavanaugh, 2001).

According to Holland and Moore-Steward (2000), a priority for school districts

was to provide a pool of competent school leaders in a technology-rich environment.

School-based administrators must have sufficient knowledge of technology to guide them

in their decision-making in technology planning and staff development (Holland &

Moore-Steward, 2000). School-based administrators also should know how to evaluate

teachers in their use of technology and know how to support staffing the application of

technology in meaningful learning activities (Holland & Moore-Steward, 2000).

Definition of Terms

There were a number of terms that were important to this study. As such, the

following terms will be operationally defined:

Page 28: Don's Dissertation

16

Educational technology. ―The study and the ethical practice of facilitating

learning and improving performance through developing and then implementing

instructional processes and materials‖ (Reiser & Dempsey, 2007, p. 6).

Leader. An individual who significantly affects the thoughts, feelings, and/or

behaviors of a significant number of individuals (Ambler, 2006).

Leadership. Persons who by word and/or personal example markedly influence

the behaviors, thoughts, and feelings of a significant number of their fellow human

beings (Ambler, 2008).

Informatics. ―Computing science – the science dealing with design, realization,

evaluation, use, and maintenance of information processing systems, including hardware,

software, organizational and human aspects, and the industrial, commercial,

governmental, and political implications of these‖ (UNESCO, 2002, p.12).

Informatics technology. ―The technological applications (artifacts) of

informatics in society‖ (UNESCO, 2002, p. 13).

Information and communication technology. ―The combination of informatics

technology with other, related technologies, especially communication technology‖

(United Nations Educational Scientific and Cultural Organization [UNESCO], 2002, p.

13).

Instructional leadership. Behaviors of a school leader (Alig-Mielcarek, 2003).

Instructional technology. The systemic and systematic application of strategies

and techniques derived from behavioral, cognitive, and constructivist theories to the

solution of instructional problems.

Page 29: Don's Dissertation

17

Secondary Principal. A person who is responsible to the superintendent for the

administration of the secondary program, Grades 9-12.

School-based administrator. ―Personnel at a school with teaching experience who

have the same responsibility of carrying out various administrative functions in managing

a school; positions to include assistant principal, vice principal or administrative

assistant‖ (Blake, 2000, p. 7).

Technology applications. ―Software and systems, run on school equipment,

which support important administrative and instructional functions‖ (National Center for

Education Statistics [NCES], 2002a, p. 2).

Technology competency. The ability to select and apply contemporary forms of

technology to solve problems or compile information (Colorado State Government,

2005).

Technology integration. ―The incorporating of technology resources and

technology-based practices into the daily routines, work, and management of schools‖

(NCES, 2002b, p. 2).

Transformational leadership. A set of behaviors of individuals who accomplish

change (Valdez, 2004).

Assumptions

The following assumptions were present in this study:

1. School-based administrators had sufficient knowledge to recognize the

level of competency necessary to effectively perform certain functions in their job

responsibilities using technology.

Page 30: Don's Dissertation

18

2. The participants provided accurate and truthful responses to each item on the

survey.

3. The results from this study indicated whether the technology competence of

school-based administrators was at an acceptable level, which ensured effective and

efficient utilization of technologies in the educational environment.

Limitations

The following limitations were present in this study:

1. This study was limited to public school secondary principals and other

administrators (assistant principals, vice principals or administrative assistants). State

requirements for certification for all public school administrators are uniform, whereas,

all non-public school administrators may not have specified certification requirements.

2. This study was limited to the Tri-County school-based administrators located

in the southeastern part of South Carolina. Generalizing the results to other states may be

limited due to different certification requirements for school administrators and variable

fiscal priorities on the implementation of technologies.

3. The results of the study were limited by the availability of technology and the

application software and hardware that is available to the participants. Due to technology

acquisition strategies of the participants’ schools or school districts, the participants had

significantly different opportunities to be able to use and to know technology in the

performance of their administrative job responsibilities.

4. Data obtained from this study was dependent on the truthfulness and accuracy

of the participants.

Page 31: Don's Dissertation

19

5. The related competence level of the participants and their actual skills were

inflated due to the instrument’s self-reporting nature.

6. Data collection was restricted to the Technology Competence Survey for

School-Based Administrators.

7. The number in this study was limited to the school-based administrators who

actually responded to the survey.

Nature of the Study

The methodology that was used in this study was quantitative. The study used a

descriptive design as a means to investigate the level of technology competence for

secondary principals and other school-based administrators (assistant principals, vice

principals or administrative assistants). This type of study examined the extent to which

differences on one or more variables were related to differences in one variable (Leedy &

Ormrod, 2005).

This research design used a Web-based survey to gather data that was relevant to

the study. Leedy and Ormrod (2001) noted that, ―research is a viable approach to a

problem when there are data to support it‖ (p. 94). A survey was used to estimate the

percentage of population that had specific attributes, when the researcher collected data

from a small portion of the total population (Dillman, 2000; Hardy, 2005; Wallen &

Fraenkel, 2001). On-line surveys were a very promising research tool to access and

involve people (Buchanan, 2002; Herrero & Meneses, 2006; Nesbary, 2000).

This study sought to identify specific technology skills and knowledge that

school-based administrators should possess and to describe the appropriate level of

competence for each of the technology skill areas. Creswell (2009) suggested that

Page 32: Don's Dissertation

20

quantitative research provided numeric description of opinions, trends, and attitudes of a

population when you study a sample of that population. According to Creswell (2009),

quantitative research tested objective theories by examining the relationship among

variables.

Organization of the Remainder of the Study

This study was divided into 5 chapters. Chapter 1 presented the introduction to

the problem, background of the study, statement of the problem, purpose of the study,

rationale, research questions and hypotheses, significance of the study, definition of

terms, assumptions, limitations, and the nature of the study. The remainder of this study

will be divided up as follows: Chapter 2 will present the literature review – technology

and school leadership, school leadership for information and communication technology

(ICT), theoretical perspectives in technology education, school leadership and technology

integration, barriers to technology integration, school leadership and decision-making

functions, school administrators’ usage of technology and computers, transformational

leadership, technology in schools, school administrators’ management functions,

technology impact on instruction, school administrators’ technological training,

instructional and academic technology leadership, and school administrators’

technological competencies.

Chapter 3 will present the methodology, an introduction, research design and

procedures, population and sampling, instrumentation, data collection and procedures,

data analysis and procedures, and ethical considerations. Chapter 4 will present the

presentation and analysis of data, descriptive data, data analysis, results and summary.

Chapter 5 will present the summary, conclusions, and recommendations of the study.

Page 33: Don's Dissertation

21

The estimated timeline for the project will be 6 months from the approval date of the

dissertation proposal.

CHAPTER 2. LITERATURE REVIEW

Chapter 2 discusses the literature review on the technologies competencies of

school-based administrators. The chapter presents technology and school leadership,

school leadership for information and communication technology (ICT), theoretical

perspectives in technology education, school leadership and technology integration,

barriers to technology integration, school leadership and decision-making functions,

school administrators’ usage of technology and computers, transformational leadership,

technology in schools, school administrators’ management functions, technology impact

on instruction, school administrators technological training, instructional and academic

technology leadership, and school administrators’ technological competencies.

Technology and School Leadership

In defining leadership, over a decade, there was no clear cut or agreed upon

definition of leadership in the literature (Ho, 2006; Goldring & Greenfield, 2002; Smylie,

Conley, & Murky, 2002; West, Jackson, Harris, & Hopkins, 2000). However, despite the

differences, it appeared that there was some commonality in leadership definitions. Ho

(2006) believed that leadership involved exerting some type of influence on people in an

attempt to impact their beliefs, values, and actions. Goldring and Greenfield (2002)

suggested that leadership in education was an ambiguous and complex concept, and the

Page 34: Don's Dissertation

22

diffuse and highly fragmented nature of theory and research on school and school district

administration, and leadership reflected that conceptual fuzziness.

School administrators must have a firm foundation in skills relative to the

instructional process, technology knowledge as well as managerial and leadership skills

(Geer, 2002). According to Geer (2002), school administrators were the key to

successful technology planning and integration in the schools. However, school

administrators sometimes lacked the necessary technology skills and knowledge to help

them to achieve their schools’ technology goals (Geer, 2002). Without technology

education, it was difficult for school administrators to understand the process of

implementing and using technology, and to make wise decisions pertaining to technology

(Geer, 2002).

Ertmer, Bai, Dong, Khalil, Park, and Wang (2002) study found that there was

very little research that delineated best practices for preparing administrators to be

technology leaders. Ertmer et al. (2002) also found that most administrators were simply

acquiring technology skills and knowledge on the job, with some training that was

provided by different vendors and some colleges and universities. Mehlinger and

Powers (2002) concluded that graduate school programs generally were doing a poor job

in preparing school principals and superintendents to be technology leaders. Whatever

the issues may be that needed to be addressed; school-based administrators needed

innovative approaches to gain the pedagogical, technical, knowledge and skills in

technology (Ertmer et al., 2002).

School administrators were being charged with added responsibilities of

managing and overseeing their local technology program and few administrators were not

Page 35: Don's Dissertation

23

prepared to do so (Moskowitz & Martabano, 2008). Because of the expanded areas of

responsibilities, school administrators were taking a more active role in the day-to-day

routines of managing the building level technology. Moskowitz and Martabano (2008)

contended that the instructional leader in the building, the school administrator, should

treat technology as no different from any other programs that he or she oversees.

Technology should be a part of good pedagogy and not as an added isolated program or

tool in the school environment.

School administrators need to have knowledge and technology skills in the

utilization of technology for teaching and learning and the utilization of technology in

non-instructional processes of managing and leading schools (Geer, 2002). School

administrators must accept the challenge to create supportive conditions that would foster

innovative uses of computers (Geer, 2002). Technology was one way of increasing

learning efficiency. Educators must continue to develop and discover how to implement

new technologies into the learning environments and also focus their efforts on

facilitating learning (Geer, 2002).

Technology, by itself, was not a panacea for educational problems, but when

combined with applicable learning models, the overall quality of education was enhanced

(Geer, 2002). O’Dwyer, Russell, and Bebell (2004) stated that the responsibility for

increasing the use of technology and the strategic decisions regarding the focus and range

of professional development opportunities in the schools, rested solely on the shoulders

of school administrators.

Perez and Uline (2003) contended that problem solving in our schools lied at the

heart of educational administration. Therefore, school administrators, as problem solvers,

Page 36: Don's Dissertation

24

must be able to gather, make sense of, and communicate information. This

communication could be made through computer technology (Perez & Uline). Computer

technology could offer administrators a quick access to the tools and to data to manage,

report, and retrieve it.

Perez and Uline (2003) suggested that school administrators should recognize the

computer’s potential to support administrators’ practice. Perez and Uline (2003)

concluded that the more school administrators had authentic and risk-free opportunities to

practice using the computer, the more likely administrators would confidently and

effectively employ the tool.

School administrators were important to achieving successful learning outcomes

with technology (Brockmeier, Sermon, & Hope, 2005). Achieving the promise to

implement technology in the learning environment required leadership with vision and

expertise (Brockmeier et al., 2005; Golden, 2004; Slowinski, 2003). Leadership in

knowing how to use technology in the teaching and learning process, making it possible

for teachers to adopt technology, and facilitating technology integration into the learning

environment were technology tasks that were needed by school-based administrators

(Brockmeier et al., 2005; Golden, 2004; Slowinski, 2003).

Hope and Brockmeier (2002) reported that there were two conclusions that was

relative to school administrators and computer technology. School administrators must

have technology professional development. School administrators needed to be familiar

with and have an understanding of computer technology in order to facilitate technology

integration into schools. School administrators needed to become advocates for

instructional technology program (Moskowitz & Martabano, 2008). The best way for the

Page 37: Don's Dissertation

25

school administrator to take the lead would be first to have an understanding of what

technology was in place and what purpose would the technology serve to the instructional

or administrative goals and function of the school building, and secondly, was to

establish maintenance and growth models that was supported by data, observation, and

anecdotal evidence of the staff (Moskowitz & Martabano, 2008). Schmeltzer (2001)

concluded that administrators must be able to set reasonable expectations for technology

use as well as be able to understand how technology could be successfully implemented

in the schools. ―Administrators must have a vision for education and a plan to make it

happen‖ (Schmeltzer, 2001, p. 17).

According to Afshari, Bakar, Luan, Samah, and Fooi (2009), the need for

principals to cultivate broad-based, skillful participation in the work of leadership was

very important, if schools were to become more effective and efficient learning

communities. Educational leaders must recognize the importance of their role in the

implementation and utilization of technology. Administrators must be proficient in the

use of technology, and provide leadership in the use of technology for instructional,

administrative, and learning functions (Afshari et al., 2009).

School Leadership for Information and Communication Technology (ICT)

According to the United Nations Educational Scientific and Cultural Organization

([UNESCO], 2002), in order to define ICT, informatics (computer science and

informatics technology must be defined first. Informatics is a science that deals with the

realization, design, use, evaluation, and maintenance of information processing systems

that include hardware, software, human and organizational aspects, and the commercial,

governmental, industrial, and political implications (UNESCO, 2002). Informatics

Page 38: Don's Dissertation

26

technology is the technological applications (artifacts) of informatics in society

(UNESCO, 2002). ICT is defined as combining informatics technology with other,

related technology, for example, communication technology (UNESCO, 2002).

Therefore, ICT is used, applied, and is also integrated in all activities of working and

learning on the basis of conceptual understanding and methods of informatics (UNESCO,

2002).

Leadership was needed in the educational environment to bring about ICT

integration (Creighton, 2003; Flanagan & Jacobsen, 2003; Ho, 2006; Kirkman, 2000).

The purpose of technology leadership should be for school-based administrators to

influence teachers to use ICT in their instructional practices (Flanagan & Jacobsen,

2003). Flanagan and Jacobsen (2003) identified five common themes that technology

leaders needed in integrating effective IT: (a) student engagement, (b) shared vision, (c)

effective professional development, (d) equity of access, and (e) ubiquitous network.

The Calgary Board of Education ([CBE], 2000) Leadership Development

Program (LDP) analyzed the principal’s role as technology leader. This program

outlined the core competencies, role responsibilities and personal attributes for school-

based administrators. The LDP identified five role responsibilities as they related to

achieving the goals of ICT integration: (a) leader of learning, (b) leader of student

entitlement, (c) leader of capacity building, (d) leader of community, and (e) leader of

resource management (CBE, 2000; Flanagan & Jacobsen, 2003).

According to Flanagan & Jacobsen (2003), technology leadership must be more

than just resource acquisition for school-based administrators. Technology leadership

must have multiple dimensions given the complexity of schools as learning organizations.

Page 39: Don's Dissertation

27

School administrators must look at technology integration as a complex change that had

the potential to impact every aspect of the public school system as well as the school

(Thorburn, 2004).

School administrators must be change agents who need to provide support,

develop and cultivate learning communities, create incentives, and accurately assess local

needs (Thorburn, 2004). School administrators must remember that teachers are people

who need good staff development and adult learning (Paben, 2002; Thorburn, 2004). If

school administrators could gain computer skills and practice them effectively, then

school administrators must be able to provide teachers with staff development for

teachers to have a greater opportunity for success in helping the schools to learn and

grow as professional organizations (Thorburn, 2004).

Gavin (2002) concluded that educational technology referred to a particular

approach. Instructional technology referred to the use of such technological processes

that was specific for teaching and learning (Gavin, 2002). Administrators need to help

create an environment that is conducive and centered around a child-centered classroom

supported with technology integration. Administrators should also influence and support

teachers in using technology (Gavin, 2002).

Gavin (2002) believed that the role of the principal was to support the staff. If the

principal was to make a difference with children, then, the principal must make a

difference with the people directly involved with the children. Technology could be used

as a motivational tool in the classroom only if teachers know how to use it. Therefore,

the key element for school administrators was to provide the opportunity for teachers to

explore the tool of technology (Gavin, 2002). Administrators also needed to help

Page 40: Don's Dissertation

28

teachers in integrating technology in creative ways in their classrooms in order to make

better learning environments.

Noeth and Volkov (2004) pointed out that administrators and teachers were key

technological interfaces in the schools. The administrators were responsible for bringing

technology into the district or school building, while the teachers were responsible for

bringing technology in the classrooms (Noeth & Volkov, 2004). If technological

implementations were to succeed, administrators and teachers must have the motivation,

knowledge, and skill to implement and utilize technology in effective ways to enhance

learning for all students (Noeth & Volkov, 2004). Noreth and Volkov (2004) contended

that administrators and teachers must be held accountable for the effectiveness of their

uses of technology to support an enhanced learning environment for the educational

community, and for the subject matter learning for the range of students who were found

in their classrooms.

Theoretical Perspectives in Technology Education

Since educational technology was viewed as both tools and processes, it was

important to examine the historical perspectives of technology (Roblyer & Doering,

2010). These perspectives or paradigms defined educational technology. Roblyer and

Doering (2010) discussed four perspectives that defined educational technology.

Perspective one was educational technology as media and audiovisual. This perspective

grew out of the audiovisual movement in the 1930s. Perspective two was educational

technology as instructional systems and instructional design. This perspective originated

with post-World War II military and industrial trainers. This view was based on the

belief that both nonhuman (media) resources and human (teachers) could be part of an

Page 41: Don's Dissertation

29

efficient system for addressing any instructional need. Perspective three was educational

technology as vocational training. This perspective originated with industry trainers and

vocational educators in the 1980s. This perspective asserted that schools were to prepare

students for the world of work by using technology and that vocational training could be

practical means of teaching all content areas, such as mathematics, science, and language.

Perspective four was educational technology as computer systems. This perspective

began in the 1950s with computers and gained momentum in the 1960s when computers

were used instructionally (Roblyer & Doering, 2010).

Januszewski (1994) focused on the thought and activities of James D. Finn in

educational technology. In the early 1960s, the focus was on media such as film,

television, filmstrips, radio, and audio recordings. By the middle of the 1980s, the focus

had shifted to computers (Januszewski, 1994; 2001). Finn (1953) noted two

characteristics of a profession that were important to educational technology: (a) a body

of intellectual theory expanded by research, and (b) an intellectual technique

(Januszewski, 1994). In the late 1950s, Finn (1953) used the terms ―technology‖ and

―technical‖ in his writings (Januszewski, 1994). Finn (1964) wrote that technology was

not just hardware or even hardware and materials. Technology was a way of organizing,

a way of thinking, and were man-machine systems, that included systems of organization,

patterns of use, and tests of economic feasibility. Januszewski (2001) considered

educational technology as a theoretical construct.

In 1994, educational technology changed to instructional technology

(Januszewski, 2001). Instructional technology and educational technology were used

interchangeably (Januszewski, 2001). Richey (2008) defined educational technology as

Page 42: Don's Dissertation

30

the ethical practice and study of improving performance and facilitating learning by

creating, managing and using the appropriate technological resources and processes.

According to Zuniga, Valdez, and Lu (2010), cognitivist and constructivist

theories began to have a major impact on design practices in instructional technology in

the 1990s to present. Zuniga, Valdez, and Lu (2010) defined instructional technology as

―the theory and practice of design, development, utilization, management, and evaluation

of processes and resources for learning‖ (p. 4). Reiser (2001) defined instructional

technology as ―the problem analysis, solution design, development, implementation,

management, and evaluation of instructional processes and resources to improve learning

and performance in education and at work‖ (p. 53). The Commission on Instructional

Technology (1970) defined instructional technology as ―a systematic way of designing,

implementing, and evaluating the total process of learning and teaching in terms of

specific objectives, based on research in human learning and communication and

employing a combination of human and no-human resources to bring about more

effective instruction‖ (p. 199).

Whelan (2005) surveyed the past, present, and future of instructional technology.

Since the beginning of the 1900s, there had been growth of instructional technology in

media-technological innovations, theoretical advances, and core issues (Whelan). During

the twentieth century, growth could be seen from the complexity of the early

stereographs, radio, film, and TV to personal computers and now computer-aided

instruction and the Internet (Whelan). During the same period, shift between theoretical

paradigms that accounted for the use of technologies in instruction as social, cultural, and

technology needs had evolved.

Page 43: Don's Dissertation

31

Whelan (2005) discussed three principal families of theories about learning that

have had an impact on instructional technology over the last 100 years. The first theory

was behaviorism. Behaviorism was concern with observable behavior rather than our

inner mental experiences. According to Whelan, behaviorism was thought of in the ―drill

and practice‖ software, which was sometimes used in skill building. MacVie (2009)

believed that in behaviorism, behavior was shaped through reinforcement – positive or

negative. In the positive reinforcement, the stimulus that was applied encouraged

positive behavior to happen again. In the negative reinforcement, the stimulus that was

withheld discouraged the negative behavior to happen again. Therefore, learning was

defined as a change in behavior in the learner (MacVie, 2009).

The second theory was cognitivism. Cognitivism emphasized the importance of

learning, the perception and thought as the bases for understanding learning and human

behavior. Cognitivistic instructional design was a characteristic of a topic or subject

matter and the transformation of the subject matter into a set of cognitive tasks (Whelan).

The cognitivistic frameworks included troubleshooting problem diagnosis and discovery

tasks. MacVie (2009) believed that cognitivism came about as a response to behavior.

MacVie (2009) contended that people were rational and required active participation in

order to learn. In cognitivism, students dictated their own course of learning.

The third theory was constructivism. Constructivism was the process of

knowledge construction, with the learner was in charge of his or her own learning

experience (Whelan). Constructivism allowed the learner to build on prior knowledge

and thereby created their own understanding of concepts and ideas (Alonso, Manrique, &

Viñes, 2009; Karagiorgi & Symeou, 2005; Whelan, 2005). Karagiorgi and Symeou

Page 44: Don's Dissertation

32

(2005) pointed out that it was important to understand the various types of

constructivism, such as social, radical, evolutionary, post-modern, physical, and

information processing. Learning today was approached as a constructive, situated,

cooperative, self-regulated, and individually different process (Karagiorgi & Symeou,

2005). Constructivism, in our world today, provided a theory of cognitive growth and

learning that were applied to technology and became the guiding theoretical foundation

for the use of technology (Karagiorgi & Symeou, 2005).

School-based administrators must recognize that as their knowledge about

learning processes and its cognition evolved, so will the applications to the field of

instructional technology. School-based administrators must look at the past, present, and

future trends in order to diversify with technology. .

Karagiorgi and Symeou (2005) were in agreement that instructional designers

were expected to be familiar with several epistemological underpinnings of several

theories and their consequences on the process of instruction. Karagiorgi and Symeou

(2005) agreed that constructivism over the last decade was the dominant theory that

supported construction of knowledge by the individual. Technology was a knowledge

construction tool that should confront the learner. MacVie (2009) believed that

constructivism acknowledged that knowledge was constructed based on experiences. All

learners had a different construction and interpretation of process. The learner

interpreted his or her findings by integrating his or her past experiences to a situation

(MacVie, 2009). Learning was approached as a self-regulated, situated, cooperative,

constructive, and individually different process (Karagiorgi & Symeou, 2005).

Technology helped instructional designers to accommodate the constructivist perspective

Page 45: Don's Dissertation

33

in order to respond to the learning requirements in the twenty-first century (Karagiorgi &

Symeou, 2005).

The nature of an instructional model was the critical element in technology-

enhanced instruction (Alonso, Manrique, & Viñes, 2009, Tuckman, 2002).

Instructional designers must try to translate constructivism into instructional designs to

make use of the technology tools (Tuckman, 2002). Cey (2001) contended that critical

thinking and learning, and problem solving only occurred when education became learner

centered, authentic, collaborative, active, and personal. There must be a paradigm shift in

the role of educators and the use of technology in order to implement constructivist

strategies (Cey, 2001).

Saettler (2004) described four distinct paradigms that had emerged in educational

technology in this century. These paradigms had different philosophical and theoretical

orientations that had affected both the practice and theory of educational technology. The

paradigms were (a) the physical science or media view; (b) the communications and

systems concept; (c) the behavioral science-based view, comprising the behaviorist and

neo-behaviorist concepts; and the cognitive science perspective (Saettler, 2004).

Nanjappa and Grant (2003) concluded that there was a complementary

relationship that existed between constructivism and technology. Technology referred to

the environments and to the designs that engaged the learners, while constructivism was

the doctrine that stated that learning took placed in context (Nanjappa & Grant, 2003). In

the constructivist domain, technology played an integral part in the learning environment

(Nanjappa & Grant, 2003). When you integrated constructivist methods with technology,

Page 46: Don's Dissertation

34

learners became more responsible for and active in the learning process (Grant, 2002;

Nanjappa & Grant, 2003).

Behaviorism and cognitivism were two dominant theoretical positions in the field

of learning with interactive courseware (Deubel, 2003). According to Deubel (2003),

discovery-learning materials were found on cognitive models of information processing

and constructivism while early computer-based materials was influenced by behaviorist

concepts. Whether designers chose to use a cognitive or behaviorist approach depended

upon the nature of the materials developed in context in which materials that were used

(Deubel, 2003).

Johnson (2002) discussed the principles of learning theories, behaviorism,

cognitive, and constructivist, as they were related to instructional technologies. In the

midst of the technology age the learning theories were being looked at and explored in

new ways (Johnson, 2002). Johnson (2002) believed that behaviorism was based on

observable changes in behavior and focused on a new behavioral pattern that repeated

until it became automatic. Cognitive was based on the thought process that followed

behavior. According to Johnson (2002), cognitive referred to changes in behavior that

were observed and then were used as indicators as to what were happening inside the

learner’s mind. The constructivist theory was based on the premise that as human beings,

we all construct our own perspective of the world, through individual schema and

experiences (Johnson, 2002). Constructivism focused on preparing the learner to

problem solve in ambiguous situations (Johnson, 2002).

Page 47: Don's Dissertation

35

School Leadership and Technology Integration

The central role of school leaders in the success of technology integration in the

learning environments was gaining momentum nationally and internationally (Gibson,

2002). The importance of administrative support from school leaders in the integration of

technology, curriculum, and instruction was also under supported and understated

(Gibson, 2002). According to Gibson (2002), one of the first steps in building a

successful technology program was for school administrators was to create a supportive

environment conducive to maximizing technology integration into the curriculum.

NCLB brought many changes to American schools, which included accessibility to

technology (Perez and Normore, 2004). NCLB also stressed the importance of

providing technology integration for administrators, teachers, and students. School

administrators played a critical role in implementing and supporting the use of

educational technology in the schools (Perez & Normore, 2004). Administrators had the

task of making their staff feel comfortable about the effective and multiple uses of

technology by providing support, time, and clear expectations for the technology plan to

take place (Perez & Normore, 2004).

According to Lamb (2001), technology integration was more than just placing

equipment in classrooms and labs. People were the key to successful technology

integration and administrators were the key persons in providing leadership (Lamb,

2001). When it came to technology integration, school administrators were faced with an

uphill battle in making difficult decisions, working with entrenched staff, and changing

times (Lamb, 2001).

Page 48: Don's Dissertation

36

McCarthy (2009) contended that technology integration was a powerful tool that

increased motivation, communication, and hand-on active learning. McLester (2003)

believed that school administrators should keep staff focused and motivated when it came

to technology integration. School administrators were being held accountable for:

evidence-based success, smart budgeting, a highly trained staff with opportunities for

ongoing learning, innovation, as well as meeting the needs of clients, and clear

communication of goals and outcomes (McLester, 2003).

Perez and Normore (2004) believed that when school leaders, and change agents,

understood the impact of their staff’s needs, teaching styles, curriculum goals, and

students needs, had on technology integration and use, technology could then reach its

potential in schools. School administrators had the responsibility of creating a

technology plan that was successfully adopted and implemented (Perez & Normore,

2004). School administrators must serve as role models in the use of technology

(Ditzhazy & Poolsup, 2002; Goddard, 2002; Perez & Normore, 2004).

Mentz and Mentz (2003) stipulated that schools were expected to equip learners

with the basic technological skills that were required into today’s world. Therefore, there

was a demand for leadership to facilitate this process (Mentz & Mentz, 2003). School

administrators not only had to update their skills and knowledge, but also were required

to work towards the transformation of their roles as educational leaders (Mentz & Mentz,

2003). Leadership was required for the implementation and improvement of technology

in the schools (Mentz & Mentz, 2003).

According to Culp, Honey, and Mandinach (2003), NCLB recommended that all

eighth grade students should be technologically literate. NCLB also considered

Page 49: Don's Dissertation

37

technology as being an important source of support for teaching and learning across the

curriculum (Culp et al., 2003; Williams, 2006). Therefore, schools of the ―Information

Age‖ must employ technology to better equip and to meet the needs of administrators,

teachers, students, and parents (Williams, 2006). Williams (2006) also claimed that

technology had the capability to be a ―transforming‖ tool that enabled organizations and

individuals to gain advantages in the world of work and life.

Barriers to Technology Integration

Even though there were positive examples of technology being used to support

student learning and to foster positive changes, there were still some barriers school-

based administrators faced in integrating ICT. Flanagan and Jacobsen (2003)

summarized barriers as (a) pedagogical issues, (b) concerns about equity, (c) inadequate

professional development, and (d) lack of informed leadership. Pedagogical issues dealt

with educators developing a conceptual basis for applying technology as well as looking

to research for successful IT integration for an understanding of the relationship between

student learning, pedagogy, and technology (Flanagan & Jacobsen, 2003).

Concerns about equity dealt with giving all students the opportunities to acquire

the skills that were needed to participate in this new society (Flanagan & Jacobsen,

2003). Research had shown that women lagged behind their male counterparts in using

computers as tools for both work and recreation (Flanagan & Jacobsen, 2003; National

Telecommunication & Information Administration, 2000). Inadequate professional

development pertained to the lack of meaningful opportunities to acquire skills that were

needed to meet the ICT outcomes (Flanagan & Jacobsen, 2003). In many school

districts, technology funding had not been expanded for staff development, which had left

Page 50: Don's Dissertation

38

teachers to seek and finance their own professional development (Flanagan & Jacobsen,

2003). Some teachers resist the pressure of implementing technology in their classrooms

because of the lack of opportunities to learn how to do it (Flanagan & Jacobsen, 2003).

A lack of informed leadership dealt with principals not being prepared for their

new role of technology leaders. Many principals had not had meaningful ways of using

computers with children and therefore, lacked the required pedagogical experience and

vision to guide their teachers (Flanagan & Jacobsen, 2003). School administrators had

made decisions dealing with wiring, networking as well as purchasing equipment without

the understanding of how their decisions impacted student learning (Flanagan &

Jacobsen, 2003). Principals who were unprepared to manage the issues surrounding ICT

networking, pedagogical judgments, tend to take a back seat to technical and financial

considerations (Flanagan & Jacobsen, 2003).

Computer Anxiety

Baloğlu and Çevik (2009) claimed that school-based administrators should be

able to promote the role of leadership and follow technological advancements with regard

to technology in their schools. However, an affective factor, computer anxiety, may

cause problems in the technology process (Baloğlu & Çevik, 2009). Computer anxiety

pertained to the apprehension and fear that were felt by persons when utilizing computer

technology (Baloğlu & Çevik, 2009). Anxiety was the most prevalent emotional problem

that was associated with computers (Baloğlu & Çevik, 2009). School-based

administrators who had limited encounter with computers were more likely to show

symptoms of anxiety and had more negative attitudes toward computers (Baloğlu &

Çevik, 2009).

Page 51: Don's Dissertation

39

Kozma (2003) and Lan (2001) research showed that school leadership could

reduce or remove barriers that teachers had in integrating technology successful in the

classroom. The role of the school administrators was very important to successful

classroom technology integration (Dawn & Rakes, 2003; Kozma, 2003; Lan, 2001;

Pierson, 2001; Williams, 2006).

School Leadership and Decision-Making Functions

According to the National Conference of State Legislature ([NCSL], 2010),

effective administrators led strong schools. Strong leaders were needed to maintain a

focus on student achievement, instructional improvement, and to inspire the staff and

students in their building to do the same (NCSL, 2010). School leadership was regarded

as one of the key factors in accounting for the difference between schools that foster

student learning and underperforming school (NCSL, 2010). Schools today required

school leaders who were trained; who understood the economic, social, and political

forces that influence education; who were committed to solutions and fresh ideas, and

were willing to take risks in implementing them; and who had a twenty-first century view

of education management (NCSL, 2010).

Chance and Chance (2002) denoted that a primary function of leadership was

decision-making. Chance and Chance (2002) discussed the decision-making theories and

their relationship to school administrators. The decision-making theories were divided

into two categories: normative and descriptive. Normative theories offered ideal

processes or models for decision-making. Descriptive theories explained how decisions

came about in practice (Chance & Chance, 2002). School leaders must understand these

decision-making theories in order to determine to what extent and when others should be

Page 52: Don's Dissertation

40

involved in the decision-making, understand what effect time had on the decision-making

process, develop strategies that helped to prevent crisis in decision-making, and identify

what factors were within and outside the organization that impacted decision-making

(Chance & Chance, 2002).

Martin (2007) discussed four stages to successful decision-making. The first

stage to decision-making was for school administrators to determine salience, which

factors to take into account. School administrators should look for less obvious, but

potentially relevant factors. School administrators should always consider outside

influences to a solution, rather than limit themselves to a limited amount of information.

Martin’s (2007) second stage was analyzing causality, analyzing how the many

salient factors were related to one another. School administrators must consider

multidirectional and nonlinear relationship among variables. School administrators must

be able to look at test data and analyze what factors had a positive or negative affect on

the test scores.

The third stage to decision-making was envisioning the decision architecture,

actually making a decision. According to Martin (2007), school administrators must look

at problems as a whole; examine how the parts fit together, and how decisions affected

one another. When integrating technology, the school administrator must look at the

whole picture in making changes in the school. The fourth stage was achieving a

resolution, the outcome. Martin (2007) pointed out that school administrators must

creatively resolve tensions that occurred among opposing ideas and generate innovative

outcomes. School administrators must become leaders who embrace holistic thinking

Page 53: Don's Dissertation

41

rather than segmented thinking (Martin). Holistic thinking can easily and creatively

resolves the tensions that start the decision-making process (Martin).

School Administrators Usage of Technology and Computers

A broad and intensive use of technology and a strong technology infrastructure

were needed to create a twenty-first century educational system (State Education

Technology Directors Association [SETDA], 2007). An intensive use of technology was

also needed if schools were to prepare students to participate in a global economy

(SETDA). School administrators must rally to the call of action to integrate technology

as a fundamental building block into education by using technology comprehensively (a)

to develop proficiency in twenty-first century skills, (b) to support innovative teaching

and learning, and (c) to create robust education support systems (SETDA, 2007).

Slowinski (2003) reported in order to be effective implementers of new ICT,

school leaders needed to have had a level of ICT competence to perform the technology

roles well. School leaders must address the challenges of implementing new

technologies that included student management systems (Stuart, Mills, & Remus, 2009).

School leaders had the roles in pushing ICT use to teachers for the benefits of increasing

educational outcomes (Baek, Jung, & Kin, 2008; Gosmire & Grady, 2007; Rakes &

Dawson, 2003; Stuart et al., 2009).

Quirk (2009) contended that the many forms of technology were very important

tools for learning, communicating, teaching, discovering, as well as expressing one’s self.

Moursund (2007) pointed out that the problem of most school administrators faced was

their knowledge and skills in information technology were very weak compared to their

Page 54: Don's Dissertation

42

skills and knowledge in other aspects of their jobs. A school administrator could not tell

whether a teacher was conducting an effective class in using information technology

when the school administrator’s own educational experience had never included the use

of information technology.

School administrators must support and actively participate in school

improvement and school reform (Moursund, 2007). According to Blake (2001), in the

information age, school administrators, as professionals needed computer application.

These applications included (a) application software (i.e., word processing, database,

spreadsheet, and presentation software), (b) the Internet with e-mail, (c) student

information systems, and other personnel and office productivity products. The

important technology trends for schools included virtual learning, data systems, and

mobile technologies (Gosmire & Grady, 2007; Johnson, 2004; Pruitt, 2005; Vail, 2005).

Students already had access to laptops, handheld devices (e.g., PDAs), and cell phones.

Therefore, schools and school administrators must learn to use these technologies.

Johnson (2004) suggested that school administrators needed to learn how to use these

technologies in order to enhance educational experiences, The essential tool that

enabled educators to use data to improve education and meet the demands of NCLB was

data management systems (Gosmire & Grady, 2007). The data management systems

allowed teachers, students, and parents access to information for various reasons:

assignments in current grade, how well a student performed on a specific content

standard on a state test, and to download class notes. The list goes on and on. All of these

tasks accomplished with a data management system that interfaced with Web-based

applications (Gosmire & Grady, 2007).

Page 55: Don's Dissertation

43

The principals, as technology leaders, must be ―knowledgeable enough‖ about

technology tools, must model the use of technology for administrative and managerial

tasks, and must make technology a routine part of their jobs (Gosmire & Grady, 2007).

Principals must also establish a context for technology and understand how technology

could be used to structure learning and empower teachers as well as help students to

become more technology literate (Brockmeier et al., 2005; Gosmire & Grady, 2007, p.

18). Hunnicutt (2008) discussed how school administrators played a key role in the

success of a school. School administrators must find ways to improve the teaching and

learning in their schools. Hunnicutt believed that the main goal of school administrators

was to become school leaders.

Transformational Leadership

Researchers had conducted a review of the concepts of leadership in educational

leadership (Fullan, 2001; Leithwood & Riehl, 2003). Based on a review of years of

leadership research by top scholars in educational administration, these researchers had

concluded that there was a clear trend toward the accumulation of knowledge regarding

school leadership and its effects (Stewart, 2006).

Leadership had been, still is, and will continue to be, a major focus in the era of

school accountability and school restructuring. Researchers suggested that the study of

leadership had become increasingly more eclectic, both methodological and

philosophically (Bass, 1985; Burns, 1978; Riggio, 2009; Stewart, 2006). There always

will be a focus on leadership throughout the succeeding decades (Stewart, 2006). Fullan

(2001) claimed there was a short supply of effective leaders. Good leadership was needed

Page 56: Don's Dissertation

44

at all levels of the school system. Good leadership was needed that effectively led us

through change and advance us even further than possible (Stewart, 2006).

According to Connor (2004), transformational leadership was leadership that

emphasized change in an organization through the use of empowerment, visioning, and

ethics (the end results). Transformational leadership was appropriate for cutback

management, task forces for problem solving, strategic planning, leadership and

professional development, grievance resolution as well as requests for proposals (Connor,

2004).

It was ideally that all administrators should use all leadership forms in carrying

out their duties. However, there were a few administrators who were adept at using all

forms because of the differences in interpersonal skills, decisiveness, willingness to bear

a risk, and conceptual transformational leadership was of particular interest because of

the growing use of leadership methods and the newness and underutilization of this type

of leadership form.

Leadership before 1978 was often approached in the literature from an exchange

context (Connor, 2004). Leaders and their followers usually influenced and interacted to

each other’s behavior. Burns (1978) introduced transformational leadership in his book,

Leadership. According to Burns, when leaders and followers helped each other to

advance to a higher level of moral and motivation, transformational leadership could be

seen.

Bass (1985) expanded upon Burns original ideas to develop what was known

today as the ―Transformational Leadership Theory.‖ Bass defined transformational

Page 57: Don's Dissertation

45

leadership based on the impact that it had on followers. Bass suggested that

transformational leaders had trust, respect, and admiration from their followers.

Wagner (2009) defined transformational leadership as a leadership style that led

to positive changes in those who followed. Wagner also described transformational

leaders as being enthusiastic, energetic, and passionate. Not only were the

transformational leaders focused on helping every person of the group to succeed, these

leaders were also concerned and involved in the process as well (Wagner, 2009). Only

through transformational leaders strength of their vision and personality, are they able to

inspire their followers to change their perceptions, expectations, and motivations to work

towards common goals (Wagner, 2009).

The Components of Transformational Leadership

Bass (1985) suggested that there were four different components of

transformational leadership: (1) intellectual stimulation – transformational leaders not

only challenge the status quo; they also encourage creativity among followers. The leader

encouraged followers to explore new ways of doing things and new opportunities to

learn, (2) individualized consideration – transformational leadership involves offering

support and encouragement to individual followers, (3) inspirational motivation –

transformational leaders have a clear vision that they are able to articulate to followers,

and (4) transformational leaders serves as a role model for followers (Wagner, 2009).

There were several qualities that the transformational leader should possess.

These qualities were charismatic in nature, which the transformational leader used to

achieve his or her organizational goals (Cherry, 2007). Cherry suggested that these

qualities included focusing attention on planned actions, encouraging risk, listening to

Page 58: Don's Dissertation

46

employee suggestions, providing feedback, demonstrating consistent trustworthy

behavior, and expressing concern for others.

Transformational Leader and Technology

Ashari et al. (2008) concluded that transformational leadership was one of the

most important factors that affected the integration of educational technology (Brooks-

Young, 2002; Ross, McGraw, Y Burdette, 2001). According to Afshari et al., principals

must be able to integrate ICT into their daily practice and to provide positive and

consistent leadership for technology use in the teaching ad learning process. Schools

must have leaders who could facilitate the change process as well as support learning

community for technology integration.

Jung, Chow, & Wu (2003) conducted research on the role of transformational

leadership in enhancing organizational innovation and found that transformational leaders

were needed in regards to ICT implementation. Jung et al. study found a direct and

positive link between style of leadership, transformational and organizational innovation.

Transformational leadership had positive and significant relations with both an

innovation organizational climate and empowerment. Schepers and Wetzels (2005)

found in their study, a positive relationship between technology use and transformational

leadership. Schepers and Wetzels concluded that a leader should facilitate events as well

as conditions that created a positive environment for technology adoption (training,

education, and organizational technical support).

Wilmore and Betz (2000) contended that principals played an integral role in

technology integration. According to Wilmore and Betz (2000), information technology

will only be successfully implemented in schools if the principal actively supported it,

Page 59: Don's Dissertation

47

learned as well, provided adequate professional development, and supported his/her staff

in the process of change. Effective change management and leadership skills were

essential for principals to ensure successful technology integration in their schools

(Wilmore & Betz).

Technology in Schools

There had been a successful push to provide technology in schools (Gahala,

2001). It was noted that many schools had computers in every classroom (Gahala, 2001).

Ninety percent of all schools were connected to the Internet and there were about thirty-

three percent of teachers who had Internet access in their classrooms (Gahala, 2001).

However, there were teachers who readily admitted that they were not making as much

use of technology as they could (Gahala, 2001). Although technology was prevalent in

the schools, there were several factors that affected how technology was used and

whether technology was used at all. These factors included technical support, placement

of computers for equitable access, new roles for teachers, effective goals for technology

use, time for on going professional development, teacher incentives, appropriate coaching

of teachers at different skill levels, availability of educational software, and sustained

funding for technology (Gahala, 2001). According to Gahala (2001), administrators

could take the following steps to promote technology use in the school: (a) pursue

funding strategies to provide the necessary technology, professional development,

technical support, equipment upgrades, and equipment maintenance to achieve

educational goals; (b) assess the school's technology use according to the seven

dimensions for gauging progress; (c) be aware of factors that affected the effective use of

technology for teaching and learning; (d) develop strategies for ensuring equitable use of

Page 60: Don's Dissertation

48

education technology for all students and teachers; (e) acknowledge the benefits of

plugging educators into technology: improved student performance, increased student

motivation, lower student absenteeism, and higher teacher morale; (f) understand the

implications of preparing teachers for the Digital Age; (g) ensure that the school is

providing professional development for effective technology use; (g) determine

expectations for teachers in regard to their use of technology in their classrooms; (h)

develop strategies for teaching the teachers and eventually winning teachers over; (i)

read about technology implementation strategies; (j) provide all teachers and

administrators with an Internet e-mail address; (k) use e-mail for all school

announcements; (l) provide a networked computer on the desk of every teacher and

administrator; (m) provide all teachers with on-site training in technology use; (n) ensure

that teachers have adequate time to practice new skills, explore software, and become

proficient with the school's technology; (o) involve teachers in identifying and pursuing

technology professional development that is appropriate to their needs and skills; (p)

encourage teachers to set their own technology integration goals as part of their

individual professional development plans; (q) ensure that adequate technical support is

available; (r) address any problems that arise with new uses of technology in the

classroom quickly and efficiently; and (s) use a variety of time and monetary incentives

as well as job requirements that encourage teachers to use technology in their classrooms.

According to Ed Tech Action Network ([ETAN], 2009), a review of research data

and case studies that had been published within the past five years by the International

Society for Technology in Education (ISTE) and the Consortium for School Networking

(CSN), confirmed that technology use in education yielded a broad array of meaningful

Page 61: Don's Dissertation

49

results: technology (a) improved student achievement in reading, writing, and

mathematics; (b) improved school efficiency, productivity, and decision-making; (c)

helped teachers meet professional requirements; (d) improved learning skills; (e) could

help schools meet the needs of all students; (f) promoted equity and access in education;

and (g) improved workforce skills.

School Administrators’ Management Functions

In the business world, information and knowledge management had been vital

tools in organization. It was only in recent years that educational administrators had

started looking at how they use technology information to assist them in creating

effective learning environments (Petrides & Guiney, 2002). According to Petrides and

Guiney (2002), there had been a lot of research literature on the information sector in

business support, but scarcely any research literature regarding information management

to support educational learning.

Knowledge management could be used to support educational administration in

turn could support teaching and learning (Petrides & Guiney, 2002). Petrides and Guiney

(2002) noted that school administrators should be able to lead information-based

knowledge management efforts. As our society became increasingly information based,

school administrators, teachers, and students played an integral part in this process.

According to Schlögl (2005), the main purpose of information management was

to make available the right information at the right place and at the right time. Computer-

based information management and technology-oriented information management were

the primary means to this end (Schögl, 2005). In the technology-oriented information

management, educators were concerned with data management, IT management, and the

Page 62: Don's Dissertation

50

strategic use of IT (Schögl, 2005). In data management, administrators were concerned

with data administration, which served as the planning and analysis function (Schögl,

2005).

In database administration, administrators were concerned with a framework for

managing data on an operational level (Schögl, 2005). In IT management, the

administrators were concerned with the management of hardware, software, and IT

personnel (Schögl, 2005). This included level of information use, level of information

systems, and level of information infrastructure (Schögl, 2005). The use of information

technology as a strategic resource was widely published. There were many publications

that dealt with the strategic relevance of information processing (Schögl, 2005).

Friehs (2009) contended that knowledge management must not be mistaken as a

means of information exchange and for data processing. Knowledge management task

was to coordinate and organize personal and organizational knowledge. In other words,

knowledge management had to take care that the internal and external exchange of

knowledge of an organization (school) was facilitated and supported (Friehs, 2009). To

successfully implement knowledge, management strategies changes within the

organization of the school culture, administrators must be able to guarantee the

acceptance and tolerance by everyone involved faculty and staff, and students (Friehs,

2009). A cultural change could take place through (a) motivation for accepting changes,

(b) the development of new meanings for existing cultural concepts, (c) internalization of

new concepts and integration to existing culture, and (d) evaluation of change processes

(Friehs, 2009).

Page 63: Don's Dissertation

51

In implementing knowledge management activities, administrators wanted to (a)

find out how to improve the knowledge that was available; (b) find out which instruments

were adequate to develop, use, and distribute new knowledge in school; (c) clarify which

strategic, structural, process-related, technological and /or cultural measures had to be

taken to introduce and implement knowledge management strategies; and (d) explore

which positive results were to be expected and where failure was possible (Friehs, 2009).

Friehs (2009) suggested that recommendable instruments included the internet,

the school-based intranet, databases, base-management, job rotation for teachers, and

other school staff at different schools to gain new perspectives and create knowledge

networks, quality circles, communication platforms and communities of practice, training

as multiplier of knowledge, mentoring programs, or story telling to transfer

organizational knowledge by means of stories. Therefore, the supporting factors for

successful knowledge management strategies were a knowledge-based and knowledge-

oriented culture, and the cooperation of the top management (administrators), an

adequate technical and organizational infrastructure, a clear vision, and motivating

elements (Friehs, 2009). Knowledge management was to improve teaching, learning, and

general working conditions (Friehs, 2009).

Technology Impact on Instruction

According to Keengwe (2007), technology permeated all sectors of our lives.

Educators were pushed to reform schools through technology (Becker, 2001; Keengwe,

2007). Over the past decade, there had been a push for school administrators, teachers,

parents, and students to use and integrate educational technology in the classroom

Page 64: Don's Dissertation

52

(Keengwe, 2007).

There was evidence that technology was changing the way teachers were teaching

in their classes. Sivin-Kachala and Bialo (2000) in their study about the effectiveness of

technology, reported that there were consistent and positive patterns when students were

engaged in technology-rich environments, there were significant gains and achievement

in all subject area, there was increased achievement in preschool through high school for

both special needs and regular students, as well as improved attitude toward learning.

Boster, Meyer, Roberto, and Ingle, (2002) study examined the integration of standard-

based video clips into lessons that were developed by classroom teachers. Boster et al.,

(2002) found increases in student achievement.

Thompson, Schmidt, and Davis (2003) pointed out that technology had the

potential for changing the way teachers taught as well as how students learned.

According to Cradler, McNabb, Freeman, and Burchett (2002), there was mounting of

evidence that supported technology advocates’ claim that the twenty-first century

communication and information tools and traditional computer-assisted instructional

applications positively influenced student learning processes and outcomes. The Center

for Applied Research in Educational Technology ([CARET], 2005) had gathered research

and findings that emphasized the importance of using technology in conjunction with

collaborative learning methods and leadership aimed at technology planning for school

improvement purposes (CARET, 2005; Cradler et al., 2002). CARET (2005) contended

that technology was most effectively integrated into instruction when educators and

education decision makers (a) reviewed and analyzed the content of technology

applications to determine if the introduced skills and knowledge align with curriculum

Page 65: Don's Dissertation

53

content standards, (b) enabled students to acquire proficiency with the technology

application prior to the onset of the content standards based lesson, (c) supported the

development of instructional lessons and units that use technology to extend and

reinforce core curricula, and (d) developed detailed plans for infusing technology as a

tool to increase learning opportunities.

Cradler et al. (2002) pointed out that the key components of instructional

strategies that accompany effective technology implementation were formative feedback

and collaborative activities. Leadership played a pivotal part in aligning available

technology resources with systematic school improvement goals (Cradler et al., 2002).

There must be an understanding of educators’ efforts for technology to positively

influence students’ academic performance (Cradler et al., 2002).

School Administrators’ Technological Training

Technology leadership was an emergent field within the diversified world of

educational leadership (Whiteside, 2005). School leaders, who were well versed in the

pitfalls and potential of information and communication technologies for our students,

were needed, if schools were to excel in the ―Information Age‖ (Whiteside, 2005).

Researchers had noted that an essential element of successful technology-based school

reform depended on strong leadership (Anderson & Dexter, 2005; Byrom & Bingham,

2001; Gibson, 2002; Martin, Gersick, Nudell, & Culp, 2002; Whiteside, 2005).

According to Whiteside (2005), the professional standards documents from the

Interstate School Leader Licensure Consortium ([ISLLC], 1996), the National Policy

Board for Educational Administration (2002), and the International Society for

Technology in Education (2002) emphasized the importance of technology related

Page 66: Don's Dissertation

54

administrative competencies. Whiteside (2005) pointed out that there were only a few

school districts that sufficiently train school administrators to facilitate the effective uses

of technology or to use technology in a meaningful way to improve the effectiveness and

efficiency of their own administrative work (Dawson & Rakes, 2003; Consortium for

School Networking, 2004). Building-level administrators must take the initiative and the

responsibility for establishing prerequisites for the success of Internet-based courses and

learning. School-based administrators must have technology competencies that pertain to

their job and any technology application in their schools (Kearsley & Blomeyer, (2004a).

According to Kearsley and Blomeyer (2004a), there had been little research that

had focused its attention on preparing school administrators to manage online learning

programs. However, there had been an increasing interest in the public schools,

kindergarten through twelfth grade in online learning (Clark, 2001; Kearsley, 2000;

Kearsley & Blomeyer, 2004b; U S Department of Education, 2000), and to prepare

teachers to teach online (Collison, Erlbaum, Haavind, & Tinker, 2000; Kearsley &

Blomeyer, 2004b; Ko & Rossen, 2001; Palloff & Pratt, 2001). There was extensive data

on technology usage by teachers and students. However, there was little data that

supported conclusions about technology competencies of most educational leaders and

school administrators (Kearsley & Blomeyer, 2004a). A key variable in school

administrators’ ability to understand and implement technology was the extent to which

school administrators were truly comfortable in using technology (Kearsley & Blomeyer,

2004a).

School administrators were very busy people and did not take time to learn

technology, even though they were very committed and interested in learning the

Page 67: Don's Dissertation

55

technologies in their schools (Kearsley & Blomeyer, 2004a). Technology training must

encompass the Technology Standards for Administrators ([TSSA], (2001). According to

the TSSA (2001) critical competencies, school administrators must be able to develop

and communicate a plan for technology, identify curriculum, and teacher training needs

that were related to technology; model the use of technology; implement procedures and

policies in regards to technology use; collect and analyze data on technology use, and

develop guidelines related to technology.

The National Conference of State Legislature ([NCSL], 2010) confirmed that

effective administrators led strong schools. Strong leaders were needed to maintain a

focus on student achievement, instructional improvement, and to inspire the staff and

students in their building to do the same (NCSL, 2010). School leadership was regarded

as one of the key factors in accounting for the difference between schools that fostered

student learning and underperforming schools. Schools today required school leaders,

who were dynamic, talented, and well trained; who understood the economic, social, and

political forces that influenced education; who were committed to solutions and fresh

ideas and were willing to take risks in implementing them; and who had a twenty-first

century view of education management (NCSL, 2010).

According to Geer (2002), school administrators needed a comprehensive and a

thorough education in order to learn the necessary technology skills and knowledge.

Geer (2002) also denoted that whenever school administrators acted as technology

leaders, teachers as well as students successfully used and integrated technology in the

school curriculum. School administrators must have knowledge and skills in utilizing

Page 68: Don's Dissertation

56

technology for teaching and learning and utilizing technology in the non instructional

processes of leading and managing their schools (Geer, 2002).

Instructional and Academic Technology Leadership

Albright and Nworie (2008) defined instructional technology as the field, focus,

or function of the service; academic technology as the campus organizations that

provided the services; and instructional technologists as the professional members who

provided the services. Albright and Nworie (2008) did not see any difference between

the terms instructional technology and academic technology, but they understood that the

newer digital tools, however, were commonly described as academic technologies. The

Association for Educational Communications and Technology ([AECT], 2004)

definitions and terminology committee defined instructional technology as the theory and

practice of design, development, utilization, management, and evaluation of processes

and resources for learning.

According to Albright and Nworie (2008), definitions of instructional technology

usually emphasized the basic processes of teaching and learning and the instructional

contexts in which information was used. Therefore, educational leaders must be able to

develop more than just learning objects and training faculty to use course management

systems. Educational leaders must be able to (a) develop, enhance, maintain, use, and

assess learning environments (both physical and virtual); (b) plan and develop curricula

(with or without the use of technology products); (c) train faculty in all aspects of

pedagogy; (d) research and develop related solutions of instructional problems; (e) assess

learning; and (f) evaluate courses and programs (Albright & Nworie, 2008).

Page 69: Don's Dissertation

57

The California Department of Education ([CDE], 2010) contended that

educational leaders played an important role in ensuring that these key elements were in

place: (a) a comprehensive curricular design that effectively integrates technology into

teaching, learning, and assessment; (b) sufficient and appropriate hardware and software

to effectively implement programs; (c) sufficient, timely support to maintain both

hardware and software; (d) ongoing professional development and coaching for

administrators, teachers, and other instructional staff to support effective integration of

educational technology into the school culture; (e) an understanding of the social, ethical,

and legal issues related to using technology; and (f) ongoing funding to support the

continued implementation of educational technology. Educational leaders must be

effective in preparing students for adult work and at the same time, make education more

efficient and effective. Students need training in order to use modern technology tools

(CDE, 2010).

School Administrators’ Technological Competencies

The Collaborative for Technology Standards for School Administrators ([TSSA],

2001) facilitated the development of a national consensus on what preschool through

twelfth grade school administrators should know and also be able to do to optimize the

effective use of technology. From the Collaborative, came the Technology Standards for

School Administrators. The Collaborative believed that comprehensive implementation

of technology was nothing more than a large-scale systemic reform that leadership played

a significant part in successful school reform. The Collaborative standards focused on

Page 70: Don's Dissertation

58

the leadership role in enhancing learning and school operations through the use of

technology (TSSA, 2001).

The Collaborative standards were considered as good indicators of effective

leadership for technology in schools. These standards did not define any level of

knowledge and skills (minimum or maximum level) that were required of a leader, and

were neither a comprehensive list, nor a guaranteed recipe for effective technology

leadership (TSSA, 2001).

The twenty-first century school administrator needed to be a hands-on user of

technology. School administrators should not allow others to do their e-mails,

manipulate critical data, or handle other technology tasks for them. Technology should

be able to empower the school administrator, who could master the tools and processes of

technology, which allowed him or her to be creative and to be able to manage any

available information (TSSA, 2001).

There were six standards that the TSSA (2001) addressed in the areas of (a)

leadership and vision; (b) learning and teaching; (c) productivity and professional

practice; (d) support, management, and operations; (e) assessment and evaluation; and (f)

social, legal, ethical issues. Williams (2006) contended that effective leadership in the

area of technology from insightful and forward-thinking school leaders were expected

throughout the country and communities. According to Williams (2006), technology was

not an end unto itself, and the promotion of innovation toward the goal of school

improvement. School improvement should include technology integration, with the goal

being the improvement of teaching and learning (Williams, 2006).

Page 71: Don's Dissertation

59

Stuart, Mills, and Remus (2009) concluded that if principals and other school

leaders were to be effective technology leaders, they must have a level of ICT

competence. There had been little research on the competencies of school leaders and

how these competencies impact technology leadership, their willingness to push for the

implementation, and the use of technology in their schools (Anderson & Dexter, 2005;

Stuart et al., 2009; Testerman, Flowers & Algozzine, 2001). School leaders must take the

initiative to actively promote and build support for technology in their schools (Stuart et

al., 2009); need to understand the technology, and how it fitted within the whole school

(Howell, 2005; Howell & Boies, 2004).

Stuart et al. (2009) found that most school leaders were not actively involved in

the management of ICT and were limited in the ―hands on‖ experience with the

management of ICT. School leaders needed hands-on experience (Gallivan, Spitler, &

Konfaris, 2005; Schiller, 2003; Stuart et al., 2009). According to Stuart et al. (2001),

there were many research approaches that assessed school administrators ICT

competence.

Schiller (2003) developed an inventory of ICT competencies with applications

that includee email, PowerPoint, spreadsheets, and processing. According to Schiller

(2003), principals must assume a major responsibility for initiating and implementing

school change through use of ICT, as a result must facilitate complex decisions about the

integration of ICT into learning and teaching. Schiller pointed out that there was little

known about the actual use of ICT by principals, their preferences for gaining new skills

and understandings, and their perceived competence. It was noted that most principals

had not been prepared for their role of technology leaders as well as not having the

Page 72: Don's Dissertation

60

opportunities for meaningful experiences in using computers with children (Schiller,

2003).

Principals in the past had relied on their inexperienced peers and over-eager sales

people for guidance and advice when making decisions about ICT (Schiller, 2003).

Schiller’s inventory sought baseline data to determine the extent of personal use and

concerns about ICT, their perceived levels of computer competency, and their perceived

skills in use of ICT. Schiller found that there were variations between principals in terms

of their use of ICT, their perceived competencies, and their preferences for learning about

ICT.

Flowers and Algozzine (2000) created an inventory, The Basic Technology

Competence E Inventory (BTCEI), which measured the basic technology competence of

school administrators. Flowers and Algozzine (2000) found that the competence of

educational administrators was very low. Results from the BTCEI (a) provided

information to teacher education programs and professional development organizations;

and (b) helped researchers in the area of educational technology by providing a

measurable indicator of basic technology competencies for educators (Flowers &

Algozzine, 2000).

Summary

The success or failure of technology use depended more on human and contextual

factors than on the hardware or software (Egbert, Paulus, Nakmichi, 2002; Valdez,

McNabb, Foertseh, Anderson, Hawkes, & Raack, 2000). School administrators must

shift their focus from just providing more computers in schools to investing in the

Page 73: Don's Dissertation

61

teachers. Teachers played a major role in how successful technology could be in

education (Thompson, Schmidt, & Davis, 2003).

School administrators were expected to serve as efficient managers, and to direct

the day-to-day operations of the school (Valdez, 2004). School administrators must

possess business management techniques as well as command authority in order to

operate their schools (Valdez, 2004). School administrators must be transformational

leaders. Lashway, Mazzarella, & Grundy (1995) contended that transformational leaders

possess behaviors to (a) identify and articulate an organizational vision, (b) foster

acceptance of group goals, (c) have high performance expectations, (d) provide

intellectual stimulation, and (e) develop a strong school culture.

Leadership, change, and technology should work together to maximize the

potential for effective use of technology (Valdez, 2004). Educational leaders were

expected to know as well as to utilize instructional technology, especially those

technologies that were related to computer use for accessing and finding information and

that created and communicated new knowledge (Valdez, 2004). School leaders must (a)

prepare students to function in an information-based Internet-using society, (b) make

students competent in using tools found in almost all work areas, (c) make education

more effective and efficient, (d) help students become technology literate, and (e)

consider increasing educator technology effectiveness and modeling it after national

accepted guidelines (ISTE, 2000; Valdez, 2004).

Page 74: Don's Dissertation

62

CHAPTER 3. METHODOLOGY

The purpose of this study was to investigate the level of technology competence

for secondary principals and other school-based administrators (assistant principals, vice

principals, or administrative assistants), who were identified by the principal as proficient

users of technology in the schools. Currently, employed secondary principals in the Tri-

County, located in the southeastern part of South Carolina, were asked to determine their

use of computer applications in their administrative functions as secondary

administrators. This study also examined the relationship between the level of use of

computer applications by secondary principals and their previous computer use, computer

training, perceptions, and attitudes that were held by the school administrators toward

computers. Chapter 3 provides an introduction, statement of the problem, research

questions and hypotheses, methodology, research design and procedures, population and

sampling, instrumentation, validity and reliability, data collection procedures, data

analysis procedures, ethical considerations, and a summary.

Statement of the Problem

It was not known to what extent school-based administrators were competent in

utilizing instructional technology, especially those technologies that were related to

computer use for assessing and finding information, and for creating and communicating

new knowledge. School districts all over the world were faced with increasing pressure to

implement technology to enhance administration, teaching, and learning (Gurr, 2001).

Principals were expected to be able to manage the explosive change through an

increasing reliance on technological information as well as to become key leaders in

managing schools. Computer technologies were entering school administration systems

Page 75: Don's Dissertation

63

and were affecting the work places and faces of administrators, teachers, and even

changing the whole nature and structure of the organization (Yu, Chang, & Tsai, 2009).

Research Questions and Hypotheses

The following research questions and hypotheses guided this study:

R1 What is the difference in the mean scale sores for skill importance between

the secondary principals and the other administrators (assistant principals, vice principals

or administrative assistants)?

H0 There is no statistically significant difference in the mean scale scores for

skill importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

H1 There is a statistically significant difference in the mean scale scores for skill

importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

R2 What is the difference in the mean scale scores for technology competence

between the secondary principal and the other administrators (assistant principals, vice

principals or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

technology competence between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H2 There is a statistically significant difference in the mean scale scores for

technology competence between the secondary principal and the other administrators

(assistant principals, vice principals, or administrative assistants).

R3 What is the difference in the mean scale scores for frequency of use between

Page 76: Don's Dissertation

64

the secondary principals and the other administrators (assistant principals, vice principals

or administrative assistants)?

H0 There is no statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H3 There is a statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

R4 What is the difference in the mean scale scores for perceptions and attitudes of

school-based administrators toward computer use and the use of computer applications?

H0 There is no statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

H4 There is a statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

Research Methodology

The methodology used in this study was quantitative. Creswell (2009) defined

quantitative research as ―a means for testing objective theories by examining the

relationship among variables‖ (p. 4). According to Jenkins (2009), a quantitative

research approach offered results in precise measurements and was considered as the

preferred method for many researchers. In quantitative research, the aim of the

researcher was to determine the relationship between one thing (an independent variable)

Page 77: Don's Dissertation

65

and another (a dependent or outcome variable) in a population (Hopkins, 2000).

Atieno (2009) suggested that quantitative research paradigm ―is empirical in

nature; it is also known as the scientific research paradigm‖ (p. 13). Quantitative

research paradigm used a method of deductive reasoning, which used measurable tools to

collect relevant data (Jenkins, 2009). In the quantitative research approach, there were

certain types of social research problems that called for specific approaches (Creswell,

2009). According to Creswell (2009), ―if the problem calls for (a) identification of

factors that influence an outcome, (b) the utility of an intervention, or (c) understanding

the best predictors of outcomes, then a quantitative approach is best‖ (p. 18).

Sanchez (2006) pointed out that quantitative research generated statistics,

through a large-scale survey research by using questionnaires. This type of research

reached more people (Sanchez, 2006). Neill (2007) contended that the aim of

quantitative research was to classify features, count them, and then construct statistical

models in an attempt to explain what was observed. The researcher knew exactly and

clearly in advanced what he or she was looking for (Neill, 2007). Aliaga and Gunderson

(2002) and Muijs (2004) defined quantitative research as ―explaining phenomena by

collecting numerical data that are analyzed using mathematically based methods‖ (p. 11).

Muijs (2004) denoted that quantitative research was flexible because researchers studied

the number of phenomena as a way that was almost unlimited. Quantitative research

used statistics to analyze the data.

Quantitative data were collected on a wide number of phenomena through data

collection instruments like questionnaires and tests (Muijs, 2004). In quantitative

research, data were collected from someone or something (people or things) (Muijs,

Page 78: Don's Dissertation

66

2004). Quantitative researchers designed studies that allowed them to test their

hypotheses (Muijs, 2004). In quantitative research, the researcher collected relevant data

and used statistical techniques to decide whether or not to reject or provisionally accept

the hypothesis (Muijs, 2004). Muijs (2004) also suggested that accepting a hypothesis

was always provisional, as new data emerged that caused it to be rejected later on.

Quantitative research focused more on the ability to complete statistical analysis

(Mersdorf, 2009). Therefore, with quantitative studies, each participant was asked to

respond to the same questions (Mersdorf, 2009). According to Mersdorf (2009), surveys

and questionnaires were the most common techniques for collecting quantitative data.

More researchers were adopting web based survey collection for quantitative research

(Mersdorf, 2009). When using questionnaires, the researcher (a) gathered the responses

in a standardized manner, (b) collected information quickly, and (c) collected potential

information from a large portion of a group (Milne, 1999). Blake (2000) study examined

the level of technology competence of school-based administrators in schools in Florida

and investigated the factors that were associated with the concept of technology

competence. This study also used quantitative research and data collection methods.

Alden (2007) advocated the usage of quantitative versus qualitative approach

when preciseness and accuracy was needed. Alden (2007) used the terms questionnaires

(quantitative) and opinionaires (qualitative) in his study. The conclusion of this study,

according to Alden, was that questionnaires (quantitative) were the best choice for the

determining trends and ascertaining data for making organizational decisions.

Eveleth, Eveleth, O’Neill, and Stone (2006) supported implementing the

quantitative method when looking at testing methods using secure software on laptop

Page 79: Don's Dissertation

67

computers. This study involved gathering data by means of a survey to find out the

usefulness of the students using laptops to complete exams by usage of secure software.

The quantitative method, according to Eveleth et al., served well in ascertaining the

needed data of how many students found the usage of laptop computers using the secure

software useful and also what improvements needed to be made.

Research Design

This study used a descriptive design as a means to investigate the level of

technology competence for secondary principals and other school-based administrators

(assistant principals, vice principals, or administrative assistants). This type of study

examined the extent to which differences on one or more variables were related to

differences in one variable (Leedy & Ormrod, 2005). A descriptive study established

only the associations between variables, not the causality (Hopkins, 2000).

According to Picciano (2006), descriptive research used quantitative methods to

describe, interpret a current event, condition, or situation. Picciano (2006) contended that

quantitative research was flexible and was probably the most popular form of research in

education today. Descriptive research provided a descriptive analysis of a given sample

or population, presented quantitative data, and used hypotheses (Picciano, 2006).

Creswell (2009) showed that survey research provided a numeric or quantitative

description of attitudes, opinions, or trends of a population by studying the sample of that

population. Wasson (2002) pointed out that descriptive research involved the collection

of data in order to test hypotheses or answered questions concerning the current status of

the participants of the study. Johnson and Christiansen (2008) stated that descriptive

research was to provide an accurate description or picture of the status or characteristics

Page 80: Don's Dissertation

68

of a situation or phenomenon. The focus was on how to describe the relationships that

existed among the variables, or on describing the variables that existed in a given

situation, not in the cause-and-effect relationships (Johnson & Christiansen, 2008).

Descriptive research was sometimes conducted to learn about the attitudes, opinions,

beliefs, behaviors, and demographics (e.g., age, gender, ethnicity, and education) of

people (Johnson & Christiansen, 2008).

Shuttleworth (2008) defined descriptive research design as a scientific method

that involved describing or observing the behavior of a subject without influencing it in

any way. According to Shuttleworth (2008), descriptive research design allowed

observation without affecting normal behavior. Descriptive research design was

considered as a valid method for researching specific subjects and as a precursor to more

quantitative studies (Shuttleworth, 2008). Gay and Airasian (2000) suggested that

quantitative research methods were based on analyzing and collecting numerical data.

Gall, Gall, and Borg (2003) pointed out that a questionnaire was used extensively

in educational research to collect data about observable phenomena, such as interests,

inner experience, and values. Questionnaires were considered as documents that asked

the same questions of all participants (Gall et al., 2003). According to Gall et al. (2003),

the purpose of the questionnaire was to collect data from a sample that had been selected

to represent a population to which the general findings of the data analysis could be

generalized. This emphasis on population generalization was characteristic of

quantitative research (Gall et al., 2003).

According to Neill (2007), quantitative research (a) involved analysis of

numerical data; (b) sought precise measurement and analysis of target concepts (e.g.,

Page 81: Don's Dissertation

69

used surveys, questionnaires, etc.); and (c) was more efficient, and able to test

hypotheses. In quantitative research, the researcher tended to remain objectively

separated from the subject matter (Miles & Huberman, 1994; Neill, 2007). A quantitative

research approach with a descriptive research design was appropriate for this study to

investigate the level of technology competence for secondary principals and other school-

based administrators (assistant principals, vice principals, or administrative assistants)

located in the Tri-County of the southeastern part of South Carolina.

Carter’s (2003) study examined the administrative perceptions and attitudes

toward technology-based education and how this affected the administrator’s support of

technology-based education. An attitudinal survey instrument was used to measure

administrator perceptions and attitudes toward technology-based education. This study

surveyed administrators in selected Georgia schools.

Rodriquez (2008) implemented descriptive research design to explore the

relationship between leadership style, schoolwork culture (e.g. planning, assessment and

staff development), and the achievement of students. Rodriquez’s study included fifty-

seven schools and had principals and teachers involved in the survey. Rodriquez (2008)

concluded that school leadership had an indirect impact and schoolwork culture (e.g.

planning, assessment and staff development), and had a direct impact on student

achievement. Descriptive methods were used as a means for collecting data.

Population and Sampling Procedures

The population was secondary principals and other school-based administrators

(assistant principals, vice principals, or administrative assistants). The setting of this

research study was twenty-four high schools, Grades 9-12, located in the Tri-County of

Page 82: Don's Dissertation

70

the southeastern part of South Carolina. The secondary principals and other school-based

administrators (assistant principals, vice principals, or administrative assistants) were

purposively selected. According to the American School of Professional Psychology

(2009), purposive sampling started with a purpose in mind and the samples were thus

selected to include people of interest and exclude those who did not suit the purpose.

According to Tongco (2007), the purposive sampling technique was a type of

non-probability sampling that was most effective when the researcher needed to study a

certain cultural domain with knowledgeable experts within. Johnson and Christiansen

(2008) contended that in purposive sampling, the researcher specified the characteristics

of a population of interest and then tried to find the individuals who had those

characteristics. Johnson (2010) defined purposive sampling as the researcher specifying

the characteristics of the population of interest and then locating individuals who matched

those characteristics.

Neill (2003) stipulated that purposive sampling was units from a prespecified

group who were purposively sought out and sampled. This type of method did not

require or use randomization. Egan (2007) pointed out that purposive sampling aimed

to select samples based on criteria that were associated with the research. Therefore, the

researcher used purposive sampling in this research, which was aimed at investigating the

technology competence for secondary school principals and other school-based

administrators (assistant principals, vice principals, or administrative assistants).

Instrumentation

Based on the review of literature, a survey design was use in this research study to

collect data. According to the Fairfax County Department of Systems Management for

Page 83: Don's Dissertation

71

Human Services ([FCDSMHS], 2003), a survey was a means of gathering information

about a particular population by sampling some of its members. According to the

FCDSMES (2003), the primary purpose of a survey was to elicit information, which,

after evaluation, resulted in statistical characterization of the population sampled.

Creswell (2009) showed that a survey design provided a numeric or quantitative

description of trends, attitudes, or opinions of a population by studying a sample of that

population. The researcher usually made claims or generalized about the population from

the sample results (Creswell, 2009).

Gall et al. (2007) pointed out that a survey allowed researchers to easily gather

data about phenomena from informed participants. A survey packet was compiled. The

survey packet included a cover letter, the informed consent form, and the survey. A

survey was administered to secondary principals and other school-based administrators

(assistant principals, vice principals, or administrative assistants). The researcher used a

survey to collect numerical data. Data were in the form of numbers and statistics.

The superintendents of the Tri-County school districts were sent letters (see

Appendix A) asking for their permission to conduct this research study in their school

districts at the high schools using a survey. The survey instrument was strictly

confidential, and names were not used when the research study was published or

reported. After receiving permission to conduct the research study in the school districts

from the superintendents, the secondary principals were sent letters (see Appendix B),

and other school-based administrators (assistant principals, vice principals, or

administrative assistants) were sent letters (see Appendix C) asking them to participate in

this research study. After consenting to participate in the research study, secondary

Page 84: Don's Dissertation

72

principals and other school-based administrators (assistant principals, vice principals, or

administrative assistants) were also sent an Informed Consent Form (see Appendix D) to

provide information that affected their decision about whether or not they wanted to

participate in this research project.

Technology Competence Survey for School-Based Administrators

The survey instrument that was used in this research study was the Technology

Competence for School-Based Administrators survey (Blake, 2000), which was used and

developed for schools in the state of Florida. This survey instrument was modified (see

Appendix E) and used with school administrators in the Tri-County located in the

southeastern part of South Carolina. Written permission was obtained to use the

Technology Competence for School-Based Administrators survey (Blake, 2000). The

modified survey instrument became the Technology Competence Survey for School-

Based Administrators (see Appendix E).

Part I of the survey instrument was a skills rubric containing 10 sections that

addressed the 10 technology skills areas under investigation. Each item consisted of four

statements that ranged from No Knowledge or Use to Advanced Knowledge or Use. The

participants were instructed to respond by placing an ―X‖ in the box next to the statement

that best described his or her ability for each section.

Part II of the survey instrument was designed to determine the level of importance

that was placed on each of the skill areas. The administrators ranked the importance of

each skill area, in relation to their work as an administrator, using a 5-point

Likert-type scale 0-4. The following choices were: 4 = Essential; 3 = Very Important;

Page 85: Don's Dissertation

73

2 = Important; 1 = Somewhat Important; and 0 = Not Important. Administrators gave the

best description of their ability in each of these technology skill areas.

Part III of the survey instrument was designed to determine the frequency of

technology use by the administrators in the 10 specific technology applications. The 10

technology applications included: word processing, computer basics, database,

spreadsheet, Internet, desktop publishing, presentation, email, technology integration, and

smart board usage. The school-based administrator responded to the frequency of

technology tools with which he or she had applied over the past year. The scale included

Daily (or almost daily), Weekly (or several times weekly), Monthly (or several times

monthly), and Never (or rarely).

Part V of the survey instrument was designed to provide an opportunity for the

participants to make any comments regarding the role of technology in the leadership and

administration of schools. Since this research study benefited other administrators and

other school districts, the school based-administrators in this study were asked to share

any thoughts that they had in regards to the role of technology in the leadership and

administration of schools. Possible topics included training for administrators,

integration of technology into the curriculum, or how technology benefited them in their

administrative or leadership roles. The survey instrument was comprised of 40 questions.

Demographic Questionnaire

A demographic questionnaire (see Appendix F) was developed and administered

by the researcher. The demographic questionnaire was designed to determine specific

demographic information of the participants, (a) the participants’ school, (b) the

participants’ school district, (c) age of the participant, (d) ethnicity, (e) years of

Page 86: Don's Dissertation

74

experience, (f) highest level of education completed, and (g) primary position. The

participants marked Yes, No, or Don’t Know to having access, no access, or limited

access to computers and network. Using demographic information in a research study

provided a greater opportunity to disaggregate the information.

Validity

Moskal (2010) defined validity as the degree to which the evidence supported

interpretations of the data that were correct and that the manner in which the

interpretations were used was appropriate. Validity answered the question of truth; does

the survey instrument measure what it intended to measure?

Fraenkel and Wallen (2008) defined validity as referring to the appropriateness,

correctness, meaningfulness, and usefulness of the specific inferences researchers made

based on the data they collected. When establishing validity, the researcher was trying to

establish whether one could draw meaningful and useful inferences from scores on the

instrument (Creswell, 2009). According to Creswell (2009), when establishing the

validity of the scores in a survey, helped to identify whether an instrument was a good

one to use in survey research.

According to Blake (2000), a panel five experts in the area of educational

technology had reviewed the construct and content validity of the survey instrument,

Technology Competence for School-Based Administrators. The panel of experts

consisted of an instructional technology professor, and educational leadership professor, a

district-level technology manager, a school-based administrator, and a school-level

technology facilitator. Trochim (2006) pointed out that construct validity referred to the

degree to which inferences can legitimately be made from operationalizations in a study

Page 87: Don's Dissertation

75

to the theoretical constructs in which the operationalizations were based. Therefore,

construct validity was an assessment of how well the researcher could translate his or her

ideas or theories into actual measures or programs (Trochim, 2006).

Shuttleworth (2009) contended that most researchers tested the construct validity

before the main research that was, the pilot study. Pilot studies established the strength

of the research and allowed the researchers to make any adjustments if needed

(Shuttleworth, 2009). Shuttleworth pointed out that construct validity was valuable in

the social sciences as well as in education. According to Shuttleworth (2009),

establishing good construct validity, was a matter of experience and judgment, and

building up as much supporting evidence as possible.

Reliability

Gall et al. (2007) contended that reliability was in how similar the results of the

research study was when completed by different researchers as well as in different areas.

The Technology Competence for School-Based Administrators survey (Blake, 2000) was

administered to a panel of experts and in a pilot evaluation to assess the external

reliability of the instrument. The pilot evaluation used twenty-five school-based

administrators, who were in the elementary, middle, and high school levels from the

central Florida area. However, out of the twenty-five participants, the majority was from

the high schools (21 of 25) (Blake, 2000). The pilot participants were asked to provide

feedback on the format and clarity of the instrument. The survey instrument was then

revised based upon the participants’ feedback.

Howell, Miller, Park, Sattler, Schack, Spery, Widhalm, and Palmquist (2005)

contended that without the agreement of independent observers able to replicate research

Page 88: Don's Dissertation

76

procedures, or had the ability to use research tools and procedures, that yielded consistent

measurements, researchers would be unable to satisfactorily draw conclusions, formulate

theories, or make claims about the generalizability of their research. Therefore,

reliability was critical for many parts of the lives of people (Howell et al., 2005). Internal

reliability of the scale constructed use in the Technology Competence for School-Based

Administrators survey (Blake, 2000) was calculated for the survey data from all the

respondents using the Cronbach Alpha reliability coefficient. In this study, the internal

reliability of the scale constructed of Technology Competence, Skill Importance, and

Technology Use survey data in the Technology Competence Survey for School-Based

Administrators were calculated by using the Cronbach Alpha reliability coefficient.

Data Collection Procedures

The South Carolina Department of Education (2010) provided a directory of

sampling frame of names, school addresses, and e-mail addresses for the four school

districts and twenty-four secondary schools in the Tri-county located in the southeastern

part of South Carolina. After the approval of the Institutional Review Board (IRB) of

Capella University, granting permission to collect research data from human subjects, the

survey instrument, Technology Competence Survey for School-Based Administrators, was

distributed to the four superintendents in the Tri-county by mail. The superintendent’s

letter (see Appendix A) was sent to superintendents asking for their permission to

conduct the research in each high school in the school district, a copy of the principal’s

letter (see Appendix B), a copy of the assistant principal, vice principal, or administrative

assistant letter (see Appendix C), a copy of an informed consent form (see Appendix D),

a copy of the survey instrument (see Appendix E), and a copy of the demographic

Page 89: Don's Dissertation

77

questionnaire (see Appendix F). The superintendent’s letter included a brief overview of

the proposed research study and explained any anticipated risks. The first mailings were

sent to the superintendents in September 2010.

After the approval of the superintendent of each school district, a letter was sent

to each principal that included the superintendent’s approval letter (see Appendix A), a

principal’s letter (see Appendix B), the other school-based administrators (assistant

principals, vice principals, or administrative assistants) (see Appendix C), an informed

consent form (see Appendix D), a copy of the research survey instrument (see Appendix

E), a copy of the demographic questionnaire (see Appendix F), and a link to the survey

instrument online. The principal’s letter included a brief overview of the proposed

research study and explained any anticipated risks. Each principal were asked to send to

the researcher the name and e-mail address of his or her school-based administrator

(assistant principal, vice principal, or administrative assistant) who he or she deemed to

be the most technologically proficient on his or her staff. The researcher, in turn, sent

the school-based administrators the same information that was sent to the principals. The

first emails were sent to principals and other school-based administrators (assistant

principals, vice principals, or administrative assistants) in October 2010. Returned

surveys were recorded on a master list of schools. A follow-up facsimile (fax) was sent

to each non-responding principal or other school-based administrators (assistant

principals, vice principals, or administrative assistants) in December 2010. In January

2011, a mail-out was sent to all non-respondents. A general letter was sent to all non-

respondents asking for their help in completing the surveys. The survey materials were

sent to non-respondents by mail.

Page 90: Don's Dissertation

78

Through the Web-based survey link, each of the participants were able to review

the informed consent form (see Appendix D), and had the option of opting out of the

survey or to check the box to agree to participate in the survey research. This process of

agreeing to participate in the survey also meant that the participants had read the

informed consent form. A Web-based survey hosting service was used to administer and

collect the results of the survey instrument for analysis. All data collected were coded,

labeled, and uploaded into the statistical software package known as PASW (formerly

known as SPSS) statistical application for analysis.

To ensure confidentiality, the survey instrument, and the responses were removed

from the online hosting service at the close of the survey. Garson (2009) defined survey

research as ―the method of gathering data from respondents thought to be representative

of some population, using an instrument composed of closed structure or open-ended

items (questions)‖ (p. 1). According to Garson (2009), the least expensive of the data

collection mode was to administer the Web surveys.

Data Analysis Procedures

The statistical software package known as PASW (formerly known as SPSS) was

used to score, code, and analyze the research data. The statistical software provided a

broad range of capabilities for the study. Descriptive statistics were used to collect data

for the study and to present the quantitative descriptions. Trochim (2006) contended that

descriptive statistics described the basic features of the data in the study. According to

Trochim (2006), descriptive statistics provided simple summaries about the sample and

measures. In actuality, descriptive statistics described what was or what the data showed.

Lane (2010) contended that descriptive statistics was used to summarize a

Page 91: Don's Dissertation

79

collection of data in an understandable and clear way. There were two basic methods of

descriptive statistics: numerical and graphical (Lane, 2010). When using the numerical

approach, researchers computed statistics such as the mean and standard deviation (Lane,

2010). In the graphical approach, researchers created a stem and leaf display and a box

plot (Lane, 2010).

Descriptive statistics were used to interpret and analyze the data. Trochim (2006)

described descriptive statistics as typically describing what was or what data showed.

The data were collected and recorded on PASW for Windows® (Version 18.0)

spreadsheet. Measures of central tendency, the mean, median, and mode were used in

this study to describe typical scores that reflected how the data were similar. The

standard deviations and variances for each group were reported for all hypotheses. In this

research study, numerical methods were used. Measures of variability (standard

deviations and response ranges), and measures of central tendency (mean, median, and

mode) were used for the discussion of trends and that produced analysis across each of

the constructs that were under investigation. Comments that were provided by the

participants in Part IV were compiled and categorized to determine the frequency of

topics that were mentioned and that provided additional data for discussion.

The data collection instrument, Technology Competence Survey for School-Based

Administrators, modified with permission of the Technology Competence for School-

Based Administrators survey (Blake, 2000), was used to determine mean scale scores for

the dependent variables of technology competence, skill importance, and frequency of

use. Mean scale scores were used in order to prevent the use of eliminating surveys that

may contain one or more incomplete responses on individual items. For technology

Page 92: Don's Dissertation

80

competency, the range of mean score was 1.00-4.00. This was based on using a 4-point

Likert-type scale. The 3.00-4.00 indicated a high competency level and 1.00-2.00

indicated a low competency level. For the area of skill importance, a 0.00-4.00 scale was

used to determine the importance. The 0.00-2.50 score indicated a low importance and

the 2.51-4.00 indicated a high importance. On the survey instrument, the frequency of

use was based on a 4-point Likert-type scale ranging from 1.00-4.00. The 1.00-2.00

depicted a low frequency of use and 3.00-4.00 indicated a high frequency of use.

Perceptions and attitudes were also based on a 4- point Likert-type scale, which ranged

from 1.00-4.00. The 1.00-2.00 indicated disagreement and the 3.00-4.00 indicated

agreement. Ten specific technology applications (computer basics, word processing,

database, spreadsheet, desktop publishing, presentation, Internet, E-mail,

technology integration, and smart board usage) were investigated for the technology

variable.

The statistical test for the hypotheses that was identified for this study was the t-

test. The t-test was the most commonly used statistical data analysis procedure for

hypothesis testing (Creech, 2010a; Stallone, 2003; Trochim, 2006). Trochim (2006)

explained that the t-test assessed whether the means of two groups were statistically

different from each other. Whenever you wanted to compare the means of two

independent groups, the independent samples t-test analysis was appropriate (Field, 2009;

Trochim, 2006). When testing hypotheses, researchers often used a t-distribution that

was clearly related to the normal to test whether a sample came from a population with a

specified mean when the population standard deviation was unknown (Stallone, 2003).

Descriptive statistics, which included standard deviation, response range, and

Page 93: Don's Dissertation

81

means, were used to discuss trends and to analyze the variables that were under

investigation and to answer Hypotheses 1, 2, 3, and 4.

Hypotheses 1, 2, 3, and 4

H1 There is a statistically significant difference in the mean scale scores for skill

importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

H2 There is a statistically significant difference in the mean scale scores for

technology competence between the secondary principal and the other administrators

(assistant principals, vice principals, or administrative assistants).

H3 There is a statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H4 There is a statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

Null Hypotheses 01, 02, 03, and 04

To test the Null Hypotheses 01, 02, 03, and 04, two-tailed t-tests were used to

determine whether there was a significant difference between the principals and the other

administrators. The significance of each test was determined at the.05 level.

H0 There is no statistically significant difference in the mean scale scores for

skill importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

Page 94: Don's Dissertation

82

technology competence between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

Ethical Considerations

Every precaution was taken to ensure that the confidentiality, anonymity, and

privacy of the data and the participants, who were involved, as ethically possible. There

was no exchange of money. Therefore, no ethical issues arose.

An informed consent form was included with the purpose of the study, and the

instructions for responding to the survey instrument. The informed consent form

explained the ethical considerations and the assurance of confidentiality, privacy, and

anonymity to all participants. The survey instrument was online for each participant.

The instructions for the survey instrument were provided with clear definitions of

constructs at the beginning of the survey instrument.

Fowler (2009) contended that the basic principle behind ethical research was that the

participants should always know what they were signing up front and be given the choice

to participate or not to participate in the research study. Participation was voluntary, and

there was no force or coercion.

Page 95: Don's Dissertation

83

Summary

There were increasing numbers of educators, as well as national leaders, who

promoted the use of technology as being essential to improving education, and who

perceived the use of technology as being the essential element in any effort to prepare

students for the twenty-first century (Bennett & Gelernter, 2001; Dawson & Rakes, 2003,

Heinich, Molenda, Russell, & Smaldino, 2002). School administrators must be able to

(a) use technology to enrich curriculum and instruction, (b) assess the current use of

technology in the business operations of the school, (c) establish and monitor a long-term

technology plan for the school, (d) make extensive use of technology to assist adult

learners to stay or return to school, and (e) integrate the introduction of technology with

the school’s improvement plan (CAP, 2006).

According to Attaran and VanLaar (2001), technology played an important role in

the personal lives of citizens as well as in the workplace. You can find computers,

software, digital information, and communications, which were the constituents of the

information age, everywhere (Attaran & VanLaar, 2001). As educators enter the new

millennium, instructional technology was considered to be a key to educational quality

(Attaran & VanLaar, 2001). Educators must recognize the potential value of technology

and must realize that technology helped to expand opportunities for American children to

improve their skill, and get the children ready for the next century.

This study was a quantitative research. This study used a descriptive design as a

means to investigate the level of technology competence for secondary principals and

other school-based administrators (assistant principals, vice principals, or administrative

assistants). A modified survey instrument, Technology Competence Survey for

Page 96: Don's Dissertation

84

School-Based Administrators, used with permission from Technology Competence for

School-Based Administrators (Blake, 2000) was used to gather data that described the

technology competence of school administrators.

School leaders today faced a different set of challenges than their predecessors did

in the past (Schmeltzer, 2001). School safety, information overload, and community

pressures were just some of the issues that administrators had to grapple with.

Technology played a positive role in helping the school administrators to face these

challenges (Schmeltzer, 2001). However, administrators must have the vision and the

know-how to harness technology and make it a part of the fabric that supports teaching

and learning in schools (Schmeltzer, 2001).

Until now, professional development for educators had focused on the needs of

the classroom teacher, driven by a technology coordinator, or someone else, which was

once a classroom teacher. But with the increased presence of technology in schools

(Internet, e-mail, technology integration, etc.), there was a need for an overarching vision

and cohesive plan that school administrators could no longer avoid stepping up to the

plate to provide leadership for technology (Schmeltzer, 2001). School administrators

must be able to develop strategies for helping teachers to use technology in their

classrooms as well as understand how technology improved instructional practices

(Schmeltzer, 2001). Schmeltzer (2001) suggested that administrators must be able to

understand how technology could be successfully implemented in the schools, and how

to set reasonable expectations for its use. In short, school administrators must have a

vision for education and a plan to make it happen.

Page 97: Don's Dissertation

85

CHAPTER 4. RESULTS

This study investigated the level of technology competence for secondary

principals and other school-based administrators (assistant principals, vice principals,

etc.), who were identified by the principal as proficient users of technology in the

schools. The study focused on the use of computer applications in administrative

functions of secondary principals. This study also examined the relationship between the

level of use of computer applications by the secondary principals and previous computer

use, computer training, perceptions, and attitudes that were held by the school

administrators toward computers.

The purpose of this chapter was to present the data analysis results that emerged

from the participants’ survey responses in an effort to address the following research

hypotheses and their corresponding null hypotheses:

H1 There is a statistically significant difference in the mean scale scores for skill

importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

skill importance between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants).

H2 There is a statistically significant difference in the mean scale scores for

technology competence between the secondary principal and the other administrators

(assistant principals, vice principals, or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

technology competence between the secondary principals and the other administrators

Page 98: Don's Dissertation

86

(assistant principals, vice principals or administrative assistants).

H3 There is a statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H0 There is no statistically significant difference in the mean scale scores for

frequency of use between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants).

H4 There is a statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

H0 There is no statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications.

The remainder of this chapter provides the descriptive data of the survey

participants, an overview of the statistical analyses that were conducted in order to test

the null hypotheses, the results for each research hypothesis and its complimentary null

hypothesis, and finally a summary of the overall research findings.

Descriptive Data

This section of the chapter provides a descriptive summary of the survey

participants. All 18 of the participants were administrators in a 9th through 12th grade

educational organization. In addition, all 18 participants were from schools that were

connected by a computer network, and from school systems that were connected by a

computer network.

Page 99: Don's Dissertation

87

Table 1 provides a summary of the school sizes represented by the participants in

the study. The results indicate that the most common school size was between 100 to 200

students (44.4%) followed by more than 500 students (33.3%), and finally between 301

to 400 students (22.2%).

Table 1

Size of School Descriptive Summary

School size Frequency Percent

100 to 200 students 8 44.4

201 to 300 students 0 0.0

301 to 400 students 4 22.2

401 to 500 students 0 0.0

More than 500 students 6 33.3

The job positions of the survey participants are summarized in Table 2. The

results indicate that 33.3% of the survey participants were secondary principals and the

remaining 66.7% were other administrators, which included assistant principals (16.7%),

vice principals or other principals (38.9%), and administrative assistants (11.1%).

Page 100: Don's Dissertation

88

Table 2

Job Position Descriptive Summary

Job position Frequency Percent

Principal 6 33.3

Assistant principal 3 16.7

Vice principal or other principal 7 38.9

Administrative assistant 2 11.1

The number of years worked as a school administrator is summarized in Table 3.

The results indicate that the majority of the participants had between zero and five years

of experience as an administrator (66.7%). However, 22.2% had between 16 and 20

years of work experience as an administrator.

Table 3

Number of Years Worked as a School Administrator Descriptive Summary

Number of years worked as administrator Frequency Percent

0-5 years 12 66.7

6-10 years 1 5.6

11-15 years 0 0.0

16-20 years 4 22.2

Participants were also asked to describe their access to a computer. The results in

Table 4 indicate that the vast majority of the participants had a computer in their office

exclusively for their work (88.9%). Only two participants (11.1%) indicated that they

had access to a computer in a room other than their office.

Page 101: Don's Dissertation

89

Table 4

Computer Access Descriptive Summary

Computer access Frequency Percent

I have a computer in my office exclusively for my work 16 88.9

I have access to a computer in a room other than my office 2 11.1

Data Analysis Procedures

The statistical software package known as PASW (formerly known as SPSS) was

used to score, code, and analyze the survey research data. The independent variable in

this study was the type of administrator who responded to the survey, which contained

two levels (secondary principal or other administrator). There were four dependent

variables in this study. Each dependent variable was measured using multiple likert scale

items on the survey. Therefore in order to create an overall scale score for each

dependent variable, survey items linking to the same dependent variable were averaged

resulting in one continuous scale score. Since several likert scale items were used to

measure underlying constructs, the internal reliability of the scales was assessed by

computing a Cronbach’s alpha. The psychometric results of the survey scales are

presented in Table 5. The results indicate that there was a restricted range in the obtained

scale scores, the distributions were relatively symmetrical according to the skew values

(Field 2009), and the internal reliability ranged from fair to excellent (Ponterotto &

Ruckdeschel, 2007). Specifically, the skill importance scale and the technology

competence scale yielded excellent reliability, the frequency of use scale yielded fair

reliability, and the perceptions and attitudes scale yielded good reliability.

Page 102: Don's Dissertation

90

Table 5

Psychometric Results of the Research Survey

Range

Scale n M SD Potential Actual Skew

Skill importance 17 3.55 0.36 0.77 1-4 2.9-4.0 -0.06

Technology competence 18 3.15 0.47 0.81 0-4 2.4-3.8 -0.12

Frequency of use 18 3.18 0.35 0.62 1-4 3.0-4.0 0.56

Perceptions and attitudes 16 3.66 0.30 0.69 1-4 3.0-4.0 -0.69

The dependent variables were analyzed by conducting descriptive statistics for the

two groups of participants. Specifically, measures of central tendency were computed

such as means, medians and modes, and measures of dispersion were computed including

response ranges and standard deviations (Field, 2009). In addition, box plots were

constructed for each group for each of the dependent variables in order to show the

distributional characteristics of the data such as skewness, extreme values and outliers

(Field, 2009).

The research and null hypotheses were addressed by conducting independent

samples t-tests in which the two independent groups of participants were compared based

on a parametric dependent variable (Field, 2009). Since the independent samples t-test is

based on the statistical assumption of homogeneity of variance (equality of error

variances), Levene’s test of equality of error variance was conducted. For instances in

which the statistical assumption was violated, results for equal variances not assumed

were reported (Field, 2009). Statistical significance was set at an alpha of .05.

Page 103: Don's Dissertation

91

Results

This section of the chapter contains the statistical findings addressing the research

and null hypotheses. The null hypothesis was retained when the obtained statistical

significance value (p value) was greater than .05, indicating that the probability of

committing a Type I error (rejecting the null when it is true) was greater than 5% (Field,

2009). The null hypothesis was rejected when the obtained statistical significance value

was no more than .05, indicating that the probability of committing a Type I error was no

more than 5% (Field, 2009).

Research and null hypothesis one. The first research hypothesis predicted that

there is a statistically significant difference in the mean scale scores for skill importance

between the secondary principals and the other administrators (assistant principals, vice

principals or administrative assistants). The complementary null hypothesis predicted

that there is no difference in the mean scale scores for skill importance between the

secondary principals and the other administrators (assistant principals, vice principals or

administrative assistants). Descriptive statistics and an independent samples t-test were

conducted in order to address these hypotheses.

Table 6 presents the descriptive statistics for each of the skill importance items on

the survey based on the principals’ responses. The results indicate that the principals

tended to believe that the skills listed on the survey were very important (value of three)

to essential (value of four) on average, and there was a restricted range for all of the

items. The results also indicate that the principals perceived the ability to search for

electronic information and the ability to explore the Internet for information as the most

important computer skills given that every principal rated those two skills as essential.

Page 104: Don's Dissertation

92

Finally, the principals perceived the ability to create multimedia presentations and the

ability to proficiently use the SMART board as the least important computer skills.

Table 6

Descriptive Statistics for Skill Importance Survey Items: Principals

Source Mean Median Mode SD Range

Search for electronic information. 4.00 4.0 4 0.00 0

Perform basic computer operations 3.83 4.0 4 0.41 1

Create documents using a word processer 3.83 4.0 4 0.41 1

Create multimedia presentations 2.83 3.0 3 0.75 2

Obtain information using a database 3.00 3.0 2* 0.89 2

Use and manage e-mail. 3.83 4.0 4 0.41 1

Proficiently use the SMART Board 2.83 3.0 3 0.75 2

Use/create spreadsheets to analyze data 3.00 3.0 2* 0.89 2

Incorporate graphics into word processing 3.00 3.0 2* 0.89 2

Explore the Internet for information 4.00 4.0 4 0.00 0

*Multiple modes exist; the smallest mode is presented in the table.

The descriptive statistics for the other administrators are provided in Table 7. The

results indicate that the other administrators also believed that all of the skills listed on

the survey were very important (value of three) to essential (value of four) on average,

and there was a restricted range for all of the items. The results also indicate that the

other administrators perceived the ability to create documents using a word processing

program as the most important computer skill, and the ability to perform basic computer

operations such as running programs and loading software as the least important

Page 105: Don's Dissertation

93

computer skill, on average. However, as previously noted, all of the computer skills

listed on the survey were rated as very important to essential on average.

Table 7

Descriptive Statistics for Skill Importance Survey Items: Other Administrators

Source Mean Median Mode SD Range

Search for electronic information. 3.75 4.0 4 0.45 1

Perform basic computer operations 3.25 3.0 3* 0.75 2

Create documents using a word processer 3.92 4.0 4 0.29 1

Create multimedia presentations 3.50 4.0 4 0.67 2

Obtain information using a database 3.36 4.0 4 0.81 2

Use and manage e-mail. 3.83 4.0 4 0.58 2

Proficiently use the SMART Board 3.58 4.0 4 0.51 1

Use/create spreadsheets to analyze data 3.50 4.0 4 0.67 2

Incorporate graphics into word processing 3.75 4.0 4 0.45 1

Explore the Internet for information 3.75 4.0 4 0.45 1

*Multiple modes exist; the smallest mode is presented in the table.

Figure 1 shows the distributional characteristics of the overall skill importance

scale by group. The results indicate that the principal distribution had less variability

than did the other administrator distribution; although both distributions had a small

amount of variability. In addition, there was an extreme value above the mean within the

principal distribution. However, there were no outliers in either of the two distributions.

Finally, there was a negative skew in the data given the longer bottom whisker

representing the bottom 25% of the distribution for both distributions.

Page 106: Don's Dissertation

94

Figure 1. Box plots for the skill importance scale from the Technology Competence for

School-Based Administrators survey (Blake, 2000). The grey boxes represent the inter-

quartile range, which is defined as the middle 50% of the distribution. The upper and

lower whiskers represent the upper and lower 25% of the distribution. The black

horizontal line in the grey box represents the median, which is defined as the middle

score in a distribution. Black dots represent extreme values, which are defined as values

that fall more than two standard deviations away from the mean, and black asterisks

represent outliers, which are defined as values that fall more than three standard

deviations away from the mean (Field, 2009).

Page 107: Don's Dissertation

95

The descriptive statistics for the overall skill importance scale are featured in

Table 8. The results indicate that on average, principals rated the computer skills listed

on the survey as less important than did the other administrators (3.42 and 3.62,

respectively). However, both groups rated the computer skills as very important to

essential on average.

Table 8

Descriptive Statistics for the Overall Skill Importance Scale

Skill importance Mean Median Mode SD Range

Principal 3.42 3.35 3.30 0.33 1.00

Other administrator 3.62 3.65 4.00 0.37 1.10

An independent samples t-test was conducted in order to determine if the

difference between the two means was statistically significant. The results in Table 9

indicate that there was no statistically significant difference between the two skill

importance mean scale scores, t(16) = -1.13, p = .27.

Table 9

Independent Samples t-Test Results for Skill Importance

Levene's test

95% CI

Source F p Mean

difference t df p Lower Upper

Skill

importance 1.01 0.33 -0.20 -1.13 16 0.27 -0.58 0.18

Page 108: Don's Dissertation

96

The results for research and null hypothesis one indicate that there was no

statistically significant difference in the mean scale scores for skill importance between

the secondary principals and the other administrators (assistant principals, vice principals

or administrative assistants). Therefore the null hypothesis was retained.

Research and null hypothesis two. The second research hypothesis predicted

that there is a statistically significant difference in the mean scale scores for technology

competence between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants). The complementary null

hypothesis predicted that there is no difference in the mean scale scores for technology

competence between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants). Descriptive statistics and an

independent samples t-test were conducted in order to address these hypotheses.

Table 10 provides the descriptive statistics for each of the technology competence

items on the survey according to the principals’ responses. All of the items were based

on a four-point scale with the exception of database, which was based on a two-point

scale. The results indicate that the principals had some variability within the various

competencies as well as across the various competencies. On average, principals rated

themselves as most competent relative to technology integration. All of the principals

indicated that they encourage and support teachers to use technology to enhance lessons.

However, the principals rated themselves as least competent with regard to using a

SMART board. Their most common response pertaining to the use of SMART boards

was that they do not use a SMART board.

Page 109: Don's Dissertation

97

Table 10

Descriptive Statistics for Technology Competence Survey Items: Principals

Source Mean Median Mode SD Range

Word processing 3.33 3.5 4 0.82 2

Computer basics 3.17 3.0 3 0.75 2

Database 1.83 2.0 2 0.41 1

Spreadsheet 3.00 3.0 2* 0.89 2

SMART board 2.33 2.0 1* 1.37 3

E-mail 3.50 3.5 3* 0.55 1

PowerPoint 3.50 3.5 3* 0.55 1

Technology integration 4.00 4.0 4 0.00 0

*Multiple modes exist; the smallest mode is presented in the table.

The descriptive statistics for the other administrator group are presented in Table

11. The results indicate that the other administrators also showed some variability within

each competency as well as across the competencies. On average, the other

administrators also rated themselves to be most competent in technology integration, with

their most common response being that they encourage and support teachers to use

technology to enhance lessons. Furthermore, the other administrators also rated

themselves as least competent in the use of SMART boards. However, even though they

rated themselves as least competent in this area, the most common response was that they

can use the Notebook software and they can create new lessons using the software.

Therefore their competency levels were high, on average.

Page 110: Don's Dissertation

98

Table 11

Descriptive Statistics for Technology Competence Survey Items: Other Administrators

Source Mean Median Mode SD Range

Word processing 3.58 4.0 4 0.51 1

Computer basics 3.42 3.5 4 0.67 2

Database 1.83 2.0 2 0.39 1

Spreadsheet 3.08 3.0 4 0.90 2

SMART board 3.00 3.5 4 1.13 3

E-mail 3.33 3.0 3 0.49 1

PowerPoint 3.50 4.0 4 0.67 2

Technology integration 3.75 4.0 4 0.62 2

The distributional characteristics for the technology competence scale for the two

groups are featured in Figure 2. However, it is important to note that since the database

item was based on a two-point scale, the scores for that item were multiplied by two in

order to keep the overall scale on a four-point scale. The results indicate that the two

distributions were relatively similar, and both distributions had a relatively large inter-

quartile range. Furthermore, there were no extreme values or outliers in either of the two

distributions. Finally, both distributions were slightly negatively skewed given the longer

whiskers for the bottom 25% of the distribution as compared to the top 25% of the

distribution.

Page 111: Don's Dissertation

99

Figure 2. Box plots for the technology competence scale from the Technology

Competence for School-Based Administrators survey (Blake, 2000). The grey boxes

represent the inter-quartile range, which is defined as the middle 50% of the distribution.

The upper and lower whiskers represent the upper and lower 25% of the distribution.

The black horizontal line in the grey box represents the median, which is defined as the

middle score in a distribution. Black dots represent extreme values, which are defined as

values that fall more than two standard deviations away from the mean, and black

asterisks represent outliers, which are defined as values that fall more than three standard

deviations away from the mean (Field, 2009).

Page 112: Don's Dissertation

100

The descriptive statistics for the overall technology competence scale are featured

in Table 12. The results indicate that on average, principals rated their own technology

competency lower than the other administrators rated their own technology competency

(3.08 and 3.19, respectively). However, both groups rated their technology competency

to be good on average.

Table 12

Descriptive Statistics for the Overall Technology Competence Scale

Technology competency Mean Median Mode SD Range

Principal 3.08 3.00 2.75 0.55 1.38

Other administrator 3.19 3.19 2.75 0.45 1.25

An independent samples t-test was conducted in order to determine if the

difference between the two means was statistically significant. The results in Table 13

indicate that there was no statistically significant difference between the two technology

competence mean scale scores, t(16) = -0.41, p = .69.

Table 13

Independent Samples t-Test Results for Technology Competence

Levene's test

95% CI

Source F p Mean

difference t df P Lower Upper

Technology

competence 0.46 0.51 -0.10 -0.41 16 0.69 -0.65 0.44

Page 113: Don's Dissertation

101

The results for research and null hypothesis two indicate that there was no

statistically significant difference in the mean scale scores for technology competence

between the secondary principals and the other administrators (assistant principals, vice

principals or administrative assistants). Therefore the null hypothesis was retained.

Research and null hypothesis three. The third research hypothesis predicted that

there is a statistically significant difference in the mean scale scores for frequency of use

between the secondary principals and the other administrators (assistant principals, vice

principals or administrative assistants). The complementary null hypothesis predicted

that there is no difference in the mean scale scores for frequency of use between the

secondary principals and the other administrators (assistant principals, vice principals or

administrative assistants). Descriptive statistics and an independent samples t-test were

conducted in order to address these hypotheses.

Table 14 provides the descriptive statistics for each of the frequency of use items

on the survey based on the principals’ responses. The results indicate that the principals

had the most variability in their use of graphics, and they had no variability in their use of

e-mail given that all of the principals indicated that they use e-mail on a daily basis.

Therefore the principals used e-mail most often. However, principals were not likely to

use graphics or a SMART board, and they used graphics and SMART boards least often

on average.

Page 114: Don's Dissertation

102

Table 14

Descriptive Statistics for Frequency of Use Survey Items: Principals

Source Mean Median Mode SD Range

Word processor 3.67 4.0 4 0.52 1

Database 3.00 3.0 3 0.63 2

Spreadsheet 3.17 3.5 4 0.98 2

Presentation software 2.33 2.0 2 0.52 1

Graphics 1.83 1.5 1 1.17 3

Internet 3.50 4.0 4 0.84 2

Electronic mail 4.00 4.0 4 0.00 0

Information search 3.00 3.0 3 0.63 2

SMART board 1.83 2.0 2 0.75 2

The descriptive statistics for other administrators are presented in Table 15. The

results indicate that again, e-mail was used most often with all of the other administrators

indicating that they use e-mail every day. In addition, the results indicate that the other

administrators were the most diverse relative to their use of SMART boards. Finally,

other administrators used a database, graphics, and a SMART board least often; although

their frequency of use was still fairly regular, on average.

Page 115: Don's Dissertation

103

Table 15

Descriptive Statistics for Frequency of Use Survey Items: Other Administrators

Source Mean Median Mode SD Range

Word processor 3.83 4.0 4 0.39 1

Database 2.75 3.0 3 0.62 2

Spreadsheet 3.33 3.0 3 0.65 2

Presentation software 3.00 3.0 3 0.60 2

Graphics 2.75 2.5 2 0.87 2

Internet 3.83 4.0 4 0.39 1

Electronic mail 4.00 4.0 4 0.00 0

Information search 3.50 4.0 4 0.67 2

SMART board 2.75 3.0 2* 0.97 3

*Multiple modes exist; the smallest mode is presented in the table.

Figure 3 displays the distributional characteristics for the overall frequency of use

scale. The results indicate that the overall variability in the principals’ frequency of use

scale scores was smaller than the overall variability in the other administrators’ frequency

of use scale scores. The results also indicate that there was an extreme score above the

mean for both distributions. Furthermore, the other administrator distribution was more

symmetrical than the principal distribution, given the equal length of the whiskers.

Finally, the difference between the two medians suggests that a moderate to large

difference exists between the two groups relative to their frequency of use.

Page 116: Don's Dissertation

104

Figure 3. Box plots for the frequency of use scale from the Technology Competence for

School-Based Administrators survey (Blake, 2000). The grey boxes represent the inter-

quartile range, which is defined as the middle 50% of the distribution. The upper and

lower whiskers represent the upper and lower 25% of the distribution. The black

horizontal line in the grey box represents the median, which is defined as the middle

score in a distribution. Black dots represent extreme values, which are defined as values

that fall more than two standard deviations away from the mean, and black asterisks

represent outliers, which are defined as values that fall more than three standard

deviations away from the mean (Field, 2009).

Page 117: Don's Dissertation

105

The descriptive statistics for the overall frequency of use scale are provided in

Table 16. The results indicate that on average, principals rated their own frequency of

use lower than the other administrators rated their own frequency of use (2.93 and 3.31,

respectively). However, both groups rated their frequency of use to be fairly regular

(principals’ use approximately once a week and other administrators’ use between once a

week and daily) on average.

Table 16

Descriptive Statistics for the Overall Frequency of Use Scale

Frequency of use Mean Median Mode SD Range

Principal 2.93 2.89 2.89 0.30 0.89

Other administrator 3.31 3.22 3.22 0.31 1.11

An independent samples t-test was conducted in order to determine if the

difference between the two means was statistically significant. The results in Table 17

indicate that there was a statistically significant difference between the two frequency of

use mean scale scores, t(16) = -2.50, p = .02. Therefore, principals used technology less

frequently than do other administrators.

Table 17

Independent Samples t-Test Results for Frequency of Use

Levene's

test 95% CI

Source F p Mean

difference t df p Lower Upper

Frequency of use 0.09 0.77 -0.38 -2.50 16 0.02 -0.70 -0.06

Page 118: Don's Dissertation

106

The results for research and null hypothesis three indicate that there was a

statistically significant difference in the mean scale scores for frequency of use between

the secondary principals and the other administrators (assistant principals, vice principals

or administrative assistants). Specifically, principals used technology less frequently than

do other administrators. Therefore the null hypothesis was rejected.

Research and null hypothesis four. The fourth and final research hypothesis

predicted that there is a statistically significant difference in the mean scale scores for

perceptions and attitudes of school-based administrators toward computer use and the use

of computer applications. The complementary null hypothesis predicted that there is no

statistically significant difference in the mean scale scores for perceptions and attitudes of

school-based administrators toward computer use and the use of computer applications.

Descriptive statistics and an independent samples t-test were conducted in order to

address these hypotheses.

Table 18 provides the descriptive statistics for each of the perception and attitude

items on the survey based on the principals’ responses. The results indicate that the

principals tended to agree to strongly agree with all of the items and therefore they

responded favorably to all of the items, on average. In fact, all of the principals strongly

agreed that e-mail is an effective and essential tool for communication and sharing of

information, and therefore they agreed most with that particular item. Principals agreed

least that the use of technology in the classroom is among the greatest challenges and

responsibilities facing administrators today, and administrators, teachers, and students

should be able to proficiently use and deploy SMART boards. However, as previously

stated, they agreed or strongly agreed with all of the items on average.

Page 119: Don's Dissertation

107

Table 18

Descriptive Statistics for Perception & Attitude Survey Items: Principals

Source Mean Median Mode SD Range

Expect teacher and student proficiency 3.67 4.0 4 0.52 1

Administration basic knowledge 3.83 4.0 4 0.41 1

Computer is an essential tool 3.83 4.0 4 0.41 1

Confident in staff and their expertise 3.33 3.0 3 0.52 1

E-mail is effective and essential tool 4.00 4.0 4 0.00 0

Use of technology is major challenge 3.17 3.0 3 0.75 2

Technology is the future 3.67 4.0 4 0.52 1

All proficient in use of SMART board 3.17 3.0 3 0.75 2

Principals/teachers bring technology 3.50 3.5 3* 0.55 1

Technology training is needed daily 3.50 3.5 3* 0.55 1

*Multiple modes exist; the smallest mode is presented in the table.

The other administrator results provided in Table 19 indicate that there was not

much variability in their responses, and they agreed or strongly agreed with all of the

items, on average. As with the principals, all of the other administrators strongly agreed

that e-mail is an effective and essential tool for communication and sharing of

information. Therefore other administrators also agreed most with that particular item.

Other administrators agreed least that administrators, teachers, and students should be

able to proficiently use and deploy SMART boards, as did the principals. However, as

previously stated, other administrators agreed or strongly agreed with all of the items on

average.

Page 120: Don's Dissertation

108

Table 19

Descriptive Statistics for Perception & Attitude Survey Items: Other Administrators

Source Mean Median Mode SD Range

Expect teacher and student proficiency 3.58 4.0 4 0.51 1

Administration basic knowledge 3.92 4.0 4 0.29 1

Computer is an essential tool 3.75 4.0 4 0.45 1

Confident in staff and their expertise 3.67 4.0 4 0.49 1

E-mail is effective and essential tool 4.00 4.0 4 0.00 0

Use of technology is major challenge 3.50 4.0 4 0.67 2

Technology is the future 3.67 4.0 4 0.78 2

All proficient in use of SMART board 3.50 4.0 4 0.67 2

Principals/teachers bring technology 3.83 4.0 4 0.39 1

Technology training is needed daily 3.83 4.0 4 0.39 1

The distributional characteristics for the overall perceptions and attitudes scale are

presented in Figure 4. The box plots highlight the fact that the principals had more

variability in their scale scores and that both groups of participants had favorable

perceptions and attitudes given that all of the scale scores were three or above. The other

administrator distribution was relatively symmetrical given that the upper and lower

whiskers were approximately equal in length, while the principal distribution was slightly

negatively skewed. Finally, the results indicate that there were no extreme values or

outliers in either of the two distributions.

Page 121: Don's Dissertation

109

Figure 4. Box plots for the perceptions and attitudes scale from the Technology

Competence for School-Based Administrators survey (Blake, 2000). The gray boxes

represent the inter-quartile range, which is defined as the middle 50% of the distribution.

The upper and lower whiskers represent the upper and lower 25% of the distribution.

The black horizontal line in the gray box represents the median, which is defined as the

middle score in a distribution. Black dots represent extreme values, which are defined as

values that fall more than two standard deviations away from the mean, and black

asterisks represent outliers, which are defined as values that fall more than three standard

deviations away from the mean (Field, 2009).

Page 122: Don's Dissertation

110

The descriptive statistics for the overall perceptions and attitudes scale are

presented in Table 20. The results indicate that on average, principals’ perceptions and

attitudes were not as favorable as other administrators’ perceptions and attitudes (3.55

and 3.72, respectively); although both groups had very favorable perceptions and

attitudes about technology.

Table 20

Descriptive Statistics for the Overall Perceptions and Attitudes Scale

Perceptions and attitudes Mean Median Mode SD Range

Principal 3.55 3.55 3.30 0.40 1.00

Other administrator 3.72 3.80 3.80 0.24 0.67

An independent samples t-test was conducted in order to determine if the

difference between the two means was statistically significant. The results in Table 21

indicate that there was no statistically significant difference between the two perceptions

and attitudes mean scale scores, t (7) = -0.95, p = .38.

Table 21

Independent Samples t-Test Results for Perceptions and Attitudes

Levene's test

95% CI

Source F p Mean

difference t df p Lower Upper

Perceptions and

attitudes 6.09 0.03 -0.17 -0.95 7 0.38 -0.59 0.26

Note. Levene’s test of equality of error variance indicates that the assumption was violated and therefore

the results for equal variances not assumed were presented. The reduced degrees of freedom (df) are due to

a statistical adjustment that was made as a consequence of the statistical violation.

Page 123: Don's Dissertation

111

The results for research and null hypothesis four indicate that there was no

statistically significant difference in the mean scale scores for perceptions and attitudes of

school-based administrators toward computer use and the use of computer applications.

Therefore the null hypothesis was retained.

Summary

This study investigated the level of technology competence for secondary

principals and other school-based administrators (assistant principals, vice principals,

etc.), who were identified by the principal as proficient users of technology in the

schools. The study focused on the use of computer applications in administrative

functions of secondary principals. This study also examined the relationship between the

level of use of computer applications by the secondary principals and previous computer

use, computer training, perceptions, and attitudes that were held by the school

administrators toward computers.

The results of this study indicate that principals and other administrators are not

statistically significantly different with regard to their appraisals of the importance of

technology skills, their appraisals of their technology competence, or in their perceptions

and attitudes regarding computer use and the use of computer applications. However,

principals were found to be statistically significantly less likely to use technology when

compared to other administrators. Finally, the results of this study indicate that both

principals and other administrators were found to place a high level of importance on

technology skills, rate themselves as fairly to highly technologically competent, use

technology frequently, and have positive perceptions and attitudes about technology.

Page 124: Don's Dissertation

112

CHAPTER 5. RESULTS, CONCLUSIONS, AND RECOMMENDATIONS

Chapter 5 discusses the summary of the study, summary of the findings in

Chapter 4 as it relate to the research questions and hypotheses that guided the study and

conclusions, recommendations for further research and practice, and implications for this

research study.

Summary of the Study

The study focused on the use of computer applications in administrative functions

of secondary principals. The conclusions from this study indicated whether the

technology competence of school-based administrators was at an acceptable level, which

ensured effective and efficient utilization of technologies in the educational environment.

This study seek to identify specific technology skills and knowledge that school-based

administrators should posses and to describe the appropriate level of competence for each

of the technology skill areas.

The methodology that was used in this study was quantitative. The study used a

descriptive design as a means to investigate the level of technology competence for

secondary principals and other school-based administrators (assistant principals, vice

principals or administrative assistants). This type of study examined the extent to which

differences on one or more variables were related to differences in one variable (Leedy &

Ormrod, 2005).

This research design used a Web-based survey to gather data that was relevant to

the study. The Web link to the survey was provided in an email, which described the

purpose of the research study. The instructions for the survey provided clear definitions

of constructs at the beginning of the survey instrument. A descriptive design allowed the

Page 125: Don's Dissertation

113

researcher to collect data from secondary principals and other school-based

administrators (assistant principals, vice principals or administrative assistants). All

secondary principals and other school-based administrators (assistant principals, vice

principals or administrative assistants) in the selected school systems were invited to

participate in the online survey.

The population was secondary principals and other school-based administrators

(assistant principals, vice principals, or administrative assistants). The principals and

other school-based administrators (assistant principals, vice principals, or administrative

assistants) were purposively selected. Purposive sampling was used to investigate the

technology competence for secondary principals and other school-based administrators

(assistant principals, vice principals, or administrative assistants). All eighteen of the

participants were administrators in Grades 9-12. All eighteen participants were from

schools that were connected by a computer network, and from school systems that were

connected by a computer network.

The survey instrument that was use was the Technology Competence for School-

Based Administrators survey (Blake, 2000). Blake (2000) used and developed the survey

for schools in the state of Florida. This survey instrument was modified to use with

school administrators in the Tri-County located in the southeastern part of South

Carolina. The modified survey instrument was the Technology Competence Survey for

School-Based Administrators.

The survey instrument was used to estimate the percentage of population that had

specific attributes, when the researcher collected data from a small portion of the total

population (Dillman, 2000; Hardy, 2005; Wallen & Fraenkel, 2001). On-line surveys

Page 126: Don's Dissertation

114

were a very promising research tool to access and involve people (Buchanan, 2002;

Herrero & Meneses, 2006; Nesbary, 2000). This study seek to identify specific

technology skills and knowledge that school-based administrators possessed and to

describe the appropriate level of competence for each of the technology skill areas. The

ten technology applications included: word processing, computer basics, database,

spreadsheet, Internet, desktop publishing, presentation, email, technology integration, and

smart board usage.

The researcher developed and administered a demographic questionnaire. The

demographic questionnaire was designed to determine specific demographic information

of the participants, (a) the participants’ school, (b) the participants’ school district, (c) age

of the participant, (d) ethnicity, (e) years of experience, (f) highest level of education

completed, and (g) primary position. The participants were asked to mark Yes, No, or

Don’t Know to having access, no access, or limited access to computers and network.

The limitations of this study were: (a) this study was limited to public school

secondary principals and other administrators (assistant principals, vice principals or

administrative assistants). State requirements for certification for all public school

administrators were uniform, whereas, all non-public school administrators may not have

specified certification requirements; (b) this study was limited to the Tri-County school-

based administrators located in the southeastern part of South Carolina. Generalizing the

results to other states may be limited due to different certification requirements for school

administrators and variable fiscal priorities on the implementation of technologies; (c) the

results of the study were limited by the availability of technology and the application

software and hardware that is available to the participants. Due to technology acquisition

Page 127: Don's Dissertation

115

strategies of the participants’ schools or school districts, the participants had significantly

different opportunities to be able to use and to know technology in the performance of

their administrative job responsibilities; (d) data obtained from this study was dependent

on the truthfulness and accuracy of the participants; (e) the related competence level of

the participants and their actual skills were inflated due to the instrument’s self-reporting

nature; (f) data collection was restricted to the Technology Competence Survey for

School-Based Administrators; and (g) the number in this study was limited to the school-

based administrators who actually respond to the survey.

Every precaution was taken to ensure that the confidentiality, anonymity, and

privacy of the data and the participants, who were involved, as ethically possible. There

was no exchange of money. Therefore, no ethical issues arose. An informed consent

form was included with the purpose of the study, and the instructions for responding to

the survey instrument. The informed consent form explained the ethical considerations

and the assurance of confidentiality, privacy, and anonymity to all participants.

Summary of Findings and Conclusion

The following summary of the findings and conclusions was developed through

an analysis of the data that were gathered through the research study. This summary

provides a brief but concise analysis of the data to promote an understanding of the level

of technology competence for secondary principals and other school-based administrators

(assistant principal, vice principal, or administrative assistants).

A descriptive summary of the survey participants was provided. The conclusions

of study indicated that the most common school size was between 100 to 200 students

(44.4%) followed by more than 500 students (33.3%), and finally between 301 to 400

Page 128: Don's Dissertation

116

students (22.2%). The conclusions indicated that 33.3% of the survey participants were

secondary principals and the remaining 66.7% were other administrators, which included

assistant principals (16.7%), vice principals or other principals (38.9%), and

administrative assistants (11.1%). The conclusions indicated that the majority of the

participants had between zero and five years of experience as an administrator (66.7%).

However, 22.2% had between 16 and 20 years of work experience as an administrator.

The conclusions of the study indicated that the vast majority of the participants

had a computer in their office exclusively for their work (88.9%). Only two participants

(11.1%) indicated that they had access to a computer in a room other than their office.

A statistical software package known as PASW (formerly known as SPSS) was

used to score, code, and analyze the survey research data. The conclusions of this

research study indicated that principals and other administrators are not statistically

significantly different with regard to their appraisals of the importance of technology

skills, their appraisals of their technology competence, or in their perceptions and

attitudes regarding computer use and the use of computer applications. Principals were

found to be statistically significantly less likely to use technology when compared to

other administrators. The conclusions of this research study indicate that both principals

and other administrators were found to place a high level of importance on technology

skills, rate themselves as fairly to highly technologically competent, use technology

frequently, and have positive perceptions and attitudes about technology.

Research and Null Hypothesis One

Research and Null Hypothesis One predicted the mean scale sores for skill

importance between the secondary principals and the other administrators (assistant

Page 129: Don's Dissertation

117

principals, vice principals or administrative assistants). The complementary null

hypothesis predicted the mean scale scores for skill importance between the secondary

principals and the other administrators (assistant principals, vice principals or

administrative assistants).

Findings

The findings indicated that the principals tended to believe that the skills listed on

the survey were very important (value of three) to essential (value of four) on average,

and there was a restricted range for all of the items. The findings also indicated that the

principals perceived the ability to search for electronic information and the ability to

explore the Internet for information as the most important computer skills given that

every principal rated those two skills as essential. Finally, the principals perceived the

ability to create multimedia presentations and the ability to proficiently use the SMART

board as the least important computer skills.

The findings indicated that the other administrators also believed that all of the

skills listed on the survey were very important (value of three) to essential (value of four)

on average, and there was a restricted range for all of the items. The findings also

indicate that the other administrators perceived the ability to create documents using a

word processing program as the most important computer skill, and the ability to perform

basic computer operations such as running programs and loading software as the least

important computer skill, on average. However, as previously noted, all of the computer

skills listed on the survey were rated as very important to essential on average.

Conclusion

The conclusion drawn from these findings indicated that there was no statistically

Page 130: Don's Dissertation

118

significant difference in the mean scale scores for skill importance between the secondary

principals and the other administrators (assistant principals, vice principals or

administrative assistants).

Research and Null Hypothesis Two

Research and Null Hypothesis Two predicted the mean scale scores for

technology competence between the secondary principals and the other administrators

(assistant principals, vice principals or administrative assistants). The complementary

null hypothesis predicted the mean scale scores for technology competence between the

secondary principals and the other administrators (assistant principals, vice principals or

administrative assistants).

Findings

The findings indicated that the principals had some variability within the various

competencies as well as across the various competencies. On average, principals rated

themselves as most competent relative to technology integration. All of the principals

indicated that they encourage and support teachers to use technology to enhance lessons.

However, the principals rated themselves as least competent with regard to using a

SMART board. Their most common response pertaining to the use of SMART boards

was that they do not use a SMART board.

The findings indicated that the other administrators also showed some variability

within each competency as well as across the competencies. On average, the other

administrators also rated themselves to be most competent in technology integration, with

their most common response being that they encourage and support teachers to use

technology to enhance lessons. Furthermore, the other administrators also rated

Page 131: Don's Dissertation

119

themselves as least competent in the use of SMART boards. However, even though they

rated themselves as least competent in this area, the most common response was that they

can use the Notebook software and they can create new lessons using the software.

Therefore their competency levels were high, on average.

Conclusion

The conclusion drawn from these findings indicated that there was no difference

in the mean scale scores for technology competence between the secondary principals

and the other administrators (assistant principals, vice principals or administrative

assistants).

Research and Null Hypothesis Three

Research and Null Hypothesis Three predicted the mean scale scores for

frequency of use between the secondary principals and the other administrators (assistant

principals, vice principals or administrative assistants). The complementary null

hypothesis predicted the mean scale scores for frequency of use between the secondary

principals and the other administrators (assistant principals, vice principals or

administrative assistants).

Findings

The findings indicated that the principals had the most variability in their use of

graphics, and they had no variability in their use of e-mail given that all of the principals

indicated that they use e-mail on a daily basis. Therefore the principals used e-mail most

often. However, principals were not likely to use graphics or a SMART board, and they

used graphics and SMART boards least often on average. The conclusions indicated that

again, e-mail was used most often with all of the other administrators indicating that they

Page 132: Don's Dissertation

120

use e-mail every day. In addition, the conclusions indicated that the other administrators

were the most diverse relative to their use of SMART boards. Finally, other

administrators used a database, graphics, and a SMART board least often; although their

frequency of use was still fairly regular, on average.

The findings indicated that on average, principals rated their own frequency of

use lower than the other administrators rated their own frequency of use (2.93 and 3.31,

respectively). However, both groups rated their frequency of use to be fairly regular

(principals’ use approximately once a week and other administrators’ use between once a

week and daily) on average

Conclusion

The conclusion drawn from these findings indicated that there was a statistically

significant difference between the two frequency of use mean scale scores,. Therefore,

principals used technology less frequently than do other administrators.

Research and Null Hypothesis Four

Research and Null Hypothesis Four, the final hypothesis, predicted the mean scale

scores for perceptions and attitudes of school-based administrators toward computer use

and the use of computer applications. The complementary null hypothesis predicted the

mean scale scores for perceptions and attitudes of school-based administrators toward

computer use and the use of computer applications.

Findings

The findings indicated that the principals tended to agree to strongly agree with

all of the items, and therefore, they responded favorably to all of the items, on average.

In fact, all of the principals strongly agreed that e-mail was an effective and essential tool

Page 133: Don's Dissertation

121

for communication and sharing of information, and therefore they agreed most with that

particular item. Principals agreed least that the use of technology in the classroom was

among the greatest challenges and responsibilities facing administrators today, and

administrators, teachers, and students should be able to proficiently use and deploy

SMART boards. However, as previously stated, they agreed or strongly agreed with all

of the items on average.

The other administrator findings indicated that there was not much variability in

their responses, and they agreed or strongly agreed with all of the items, on average. As

with the principals, all of the other administrators strongly agreed that e-mail is an

effective and essential tool for communication and sharing of information. Therefore

other administrators also agreed most with that particular item. Other administrators

agreed least that administrators, teachers, and students should be able to proficiently use

and deploy SMART boards, as did the principals. However, as previously stated, other

administrators agreed or strongly agreed with all of the items on average.

Conclusion

The conclusion drawn from these findings indicated that there was no statistically

significant difference between the mean scale scores for perceptions and attitudes of

school-based administrators toward computer use and the use of computer applications.

Recommendations

The following recommendations were intended to assist in the accumulation of

knowledge regarding the investigation of the level of technology competence for

secondary principals and other school-based administrators (assistant principals, vice

principals, or administrative assistants), who were identified by the principal as proficient

Page 134: Don's Dissertation

122

users of technology in the schools. While this study successfully answered the research

questions and hypotheses, there were additional avenues of education and areas of

interest that arose from the analysis of research.

Recommendations for Future Research

1. This research study was conducted on a small scale. All 18 of the participants,

who completed the Web-based survey, response rate were excellent. However, this

research study could be repeated on a larger scale. The results by geographic regions

would be consistent with the results found in this research study, with a larger population.

A larger population would allow for a sufficient sample for data analysis. The

sample size could be increased to support validity of the conclusions that are found in this

study. School districts should perform other studies that are similar to this study to

determine the areas of need for administrators within their own districts. Data from these

studies would be beneficial in developing comprehensive staff development programs to

help in developing the necessary competencies in current school administrators.

2. The uses of technology in schools for educational purposes could

be areas for future research (Lay, 2007; Page-Jones, 2008). The use of technology for

educational, instructional purposes, and the role of leadership in technology were all

avenues for further research (Page-Jones, 2008; Veneszky, 2004). Training workshops

helped to raise principals’ awareness on the use of technology and helped to build their

confidence in using technology (Serhan, 2007).

3. Further research in the use of technology and more hands-on training for school

principals were needed. Future studies that were conducted simultaneously with training

workshops were recommended to assess the principals' abilities to use and evaluate the

Page 135: Don's Dissertation

123

different technologies. Further research can be done to explore how teachers in the high

school describe their use of technology to support teaching and learning.

4. This study can be replicated by using middle school and elementary principals.

This study consisted of principals and other administrators in the secondary school. This

sample size could be increased to ensure that all types of schools (elementary and middle)

are represented in the sample. This type of study would give educators and stakeholders

an opportunity to make data-driven decisions, which would be based on the findings at

each school level. Schools could reassess yearly to ensure that school administrators were

developing the necessary technology competencies.

Recommendation for Practice

1. School systems should provide all administrators access to computers,

software, and the latest technologies. Since technology played an important role in

enabling data-driven decision-making, school administrators must be able to have the

necessary tools to make these data-driven decisions. School administrators must stay

abreast of state technology plans, district technology plans, and related policies to ensure

that their schools are in compliance.

2. School systems should provide administrators training in school technology

management. In order to strengthen leadership at the school, school administrators must

be able to provide creative as well as transformative leadership for systematic change in

this rapidly evolving development of information and communication technology.

School administrators must have skills and processes that are used to improve instruction.

3. School systems should provide principals technology professional development

to increase the effectiveness of technology integration. School administrators must

Page 136: Don's Dissertation

124

support teachers in their planning and collaboration with technology. School

administrators must be able to provide principles of technology professional development

that increases the effectiveness of technology integration. School administrators should

receive adequate training and continuing education on how to best integrate technology

within their schools and should be evaluated for their proficiency in doing so.

4. School administrators should be knowledgeable about technology in order to

provide guidance concerning technology integration and use. School administrators must

support teachers in the role of providing adequate technology support. School

administrators should be actively involved in the development, the implementation and

the evaluation of technology integration goals.

5. School administrators should continue to follow the guidelines and standards

that were set forth by the International Society for Technology in Education (ISTE) and

the National Educational Technology Standards for Administrators (NETS-A). School

administrators must realize that the standards are the roadmap to teaching effectively and

growing professionally in this digital world. With technology changing in our society,

school administrators must demonstrate the behaviors and skills as digital professionals.

Implications

This research study focused on and investigated the level of technology

competence for secondary principals and other school-based administrators (assistant

principals, vice principals, or administrative assistants), who were identified by the

principal as proficient users of technology in the schools. Principals were expected to be

able to manage the explosive change through an increasing reliance on technological

information as well as to become key leaders in managing schools. Computer

Page 137: Don's Dissertation

125

technologies were entering school administration systems and were affecting the work

places and faces of administrators, teachers, and even changing the whole nature and

structure of the organization (Yu, Chang, & Tsai, 2009).

The administrators who responded to the survey were technologically proficient

according to the results of the survey. The findings of this research study were consistent

with previous research that principals, who modeled the use of technology, shared their

learning, and actively learned about technology, were more likely to have faculty and

students who used technology in their daily practice (Page-Jones, 2008).

School leadership played a very important role in the implementation of

technology in schools (Tooms, Acomb & McGlothlein, 2004; Golden, 2004; Serhan,

2007). Principals assumed an effective role in supporting and advocating the use of

technology in their schools, when they were introduced to the different available

technology resources and the role of technology in advancing their schools (Serhan,

2007). ―When school principals feel comfortable using the technology and realize its

possible applications in education then they can help facilitate its incorporation into the

curriculum. A positive attitude starting from the school leadership can spread to the

teaching faculty in the school and hence to the classroom and the students‖ (Tooms et al.,

2004, p. 14).

School administrators must provide positive reinforcement through mentors,

incentives, and staff development for integration of technology in the schools to promote

successful technology integration into the curriculum (Webb, 2011). According to

Flanagan and Jacobsen (2003), principals and teachers are faced today with the huge task

of reinventing classrooms and schools in a society that has been transformed by digital

Page 138: Don's Dissertation

126

technologies. With this implementation of technologies in the school, many

administrators felt overwhelmed by the mandate to integrate computer technology into

every subject, grade, and phase in the school. School administrators were now required

to assume leadership responsibilities in areas with which they were unfamiliar and for

which they have not received training (Flanagan & Jacobsen, 2003).

Just merely installing networks and computers in schools was insufficient for

educational reform. School administrators must know and understand pedagogical

issues, concerns about equity, have professional development, and have informed

leadership (Flanagan & Jacobsen, 2003). School administrators need to look at how

technology may help to solve problems, help to make decisions, and how to interact using

computers as tools (Flanagan & Jacobsen, 2003; Kearsley, 1998).

One of the many requirements of an effective school leader was providing strong

technology leadership (Redish & Chan, 2001). There had been very little attention given

to preparing school administrators for their role as technology leaders. Research indicated

that there were very few school administrators who used technology meaningfully to

improve the effectiveness and efficiency of their own work (Redish & Chan, 2001; Reidl,

Smith, Ware, Wark, & Yount, 1998).

School administrators must have basic technology competency. Without basic

technology competency, school administrators lack the ability to understand the various

policy and planning issues that were related to successful implementation of technology

(McLeod, Hughes, Richardson, Dikkers, Becker, Quinn, Logan, & Mayrose, 2005;

Redish & Chan, 2001).

As technology becomes increasingly important to the field of education in the

Page 139: Don's Dissertation

127

United States, the technology competence of secondary principals and other school-based

administrators needed to be investigated to identify what specific technology skills and

knowledge they possess, and what competencies were associated with a successful

educational leader. Examining the competency of school-based administrators and the

increased of technology use, had the potential to lead to a greater understanding of policy

differences, organizational practices, and how school-to-school technology was being

used as a teaching and learning tool (Blake, 2000).

Page 140: Don's Dissertation

128

REFERENCES

Afshari, M., Bakar, K. A., Luan, W. S., Samah, B. A., & Fooi, F. S. (2009). Technology

and school leadership. Technology, Pedagogy, and Education, 18(2). 235-248.

Retrieved from

http://ejournals.ebsco.com.library.capella.edu/direct.asp?ArticleID=4CB49137E1

1BF21B4B88

Albright, M. J., & Nworie, J. (2008). Rethinking academic technology leadership in the

era of change. Educause Quarterly, 31(1), 1-6. Retrieved from

http://www.educause.edu/EDUCAUSE+Quarterly/EQVolume312008/EDUCAU

SEQuarterlyMagazineVolum/162507

Aliaga, M., & Gunderson, B. (2002). Interactive statistics. (3rd ed.). Upper Saddle River,

NJ: Prentice Hall.

Alig-Mielcarek, J. M. (2003). A model of school success: Instructional leadership,

academic press, and student achievement. (Doctoral dissertation, Ohio State

University). Retrieved from http://etd.ohiolink.edu/send-

pdf.cgi/AligMielcarek%20Jana%20Michelle.pdf?acc_num=osu1054144000

Alonso, F., Manrique, D., & Viñes, J. (2009). A moderate constructivist e-learning

instructional model evaluated on computer specialists. Computers & Education,

53(1), 57-65. Retrieved from ScienceDirect Social & Behavioral Science

database.

Ambler, G. (2006). The practice of leadership: Howard Gardner defines leadership.

Retrieved from

http://www.thepracticeofleadership.net/2006/02/25/howard-gardner-defines-

leadership/

Ambler, G. (2008). The practice of leadership: Howard Gardner defines leadership.

Retrieved

http://www.thepracticeofleadership.net/2008/11/24/howard-gardner-defines-

leadership-2/

American School of Professional Psychology (2009). Purposive sampling. Retrieved

from

http://changingminds.org/explanations/research/sampling/purposive_sampling.ht

m

Amrein, A. L., & Berliner, D. (2003, February. The effects of high stakes testing on

student motivation and learning. Educational Leadership, 60(5), 32-37.

Retrieved from http://search.proquest.com/docview/224842817?accountid=27965

Page 141: Don's Dissertation

129

Anderson, R. E., & Dexter, S. (2000). School technology leadership: Incidence and

impact. UC Irvine: Center for Research on Information Technology and

Organizations. Retrieved from http://escholarship.org/uc/item/76s142fc

Anderson, R. E., & Dexter, S. (2005). School technology leadership: An empirical

investigation of prevalence and impact. Retrieved from

http://sdexter.net/Vitae/Ander_Dex-FInalLdrshp_12-03.doc

Association for Educational Communications and Technology. (2004). What is the

knowledge base? Retrieve from

http://www.aect.org/standards/knowledgebase.html

Atieno, O. (2009). An analysis of the strengths and limitations of qualitative and

quantitative research paradigms. Problems of Education in the 21st Century,

13(13), 13-18. Retrieved from EBSCOhost.

Attaran, M., & VanLaar, I. (2001). Managing the use of school technology. An eight step

guide for administrators. The Journal of Management Development, 20(5/6), 393-

401. Retrieved from

http://search.proquest.com/docview/216352701?accountid=27965

Baek, Y., Jung, J. & Kim, B. (2008). What makes teachers use technology in the

classroom? Exploring the factors affecting facilitation of technology with a

Korean sample. Computers & Education, 50(1), 224-234. Retrieved from

ScienceDirect Social & Behavioral Science.

Baloğlu, M., & Çevik, V. (2009). A multivariate comparison of computer anxiety levels

between candidate and tenured school principals. Computers in Human Behavior,

25(5), 1102-1107. Retrieved from ScienceDirect Social &

Behavioral Science.

Barber, M. (2004). The virtue of accountability: System redesign, inspection, and

incentives in the era of informed professionalism. Journal of Education, 185(1),

7-38. Retrieved from EBSCOhost .

Barnes, R. (2005). Moving towards technology education: Factors that facilitated

teachers’ implementation of a technology curriculum. Journal of Technology

Education, 17, 6-18. Retrieved from

http://scholar.lib.vt.edu/ejournals/JTE/v17n1/pdf/barnes.pdf

Bass, B. M. (1985). Leadership and performance. New York, NY: Free Press.

Becker, H. J. (2001). How are teachers using computers in instruction? A paper

presented at the 2001 meeting of the American Educational Research Association.

Retrieved from

Page 142: Don's Dissertation

130

https://www.msu.edu/course/cep/807/zOld807.1998Gentry/snapshot.afs/*cep240s

tudyrefs/beckeraera2001howtchrsusing.pdf

Bennett, W. J., & Gelernter, D. (2001). Improving education with technology. Retrieved

from http://www.edweek.org/ew/articles/2001/03/14/26bennett.h20.htm

Benson, P., Peltier, G. L., & Matranga, M. (1999). Moving school administrators into the

computer age. Education, 120(2), 326.

Benton Foundation. (2002). Great expectations: Leveraging America's investment in

educational technology. Washington, D.C.: Benton Foundation.

Bertin, C. K. (2006). The impact of information technology on the school system.

National Principals Association Annual Conference. Retrieved from

http://unpan1.un.org/intradoc/groups/public/documents/tasf/unpan024797.pdf

Blake, R. L. (2000). An investigation of technology competence of school-based

administrators in Florida school. (Doctoral dissertation, University of Central

Florida University). Retrieved from ProQuest Digital Dissertation database. (AAT

9977808).

Boster, F. J., Meyer, G. S., Roberto, A. J., & Inge, C. C. (2004). A report on the effect of

the unitedstreamingTM

application on educational performance. Retrieved from

http://www.southwesterncc.edu/distlearn/tutorials/articles/Effect%20%of%20Unit

edstreaming.pdf

Brockmeier, L. L., Sermon, J. M., & Hope, W. C. (2005). Principals’ relationship with

computer technology. National Association of Secondary School Principals.

NASSP Bulletin, 89(643), 45-63. Retrieved from

http://search.proquest.com/docview/216029417?accountid=27965

Buchanan, T. (2002). Online assessment: Desirable or dangerous? Professional

Psychology: Research and Practice, 33(2), 148-154. doi:10.1037/0735-

7028.33.2.148

Burns, R. (1978). Leadership. New York, NY: Harper & Row.

Bybee, R. W. (2003, May/June). Improving technology education: Understanding

reform—Assuming responsibility. Technology and Engineering Teacher, 62(8),

22-25. Retrieved from

http://www.iteaconnect.org/TAA/Resources/TAA_Articles.html

Byrom, E., & Bingham, M. (2001). Factors influencing the effective use of technology

for teaching and learning: Lessons learned from the SEIR*TEC intensive site

schools. Retrieved from

Page 143: Don's Dissertation

131

http://www.serve.org/seir-tec/publications/lessons.pdf

Calgary Board of Education. (2000). Leadership development program. Author.

Carter, T. H. (2003). An analysis of public school administrator perceptions and

attitudes toward technology-based education. Ph.D. dissertation, Clemson

University, United States -- South Carolina. Retrieved from Dissertations &

Theses: Full Text. (Publication No. AAT 3098291).

Cavanaugh, C. (2001). School administrators as educational technology leaders. Florida

ASCD, 1(2), 1-4. Retrieved from

http://www.coe.ufl.edu/faculty/cathycavanaugh/docs/etleader.htm

Center for Applied Research in Educational Technology. (2005). Curriculum and

instruction. Retrieved from

http://caret.iste.org/index.cfm?fuseaction=answers&QuestionID=7

Cey, T. (2001). Moving towards constructivist classrooms. Retrieved from

http://www.usask.ca/education/coursework/802papers/ceyt/ceyt.htm

Chance, P. L., & Chance, E. (2002). Introduction to educational leadership &

organizational behavior: Theory into practice. Larchment, NY: Eye on

Education.

Chang, I. H., Chin, J. M., & Hsu, C. M. (2008). Teachers’ perceptions of the dimensions

and implementation of technology leadership of principals in Taiwanese

elementary schools. Educational Technology & Society, 11(4), 229-245.

Retrieved from http://www.ifets.info/journals/11_4/17.pdf

Chang, I. H., & Wu, Y. (2008). A study of the relationships between principals’

technology leadership and teachers’ teaching efficiency. Journal of Educational

Update, 42(3), 1-8.

Cherry, B. L. (2007). Transformational versus Transaction Leadership. Retrieved from

http://www.succeedtolead.com/pdfs/articles/leadership/Transformational-vs-

Transactional_Leadership.pdf

Coleman, H., & Dickerson, J. (2007). E-Portfolio assessment of school leaders’

evaluation and technology competencies. Retrieved from

http://createconference.org/documents/archive/2007/coleman_dickerson_eval.pdf

Clark, T. (2001). Virtual high schools: State of the states. Macomb, IL: Center for the

Application of Information Technologies, Western Illinois University. Retrieved

from http://www.wested.org/online_pubs/virtualschools.pdf

Colorado State Government. (2005). Technology competence. Retrieved from

Page 144: Don's Dissertation

132

http://highered.colorado.gov/Academics/Transfers/gtPathways/Criteria/Competen

cy/technology.pdf

Collaborative for Technology Standards for Administrators. (2001). Technology

standards for school administrators. Retrieved from

http://www.ncrtec.org/pd/tssa/tssa.pdf

Collison, G., Elbaum, B., Haavind, S., & Tinker, R. (2000). Facilitating online learning:

Effective strategies for moderators. Madison, WI: Atwood.

Connor, L. J. (2004). Moving from transactional to transformational leadership in

colleges of agriculture. Retrieved from

http://findarticles.com/p/articles/mi_qa4062/is_200406/ai_n9451836/pg_7/?tag=c

ontent;col1

Consortium for School Networking. (2004). Digital leadership divide: without visionary

leadership, disparities in school technology budgets increase. Washington, DC:

Retrieved from

http://www.cosn.org/Portals/7/docs/digital_leadership_divide.pdf

Cradler, J., McNabb, M., Freeman, M., & Burchett, R. (2002). How does technology

influence student learning? Retrieved from

http://caret.iste.org/caretadmin/news_documents/StudentLearning.pdf

Creech, S. (2010a). T-test. Retrieved from

http://www.statisticallysignificantconsulting.com/Ttest.htm

Creech, S. (2010b). Statistics overview: Descriptive statistics. Retrieved from

http://www.statisticallysignificantconsulting.com/Statistics101.htm

Creighton, T. (2003). The principal as technology leader. Thousand Oaks, CA: Corwin

Press.

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods

approaches. Los Angles, CA: Sage.

Culp, K. M., Honey, M. & Mandinach, E. (2003). A retrospective in twenty years of

educational technology policy. Retrieved from

http://courses.ceit.metu.edu.tr/ceit626/week12/JECR.pdf

Dawson, C., & Rakes, G. C. (2003). The influence of principals' technology training on

the integration of technology into schools. Journal of Research on Technology in

Education, 36(1), 29-49. Retrieved from

http://search.proquest.com/docview/274700822?accountid=27965

Page 145: Don's Dissertation

133

DeMary, J. L. (2000). Educational objectives to include proficiency in the use of

computers and related technology. Commonwealth of Virginia Department of

Education. Author.

Deubel, P. (2003). An investigation of behaviorist and cognitive approaches to

instructional multimedia design. Journal of Educational Multimedia and

Hypermedia, 12(1), 63-90. Retrieved from

http://www.ct4me.net/multimedia_design.htm#top

Diamond, D. (2008). Leadership attributes bringing distance learning programs to scale.

Retrieved from

http://findarticles.com/p/articles/mi_hb5835/is_200803/ai_n32281698/

Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method. (2nd ed.).

New York, NY: John Wiley and Sons.

Dinham, S. (2005). Principal leadership for outstanding educational outcomes. Journal of

Educational Administration, 43(4/5), 338-356. Retrieved from

http://search.proquest.com/docview/220462050?accountid=27965

Ditzhazy, H. E. R., & Poolsup, S. (2002). Successful integration of technology into the

classroom. The Delta Kappa Gamma Bulletin, 68(3), 10-14. Retrieved from

EBSCOhost.

Dufour, R. (2001, Winter). In the right context. Journal of Staff Development, 22(1), 14-

17. Retrieved from

http://www.nsdc.org/news/getDocument.cfm?articleID=297

Dugger, W. E. (2007). The status of technology education in the United States.

Technology and Engineering Teacher, 67(1), 14-21. Retrieved from

http://www.iteaconnect.org/TAA/Resources/TAA_Articles.html

Earle, R. S. (2002). The integration of instructional technology into public education:

Promises and challenges. Educational Technology, 42(1), 5-13. Retrieved from

http://asianvu.com/bookstoread/etp/earle.pdf

Ed Tec Action Network. (2009). Why technology in schools. Retrieved from

http://www.edtechactionnetwork.org

Egan, J. (2007). Marketing communication. London: Cengage Learning EMEA.

Egbert, J., Paulus, T. M., & Nakmichi, Y. (2002). The impact of CALL instruction on

classroom computer use: A foundation for rethinking technology in teacher

education. Learning & Technology, 6(3), 108-126.

Ertmer, P.A., Bai, H., Dong, C., Khalil, M., Park, S. H., & Wang, L. (2002). Technology

Page 146: Don's Dissertation

134

leadership: Shaping administrators’ knowledge and skills through an online

professional development course. Retrieved from

http://www.edci.purdue.edu/ertmer/docs/SITE02_TIPDOC_paper.PDF

Eveleth, L. B. Eveleth, D. M., O’Neill, D. M., & Stone, R. W. (2006). Enabling laptop

exams using secure software: Applying the technology acceptance model. Journal

of Information Systems Education, 17(4), 413-420. Retrieved from EBSCOhost.

Field, A. (2009). Discovering statistics using SPSS (3rd ed.). Thousand Oaks, CA: Sage.

Finn, J. D. (1953). Television and education: A review of research. Educational

Technology Research and Development, 1(2), 106-126.

Flanagan, L., & Jacobsen, M. (2003). Technology leadership for the twenty-first century

principal. Journal of Educational Administration, 41(2), 124-142. Retrieved from

http://search.proquest.com/docview/220454714?accountid=27965

Flowers, C. P., & Algozzine, R. F. (2000). Development and validation of scores on the

basic technology competencies for educators inventory. Educational and

Psychological Measurement, 60(3), 411-418. doi: 10.1177/00131640021970628

Fowler, F. J. (2009). Survey research methods (4th ed.). Thousand Oaks, CA: SAGE.

Fraenkel, J. & Wallen, N. E. (2008). How to design and evaluate research in education

(7th ed.). San Francisco, CA: McGraw-Hill.

Friehs, B. (2009). Knowledge management in educational settings. Retrieved from

http://www.see-educoop.net/education_in/pdf/erasmus2009-oth-enl-t03.pdf

Fullan, M. (2001). Leading in a culture of change. San Francisco, CA: Jossey-Bass.

Fullan, M. (2003). The moral imperative of school leadership. Toronto, ON: Corwin

Press.

Gahala, J. (2001). Critical issue: Promoting technology use in schools. Retrieved from

http://www.ncrel.org/sdrs/areas/issues/methods/technlgy/te200.htm

Gallagher, C. W., & Ratzlaff, S. (2008). The road less traveled. Educational Leadership,

65(4), 48-53. Retrieved from EBSCOhost .

Gall, M. D., Gall, J. P., & Borg, W. R. (2003). Educational research: An introduction.

(7th ed.). Boston, MA: Allyn & Bacon.

Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research: An introduction.

(8th ed.). Boston, MA: Pearson.

Page 147: Don's Dissertation

135

Gardner, J. (2000). The nature of literature. In Educational Leadership (pp.3-12). San

Francisco, CA: Jossey-Bass.

Garson, G. D. (2009). Survey research. Retrieved from

http://faculty.chass.ncsu.edu/garson/PA765/survey.htm

Gavin, D. (2002). How should administrators support teachers in the integration of

technology? Retrieved from

http://www.usask.ca/education/coursework/802papers/gavin/index.htm

Gay, L. R. & Airasian, P. (2000). Educational research: Competencies for analysis and

application. Upper Saddle River, NJ: Prentice Hall.

Geer, C. (2002). Technology training for school administrators: A real world approach.

TechTrends, 46(6), 56-59.

Gibson, I. W. (2002). Leadership, technology, and education: Achieving a balance in new

school leader thinking and behavior in preparation for twenty-first century global

learning environments. Technology, Pedagogy and Education, 11(3), 315-334.

Retrieved from EBSCOhost.

Goddard, M. (2002). What do we do with these computers? Reflections on technology

into the classroom. Journal of Research on Technology in Education, 35(1), 19-

26. Retrieved from

http://search.proquest.com/docview/274702669?accountid=27965

Golden, M. (2004). Technology’s potential promise for enhancing student learning. T. H.

E. Journal, 31(12), 42-44. Retrieved from

http://search.proquest.com/docview/214829456?accountid=27965

Goldring, E., & Greenfield, W. D. (2002). Understanding the evolving concept of

leadership in education: Roles, expectations, and dilemmas. In J. Murphy (Ed.).

The Educational Leadership Challenge: Redefining Leadership for the 21st

Century. (Vol. 1, pp. 1-19). National Society for the Study of Education.

Gosmire, D., & Grady, M. (2007). A bumby road: Principal as technology leader.

Principal Leadership, 7(6), 17-21. Retrieved from

http://search.proquest.com/docview/233321553?accountid=27965

Grant, M. M. (2002). Getting a grip on project-based theory learning: Theory, cases, and

recommendations. Meridian: A Middle School Computer Technologies Journal,

5(1), 1-3. Retrieved from

http://www.ncsu.edu/meridian/win2002/514/index.html

Page 148: Don's Dissertation

136

Gurr, D. (2000). School principals and information and communication technology.

Retrieved from

http://staff.edfac.unimelb.edu.au/~davidmg/papers/Gurr_Conf_Paper.pdf

Gurr, D. (2001). Principals, technology, and change. Retrieved from

http://technologysource.org/article/principals_technology_and_change

Gurr, D., Drysdale, L. & Mulford, B. (2006). Models of successful principal leadership.

School Leadership and Management, 26(4), 371-395.

doi: 10.1080/13632430600886921

Hardy, C. (2005). A study of Midwest students’ technology skills. (Doctoral dissertation,

University of Nebraska). Retrieved from http://dwb4.unl.edu/Diss/Hardy/intro.pdf

Harris, A. (2002). School improvement: What’s in it for schools? London: Routledge.

Hayes, L. S. (2004). Methods used to determine technology competence for Virginia

teachers. (Doctoral dissertation, Virginia Polytechnic Institute and State

University). Retrieved from

http://scholar.lib.vt.edu/theses/available/etd-04262004-

163156/unrestricted/FinalDissertation.pdf

Heinich, R., Molenda, M., Russell, J. D., & Smaldino, S. (2002). Instructional media and

technologies for learning (7th ed.). Upper Saddle, NJ: Merrill.

Herrero, J., & Meneses, J. (2006). Short Web-based versions of the perceived stress

(PSS) and the center for epidemiological studies-depression CESD scales: A

comparison to pencil and paper responses among Internet users. Computers in

Human Behavior, 22(5), 830-846. Retrieved from ScienceDirect Social &

Behavioral Science.

Holland, L., & Moore-Steward, T. (2000). A different divide: Preparing tech-savvy

leaders. Leadership, 30(1), 37-38.

Ho, J. (2006). Technology leadership. Singapore: Educational Technology Division,

Ministry of Education. Retrieved from

http://iresearch.edumall.sg/iresearch/slot/u110/litreviews/techn_leadership%5B1

%5D.pdf

Hope, W. C., & Brockmeier, L. L. (2002). Principals’ self-report of their computer

technology expertise. In F. K. Kochan & C. J. Reed (Eds.). Accountability:

Education and Educational Leaders Under a Microscope (pp. 57-64). Auburn,

AL: Truman University, Pierce Institute.

Hopkins, W. G. (2000). Quantitative research design. Retrieved from

Page 149: Don's Dissertation

137

http://www.sportsci.org/jour/0001/wghdesign.html

Howell, J., Miller, P., Park, H. H., Sattler, D., Schack, T., Spery, E., Widhalm, S., &

Palmquist, M. (2005). Reliability and validity. Retrieved from

http://writing.colostate.edu/guides/research/relval

Hughes, M., & Zachariah, S. (2001). An investigation into the relationship between

effective administrative leadership styles and the use of technology. International

Electronic Journal for Leadership in Learning, 5(5), 1-8. Retrieved from

http://www.ucalgary.ca/iejll/hughes_zachariah

Hunnicutt, C. (2008). Successful school administrators. Retrieved from

http://pcourses.teacherswithoutborders.org/leadership-in-education/integrative-

thinking-and-school-leadership/successful-school-administrators

Hursh, D. (2005). The growth of high stakes testing in the USA: Accountability, markets

and the decline of educational quality. British Educational Researcher, 20, 2-7.

International Society for Technology in Education. (2000). National educational

technology standards for teachers. Retrieved from

http://cnets.iste.org/teachers/t_stands.html

International Society for Technology in Education. (2002). National educational

technology standards for administrators. Retrieved from

http://www.iste.org/AM/Template.cfm?Section=NETS

International Society for Technology in Education. (2009). Technology and student

Achievement—The indelible link. Retrieved from

http://www.iste.org/Content/NavigationMenu/Advocacy/Policy/59.08-

PolicyBrief-F-web.pdf

International Technology Education Association. (2006). Technological literacy for all:

A rationale and structure for the study of technology. Reston, VA: Author.

Januszewski, A. (2001). Educational technology: The development of a concept.

Englewood, CO: Libraries Unlimited.

Jenkins, L. (2009). Fundamentals of quantitative research: Considerations in research

methodology. Retrieved from

http://academicwriting.suite101.com/article.cfm/fundamentals_of_quantitative_re

search

Johnson, B., & Christiansen, L. (2008). Educational research: Quantitative, qualitative,

and mixed approaches (3rd ed.). Thousand Oaks, CA: Sage.

Page 150: Don's Dissertation

138

Johnson, D. (2004). Ban or boost student-owned technology? School Administrator,

61(10), 8. Retrieved from

http://search.proquest.com/docview/219261963?accountid=27965

Johnson, R. B. (2010). Sampling. Retrieved from

http://www.southalabama.edu/coe/bset/johnson/lectures/lec7.htm

Johnson, T. (2002). Research of learning: Theories as to distance education.

Retrieved from

http://ts010.k12.sd.us/Portfolio/712_Assign/Formal_Paper.doc

Jung, D. I., Chow, C., & Wu, A. (2003). The role of transformational leadership in

enhancing organizational innovation: Hypotheses and some preliminary findings.

The Leadership Quarterly, 14(4/6), 525-544.

doi:10.1016/S1048-9843(03)00050-X

Lowther, D. L., Strahl, J. D., Inan, F. A., & Bates, J. (2007). Freedom to Learn

program: Michigan 2005–2006 evaluation report. Memphis, TN: Center for

Research in Educational Policy.

Karagiorgi, Y., & Symeou, L. (2005). Translating contructivism into instructional

design: Potential and limitations. Educational Technology & Society, 81(1) 17-27.

Retrieved from http://www.ifets.info/journals/8_1/5.pdf

Kearsley, G. (1998). Educational technology: A critique. Educational Technology, 38(2),

47-51.

Kearsley, G. (2000). Learning and teaching in cyberspace. Belmont, CA: Wadsworth.

Kearsley, G., & Blomeyer, R. (2004a). Preparing school administrators for online

learning. Retrieved from

http://home.sprynet.com/~gkearsley/MOLarticle_Oct04.htm

Kearsley, G., & Blomeyer, R. (2004b). Preparing K-12 teachers to teach online.

Educational Technology, 44(1), 49-52. Retrieved from

http://home.sprynet.com/~gkearsley/TeachingOnline.htm

Keengwe, J. (2007). Faculty integration of technology into instruction and students’

perceptions of computer technology to improve student learning. Journal of

Information Technology Education, 6, 1-12. Retrieved from

http://informingscience.org/jite/documents/Vol6/JITEv6p169-

180Keengwe218.pdf

Key, J. P. (1997). Research design in occupational education: Reliability and validity.

Page 151: Don's Dissertation

139

Retrieved from

http://www.okstate.edu/ag/agedcm4h/academic/aged5980a/5980/newpage18.htm

Khosrow-Pour, M. (2006). Emerging trends and challenges in information technology.

London: Idea Group.

Ko, S., & Rossen, S. (2001). Teaching online: A practical guide. Boston, MA: Houghton

Mifflin.

Kozloski, K. C. (2006). Principal leadership for technology integration: A study of

principal technology leadership. (Doctoral dissertation, Drexel University).

Retrieved from

http://www.iste.org/Content/NavigationMenu/Research/NECC_Research_Paper_

Archives/NECC_2007/Kozloski_Kristen_N07.pdf

Kozma, R. (2003). Technology innovation and educational change: A global perspective.

Eugene, OR: International Society for Technology in Education.

Lamb, A. (2001). Lumberjack leadership: School administrators and technology

integration. Retrieved from

http://eduscapes.com/sessions/lumber/index.htm

Lane, D. M. (2010). Descriptive statistics. Retrieved from

http://davidmlane.com/hyperstat/A28521.html

Lan, J. (2001). Web-based instruction for education faculty: A needs assessment. Journal

of Research on Computing in Education, 33(4), 385-399. Retrieved from

EBSCOhost.

Langlie, N. (2008). Educational technology leaders: Competencies for a conceptual age.

(Doctoral dissertation, Capella University). Retrieved from ProQuest Digital

Dissertations database. (AAT 3320348).

Law, J. P. (2002). What effect of West Virginia principals’ leadership styles, their levels

of computer anxiety, and selected personal attributes upon their levels of

computer use. Retrieved from

http://wvuscholar.wvu.edu:8881//exlibris/dtl/d3_/apache_media/6319.pdf

Lay, C. L. (2007). Smaller isn’t always better: School size and school participation

among young people. Social Science Quarterly (Blackwell Publishing Limited),

88(3), 790-815. doi:10.1111/j.1540-6237.2007.00483.x

Leedy, P. D., & Ormrod, J. E. (2001). Practical research: Planning and design. (7th ed.).

Upper Saddle, NJ: Merrill Prentice.

Page 152: Don's Dissertation

140

Leedy, P. D., & Ormrod, J. E. (2005). Practical research: Planning and design (8th ed.).

Upper Saddle River, N.J.: Pearson Education.

Leithwood, K., & Jantzi, D. (2006). Transformational school leadership and large-

scale reform: Effects on students, teachers, and their classroom practices. School

Effectiveness and School Improvement, 17(2), 201-227. Retrieved from

EBSCOhost.

Leithwood, K., & Riehl, C. (2003). What we know about successful school leadership.

United Kingdom: National College for School Leadership.

Lemke, C., Coughlin, E., & Reifsneider, D. (2009). Technology in schools. Retrieved

from

http://www.cisco.com/web/strategy/docs/education/TechnologyinSchoolsReport.p

df

Levitin, A. V., & Redman, T. C. (1998). Data as a resource: Properties, implications, and

prescriptions. Sloan Management Review, 40(1), 89-101. Retrieved from

EBSCOhost.

Lloyd, G. (2003). Mathematics teachers’ beliefs and experience with innovative

curriculum in teacher development. Mathematics Education Library, 31, Part 2,

149-159. DOI: 10.1007/0-306-47968.

Lukin, L. E., Bandalos, D. L., Eckhout, T. J., & Mickelson, K. (2004). Facilitating the

development of assessment literacy. Educational Measurement: Issues and

Practice, 23(2), 26-32. doi: 10.1111/j.1745-3992.2004.tb00156.x

Macaulay, L. (2008). Elementary principals as technology instructional leaders.

Retrieved from

http://www.iste.org/Content/NavigationMenu/Research/NECC_Research_Paper_

Archives/NECC2009/Macaulay_NECC09.pdf

MacVie, L. (2009). Paradigms in learning theory. Retrieved from

http://pcourses.teacherswithoutborders.org/educational-technology/educational-

technologies-in-learning-theories/paradigms-in-learning-theory/behaviorism-and-

instructional-technology/view

Martin, R. (2007). How successful leaders think. Harvard Business Review, 85(6), 60-67.

Retrieved from EBSCOhost.

Martin, W., Gersick, A., Nudell, H., & Culp, K. M. (2002). An evaluation of Intel each to

the future: Year two final report. New York, NY: Center for Children and

Technology.

Page 153: Don's Dissertation

141

Maurer, M. (1994). Computer anxiety correlates and what they tell us: A literature

review. Computers in Human Behavior, 10(3), 369-376. Retrieved from

ScienceDirect Social & Behavioral Science.

Mazzeo, C. (2004). Improving teaching and learning by improving school leadership.

Retrieved from http://www.nga.org/cda/files/091203LEADERSHIP.pdf

McCarthy, A. (2009). Technology integration in elementary, middle, and high school.

Retrieved from

http://www.dpi.state.nc.us/racg/briefs/techintegration

McIlroy, D. Bunting, B., Tierney, K., & Gordon, M. (2001). The relation of gender and

background experience to self-reported computing anxieties and cognition.

Computers in Human Behavior, 17(1), 21-33. Retrieved from

ScienceDirect Social & Behavioral Science.

McLeod, S., Hughes, J. E., Richardson, J., Dikkers, A.G., Becker, J., Quinn, D., Logan,

J., & Mayrose, J. (2005). Building capacity for technology leadership in

educational administration preparation programs. Retrieved from

http://www.schooltechleadership.org/uploaded/Documents/2005_AERA/2005_ST

LI_P3_Paper_Draft.pdf

McLester, S. (2003). Keeping staff focused and motivated. Technology and Learning,

23(11), 15.

McNeil, S. (2008). What is instructional design? Retrieved from

http://www.coe.uh.edu/courses/cuin6373/index.html

Meade, S. D., & Dugger, W. E. (2004). Reporting the status of technology education in

the United States. Technology and Engineering Teacher, 64(2), 29-35. Retrieved

from http://search.proquest.com/docview/235319051?accountid=27965

Mehlinger, H. D., & Powers, S. M. (2002). Technology and teacher education: A guide

for educators and policymakers. Boston: MA: Houghton Mifflin.

Mentz, E., & Mentz, K. (2003). Managing technology integration into schools: A South

African perspective. Journal of Educational Administration, 41(2), 186-200.

Retrieved from http://search.proquest.com/docview/220461036?accountid=27965

Mersdorf, S. (2009). Qualitative and quantitative research methods. Retrieved from

http://survey.cvent.com/blog/cvent-web-surveys-blog/0/0/qualitative-vs-

quantitative-research-methods

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis. (2nd ed.). Thousand

Oaks, CA: Sage.

Page 154: Don's Dissertation

142

Milne, J. (1999). Questionnaires: Advantages and disadvantages. Retrieved from

http://www.icbl.hw.ac.uk/ltdi/cookbook/info_questionnaires/index.html

Mojkowski, C. (2000, February). The essential role of principals in monitoring

curriculum implementation. National Association of Secondary School Principals.

NASSP Bulletin, 84(613), 76-83. Retrieved from

http://search.proquest.com/docview/216044401?accountid=27965

Moskal, B. (2010). Useful concepts in qualitative and quantitative research. Retrieved

from http://www.succeed.ufl.edu/icee/workshops/MoskalWorkshopFl.pdf

Moskowitz, S., & Martabano, S. (2008). Administrators accessing the effectiveness of

technology. Retrieved from

http://www.schoolcio.com/showarticle/15678

Moursund, D. (2007). School administrators. Oregon Technology in Education Council.

Retrieved from

http://otec.uoregon.edu/school_administrators.htm#Leadership

Muijs, D. (2004). Doing quantitative research in education with SPSS. Thousand Oaks,

CA: Sage.

Nanjappa, A., & Grant, M. M. (2003). Constructing on constructivism: The role of

technology. Electronic Journal for the Integration of Technology Education, 1-11.

Retrieved from http://ejite.isu.edu/Volume2No1/nanjappa.htm

National Center for Education Statistics. (2002a). Technology in schools. Retrieved

from http://nces.ed.gov/pubs2003/tech_schools/chapter4.asp#top

National Center for Education Statistics. (2002b). Technology in schools. Retrieved from

http://nces.ed.gov/pubs2003/tech_schools/chapter7.asp#top

National Center for Education Statistics. (2004c). Technology in schools. Retrieved

from http://nces.ed.gov/pubs2003/2003313.pdf

National Conference of State Legislature. (2010). School leadership. Retrieved from

http://www.ncsl.org/IssuesResearch/Education/SchoolLeadershipOverivew/tabid/

12893/D

National Policy Board for Educational Administration. (2002). Standards for advanced

programs in educational leadership for principals, superintendents, curriculum

directors, and supervisors. Washington, DC. Retrieved from

http://www.npbea.org/ELCC/ELCCStandards%20_5-02.pdf

Page 155: Don's Dissertation

143

National School Boards Foundation. (2002). Are we there yet? Research and guidelines

on schools' use of the Internet. Retrieved from

http://caret.iste.org/index.cfm?fuseaction=studySummary&StudyID=980

Neill, J. (2003). Quantitative research design: Sampling and measurement. Retrieved

from

http://wilderdom.com/OEcourses/PROFLIT/Class5QuantitativeResearchDesignS

amplingMeasurement.htm

Neill, J. (2007). Qualitative versus quantitative research: Key points in a classic debate.

Retrieved from

http://wilderdom.com/research/QualitativeVersusQuantitativeResearch.html

Nesbary, D. K. (2000). Survey research and the World Wide Web. Boston, MA: Allyn

and Bacon.

Nettles, S. M., & Herrington, C. (2007). Revisiting the importance of the direct effects of

school leadership on student achievement: The implications for school

improvement policy. Peabody Journal of Education, 82(4), 724-736. Retrieved

from EBSCOhost.

Noeth, R. J., & Volkov, B. B. (2004). Evaluating the effectiveness of technology in our

schools: ACT policy report. Retrieved from

http://www.act.org/research/policymakers/pdf/school_tech.pdf

Noonan, B., & Renihan, P. (2006). Demystifying assessment leadership. Canadian

Journal of Educational Administration and Policy, 56. Retrieved from

http://www.umanitoba.ca/publications/cjeap/articles/noonan.html

O’Donnell, R. J., & White, G. P. (2005). Within the accountability era: Principals’

instructional leadership behaviors and student achievement. National Association

of Secondary School Principals. NASSP Bulletin, 89(645), 56-71. Retrieved from

http://search.proquest.com/docview/216022467?accountid=27965

O’Dwyer, L. M., Russell, M., & Bebell, D. J. (2004, September). Identifying teacher,

school, and district characteristics associated with elementary teachers use of

technology: A multilevel perspective. Education Policy Analysis Archives,

12(48), 1-33. Retrieved from http://epaa.asu.edu/ojs/article/view/203

.

Orr, M. T., & Barber, M. E. (2006). Collaborative leadership preparation: A comparative

study of partnership and conventional programs and practices. Journal of School

Leadership, 16(6), 703-709.

Page 156: Don's Dissertation

144

Overview for Sampling Procedures. (2003). Fairfax County Department of Systems

Management for Human Services. Retrieved December 10, 2010, from

www.fairfaxcounty.gov/demogrph;

Paben, S. (2002). What’s in it for the busy leader? Journal of Staff Development, 23(1),

24-27.

Padgett, S. (2009). Why technology in schools. Retrieved from

http://www.edtechactionnetwork.org/why-technology-in-schools

Page-Jones, A. B. (2008). Leadership behavior and technology activities: The

relationship between principals and technology use in schools. (University of

Central Florida). ProQuest Dissertations and Theses, Retrieved from

http://search.proquest.com/docview/304371319?accountid=27965

Palloff, R., & Pratt, K. (2001). Lessons from the cyberspace classroom: The realities of

online teaching. San Francisco, CA: Jossey-Bass.

Patten, M. L. (2007). Understanding research methods: An overview of the essentials.

(6th ed.). Glendale, CA: Pyrczak.

Pelgrum, W. J., & Law, N. (2003). Organizational change and leadership ICT in

education around the world: Trends, problems and prospects. Paris, France:

United Nationals Educational, Scientific and Cultural Organization.

Perez, L. G., & Uline, C. L. (2003). Administrative problem solving in the information

age: Creating technological capacity. Journal of Educational Administration,

41(2), 143-157. Retrieved from

http://search.proquest.com/docview/220454885?accountid=27965

Perez, M. J., & Normore, A. H. (2004). Technology integration and school leadership. In

S. Nelson, T. Rocco, & M. S. Plakhotnik, (Eds.) COERC 2004: Proceedings of

the Third Annual College of Education Research Conference (pp.104-108).

Miami, FL: Florida International University.

Perez-Prado, A., & Thirunarayanan, M. (2002). A qualitative comparison of online and

classroom-based sections of a course: Exploring student perspectives. Education

Media International, 39(2), 195-202.

Peters, G. S., & Carey,K. M. (2010). Innovative curriculum in distance learning: An

Ohio case study. Holim Son (Texas Southern University). Hersey, PA: IGI

Global.

Petrides, L. A., & Guiney, S. Z. (2002). Knowledge management for school leader: An

ecological framework for thinking schools. Retrieved from

http://www.iskme.org/what-we-do/publications/thinking_schools.pdf

Page 157: Don's Dissertation

145

Picciano, A. G. (2006). Descriptive research. Retrieved from

http://www.hunter.cuny.edu/edu/apiccian/edstat06.html

Pierson, M. (2001). Technology integration practice as a function of pedagogical

expertise. Journal of Research on Computing in Education, 33(4), 413-430.

Retrieved from EBSCOhost.

Popham, W. (2005). Assessment for educational leaders. Boston, MA: Allyn & Bacon.

Pruitt, C. (2005). Technology and student achievement. Principal, 85(2), 46-48.

Quality Indicators for Assistive Technology Services. (2006). Administrators guide to

effective technology. Retrieved from

http://natri.uky.edu/assoc_project/qiat/resources.html

Quirk, M. (2009). Standards for effective teaching: Technology. Retrieved from

http://eportfolios.ithaca.edu/mquirk1/programstandards/technology/

Rakes, G., & Dawson, C. (2003). The influence of principals’ technology training on

integration of technology into schools. Journal of Research on Technology in

Education, 36(1), 29-49.

Redish, T., & Chan, T. C. (2001). Technology leadership: Aspiring administrators

perceptions of their leadership preparation program. Retrieved from

http://ejite.isu.edu/Volume6/Reddish.pdf

Reiser, R. A., & Dempsey, J. V. (2007). Trends and issues in instructional design. (2nd

ed.). Upper Saddle River, NJ: Pearson Education.

Richey, R. C. (2008). Reflections on the 2008 AECT definitions of the field. TechTrend,

52(1), 24-25.

Riedl, R., Smith, T., Ware, A., & Yount, P. (1998). Leadership for technology-rich

educational environment. Charlottesville, VA: Society for Information

Technology and Teacher Education. (ERIC Document Reproduction Service No.

ED 421128).

Riggio, R. E. (2009). Soaring with eagles –leadership, collaboration, and vision.

Leadership Quarterly, 13(5). 545.

Ringstaff, C., & Kelley, L. (2002). The learning return on our educational technology

investment. San Francisco: WestEd. Retrieved from

http://www.wested.org/cs/we/view/rs/619

Roblyer, M. D., & Doering, A. H. (2010). What is educational technology? Retrieved

Page 158: Don's Dissertation

146

from http://www.education.com/reference/article/what-educationatechnology

Roschelle, J. M., Pea, R. D., Hoadley, C. M., Gordin, D. N., and Means, B. M.

(2000). Changing how and what children learn in school with computer-based

technologies. The Future of Children, 10(2), 76-101.

Rose, L. C., & Dugger, W. C. (2002). ITEA/Gallup poll reveals what Americans

think about technology. The Technology and Engineering Teacher, 61(6), I1-I1

Retrieved from

http://search.proquest.com/docview/235291890?accountid=27965

Ross, J. McGraw, T., & Burdette, K. (2001). Toward an effective use of technology in

education: A summary of research. Charleston, WV: Institute for the

Advancement of Emerging Technologies in Education at AEL.

Ross, S.M. & Strahl, J.D. (2005). Evaluation of Michigan’s Freedom to Learn Program.

Memphis, TN: Center for Research in Education Policy. Retrieved

January 2010 from http://www.ftlwireless.org.

Saettler, P. (Ed.). (2004). The evolution of American educational technology. Charlotte,

NC: Information Age.

Sanchez, A. (2006). The difference between qualitative and quantitative research.

Retrieved from http://e-articles.info/e/a/title/THE-DIFFERENCE- BETWEEN-

QUALITATIVE-AND-QUANTITATIVE-RESEARCH/

Scanga, D. (2004). Technologies competencies for school administrators: Development

and validation study of a self-assessment instrument. (Doctoral dissertation,

University of South Florida). Retrieved from ProQuest Digital Dissertation

database. (AAT 3121036).

Schepers, J., & Wetzels, M. (2005). Leadership styles in technology acceptance: Do

followers practice what leaders preach? Managing Service Quality, 15(6), 496-

508. Retrieved from http://jjlsite.onward.nl/pdf/JeroenSchepers.nl%20-

%20Schepers%20et%20al%202005%20-%20Leadership%20styles.pdf

Schiller, J. (2003). Working with ICT: Perceptions of Australian principals. Journal of

Educational Administration, 41(2), 171-185. Retrieved from

http://search.proquest.com/docview/220428612?accountid=27965

Schmeltzer, T. (2001). Training administrators to be technology leaders. Retrieved

from http://www.techlearning.com/article/18648

Schuman, L. (1996). Perspectives on instruction. Retrieved from

http://edweb.sdsu.edu/courses/edtec540/Perspectives/Perspectives.html

Page 159: Don's Dissertation

147

Schlögl, C. (2005). Information and knowledge management: Dimensions and

approaches. Information Research, 10(4), 1-14. Retrieved from

http://informationr.net/ir/10-4/paper235.html

Seels, B. B., & Richey, R. C. (1994). Instructional technology: The definition and

domains of the field. Washington, DC: Association for Educational

Communications and Technology.

Serhan, D. (2007). School principals’ attitudes towards the use of technology: United

Arab Emirates technology workshop. The Turkish Online Journal of Educational

Technology, 6(2), 42-46. Retrieved from http://www.tojet.net/articles/625.pdf

Sheehan, K., & Sheehan, K. B. (1999). Using e-mail to survey internet users in the

United States: Methodology and assessment. Retrieved from

http://sysurvey.com/tips/using_e-mail_to_survey.htm

Shepard, L. A. (2000). The role of assessment in a learning culture. Educational

Researcher, 29(7), 4-14. doi:10.3102/0013189X029007004

Shuttleworth, M. (2008). Descriptive research design. Retrieved from

http://www.experiment-resources.com/descriptive-research-design.html

Shuttleworth, M. (2009). Construct validity. Retrieved from

http://www.experiment-resources.com/construct-validity.html

Sivin-Kachala, J. & Bialo, E. (2000). 2000 research report on the effectiveness of

technology in schools. (7th ed.). Washington, DC: Software and Information

Industry Association.

Slowinski, J. (2000). Becoming a technologically savvy administrator. Retrieved from

http://cepm.uoregon.edu/pblications/digests/digest135.html

Slowinski, J. (2003). Becoming a technologically savvy administrator. Teacher

Librarian, 30(5), 25-29. Retrieved from

http://search.proquest.com/docview/224881125?accountid=27965

Smith, G., Ferguson, D., & Caris, M. (2001). Teaching college courses online vs. face-to-

face: Technology horizons in education. T. H. E. Journal, 28(9), 18-26. Retrieved

from http://search.proquest.com/docview/214800558?accountid=27965

Smylie, M. A., Conley, S., & Murky, H. M. (2002). Exploring new approaches to teacher

leadership for school improvement. The Laboratory for Student Success Review,

2(2), 18-19.

Page 160: Don's Dissertation

148

South East Initiatives Regional Technology in Education Consortium. (2000). Factors

influencing the effective use of technology for teaching and learning: Lessons

learned from the SEIR*TEC intensive site schools. Retrieved from

http://www.serve.org/seir-tec/publications/lessons.pdf

Stallone, M. (2003). Nutshell-Educational research and statistics. Retrieved from

http://education.tamuk.edu/kfmns00/COMPS/Educational%20Research%20and%

20Statistics.doc

State Educational Technology Directors Association. (2007). Maximizing the Impact:

The Pivotal Role of Technology in a 21st Century Education System. A report

from the International Society for Technology in Education, The Partnership for

21st Century Skills, and the State Educational Technology Directors Association.

Retrieved from http://www.setda.org/web/guest/maximizingimpactreport

Stewart, J. (2006). Transformational leadership: An evolving concept examined through

the works of Burn, Bass, Avolio, and Leithwood. Canadian Journal of

Educational Administration and Policy, 54, 1-12. Retrieved from

http://www.umanitoba.ca/publications/cjeap/articles/stewart.html

Stuart, L. H., Mills, A. M., & Remus, U. (2009). Schools leaders, ICT competence and

championing innovations. Computer & Educators, 53(3), 733-741. Retrieved

from ScienceDirect Social & Behavioral Science.

Tan, S. C. (2010). School technology leadership: Lessons from empirical research.

Retrieved from

http://www.ascilite.org.au/conferences/sydney10/procs/Seng_chee_tan-full.pdf

Technology Standards for School Administrators. (2001). Collaborative for technology

standards for school administrators. Retrieved from

http://www.ncrtec.org/pd/tssa/tssa.pdf

Testerman, J. C., Flowers, C. P., & Algozzine, B. (2001). Basic technology competencies

of educational administrators. Contemporary Education, 72(2), 58-63. Retrieved

from http://search.proquest.com/docview/233025735?accountid=27965

Testerman, J. K., & Hall, H. D. (2001). The electronic portfolio: A means of preparing

leaders for application of technology in education. Journal of Educational

Technology Systems, 29(3), 193-206.

Thompson, A. D., Schmidt, D. A., & Davis, N. E. (2003). Technology collaborative for

simultaneous renewal in teacher education. Educational Technology, Research,

and Development, 51(1), 73-89.

Thorburn, D. (2004). Technology integration and educational change: Is it possible?

Page 161: Don's Dissertation

149

Retrieved from

http://www.usask.ca/education/coursework/802papers/thorburn/index.htm

Tongco, D. C. (2007). Purposive sampling as a tool for informant selection. Retrieved

from http://www.erajournal.org/ojs/index.php/era/article/viewArticle/126

Tooms, A., Acomb, M. & McGlothlin, J. (2004). The Paradox of Integrating Handheld

Technology in Schools: Theory vs. Practice. T.H.E. Journal, 32(4) 14, 18, 20, 24.

Retrieved from http://search.proquest.com/docview/214821482?accountid=27965

Trochim, W. M. K. (2006). Measurement validity types. Retrieve from

http://www.socialresearchmethods.net/kb/measval.php

Tuckman, B. W. (2002). Evaluating ADAPT: A hybrid instructional model combining

web-based and classroom components. Computers & Education, 39(3), 261-269.

Retrieved from ScienceDirect Social & Behavioral Science.

United Nations Educational Scientific and Cultural Organization. (2002). Information

and communication technology in education: A curriculum for schools and

program of teacher development. Retrieved from

http://unesdoc.unesco.org/images/0012/001295/129538e.pdf

United Nations Educational Scientific and Cultural Organization. (2005). Information

and communication technologies in schools: A handbook for teachers: How ICT

can create new, open learning environments. Retrieved from

http://unesdoc.unesco.org/images/0013/001390/139028e.pdf

United States Department of Education. (2000). E-learning: Putting a world-class

education at the fingertips of all children. Washington, DC. Retrieved from

http://www.eric.ed.gov/ERICWebPortal/custom/portlets/recordDetails/detailmini.

jsp?_nfpb=true&_&ERICExtSearch_SearchValue_0=ED444604&ERICExtSearc

h_SearchType_0=no&accno=ED444604

Vail, K. (2003). School technology grows up: Good-bye to the gee-whiz – the new

generation of ed tech is all about solutions. American School Board Journal,

190(9), 34-37. Retrieved from EBSCOhost.

Valdez, G. (2004). Critical Issue: Technology leadership: Enhancing positive

educational change. North Central Regional Educational Laboratory. Retrieved

from http://www.ncrel.org/sdrs/areas/issues/educatrs/leadrshp/le700.htm

Valdez, G. M., McNabb, M., Foertsch, M., Anderson, M., & Raack, L. (2000).

Computerbased technology and learning: Evolving uses and expectations.

Retrieved from http://www.ncrel.org/tplan/cbtl/toc.htm

Page 162: Don's Dissertation

150

Veneszky, R. (2004). Technology in the classroom: Steps toward a new vision.

Education, Communication, and Information, 4(1), 3-21.

Vogt, P. W. (2007). Quantitative research methods for professionals in education and

other fields. Needham Heights, MA: Allyn & Bacon.

Volante, L., & Cherubini, L. (2007). Connecting educational leadership with multi-level

assessment form. International Electronic Journal for Leadership in Learning,

11(12). Retrieved from http://www.ucalgary.ca/iejll/vol11/volante

Volante, L., Cherubini, L., & Drake, S. (2008). Examining factors that influence school

administrators’ responses to large-scale assessment. Canadian Journal of

Educational Administration and Policy, 84, 1-17. Retrieved from

http://www.umanitoba.ca/publications/cjeap/articles/volante_etal.html

Wallen, N. E., & Fraenkel, J. R. (2001). Educational research: A guide to the process.

(2nd ed.). Mahwah, NJ: Lawrence Erlbaum.

Wagner, K.V. (2009). Transformational leadership. Retrieved from

http://psychology.about.com/od/leadership/a/transformational.htm

Wang, C. (2010). Technology leadership among school principals: A technology-

coordinator’s perspective. Asian Social Science, 6(1), 1-4. Retrieved from

http://www.ccsenet.org/journal/index.php/ass/article/viewFile/4774/4018

Warwick, D. P. & Lininger, C. A. (1975). The sample survey: Theory and practice. New

York, NY: McGraw Hill.

Wasson, J. (2002). Descriptive research. Retrieved from

http://www.mnstate.edu/wasson/ed603/ed603lesson10.htm

Waters, T., Marzano, R. J., McNulty, B. (2003). Balanced leadership: What 30 years of

research tells us about the effect of leadership on student achievement. Aurora,

CO: Mid-Continent Research for Education and Learning.

Webb, L. (2011). Supporting technology integration: The school administrators’ role.

National Forum of Educational administration & Supervision Journal, 24(4).

Retrieved from

http://www.nationalforum.com/Electronic%20Journal%20Volumes/Webb,%20Lo

rie%20Supporting%20Technology%20Integration%20NFEASJ%20V28%20N4%

202011.pdf

Wenzel, G. (2009). Why school administrators need to know about distance learning: A

college professor’s perspective. Retrieved from

http://www.westga.edu/~distance/ojdla/spring21/wenzel21.pdf

Page 163: Don's Dissertation

151

West, M., Jackson, D., Harris, A., & Hopkins, D. (2002). Learning through leadership,

leadership through learning: Leadership for sustained school improvement. In

Leadership for Change and School Reform (pp. 30-35). London:

RoutledgeFalmer.

Whelan, R. (2005). Instructional technology & theory: A look at past, present, & future

trends. Connect: Information Technology at NYU. Retrieved from

http://www.nyu.edu/its/pubs/connect/spring05/pdfs/whelan_it_history.pdf

Whiteside, A. (2005, June). School technology leadership: Theory to practice. The Free

Library. (2005). Retrieved from http://www.thefreelibrary.com/Schooltechnology

leadership: theory to practice-a0136071077

Williams, K. (2006). Beliefs about technology integration support factors held by school

leadership and school faculty: A mixed methods study. (Doctoral dissertation,

Georgia State University). Retrieved from

http://etd.gsu.edu/theses/available/etd-11222006-

143045/unrestricted/williams_katherine_j_200608_phd.pdf

Wilmore, D., & Betz, M. (2000). Technology and schools: the principal’s role.

Educational Technology and Society, 3(4), 12-19. Retrieved from

http://www.ifets.info/journals/3_4/discuss_october2000.pdf

Yildirim, S., & Kiraz, E. (1999). Obstacles to integration of on-line communication tools

in pre-service teacher education. Journal of Computing in Teacher Education,

15(3), 23-28.

Yu, K., Chang, M., & Tsai, F. (2009). The effects of computer network on education.

Retrieved from

http://web.ntnu.edu.tw/~minfei/personnel/Effects%20of%20Computer%20Netwo

rk%20on%20Education%20(TERC2002).pdf

Zuniga, J., Valdez, E., & Lu, I. (2010). Time line of people, events, theories, and

definitions of instructional technology. Retrieved from

http://www.jdzuniga.com/timeline.doc

Page 164: Don's Dissertation

152

APPENDIX A. LETTER TO THE SUPERINTENDENT

225 South 6th Street, 9th

Floor ♦ Minneapolis, Minnesota 55402

Dear Superintendent:

I am a doctoral student in the School of Education (Education Administration) at

Capella University located in Minneapolis, Minnesota. Dr. McIntyre, my faculty mentor

and chair, is providing me guidance during the research process. I will be conducting

research for my dissertation on An Investigation of Technology Competence of School-

Based Administrators in the Tri-County Secondary Schools in the Southeastern Part of

South Carolina as part of my dissertation requirements.

The overall purpose of this study will be to investigate the level of technology

competence for secondary principals and other school-based administrators (assistant

principals, vice principals), who will be identified by the principals as proficient users of

technology in the schools. The findings of this study will have implications for setting

policies and practices, in regards to the utilization of technology in the schools, and with

teachers’ integration of educational technology in the classroom.

This letter serves as a request for permission to conduct this research study with

secondary principals and other school-based administrators (assistant principals, vice

principals, administrative assistants) in the high schools in the school district. I am

asking permission to administer the survey instrument by email.

The survey instrument, Technology Competence Survey for School-Based

Administrators, and its data, will be strictly confidential. Names of the school district, the

Page 165: Don's Dissertation

153

schools, and the participants will not be used when the study is reported or published. All

participants will have agreed to take part in the research study. Participants are free to

withdraw consent and discontinue participation in the project.

I thank you for your consideration of this matter. Should you have any questions

regarding this study or permission request, you may contact Dr. McIntyre or me. Dr.

McIntyre can be reached at XXX-XXX-XXXX. There is no exchange of money or

compensation for participating in this research study.

Enclosed you will find a copy of the principal’s and the other school-based

administrators’ letters, an informed consent form, and the survey instrument.

Sincerely,

Donald D. Simpson

Doctoral Candidate

Page 166: Don's Dissertation

154

Page 167: Don's Dissertation

155

APPENDIX B. LETTER TO THE PRINCIPAL

225 South 6th Street, 9th

Floor ♦ Minneapolis, Minnesota 55402

Dear Principal:

I am a doctoral student in the School of Education (Education Administration) at

Capella University located in Minneapolis, Minnesota. Dr. McIntyre, my faculty mentor

and chair, is providing me guidance during the research process. I will be conducting

research for my dissertation on An Investigation of Technology Competence of School-

Based Administrators in the Tri-County Secondary Schools in the Southeastern Part of

South Carolina as part of my dissertation requirements.

The overall purpose of this study will be to investigate the level of technology

competence for secondary principals and other school-based administrators (assistant

principals, vice principals), who will be identified by the principals as proficient users of

technology in the schools. The findings of this study will have implications for setting

policies and practices, in regards to the utilization of technology in the schools, and with

teachers’ integration of educational technology in the classroom.

You are invited to participate in this research study by completing the survey

instrument, Technology Competence Survey for School-Based Administrators. The survey

instrument and it data will be strictly confidential. Names of the school district, the

school, and the participants will not be used when the study is reported or published. All

participants will have agreed to take part in the research study. Participants are free to

withdraw consent and discontinue participation in the project.

If you have any questions about the survey instrument that you will be completing

by online, do not hesitate to contact me. If you have any questions about your rights as a

human subject, please contact me.

Enclosed you will find a copy of the superintendent’s permission letter giving me

approval to conduct my research and to collect data from the survey instrument in your

school, an informed consent form, and a copy of the survey instrument.

Sincerely,

Donald D. Simpson

Doctoral Studen

Page 168: Don's Dissertation

156

Page 169: Don's Dissertation

157

APPENDIX C. LETTER TO THE

OTHER SCHOOL-BASED ADMINISTRATORS

ASSISTANT PRINCIPALS, VICE PRINCIPALS, OR

ADMINISTRATIVEASSISTANTS

225 South 6th Street, 9th

Floor ♦ Minneapolis, Minnesota 55402

Dear Assistant Principal, Vice Principal, or Administrative Assistant

I am a doctoral student in the School of Education (Education Administration) at

Capella University located in Minneapolis, Minnesota. Dr. McIntyre, my faculty mentor

and chair, is providing me guidance during the research process. I will be conducting

research for my dissertation on An Investigation of Technology Competence of School-

Based Administrators in the Tri-County Secondary Schools in the Southeastern Part of

South Carolina as part of my dissertation requirements.

The overall purpose of this study will be to investigate the level of technology

competence for secondary principals and other school-based administrators (assistant

principals, vice principals), who will be identified by the principals as proficient users of

technology in the schools. The findings of this study will have implications for setting

policies and practices, in regards to the utilization of technology in the schools, and with

teachers’ integration of educational technology in the classroom.

You are invited to participate in this research study by completing the survey

instrument, Technology Competence Survey for School-Based Administrators. The survey

instrument and it data will be strictly confidential. Names of the school district, the

school, and the participants will not be used when the study is reported or published. All

participants will have agreed to take part in the research study. Participants are free to

withdraw consent and discontinue participation in the project.

If you have any questions about the survey instrument that you will be completing

by online, do not hesitate to contact me. If you have any questions about your rights as a

human subject, please contact me.

Enclosed you will find a copy of the superintendent’s permission letter giving me

approval to conduct my research and to collect data from the survey instrument in your

school, an informed consent form, and a copy of the survey instrument.

Sincerely,

Donald D. Simpson

Doctoral Student

Page 170: Don's Dissertation

158

APPENDIX D. INFORMED CONSENT FORM

225 South 6th Street, 9th

Floor ♦ Minneapolis, Minnesota 55402

The main purpose of this form is to provide information that may affect your

decision about whether or not you want to participate in this research project. If you

choose to participate, please sign in the space at the end of this form to record your

consent.

I am a doctoral student in the School of Education (Education Administration) at

Capella University located in Minneapolis, Minnesota. Dr. McIntyre, my faculty mentor

and chair, is providing me guidance during the research process. I will be conducting

research for my dissertation on An Investigation of Technology Competence of School-

Based Administrators in the Tri-County Secondary Schools in the Southeastern Part of

South Carolina as part of my dissertation requirements.

If you decide to participate in this study, you will be asked to complete the

Technology Competence Survey for School-Based Administrators and the demographic

questionnaire. The Technology Competence Survey for School-Based Administrators has

35 questions and the demographic questionnaire has 10 questions. Your participation will

take approximately 10 to 15 minutes to complete online.

You have been invited to participate in this study to investigate the level of

technology competence for secondary principals and other school-based administrators

(assistant principals, vice principals), who will be identified by the principals as

proficient users of technology in the schools.

Although no study is completely risk-free, we don’t anticipate any risks to you if

you decide to participate in this study. However, the risks of harm anticipated in the

proposed research are not greater, considering probability and magnitude, than those

ordinarily encountered in daily life or during the performance of routine physical or

psychological examinations or tests. However, should you experience discomfort; you

may discontinue your voluntary participation at any time, thereby withdrawing your

consent.

We don’t expect any direct benefits to you from participation in this study.

However, you will be making an integral contribution to the learning environment of

children. As principals, you are the ones who are most directly involved in helping the

teachers in educating students. Therefore, it is vital that your views be represented in the

research findings.

The researcher will contact you if he/she learns new information that could

change your decision about participating in this study. The results of the research study

will be published, but your name or identity will not be revealed. In order to maintain

confidentiality of your records, the researcher will keep all records of this study private.

The researcher will provide you code identification number that will be used for all

correspondence. Returned survey and code identification numbers will be stored under

lock and key. The information that you provide will be summarized and reported in

aggregated form to make sure that your individual responses will remain confidential.

Page 171: Don's Dissertation

159

Participation in this study is voluntary. If you choose not to participate or if you

choose to withdraw from the study, you may do so at any time. There will be no penalty.

There is no financial cost to participate in the study. There will be no financial

compensation for participating in the study. You are not waiving any of your legal rights

if you agree to participate in this study. But no funds have been set aside to compensate

you in the event of injury. If you suffer harm because you participated in this research

project, you may contact the Capella Human Research Protections Office at 1-888-227-

3552, extension 4716.

By signing this form, you are saying (1) that you have read this form or have had

it read to you and (2) that you understand this form, the research study, and its risks and

benefits. The researcher will be happy to answer any questions you have about the

research. If you have any questions, please feel free to contact at Donald Simpson at, or

you may contact Dr. McIntyre at [email protected].

If you have any questions about your rights as a research participant or any

concerns about the research process, or if you'd like to discuss an unanticipated problem

related to the research, please contact the Capella Human Research Protections Office at

1-888-227-3552, extension 4716. Your identity, questions, and concerns will be kept

confidential.

Note: By signing below, you are telling the researcher “Yes,” you want to

participate in this study. Please keep one copy of this form for your records.

Your Name (please print):

____________________________________________________

Your Signature: ___________________________________________________

Date: ______________________________

I certify that this form includes all information concerning the study relevant to

the protection of the rights of the participants, including the nature and purpose of this

research, benefits, risks, costs, and any experimental procedures.

I have described the rights and protections afforded to human research

participants and have done nothing to pressure, coerce, or falsely entice this person to

participate. I am available to answer the participant’s questions and have encouraged him

or her to ask additional questions at any time during the course of the study.

Investigator’s Signature:

____________________________________________________

Investigator’s Name

____________________________________________________

Date:

________________________________

Page 172: Don's Dissertation

160

Page 173: Don's Dissertation

161

APPENDIX E. TECHNOLOGY COMPETENCE SURVEY FOR SCHOOL-

BASED ADMINISTRATORS

From An investigation of technology competence of school-based administrators in

Florida school, Blake, R. L. (2000), (Doctoral dissertation, University of Central

Florida University). Retrieved from ProQuest Digital Dissertation database. (AAT

9977808). Adapted with permission.

Estimated Time of Completion is about 5-10 minutes

Part I – Technology Skill Rubric

Directions: Please place an X in the box next to the one statement, which gives the best

description of your ability in each of the following technology skill areas.

Word Processing

1. I do not use word processing software.

2. I use word processing software occasionally for simple documents, which I

know I will modify, and use again. I generally find it easier to hand-write

most written work.

3. I use word processing software for nearly all my professional work (memos,

tests, reports, worksheets, and home communication). I can edit, spell check,

and change character and paragraph formats. I feel my work looks

professional.

4. I can create different column types, tables of contents, indexes, and a variety

of templates and style sheets. I can open and save documents in various file

formats.

Computer Basics

5. I do not use a computer for any applications at work or at home.

6. I can use the computer to run a few specific programs. I can open programs

and navigate within the desktop without any assistance.

7. I can set-up my computer and peripheral devices, load software, print, and use

many essential operating system tools.

Page 174: Don's Dissertation

162

8. I can troubleshoot many problems on my computer and I can customize the

screen and sounds on my computer. I can easily switch between programs and

I feel confident choosing appropriate applications for different tasks.

Database

9. I do not use database software.

10. I understand the use of a database and can move between records and fields in

a database, as well as locate field information.

Spreadsheet

11. I do not use spreadsheet software.

12. I understand the use of a spreadsheet and can navigate within cells, rows,

and columns. I can create a simple spreadsheet, which are useful to me.

13. I use a spreadsheet for a variety of tasks. I can create spreadsheets

containing labels, formulas, and cell references. I can use the spreadsheet to

create a simple graph or chart.

14. In addition to the statement immediately above, I can use spreadsheets to

explore relationships and to analyze information for solving problems.

Internet

15. I do not use a web browser.

16. I can start up a browse to use the Internet and browse World Wide Web

pages, but spend little time doing so.

17. I am able to make use of a Web browser to explore educational and

professional resources. I can also create and manage bookmarks/hotlists.

18. I can configure my browser to maximize its ability to manage mail, graphics,

sounds, and attachment.

Desktop Publishing

19. I do not know how to use Desktop Publishing software.

20. I can only use the templates included with the Desktop Publishing software

to create documents.

Page 175: Don's Dissertation

163

21. I can use Desktop Publishing to create flyers and signs. This is the limit of

my skills with Desktop Publishing.

22. In addition to the prior statement, I can create professionally looking cards,

stationary, and properly formatted programs for all occasions.

SMART Board

23. I do not use a SMART Board.

24. I can use a SMART Board to show a PowerPoint presentation.

25. I don’t know how to use the Notebook software, but I can properly deploy

the document camera, the wireless slate, and response system.

26. I can use the Notebook software and can create new lessons using the

software.

E-mail

27. I do not use electronic mail.

28. I understand that electronic mail is an effective way to communicate with

my colleagues. I occasionally use e-mail.

29. I am an active e-mail user and efficiently manage and save my mail. I can

send and receive attached files.

30. I understand how to subscribe and unsubscribe to electronic newsletters

through listserves. I can mange news groups.

PowerPoint

31. I do not use PowerPoint.

32. I can start and navigate through an existing presentation.

33. I can create my own presentations, which include sounds and graphs.

34. In addition to the previous statement, I can add hyperlinks and embed videos

in my presentations.

Page 176: Don's Dissertation

164

Technology Integration

35. I do not see the need to integrate technology in the classroom.

36. I see the need to integrate technology in the classroom, but I have problems

knowing what to integrate or how to integrate.

37. I look for student performance indicators related to the NET-S.

38. I encourage and support teachers to use technology to enhance lessons.

Part II: Skill Importance

Directions: In relation to your work as a school administrator, please rate the

following with a value of 0-4 where: 4=Essential; 3=Very Important;

2=Important; 1=Somewhat Important; and 0=Not Important

_____39. The ability to search for electronic information.

_____40. The ability to perform basic computer operations, such as running programs

and loading software.

_____41. The ability to create documents using a word processing program.

_____42. The ability to create multimedia presentations.

_____43. The ability to obtain information using a database.

_____44. This ability to use and manage e-mail.

Communications

_____45. The ability to proficiently use the SMART Board.

_____46. The ability to use/create spreadsheets to analyze data.

_____47. The ability to incorporate graphics into word processing and presentation

software.

_____48. The ability to explore the Internet for information.

Part III: Technology Use

Page 177: Don's Dissertation

165

Directions: Please circle the most appropriate response, which represents the

frequency with which you have applied the following tools over the past

year.

Scale: D=Daily (or almost daily); W=Weekly (or several times a week);

M=Monthly (or several times a month); N=Never (or rarely)

49. In the past year, I have used a word processor in my work.

D W M N

50. In the past year, I have used a database in my work.

D W M N

51. In the past year, I have used a spreadsheet in my work.

D W M N

52. In the past year, I have used presentation software in my work.

D W M N

53. In the past year, I have used graphics in my work.

D W M N

54. In the past year, I have used the Internet in my work.

D W M N

55. In the past year, I have used electronic mail (e-mail) in my work.

D W M N

56. In the past year, I have used an information search in my work.

D W M N

57. In the past year, I have used a SMART Board in my work.

D W M N

Part IV: Perceptions and Attitudes

Page 178: Don's Dissertation

166

Directions: Please select the appropriate choice for each of the following items.

Scale: Strongly Agree (SA); Agree (A); Disagree (DA); Strongly Disagree (SDA)

58. All teachers and students are expected to be proficient with technology.

SA A DA SDA

59. Principals/administrators are expected to have a basic knowledge of office systems.

SA A DA SDA

60. The computer has become an essential tool for school management, organization,

and administration.

SA A DA SDA

61. I feel confident in my staff and their expertise in technology.

SA A DA SDA

62. E-mail is an effective and essential tool for communication and sharing of

information.

SA A DA SDA

63. The use of technology in the classroom is among the greatest challenges and

responsibilities facing administrators today.

SA A DA SDA

64. Technology is the future. Students, teachers, staff as well as administrators must be

proficient in technology usage.

SA A DA SDA

65. Administrators, teachers, and students should be able to proficiently use and deploy

SMART Boards.

SA A DA SDA

66. Principals and teachers alike have a big responsibility in bringing the best

technology available into the schools of our country.

SA A DA SDA

Page 179: Don's Dissertation

167

67. Technology training and practice is a daily necessity.

SA A DA SDA

Part V: Comments

Because this study is intended to benefit your fellow school administrators, please

share any thoughts that you may have in regards to the role of technology in the

leadership and administration of schools. (Possible topics that you may want to

address include: training for administrators, integration of technology into the

curriculum, or how technology may benefit you in your leadership/administrative

roles.)

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

_______________________________________________________________________

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

________________________________________________________________________

Page 180: Don's Dissertation

168

Page 181: Don's Dissertation

169

Page 182: Don's Dissertation

170

APPENDIX F. DEMOGRAPHIC QUESTIONNAIRE

1. My school contains grades: 9 10 11 12

(Circle all that apply)

2. My school has approximately_________ students.

(Number)

3. My position is: (Circle One) Principal Assistant Principal Other

4. I have worked in school administration for ____________years.

(Number)

5. Which of the following best describes you access to a computer?

(Put an X on the line)

_____ I have no access to a computer.

_____ I have limited access to a compute (I am not the primary user).

_____ I have access to a computer in a room other than my office.

_____ I have a computer in my office exclusively for my work.

6. My school is connected by a computer network.

(Put an X on the line)

_____ Yes ______ No _____ Don’t know

7. Our county school system is connected by a computer network.

______ Yes ______ No ______Don’t know

Check this box if you would like to receive the results of this study. The

results will be sent to you when the study is complete.

Thank you very much for your help with this study. Your input is greatly appreciated

and will be valuable to present aspiring school administrators throughout the state and

school districts.

Page 183: Don's Dissertation

171