California State University, Los Angeles
COLLEGE FACULTY TECHNOLOGICAL, PEDAGOGICAL, AND CONTENT KNOWLEDGE (TPACK), AND INSTRUCTIONAL ADAPTATIONS
FOR DIVERSE LEARNERS.

By
David EjimoleA Dissertation Proposal
Presented to the Faculty of the Charter College of Education
Summer 2018
Dr. Anthony Hernandez (Chair)
Dr. Anne Hafner
Dr. Valerie Sun
CHAPTER 1Introduction
Teaching is an art. In a seminal paper titled, “Those who understand: knowledge growth in teaching”, Shulman, Lee in 1986 introduced the concept of pedagogical content knowledge (PCK). In it, he defined PCK as a special knowledge in a teacher which goes beyond content or subject matter knowledge to include knowledge about how to teach particular content. It is a complex process, and the blend of teacher’s knowledge of the subject matter, with the knowledge of teaching methods, processes of learning, and understanding of the learning environment, (Koehler ; Mishra, 2009). The graphic representation of (PCK), as advanced by Shulman in 1986, see figure 1, is the overlapping portion of the two distinct knowledge domains – content and pedagogy. It is a new knowledge, unique to a teacher on an approach to teaching a particular subject matter (content).
The tacit nature of Shulman, (1986) concept of this special knowledge called pedagogical content knowledge has made the idea contentious at certain quarters. This is so because it does not involve behavior, and therefore is difficult to observe or measure. In addition, validity of the construct is another area of controversy over the concept. This is based on the notion that it cannot be differentiated from pedagogical knowledge or content knowledge using quantitative measures. Notwithstanding, the Shulman’s pedagogical content knowledge concept has been accepted as a valuable framework in the field of teacher education. For instance, the technological pedagogical content knowledge (TPCK), Mishra and Koehler, (2006), which added technology as a third domain to the concept, was based on the assumptions of the validity of Shulman’s pedagogical content knowledge.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

Figure 1. Shulman (1986) concept of pedagogical content knowledge (PCK)
The PCK therefore represents the new knowlege that combines best pedagogical practice and content expertise; and demonstrated in the way instructions are delivered by a teacher through meaningful activities, demonstrations, illustrations and examples that connects and makes real sense the to learner. Conchran (1997) further described it as the knowledge base that integrates subject matter, pedagogy, students, and learning environments – an interplay between the knowledge of subject matter (content), and the knowledge about the processes, practices or methods of teaching ; learning (pedagogy). This interplay between the two knowledge base is complex and messy (Shulman, 1986), and the complexity becomes even more compounded when situated in a learning environment that is made up of diverse learners (Rosenberg, ; Koehler, 2015).
According to Chen (2017); and the NCES (2015) the ever-rising diversity across every aspect of life in the American society today is a well-known phenomenon. Lending credence to this statement, Turner, Gonzalez ; Wood (2008) found that the demographics of the United States in the past decade have maintained an upward trend. Admittedly, the increasing diversity has some positive impacts within the American society. On the other hand, it is not exactly so in the instutions of higher learning, as increasing diversity poses instructional challenges to the faculty and instructional leaders (Chen, 2017). Among these challenges is the need for the faculty to adapt their content delivery strategies in line with the learning needs and styles of the increasing diverse student population in their classrooms. It, therefore, behooves instructional leaders (faculty) to identify and use appropriate instructional tools, together with best pedagogical strategies in their delivery of content to ensure effective learning and support for all students in their learning differences.
The world today is changing, and so is the way students learn. The new wave of change invariably demands a change in instructional models to keep to speed and remain relevant in the changing educational landscape. For example, the advent and explosion of information and communication technology (ICT) has rendered invalid and irrelevant teachers’ content and pedagogical knowledge that was hitherto considered an effective way of instructional delivery (Shulman, 1986).

Evidently, technology is here and has been growing at a rate that is imaginable. Also, it is finding its way into the very fabrics of our global society (including the United States) – in commerce, industry, and more interestingly, in education as a viable tool for teaching and learning. Roberts, Park, Brown, & Cook (2011) for example, observed how central technology could be in an all-inclusive learning environment and in creating adaptable model of instruction that will make learning accessible to diverse learners.

To remain relevant, therefore, in the scheme of affairs teachers and instructional leaders in all tiers of the educational system have to recognize this new paradigm. Meanwhile, in the face of the widening application of technology in education, it is believed that research in educational technology has failed to meet expectations in certain critical areas of interest. For instance, it has failed to be in the forefront of studies in the context of diverse learners and learning environment; and in the trajectory for the education of this ever-growing student population (Rosenberg, & Koehler, 2015).
The issue of context is considered critical in the literature as a component of the technological pedagogical content knowledge (TPACK) from its inception (Rosenberg & Koehler (2015). TPACK as a framework emanates from the interaction of teacher’s knowledge of technology, pedagogy, and content (Koehler ; Mishra, 2009; Koh ; Chai, 2013). However, this special knowledge that is needed to infuse technology, content, and pedagogical principles places another layer of challenge to instructional practices, particularly in certain defined contexts such as in diverse learners. The issue of specific context and contextualization of TPACK underpins the fundamental bases for this study, and characterized by diverse learning processes and environment on the campus.

Figure 2. Technological Pedagogical Content Knowledge Model from www.tpack.org
Problem Statement
In a content analysis of 16 articles published in print journals between 2006 and 2009, on Technological, Pedagogical, and Content Knowledge (TPACK), Kelly (2010), advanced that context, which represents the fourth element of the concept of TPACK, was hardly mentioned in the literature that describe, explain or operationalize TPACK as a framework. On the other hand, Porras-Hernandez ; Salinas-Amescua (2013), contended that even when “context” does appear in the description of TPACK, its meaning was not explicit.
Consequent upon these assertions, Rosenberg, ; Koehler (2015), embarked on a study to verify the authenticity of those claims and to further affirm the central role of context in the TPACK framework. They carried out a study of 193 peer-reviewed journals published between 2005 and 2013 and found that context appeared in only 70, or 36% of the reviewed publications in their TPACK descriptions or definitions. This outcome means that context was mentioned only in about one-third of the publications, and thus confirms and extends the earlier work of Kelly (2010). The study Rosenberg ; Koehler (2015), also empirically established Porras-Hernandez ; Salinas-Amescua’s (2013) contentions regarding lack of specificity in definitions of context in TPACK when mentioned in journals.
Considering the importance of context in TPACK as the special knowledge needed to teach with technology, the findings of Rosenberg, & Koehler (2015), suggest the need to further establish more ways context interacts with, and influence TPACK enactment in technology integration. It is these needs that underpin this study. The focus of the study is, in part, is to understand and contextualize faculty TPACK through the study of faculty utilization of the affordances of technologies for instructional adaptations in the context of a diverse learning environment. This study will not only further extend the inclusion of context in educational technology study journals and publications, but will also explicate the definition of context in reference to instructional adaptation in diverse learning environment.

Significance of the Problem
The process of teaching and learning does not occur in a void. Rather, it happens in an environment that depicts a complex intertwining of different roles, functions and goals represented by those of the teachers (faculty), students, instructional leaders, and administers (Gosling, 2009). These different roles are further compounded by the heterogeneity and the multiple identities of the student population typical of our institutions of higher learning today; which brings about multiplicity in students’ learning needs. This changing context of diversity in higher education setting necessitates a more systematic approach to instructions in order to support all students in their diverse learning styles and needs.
Technology plays a vital role in providing the most desired flexible mode of instruction that does not only make for an all-inclusive classroom but also widens access to much needed resources for effective learning outcome for diverse learners (Roberts, Park, Brown, ; Cook, 2011). TPACK as a framework positions itself to fill this gap since according to Alqurashi, Gokbel, Carbonara (2016), TPACK is all about understanding how to integrate technology into various instructional models in order to improve students’ learning outcomes.

On this premise, it becomes important to understand how the faculty adapt their instructional strategies using available tools such as the information and communication technologies (ICT) to address the acknowledged instructional challenges in the context of the contemporary prevalent diverse learning environment.
Results from this study can be used understand faculty challenges in technology integration and particularly in addressing the student diverse learning needs. Secondly, the result of this study will suggest areas of professional development for the faculty in order to make them relevant in the current classroom dynamics. Administrators and other policy makers within the institution, on the other hand, can utilize the findings of this study to identify faculty-specific tech adoption needs in order to provide appropriately tailored support.

Purpose of the Study
The purpose of this study is to assess faculty perceptions of technological, pedagogical, and content knowledge (TPACK), in the College of Education, of a renowned 4-year public university on the west coast of the United States. The study will specifically examine faculty tenure status, age, gender, and teaching experience to determine a correlation with the seven levels of TPACK, and then establish the variable(s) that mostly influence instructional adaptation.
Research Questions
The following research questions will guide this study.
1. What are the faculty’s self-assessments of technological, pedagogical and content knowledge (TPACK) using the Higher Education TPACK
(HE-TPACK) instrument?
2. Is there a difference in the faculty TPACK based on tenure status, age range, gender and experience?
3. Which of the seven TPACK domains mostly influence faculty use of technology for instructional adaptation?
Definition of Terms
The following definitions will be used throughout this proposal.
Content knowledge (CK): Knowledge of a subject matter to be taught without considerations to how to teach such subject(s) (Chai, Koh and Tsai, 2013).

Context: This, in general terms, is the things in the environment around an object of study.

Pedagogical content knowledge (PCK): Knowledge that represents the complex bend of subject matter competence and pedagogical techniques, knowledge of students’ prior knowledge, and theories of epistemology (Mishra and Koehler, 2006).

Pedagogical knowledge (PK): Pedagogical knowledge refers to the methods about, and processes and practices of, teaching (Mishra and Koehler, 2006); which includes knowledge of classroom management, assessment, lesson plan development, and student learning.

Technological content knowledge (TCK): Technological content knowledge refers to the knowledge of how appropriate technology can be used in teaching specific content (Cox, 2008). It suggests that teachers understand that, by using a specific technology, they can change the way learners practice and understand concepts in a specific content area.

Technological pedagogical content knowledge (TPACK): Technological pedagogical content knowledge refers to the knowledge required by teachers for integrating technology into their teaching in any content area (Mishra and Koehler, 2006). Teachers have an intuitive understanding of the complex interplay of relationship between student learning processes, content, teaching practices and appropriate technologies (Archambault and Crippen, 2009).

Technological pedagogical knowledge (TPK): Technological pedagogical knowledge refers to the knowledge of how various technologies can be used in teaching, and to the understanding that using technology may change the way teachers teach (Mishra and Koehler, 2006).

Technology knowledge (TK): Technology knowledge refers to the knowledge about various technologies, ranging from low-tech technologies such as pencil and paper to digital technologies such as the Internet, digital video, interactive whiteboards, and software programs (Mishra and Koehler, 2006).

Universal Design for Instruction (UDI): Is the design of instruction to be usable by all students, without the need for adaptation or specialized design (Burgstahler, 2009).

Universal Design for Learning (UDL): A scientifically valid framework for guiding educational practice that provides: a). flexibility in the ways information is presented; b). flexibility in the ways students respond or demonstrate knowledge and skills; and c). Flexibility in the ways students are engaged. (UDL and UD Provisions in the Higher Education Opportunity Act (P.L. 110-315), 2010, SEC. 103).

Positionality Statement
My life circumstance in essence, has indirectly defined and constructed my position, as well as the lens through which I see certain issues (Graham & Horejes, 2017), as I go through this research process. There is no denying the fact that along the way I have been picking up and building some preconceptions, as well as developing certain personal attachments to issues that pertain to the fields of education and technology. Definitely, these Personal attachments and preconceptions have generated some passion and dedication necessary for conducting good research on one hand; but on the other hand, must have carried with them some biases and opinions, beliefs, values, that might lead to my jumping into premature conclusions as a researcher (Machi & McEvoy, 2016). I am mindful of these power dynamics that flow through the trajectory of the research process. I also understand my ethical duty to purposefully attend strictly to my role in this research process and therefore will try to figure out ways of ensuring that my research is approached objectively to validate the quantitative method of the study.

Conclusions
This chapter provided a snapshot of the evolution of technology, its ubiquitous influence in education and ultimately the challenges of its integration into curriculum and instructional models. An overview of the growth of diversity and its spread into our institutions of higher learning was also presented and elucidated the instructional challenges that goes with it.
Diverse student population in higher institutions translates into students from diverse socio-economic status, diverse learning styles and needs, as well as purposes, in the classrooms. The evolution of technology, on the other hand, and its en-route into our classrooms through student born into the digital age suggest that the faculty is embattled with the task of technology integration in their teaching. The above phenomena bring about the need to understand how the faculty, in the circumstance develop the technological knowledge with which to adapt their instructions using available technologies to support diverse learners. Following, in chapter 2, an overview of literature on TPACK framework, diversity, and how they relate to instructional adaptation is presented.
CHAPTER 3
Methodology
As the wave of diversity spreads across every sphere of life in the United States, the educational system, including institutions of higher learning are also impacted. It is important therefore to understand how teachers, particularly the faculty in our higher institutions, use available tools to adjust their instructional strategies to ensure that all students derive maximum educational benefits irrespective of their learning styles and differences. The ubiquitous technology and its application has proved to be among the viable instructional tools of this time and age.

The purpose of this quantitative study will be to examine faculty self-assessments of their technological, pedagogical, and content knowledge using the Higher Education-TPACK (HE-TPACK) as the assessment instrument. In addition, the study will investigate faculty tenure status (academic ranking), age, gender, and teaching experience for a correlation with the seven levels of TPACK; and then establish the variable(s) that mostly influence instructional adaptation.
This study will address the following three research questions:
What are the faculty’s self-assessments of technological, pedagogical and content knowledge (TPACK) using the Higher Education TPACK (HE-TPACK) instrument?
Is there a relationship between faculty’s level of TPACK and their tenure status, age range, gender and experience?
Which of the seven TPACK domains mostly influence faculty’s use of technology for instructional adaptation?
Research Design
This study will adopt a non-experimental quantitative (cross-sectional) predictive research method (Muijs, 2004; Creswell, 2012; Johnson and Christensen, 2014). It will be a survey design in which questionnaire will be used as the instrument. Different quantitative Research types as positioned by Johnson and Christensen (2014), are presented in Table 1. A quantitative research, according to Muijs (2004), is the use of mathematically-based methods to explain phenomena through collection and analysis of quantitative data. It also noted that many data that do not appear naturally in quantitative forms (such attitudes and beliefs) can be converted into quantitative data and analyzed statistically. A survey research design, on the other hand, is a form of quantitative research in which the attitudes, opinions, behaviors or characteristics of a sample or entire population are described using numbered mathematical data collected through questionnaire by a researcher (Creswell, 2012). Elaborating on quantitative methods, Muijs (2004) further states that when looking for breadth, something quantitative or testing hypothesis, a quantitative method is always a preferred approach by researchers. A cross-sectional type of research has been described as the most popular on educational research (Muijs, 2004), and involves the collection of data from the participants at one point in time, as opposed to longitudinal survey in which data is collected over a period of time. The data here is directly applied to each case at that point in time, and comparisons made across the variables of interest (Johnson, 2001).

Some advantages of survey research which includes flexibility, ease of generalization, anonymity of respondents, and low cost of collecting large amounts of data, have made it the most popular choice among social science researchers. Survey research design is therefore considered well-suited for this study, since it is a research approach in which relationships between variables will be examined. In addition, this study involves the measurement of participants’ attitudes and perceptions; and the participants will be surveyed once for purposes of data collection, as well as statistical analysis of such data (Muijs, 2004; Creswell, 2014).
Table 1.
Diagram of the Types of Research Obtained by Crossing Research Objective and Time Dimension (from: Johnson and Christensen, 2014),
Research objective Time dimension
Retrospective Cross-sectional Longitudinal
Descriptive (Type 1)
Retrospective, descriptive study (Type 2)
Cross-sectional, descriptive study (Type 3)
Longitudinal, descriptive study
Predictive (Type 3)
Retrospective, predictive study (Type 5)
Cross-sectional, predictive study (Type 6)
Longitudinal, predictive study
Explanatory (Type 7)
Retrospective, predictive study (Type 8)
Cross-sectional, explanatory study (Type 9)
Longitudinal, explanatory study
Setting
The setting for this study is the campus of a renowned 4-year university in the west coast of the United States of America. It is one of the foremost universities in that region of the country and a true representation of diversity in institutions of higher learning. In Fall 2017, the university enrolled a total of 28,253 students of very unique demographics that cuts across racial, ethnic and economic backgrounds in its various schools and programs. This figure includes the researcher. The university is made up of six colleges which include: Colleges of Art & Letters; Business & Economics; Education; Engineering, Computer Science & Technology; Health and Human Services; and, Natural & Social Sciences.
The faculty in the School of Education – tenured, tenure-track, adjunct, part-time and graduate assistants – will serve as the participants in this study. The table below (See table 2) is the university’s student enrollment by gender and ethnicity in Fall 2017.

Table 2
2017 Enrollment by Gender and Ethnicity (institutional Research, IR)
Level Gender Asian Pacific Islander Hispanic Black White International Two Races Unknown American Indian Total
# % # % # % # % # % # % # % # % # % # %
1st Time Freshman F 167 0.59 1 0.00 1534 5.43 46 0.16 37 0.13 134 0.47 19 0.07 31 0.11 1 0.00 1970 6.97
1st Time Freshman M 206 0.73 0 0.00 1165 4.12 30 0.11 22 0.08 113 0.40 16 0.06 32 0.11 2 0.01 1586 5.61
Freshman F 111 0.39 1 0.00 990 3.50 50 0.18 28 0.10 84 0.30 17 0.06 20 0.07 0 0.00 1301 4.6
Freshman M 129 0.46 2 0.01 655 2.32 50 0.18 26 0.09 89 0.32 17 0.06 10 0.04 0 0.00 978 3.46
Sophomore F 222 0.79 4 0.01 1477 5.23 74 0.26 48 0.17 137 0.48 34 0.12 28 0.10 3 0.01 2027 7.17
Sophomore M 225 0.80 1 0.00 930 3.29 29 0.10 39 0.14 113 0.40 23 0.08 29 0.10 1 0.00 1390 4.92
1st Time Transfer F 181 0.64 1 0.00 1202 4.25 89 0.32 174 0.62 170 0.60 46 0.16 59 0.21 2 0.01 1924 6.81
1st Time Transfer M 221 0.78 1 0.00 739 2.62 66 0.23 161 0.57 122 0.43 34 0.12 48 0.17 3 0.01 1395 4.94
Junior F 218 0.77 0 0.00 1388 4.91 69 0.24 75 0.27 123 0.44 32 0.11 31 0.11 1 0.00 1937 6.86
Junior M 208 0.74 1 0.00 814 2.88 36 0.13 71 0.25 97 0.34 16 0.06 38 0.13 1 0.00 1282 4.54
Senior F 670 2.37 7 0.02 3071 10.87 210 0.74 393 1.39 298 1.05 96 0.34 154 0.55 4 0.01 4903 17.35
Senior M 766 2.71 4 0.01 2177 7.71 132 0.47 377 1.33 258 0.91 83 0.29 133 0.47 6 0.02 3936 13.93
PostBacF 69 0.24 0 0.00 294 1.04 16 0.06 77 0.27 28 0.10 10 0.04 24 0.08 0 0.00 518 1.83
PostBacM 23 0.08 1 0.00 103 0.36 8 0.03 42 0.15 6 0.02 5 0.02 9 0.03 1 0.00 198 0.7
Graduate F 234 0.83 0 0.00 864 3.06 102 0.36 261 0.92 218 0.77 34 0.12 78 0.28 4 0.01 1795 6.35
Graduate M 153 0.54 0 0.00 432 1.53 56 0.20 173 0.61 197 0.70 27 0.10 73 0.26 2 0.01 1113 3.94
Total 3803 13.46 24 0.08 17835 63.13 1063 3.76 2004 7.09 2187 7.74 509 1.8 797 2.82 31 0.11 28253 99.99
Sample
This non-experimental quantitative (cross-sectional) descriptive research (Muijs, 2004; Creswell, 2012; Johnson and Christensen, 2014), will be conducted at one of the foremost state 4-year universities on the west coast of United States. The university’s overall faculty count in the fall of 2014, for instance, was 1216 (Office of the Institutional Research, (IR)); and below, (see table 3), is the breakdown according to instructor groups (Professors, Associates, Assistants, Full-time and Part-time lecturers). Also, Table 2 shows the university’s student enrollment in the fall of 2017.
Professors Associate Prof. Assistant Prof. Full Time Lecturers Part Time Lecturers Total
267 99 85 115 650 1216
Table 3. Overall count by instructor group of the university
The faculty members of the university’s College of Education will form the unit of analysis for this study. The College of Education has three main divisions, namely: Applied & Advanced Studies in Education; Curriculum and Instructions; and Special Education & Counseling. According to the office of Institutional Research of the university, the college of education enrolled a total of 1578 students in fall 2017. This is comprised of students in the doctorate, graduate, undergraduate, and credential programs.

Faculty members of the college in the category of Professors, Associates, Assistants, will form potential participants in this study because they are vested in the college and in the University and are more likely to be committed to maintaining high academic and professional integrity of the institution (Association of American Universities, 2010) in all circumstances. However, due to usual low response in surveys, instructors in the category of full-time and part-time lecturers, who hold any form of teaching position at the time of this survey will be included as potential participants.
This study will therefore depend on available tenured, non-tenure, and GTA participants within the college and who are willing to complete the online survey within the allotted time frame. They will, in effect, constitute a convenience (nonprobability) sample (Fink, 2009; Creswell, 2014). On the other hand, since the entire faculty members of the college will constitute the sample, it, the sample, will not be stratified (Creswell, 2014) by the researcher.
More importantly, any instructor that teaches the current diverse student population on our campus classrooms, be it brick-and-mortar of virtual, has an obligation to ensure students’ positive learning outcomes through technology and instructional adaptation. On this premise, full-time and part-time instructors in the college will be included as participants in the survey. Recruitment of the anticipated participants will be through the office of the Dean of the college where an official, comprehensive, and updated list of all instructors will be obtained.
Data Collection Procedures
Start with an introductory paragraph that describes your overall anticipated data collection procedures, how long each one will take, and where each one will be conducted. Better to include more rather than less. If you are doing a mixed methods study, start with one type of data collection and then describe the other.
This section explains the step-by-step process to be used for data collection in this study. The first step will involve a formal Invitation to participate in this study. This will be in the form of online consent form that will be extended to the entire instructional staff of the school via emails. The instructional staff will include all categories of the faculty, full-time and part-time instructors, including the Graduate Teacher Assistants (GTA). The purpose of this invitation to participate is to give the instructors a prior knowledge of the proposed study, acquaint them with what the study is all about, solicit their support, and encourage them to participate by responding to a survey that will come later.

The invitation-to -participate email will contain a link to a “Doodle Poll” where the proposed participants will be requested to acknowledge receipt of the invitation by simply selecting a “yes”; or a “no” to decline. The email will also contain a statement that clearly indicates that selecting a “yes” in the doodle poll window will indicate consent to participate in the survey. In addition, this letter will indicate to the recipients that participation in the survey will strictly voluntary.

This invitation-to-participate emails will be dispatched after all necessary approvals from the IRB have been obtained. The survey invitation will be sent out to all the faculty to give every faculty an equal opportunity to participate in the study. The respondents “X” to the survey will form the study sample (n=”X”) from where data will be collected for analysis. These respondents to the survey will represent a nonrandomized (convenience) sample drawn from the entire faculty population. Creswell (2014) described convenience (or nonprobability) sample as one in which convenience and availability represent the sole reasons for its selection.

Data Collecton Type #1
Change the heading to the type of data you are collecting, e.g., observations or interviews, or surveys, etc. Here you will fully describe the type of data you will collect. Be sure to justify (with citations) why that type of data is appropriate to your research questions.

Data Collecton Type #2
Add as many sections as the types of data you indicated you will collect in the introductory paragraph of this section.
Instrument
Self-assessment of TPACK fundamentally enables instructors to look back and reflect on their instructional practices in order to improve upon them and thinking of more innovative ways of adapting them for use in various learning environments (Harris ; Hofer, 2011; Niess, 2011). In effect, since the primary role of a teacher is to promote effective learning irrespective of context, a Pre-Teacher TPACK survey (PT-TPACK) is believed to remain appropriate in any learning environment or setting. Meanwhile, over the years TPACK surveys such as the one developed by Schmidt et al., (2009) have been replicated, for instance, by Howard, (2011), and modified, for instance, by Knolton, (2014), and others. Yet the original design intent and purpose have remained the same.
On this premise, Garrett, (2014), modified the Pre-Teacher TPACK (PT-TPACK) survey originally by Lux, Bangert, ; Whittier, (2011), to develop the Higher Education TPACK (HE-TPACK) as a survey instrument that will accommodate postsecondary education instructors.

Validity and reliability of HE-TPACK, (Garrett, (2014).

The HE-TPACK by Garrett (2014) were subjected to a validity and reliability tests. The internal consistency of the survey instrument was established using Cronbach’s Alpha, to ensure that the instrument measured with it was intended to measure. Also, the content validity of the instrument was established, by experts in either in TPACK or in Technology Training. On the other hand, a measurement expert evaluated each of the items in the instrument to verify the verbiage, grammar and ensure that the items are devoid of ambiguity or other flaws, according to (Crocker & Algina, 1986, p. 81). Finally, the experts conducted an overall instrument review to confirm that the items adequately measured both concepts of TPACK and technology training (Crocker & Algina, 1986, p. 218). The HE-TPACK went through this process of reviews, feedbacks, corrections, and verifications (Crocker & Algina, 1986, p. 218). to establish its authenticity as an instrument of measurement.
Through the processes indicated above, the validity of HE-TPACK (Garrett, 2014) was established and was used in a research study on faculty self-assessment of TPACK and technology training. It proved to be a reliable instrument and so will be modified and adopted (with written permission) for use as the survey instrument in this study; and shall be captioned Diversity-TPACK specifically in this study for ease of distinction.
Diversity-TPACK Survey Instrument

Pilot of instrument for the proposed study
To measure phenomena in educational research, Validity, and reliability are important and mutually nonexclusive concepts in quantitative studies (Muijs, 2004). These elements determine the quality of a measurement instrument. Diversity-TPACK will undergo verification tests by experts who will examine the instrument to ensure that the modifications did not alter either the validity or the reliability of the original survey by Garrett (2014). First, a validity test will be carried out which will involve:
1. The content validity will be examined to ensure that the latent content of the concept will be measured by questions of the questionnaire.
2. The criterion validity to examine how well the measures in the instrument relate, or predict the other.
3. The construct validity to examine the extent to which the data fit into the factors of the instrument.

Secondly, to establish the reliability of Diversity-TPACK, the Cronbach’s Alpha will be used to test the internal consistency of the instrument. Internal consistency test is one form of measuring the homogeneity of the items of survey instruments in addressing the concept. Finally, the experts will examine each of the items in the instrument to ensure that the questions are direct and clear; and are not ambiguous.
Data Analysis Procedures
Start with an introductory paragraph that describes your anticipated overall data analysis procedures and when they took place. Decide if you will use an a-priori coding frame or grounded theory coding (for qualitative) or what statistical analyses you will use for quantitative. You will have both if you are doing mixed methods. If appropriate you can describe how you analyzed each type of data and then you will need a section that describes your overall analysis across all data sets. Use the same order as in your data collection procedures.
Here, the researcher presents the procedures for analyzing the data to be generated from the survey in this proposed quantitative study. The survey will use questionnaire as the instrument and will be hosted on Survey Monkey website, the link of which will be forwarded to the proposed participants through their emails.

The proposed survey will consist of —— queastionData Analysis Type #1
To measure faculty TPACK knowledge (competencies)and perceptions towards instructional adaptation, basic statistical analysis will be used.
Data Analysis Type #2
To investigate the relationship between faculty TPACK competencies and their perceptions towards instructional adaptation, a correlation analysis will be conducted.

Data Analysis Type #3
To measure the differences between faculty perceptions towards instructional adaptation and TPACK competencies by their tenure status (tenure, tenure track, adjunct, part-time, etc), a t-test will be run.

Data Analysis Type #4
To determine which of the TPACK domain(s) will mostly predict faculty instructional adaptation (and by how much), a hierarchical regression analysis
by Tabachnick and Fidell (2013, p. 138) will be conducted.

Add as many sections as the types of data you indicated you will collect in the introductory paragraph of this section.

Overall Analysis Across All Data Sets
Here you will describe how you anticipate you will analyze all the data to find the central themes and answer your research questions.
Validity and Reliability
Here you will describe the validity and reliability (or credibility and trustworthiness for quantitative studies) of your study (see Creswell or the Golafshani article).
Timeline
Include a timeline for your study—this can be a bulleted list or a table linking data collection and analysis procedures to the months ahead.
Summary

RE
REFERENCES
REFERENCES
Here you will write your references. Use insert page break before this page to start your references on a new page. References should be double spaced like the rest of the paper and use a hanging indent. Here are some examples of different references (book, book chapter, article). Titles of books are in italics with heading caps; titles of articles and chapters are in regular font with sentence caps. Grad studies does not allow hyperlinks so remove any that are in your references.
Chamot, A.U., ; O’Malley, J.M. (1994). The CALLA Handbook: Implementing the Cognitive Academic Language Learning Approach. Reading, MA: Addison-Wesley Publishing Company.

Clandinin, D. J. & Huber, J. (2010). Narrative inquiry. In P. Peterson, E. Baker, & B. McGaw, (Eds.). International Encyclopedia of Education (Third Edition), 436-441, Cambridge, MA: Elsevier.
Forbes, C. T. & Davis, E. A. (2009). Beginning elementary teachers’ beliefs about the use of anchoring questions in science: A longitudinal study. Science Education, 94(2), 365-387.
Rosenberg, ; Koehler (2015). Context and Teaching with Technology in the Digital Age. Handbook of Research on Teacher Education in the Digital Age. Chap 17. DOI: 10.4018/978-1-4666-8403-4.ch024
Johnson, B. (2001). Toward a new classification of nonexperimental quantitative research. Educational Researcher, 30(2), 3-13.

Attwell, G. ; Hughes, J. (2010). Pedagogic Approaches to Using Technology for Learning – Literature review
Creswell.

Creswell, John W. (2014). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. SAGE Publications, Thousand Oaks, CA.
Sheryl Burgstahler (2009). Universal design for instruction (UDI): Definition, Principles, Guidlines, and Examples.http://www.washington.edu/doit/
David Gosling (2009). Supporting student learning, A hand book of A Handbook for Teaching and Learning in Higher Education, 3rd edition, Routledge, New York. 113-131
Kelly (2010)
Porras-Hernandez ; Salinas-Amescua’s (2013)
Rosenberg, & Koehler (2015), Context and teaching with technology in the digital age Handbook of research on teacher education in the digital age. IGI Global, August 3, 2015 (440-465)
Context and Technological Pedagogical Content Knowledge
(TPACK): A Systematic Review. Journal of Research on Technology in Education l Volume 47 Number 3
Archambault, L., & Crippen, K. (2009). Examining TPACK among K–12 online distance educators in the United States. Contemporary Issues in Technology and Teacher Education, 9(1), 71–88.

Chai, C.-S., Koh, J. H.-L., Tsai, C.-C. (2013). A review of technological pedagogical content knowledge. Educational Technology & Society, 16(2), 31–51.

Johnson, R. R. B., & Christensen, L. B. (2014). Educational research: Quantitative, qualitative, and mixed approaches. Thousand Oaks, CA: Sage.

Roberts, Park, Brown, & Cook (2011) Universal design for instruction in postsecondary education: A systematic review of empirically based articles. Journal f postsecondary education and disability, 24(1), 2011. 5-15
Alqurashi, E., Gokbel, E., Carbonara (2016). Teacher’s knowledge in content, pedagogy and technology integration: a comparative analysis between teachers in Saudi Arabia and United States. British journal of education technology 00(00). Doi: 10.1111/bjet. 12515
Association of American Universities, (2018). Retrieved from
https://www.aau.edu/lecturers-potential-security-employment
Harris, J. B., ; Hofer, M. J. (2011). Technological pedagogical content knowledge (TPACK) in action: A descriptive study of secondary teachers’ curriculum-based, technology-related instructional planning. Journal of Research on Technology in Education, 43(3), 211-229.

Niess, M. L. (2011). Investigating TPACK: Knowledge growth in teaching with technology. Journal of Educational Computing Research, 44(3), 299-317.

Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2009). Technological pedagogical content knowledge (TPACK): The development and validation of an assessment instrument for preservice teachers. Journal of Research on Technology in Education, 42(2), 123–149.

Howard, S. K. (2011). Affect and acceptability: Exploring teachers’ technology related risk perceptions. Educational Media International, 48(4), 261-272. doi:10. 1080/09523987. 2011. 632275
Lux, N. J., Bangert, A. W., ; Whittier, D. B. (2011). The development of an instrument to assess preservice teacher’s technological pedagogical content knowledge. Journal of Educational Computing Research, 45(4), 415-431.

Crocker, L., & Algina, J. (1986). Introduction to classical and modern test theory. Philadelphia, Pennsylvania: Harcourt Brace Jovanovich College Publishers,

APPENDICES
Here you put things like your interview and observational protocols. Number them in order, e.g., Appendix A: Interview Questions, Appendix B: Survey, etc. They should be in the same order you talk about them in your paper.

x

Hi!
I'm Katy

Would you like to get a custom essay? How about receiving a customized one?

Check it out