Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.

Linux In Education

468 vues

Publié le

The PDF version of a power point project that I put together for an online graduate level education course I took with American Intercontinental University

Publié dans : Formation, Technologie
  • Soyez le premier à commenter

  • Soyez le premier à aimer ceci

Linux In Education

  1. 1. EDU 602-0901B-02American InterContinental University Christopher Bradley
  2. 2.  Does current technology enhance or degrade the learning abilities of students exposed to technology versus those who are not exposed to the same aids? If technology is effective, how effective and at what cost? What type of technology works most effectively and why? How does the student’s socioeconomic background play a role?
  3. 3.  Declining Standardized Scores Nationwide Increased technological jobs (even in the military). Increase in high tech toys, games, and etc… Aging Teacher population Increased Teacher Regulations and Federal laws Technology level of the teacher force? Overall National pride? Declining University Enrollment
  4. 4.  Understanding the abilities of the current education workforce. Better determining the future needs of the education workforce. Giving Education Administration a better picture of where the money needs to flow and the qualities to look for in the hiring of new teachers. Equipping current educators with the knowledge of what they need to stay on top of technology wise
  5. 5.  Examine a local school site using a mixed methods study Interview students (video game format…sort of like the Cash Cab Game Show so as to capture both the body language and emotions of the participant when answering the questions).
  6. 6.  Do the same type and format of interview with teachers. Asking similar questions to the ones below:  Do you know what it means to podcast?  Do you Know what it means to Blog?  Do you have a Facebook, MySpace, or twitter account?  How often are you online?  Do you have/use a computer in your classroom?  Television, Video player, gaming console, I-Pod?  Do you discuss technology in your classroom when possible ?
  7. 7.  Compile the data from the teacher surveys  Compare the results based on gender, time of service, education level, and age.  Determine the “tech level” of the teachers
  8. 8.  Now that the tech level is established the experiment has to be designed.  pick a subject that is taught by multiple educators this will likely be Mathematics, Science, or Social Studies.  Teach the exact same course content in both classes allowing the same amount of time for instruction in each class.
  9. 9.  Compile the data from the student surveys  Compare the results based on gender and grade level  Also examine the tech level of girls v. boys  Determine the “tech level” of the students
  10. 10.  In one classroom use technology that is appropriate to the course content. ▪ Do this for the length of the unit or chapter ▪ Record and separate the results by grade and gender
  11. 11.  In one classroom use no technology to deliver the course content. ▪ Do this for the length of the unit or chapter ▪ Record and separate the results by grade and gender.
  12. 12.  Interview Teachers Interview Students Determine the tech level of teachers Determine the tech level of students Determine the subject and unit to teach Determine the technology enriched study group.
  13. 13.  Define non tech enriched student group Teach one unit of course work Separate and analyze results of unit performance Publish Results
  14. 14.  Inclusion of students with special needs Students that have transferred from another school that may have already went over the material Gifted students Teacher tells students about the study Administration tells teachers about study*
  15. 15.  Parental study influences Socioeconomic background Students with undiagnosed learning disabilities Teachers assist students to keep from looking bad Teacher aids Students from other group
  16. 16.  Researcher biases False answers given at time of interview False answers given on unit assignments Other unknown factors.
  17. 17. Barton, A., Sevcik, R., & Romski, M. (2006, March). Exploring visual-graphic symbol acquisition by pre-school age children with developmental and language delays. AAC: Augmentative & Alternative Communication, 22(1), 10-20. Retrieved April 11, 2009, from CINAHL with Full Text database. The process of language acquisition requires an individual to organize the world through a system of symbols and referents. For children with severe intellectual disabilities and language delays, the ability to link a symbol to its referent may be a difficult task. In addition to the intervention strategy, issues such as the visual complexity and iconicity of a symbol arise when deciding what to select as a medium to teach language. This study explored the ability of four pre- school age children with developmental and language delays to acquire the meanings of Blissymbols and lexigrams using an observational experiential language intervention. In production, all four of the participants demonstrated symbol- referent relationships, while in comprehension, three of the four participants demonstrated at least emerging symbol- referent relationships. Although the number of symbols learned across participants varied, there were no differences between the learning of arbitrary and comparatively iconic symbols. The participants comprehension skills appeared to influence their performance.
  18. 18. Campuzano, L., Dynarski, M., Agodini, R., Rall, K., & Institute of Education Sciences (ED), N. (2009, February 1). Effectiveness of Reading and Mathematics Software Products: Findings From Two Student Cohorts. NCEE 2009-4041. National Center for Education Evaluation and Regional Assistance, (ERIC Document Reproduction Service No. ED504657) Retrieved April 11, 2009, from ERIC database. In the No Child Left Behind Act (NCLB), Congress called for the U.S. Department of Education (ED) to conduct a rigorous study of the conditions and practices under which educational technology is effective in increasing student academic achievement. A 2007 report presenting study findings for the 2004-2005 school year, indicated that, after one school year, differences in student test scores were not statistically significant between classrooms that were randomly assigned to use software products and those that were randomly assigned not to use products. School and teacher characteristics generally were not related to whether products were effective. The second year of the study examined whether an additional year of teaching experience using the software products increased the estimated effects of software products on student test scores. The evidence for this hypothesis is mixed. For reading, there were no statistically significant differences between the effects that products had on standardized student test scores in the first year and the second year. For sixth grade math, product effects on student test scores were statistically significantly lower (more negative) in the second year than in the first year, and for algebra I, effects on student test scores were statistically significantly higher in the second year than in the first year. The study also tested whether using any of the 10 software products increased student test scores. One product had a positive and statistically significant effect. Nine did not have statistically significant effects on test scores. Five of the insignificant effects were negative and four were positive. Study findings should be interpreted in the context of design and objectives. The study examined a range of reading and math software products in a range of diverse school districts and schools. But it did not study many forms of educational technology and it did not include many types of software products. How much information the findings provide about the effectiveness of products that are not in the study is an open question. Products in the study also were implemented in a specific set of districts and schools, and other districts and schools may have different experiences with the products. The findings should be viewed as one element within a larger set of research studies that have explored the effectiveness of software products. Three appendixes are included: (1) Second-Year Data Collection and Response Rates; (2) Description of Sample for the 10 Products; and (3) Details of Estimation Methods. (Contains 29 footnotes, 4 figures and 24 tables.
  19. 19. McDonald and Hannafin, K.K and R.D. (2003).Using Web-Based Computer Games to Meet the Demands of Todays High Stakes Testing: A Mixed Method Inquiry. Journal of Research on Technology. 35 Number 4, 459-472. The State of Virginias Standards of Learning (SOL) curriculum identifies specific objectives for each grade level in the subjects of reading, math, science, and social studies, and assesses student mastery of those objectives at targeted grade levels. The third-grade social studies curriculum and test represent a particular challenge for teachers because they cover information taught from kindergarten through third grade over a wide variety of topics, including Ancient Civilizations, Famous Americans, Civics, Famous Explorers, and U.S. Holidays. To assist one school in reviewing for the third-grade exam, the first author developed a Web-based review tool using the formats of the popular television game shows Who Wants to Be a Millionaire? and Jeopardy! that actively engaged students in reviewing social studies material. This mixed- method study used both a quasi-experimental and a qualitative approach. In the quasi-experimental design, scores of students in one thirdgrade class who used the game to review for the SOL test were compared to scores of students in another class who reviewed for the exam using more traditional methods. Students in the Web-review treatment were extensively observed, recorded, and analyzed. Students in the Web-based review treatment did score higher in the SOL exam than students in the control group, but not significantly so. However, more importantly, the games promoted higher order learning outcomes such as increased meaningful dialogue among students and the identification of student misconceptions, both of which contributed to deeper student understanding. [ABSTRACT FROM AUTHOR]
  20. 20. Schlosser, R., & Blischak, D. (2004, August). Effects of speech and print feedback on spelling by children with autism. Journal of Speech, Language & Hearing Research, 47(4), 848-862. Retrieved April 11, 2009, from CINAHL with Full Text database. In this systematic replication of a previous study (R. W. Schlosser, D. M. Blischak, P. J. Belfiore, C. Bartley, and N. Barnett, 1998), the effects of speech and print feedback on spelling performance were evaluated. Four children with autism and no functional speech were taught to spell words with a speech-generating device under 3 feedback conditions. In the auditory-visual condition, children received both speech and print feedback, whereas in the auditory and visual conditions, only 1 type of feedback was provided. An adapted alternating treatments design was used. All 4 children reached criterion across conditions. Although 3 children reached criterion first with print or speech-print feedback, 1 child was most efficient with speech-print followed by speech feedback. Based on the findings of both studies, 2 distinct profiles of feedback efficiency are proposed. Children that exemplify the primarily visual profile spell words most efficiently when feedback involves print. Children that fit the auditory profile spell words most efficiently when feedback involves speech. The implications for understanding the learning characteristics of children with autism, as well as those for practice and further research are derived.
  21. 21. Valiquette, C., Gérin-Lajoie, A., & Sutton, A. (2006, June). A visual graphic tool to support the production of simple sentence structures in a student with learning disability. Child Language Teaching & Therapy, 22(2), 219-240. Retrieved April 11, 2009, from CINAHL with Full Text database. A tool was devised to improve spoken syntax through manipulation of graphic symbols. The participant, a French- speaking 11-year-old girl with general learning disability, learned to produce subject-verb-object (SVOn) sentences and transform them into a subject-object-verb (SOpV) structure in which the object becomes pronominal in a preverbal position. The production of personal pronouns in subject position was also targeted. Didactic methods were selected and administered for a total of 12 individual sessions. Pre- and post-measurements in an elicitation task (picture description) and language sample analysis showed significant improvement on target productions.