SlideShare une entreprise Scribd logo
1  sur  23
Télécharger pour lire hors ligne
Science Team
Designing a New Learning Environment Course
A Plug-in for Science
Enhancing the 21st century education:
The intelligent tutoring spirit
Science team proposal
Piso-MOOC
Designing a New Learning Environment Course
in for Science-oriented MOOCs (PiSo
Enhancing the 21st century education:
The intelligent tutoring spirit
Science team proposal
16 December 2012
oriented MOOCs (PiSo-MOOC)
Science Team Piso-MOOC
Designing a New Learning Environment Course
Science team - Designing New Learning Environments
Venture Lab Fall Course 2012
Science Team Piso-MOOC
Designing a New Learning Environment Course
Summary
1. Our Rationale....................................................................................................................................... 5
2. Our System function and features ...................................................................................................... 5
2.1. State of the art ............................................................................................................................. 6
2.2. Technological requirements......................................................................................................... 6
2.3. Learning environment based on scientific research .................................................................... 7
2.4. Higher order learning (implementation of Bloom’s taxonomy)................................................... 7
2.5. Learning sequence (curriculum sequencing)................................................................................ 7
3. Our Target Audience and Learning Conditions ................................................................................... 7
3.1. Target audience............................................................................................................................ 7
3.2. Student with disabilities............................................................................................................... 8
3.3. Student profiles............................................................................................................................ 8
4. Our Business Model/implementation.................................................................................................8
4.1. Required technological and human resources............................................................................. 8
4.2. Business model............................................................................................................................. 8
4.3. Contingency planning................................................................................................................... 9
Appendix I - Prototype description: Bradford assay laboratory exercise ............................................... 9
I.1 Automated tests: when and why? ................................................................................................. 9
I.2 Automated assessment.................................................................................................................... 10
I.2.1 Testing a sequence of tasks....................................................................................................... 10
I.2.2 Learning the right gesture......................................................................................................... 13
I.2.3 Memorizing formulae................................................................................................................ 14
I.3 Evaluation as a teaching methodology ........................................................................................ 15
I.3.1 Memory target: Bradford assay, error bars formula............................................................. 15
I.3.2 Test 1: Drag-and-drop ........................................................................................................... 15
I.3.3 Test 2: MCQ of increasing difficulty ...................................................................................... 16
I.3.4 Test 3: Cloze tests.................................................................................................................. 16
I.4 A need for critical thinking in experiments: the implementation of a virtual laboratory............ 18
I.4.1 Virtual laboratories should include a student model............................................................ 18
I.4.2 Human vs. Automated evaluation of experimental choices ................................................. 18
I.4.3 Collaboration in virtual experiments..................................................................................... 19
Science Team Piso-MOOC
Designing a New Learning Environment Course
I.4.4 Usual learning environment features.................................................................................... 20
Appendix II - Additional resources ........................................................................................................ 21
II.1 Links towards video tutorials ...................................................................................................... 21
II.2 Google presentation link............................................................................................................. 21
II.3 Prezi link ...................................................................................................................................... 21
II.4 Link to the mock up..................................................................................................................... 21
II.5 Bibliography................................................................................................................................. 21
Science Team Piso-MOOC
Designing a New Learning Environment Course
1. Our Rationale
Basic understanding of science and learning to learn have been recognized as the key
competencies for the 21st century (The key competences for European citizens in the knowledge
society).
As described in this document, “competence in science refers to the willingness to use the body of
knowledge and methodology employed to explain the natural world in order to identify questions and
to draw evidence-based conclusions”, while “learning to learn is pursuing and persisting in learning,
organising one’s own learning, including through effective management of time and information, both
individually and in groups”. Moreover, critical thinking and problem solving skills have been labeled
with the highest ranking among the skills with increasing demands in today’s society in both USA and
EU (Emerging Skills and Competences - A transatlantic study). Those skills include effective
reasoning, systems thinking, making judgements and decisions based on the critical analysis of
evidence and identifying and asking significant questions that lead to better solutions.
Training the new generation of experts according to these guidelines is therefore the global
aim of education systems throughout the world. Including scientific training in the curriculum could
provide a basis for the development of key competencies for 21st century professionals. Unfortunately,
many students switch from STEM majors early in their educational career, and this attrition rate hasn’t
changed much in the past few decades. Often, even students studying sciences lack a deep
understanding of the theory behind the techniques that are used in their respective fields, have
problems organizing their learning and show inadequate critical thinking and problem solving skills. A
possible cause for this problem could be insufficient or poorly implemented laboratory exercises.
Instead of deepening the understanding, laboratory exercises encourage repetition, following the
procedures without understanding and offer no challenge or possibility for problem-based inquiry. In
addition, high costs of laboratory equipment and consumables greatly inhibits any innovation in this
type of learning (learning by doing and learning through research).
Our proposal describes a novel learning environment based on scientific evidence offering an
experimental training in a virtual laboratory as an added value to existing campus-based theoretical
courses or recently extremely popular MOOCs (Massive Open Online Courses). Our design would
offer a tool for teachers of laboratory sciences to engage students outside of the laboratory. This
could serve as a preview to a laboratory exercise or as a review of a laboratory exercise. In addition, in
environments that do not have access to a science lab, our design can serve as a way for students to
learn laboratory procedures without a large investment in resources. Moreover, our modular design
allows for high flexibility and evolvability to suit any content and learning objectives, with special
emphasis on the development of deep understanding of material, critical thinking and learning to learn
competencies. Finally, the flexibility of our design could allow for implementation of this learning
environment to any discipline and educational level, allowing the teaching of meta-competences in a
broader sense to a much wider audience.
2. Our System function and features
We believe that laboratory exercises are crucial for the development of scientific thought and
understanding. Our proposal enhances the experience for the students and the instructors despite
increasing demands on instructors due to class size and time limitations. Our design aims to offer
teachers a way to reach larger numbers of students, as well as to engage those students outside of
labs and classrooms.
Science Team Piso-MOOC
Designing a New Learning Environment Course
We base our system on the needs we identified for modern science teaching:
• clear visualizations of laboratory procedures
• easy memorization of lab tasks, supported by audio and video material, and repetition
• critical thinking and understanding of the procedures prior to class, using assessment tools
and exercises
• large-scale and low-cost open and accessible tools
• intuitive, fun, and learner-centered learning environment that engages students in content and
develops crucial skills
2.1. State of the art
The recent success of Massive Open Online Courses has highlighted the importance of
developing online based learning tools. MOOCs enable teachers to efficiently deliver pedagogical
content and assignments. Nevertheless, the efficiency of this learning approach is often questioned,
since it is not clear what the benefits of learning a topic through a MOOC are. In MOOC science
classes, remembering and developing critical thinking are essential. However, to our knowledge, there
is no automated procedure to assess precisely what students remember from the course, and how
well they would perform in a situation where they have to apply their knowledge to real problems, by
having to design experiments by themselves for instance. We will describe a tool that assesses
students’ automatically, with a focus on memory and creativity. In many ways, the tool belongs to a
large family of quiz makers like Hot potatoes, Adobe Captivate, and so many others. The difference
lies in the fact that we add a memory component to it; our philosophy is very close to the one that
underlies cognition-based learning platforms like Memrise or Iknow.jp. The main difference lies in the
focus; Memrise is designed towards learning vocabulary in foreign languages, i.e. word lists. Memrise,
permits little control over the nature of tests, and in most quiz makers, there is no control upon the
sequence and the frequency of testing. We would therefore combine the advantages of classic quiz
makers and Memrise-like tools in one platform to create a powerful online learning tool. Moreover, we
would like this tool to be used in the context of virtual experiments in order to help students acquire
critical thinking over what they do. Several virtual laboratories have been developed in order to
perform low cost experiments, but as of yet, they have had very limited success, partly because the
value they add to the learning process is unknown. In our virtual laboratory, there will be a strong
focus on student evaluation in order to avoid this problem.
To enhance the learning experience for the learner, we plan to integrate knowledge maps and
game elements. Knowledge maps have already been used to guide self-directed learners in Khan
Academy Math lessons , where the implementation was done using Google maps overlays.
Gamification in education is gaining more and more visibility and impacts course design and
curriculum development (see infographic), both in online systems like the Khan Academy, but also in
off-line systems, such as a school-based A Quest to Learn. We hope to provide a learning
environment where the learner has instant feedback of its progress towards mastery and where we
offer reminders and recommendations of suitable activities to enhance both memorization and
understanding of the content. We hope to create an environment where both instructors and students
can set specific learning goals and where every action taken towards a learning goal is recognized by
giving points and badges to motivate the learner. In addition, we plan on integrating existing social
networks to allow users to share their success stories, as well as to compete in challenges with other
users. These elements will hopefully provide an interesting and dynamic social learning environment
that would attract users to use the system more frequently.
2.2. Technological requirements
Our design would be optimized for online learning, requiring a stable Internet connection, but an offline
version would be made available for students with unreliable Internet access. The learning
Science Team Piso-MOOC
Designing a New Learning Environment Course
environment is designed to be used on most common learning devices, including tablets, smartphones
and personal computers. social networking will be integrated to.
2.3. Learning environment based on scientific research
To implement our design, we draw on research in both intelligent tutoring systems and
computer adaptive testing. Intelligent tutoring systems has been a very intensive field of research in
the seventies (Corbett et al. 2008). Though recently research has waned, findings in the field have
been applied to various topics, ranging from programming (Sykes 2007, Dadic 2010) to physics
(Schulze 2000) and medical studies (Suebnukarn 2006, Kazi 2007) . The computer adaptive testing
approach we draw on (Gerson 2005, Georgiadou 2006) follows the philosophy of cognitive tutoring
systems that highlight the importance of the student cognition, including memory and misconceptions
(Aderson 1990, Shute et al.1995, Colace 2006, Feng 2012, Branstadter 2012). Regarding virtual
laboratories, we refer to the inquiry based pedagogy that has been encouraged from the seventies
(Holt et al. 1969, OCDE 2006 Kloser 2012, Brownell 2012). The actual implementation of this
pedagogy with numeric technologies has been described in the virtual laboratories literature (Zumbach
2006, Cobb 2009, Girault 2012).
2.4. Higher order learning (implementation of Bloom’s taxonomy)
Our system will address the following learning objectives:
• remembering will be enforced through cognition-based features, such as memorization
• understanding will be evaluated through quizzes
• applying will be presented through problem-based exercises
• analyzing will be developed through comparing the results of virtual experiments
• evaluating will be trained by choosing and prioritizing the steps of procedures in a virtual
experiment
• creating will be taught through experimental design and planning of learning activities
• learning will globally enhanced through mastering both content-specific knowledge and
acquiring meta-competencies (critical thinking, creativity, learning to learn)
2.5. Learning sequence (curriculum sequencing)
Our learning platform is aimed at fostering memorization and learning through trials and error
via virtual experiments. It will include an authoring tool to enable teachers to design and implement
their own pedagogy. Teachers and self-learners will be responsible for the learning sequence that will
be implemented, as some people prefer to experiment before memorizing whereas others prefer the
other way round. Ultimately, however, learners following the same module will be evaluated in the
same way.
3. Our Target Audience and Learning Conditions
3.1. Target audience
We are targeting students attending MOOCs or institutions where practical laboratory
experience is not available for financial or other reasons. Even in developed countries, the expense of
Science Team Piso-MOOC
Designing a New Learning Environment Course
laboratory exercises is a limiting factor to having each student perform a laboratory exercise in a cost-
efficient manner. Initially, our design would be geared towards university undergraduates in scientific
fields via their instructors who would be able to design an optimal learning environment (by either
customizing their own module or using an “off-the-shelf” version). Our design could also be
implemented in schools or for a general audience, and can be customized for use in other fields that
might benefit from memorization, procedural planning, and automatic evaluation modules.
3.2. Student with disabilities
The complementary use of videos with narratives, subtitles, and supporting material will allow equal
access to students with disabilities, particularly hearing or vision disabilities. The customizable design,
which can be modified to suit individual learning pace, offers a nurturing environment for students with
learning disabilities, including autism spectrum disorders.
3.3. Student profiles
The problem of developmental-appropriateness of our design is solved by the fact that the
instructor will create the content, ensuring optimization for the students’ learning stage. Subtle
modifications can be easily introduced to teach the same content to different education levels or
developmental stages of students. More specifically, in the beginning of each virtual exercise the
student will be presented with an evaluation of previous knowledge and they will be asked to auto-
evaluate their motivation and learning skills. According to both content-related aspects and self-
directed learning level, the student can be offered different content depths and more or less complex
activities, and/or have a learning path tailored towards individual objectives and learning styles.
4. Our Business/Implementation Model
4.1. Required technological and human resources
Development of our design requires curriculum designers, educators and programmers to form a solid
interdisciplinary team. The team should also have an expert who would be offering workshops to
educate instructors on how to use our design and integrate it in the existing framework.
4.2. Business model
Although we would be a non-profit entity, we would use a for-profit model to increase the
business’s self-sufficiency and reduce our need for outside donors. We would offer two tiers of
service: for instructors interested in experimenting with the platform, a limited version would be free;
for instructors interested in using a fully featured version for academic classes, we would offer
renewing yearly and quarterly subscriptions that would track the academic year. Outside
philanthropists could also purchase subscriptions for other groups of users. For example, a charitable
organization might sponsor a subscription to the platform for a school in need.
We hope to initially launch through a crowdfunding source like Kickstarter, which would allow
us to present a prototype (and demonstrate value to potential users) before making large
investments. This funding would allow us to develop the platform and create several stock modules
that would both demonstrate to potential users what the platform is capable of and be products
investors could immediately implement, if desired. As our module bank grew, we would open the
platform up for subscriptions, giving discounts and other rewards to initial investors. One of the goals
would be to foster instructor interaction through the platform to enable the exchange of knowledge
between instructors and promote the diffusion of innovation. Modules created by any instructor would
Science Team Piso-MOOC
Designing a New Learning Environment Course
be public by default, and instructors would have the ability to discuss, adopt, and adapt modules
created by other instructors. Once subscribed, instructors can create as many modules as they need
and for as many different courses as they desire, which would incentivize subscribers to invest in
using the platform. We would also issue regular tutorials for both students and instructors on how to
make their experience using the platform as valuable as possible. We plan to gain additional revenue
from “swag” sales and events between other potential partners in the learning space.
4.3. Contingency planning
Although subscription models can work well once implemented, probably the most difficult part
would be demonstrating the value of the platform to initial subscribers. Many crowdfunded projects fail
because the product was ultimately more expensive to develop than anticipated or because the
business could not transition into a sustainable business model.
To avoid this problem, we plan on building collaborative partnerships with private and public
organizations, including governments and ministries of education interested in enhancing the current
educational system, as well as private organizations that would be interested in increasing their
visibility through backing up projects such as ours.
In the long-run, attaining a critical mass of users to our product would be essential for success. To
build a growing and active community, we will invest a significant amount of time in communications
and public relations, primarily through social networks and targeted advertising, to ensure the influx of
interested instructors and students.
Appendix I - Prototype description: Bradford assay laboratory
exercise
For the purposes of our demonstration, our target audience is college freshmen or sophomores with
some basic chemistry background.We have focused on one specific laboratory procedure, Bradford
protein assay, to illustrate how our proposed system works. Bradford assay is used to measure the
total protein concentration in the solution. In this assay, a dye Coomasie blue shifts from its red form to
the blue form once bound to proteins. The change can be quantified by spectrophotometry - the
increase of absorbance at 595 nm is proportional to the concentration of proteins in the sample.
Bradford assay laboratory exercise offers several higher-order concepts the students should learn,
including experimental design, statistical analysis, standard error, using a reference curve, critical
thinking and collaboration.
I.1 Automated tests: when and why?
Evaluating student skills is one of the main issues associated with online learning platforms. This
evaluation can be either automatic or done by humans. In the latter case, the main problem is
scalability. Human evaluation implies that either an instructor or a peer spend some time evaluating
students outcomes. An instructor can evaluate only a limited number of students and peer evaluation
also has its drawbacks. Although it is possible to assess whether homework has been submitted and
whether evaluation criteria have been followed through peer evaluation, most peers, particularly for
sciences courses, do not have the knowledge required to assess whether a skill has been acquired.
There is therefore a strong need to improve automated evaluation techniques, especially for scientific
skills.
Many skills cannot be assessed through automated evaluation. For instance, there is no automated
way to assess the quality of an argument or of a demonstration. Automated evaluation is therefore
restricted to a subset of skills, and it is mostly based on tests like multiple choice questions, cloze tests,
Science Team Piso-MOOC
Designing a New Learning Environment Course
matching tests, among others. This approach is widely followed in MOOCs like Udacity, edX, and
Coursera.
Here, we will discuss the best uses of automated tests for assessing scientific skills. We will use the
Bradford assay as an example.
A test may have different objectives. Here are three different aims of a test:
• Evaluating the student before the beginning of a course. This approach allows an instructor to
assess whether a student has the prerequisites to attend the course and also provides the
instructor with an opportunity to adapt their teaching methods to the needs of the students.
• Evaluating the student after a course. This type of assessment may help determine whether
specific skills have been acquired.
• Evaluating during the course. This can provide an opportunity to check whether the concepts
or procedures, taught through tutorials for instance, have been correctly understood.
Evaluations can also be viewed as teaching methods as they require memory recollection and
therefore strengthen skill acquisition. This philosophy underlies sites such as Memrise. We will
consider two types of skills: cognitive and procedural. Both types of skills are needed to perform a
Bradford assay. Procedural skills include the correct gestures necessary to perform the assay, like
preparing dilutions and manipulating chemicals. Cognitive skills include mathematical computations,
such as applying calculus and proper formula use.
Figure 1: Left, Bradford Assay video tutorial on Youtube. Right, theoretical step-by-step
explanation (Adobe Captivate, thanks to Dr. Karen Champ)
The first step, before evaluation, is the passive assimilation of knowledge through lectures, video
tutorials or any kind of document. To illustrate this aspect of the work we designed some introductory
lectures to the Bradford Assay explaining its context, lab procedures and the correct use of formulae.
In addition to these explanations, students will go through a video tutorial showing the whole Bradford
assay. After having watched the videos, students can proceed to the evaluation itself.
I.2 Automated assessment
I.2.1 Testing a sequence of tasks
A Bradford assay is a sequence of tasks, beginning with the dilution of the original solution
and ending with the calculation of the protein concentration of an unknown sample.
Before assessing whether students know how to perform each individual task, an instructor may want
Science Team Piso-MOOC
Designing a New Learning Environment Course
to evaluate if the students know the correct sequence of the tasks that are to be performed. One easy
way to assess this automatically is to use a drag-and-drop test. In this case, a name would be given
to each particular task. This would include names such as "dilution", "calculation of error bars", etc.
The tasks are then presented to the student in a random order, and the students are required to “drag
and drop” (i.e. rearrange) the tasks to put them in the correct order. This exercise makes sense in a
number of situations where the sequence is not obvious. However, in the case of the Bradford assay,
this type of assessment is not challenging enough due to the limited number of tasks. Common sense
is sufficient to know that one cannot create the calibration curve before having done the dilutions.
Figure 2:Learning a sequence of tasks: the drag-and-drop matching test
Moreover, the fact that a student can rearrange the tasks in the correct order does not mean
that they would be able to recollect all of these tasks. For example, a student may be able to place
tasks D, C, B, A in the correct order A, B, C, D if asked to do so. However, if asked to perform the
Bradford Assay, the student may forget task C and propose the sequence A, B, D. In this case, tasks
are in the proper order but one is missing. The student is, therefore, not able to perform the Bradford
assay by himself. This test is too easy to be valuable
In comparison, it can be difficult to use a cloze test to assess whether a student knows a specific
sequence. In a cloze test, the student must type exactly (or nearly exactly) what the computer expects.
In some cases, a small range of answers are acceptable. One difficulty is that the designer of the
cloze test cannot easily predict the way in which students will refer to a given task, as well as what
they will consider to be a task to be included versus a subtask that should not be included. Even if the
student knows the total number of tasks he is expected to list, it is unlikely that the student’s answer
will match the computer’s expectations. This is a major drawback of current automated tests. However,
we can use the cloze test as a final test and assume that students can learn by heart our way of
referring at the different steps of the experiment.
One possible approach to avoid this cloze test issue is to rely on a sequence of multiple
choice questions. With simple MCQs, users must choose the correct answer to a question from among
four (or more) different answers. After submitting the answer, the user is usually provided feedback, at
least whether his answer was correct or not. With a sequence of MCQs, the approach is slightly
different. This is because the user needs to answer multiple, interconnected MCQs before submitting
their responses. First MCQ: What is step 1? Second MCQ: What is step 2? Third MCQ: What is step 3?
After selecting an answer for each question, the user submits these for grading. If there is only one
correct answer to a given test, there is a 1/96 chance (one out of 4x4x4=96 ) of randomly getting the
correct answer if there are four answer choices for each included question. The higher the number of
MCQs that are included in the sequence, the lower the probability that the correct answer will be
selected randomly.
Science Team Piso-MOOC
Designing a New Learning Environment Course
Figure 3: Learning a sequence of tasks: the MCQ
This approach also has drawbacks. Since this test is more difficult than the previously
described drag and drop test, it is more reliable for determining whether the student really knows the
sequence. However, since there are many potential incorrect answers, it is hard to infer the associated
misconception from those incorrect answers. Correspondingly, it is difficult to guide accordingly.
Intelligent tutoring systems research indicates that this is a common issue associated with constraint-
based models. With constraint-based models, the answer provided by the student must meet certain
constraints in order to be accepted as correct. An easy way to deal with this issue is to run through the
sequence proposed by the student and stop at the MCQ where the mistake occurred. This is an easy
way for a computer to infer what the corresponding misconception is. This type of issue has been
addressed in previous papers (Viswanathan 2005).
One might argue that few differences exist between a sequence of MCQs and a list of MCQs,
but this is not entirely true. When considering a list of MCQs, the MCQs are independent from one
another. With a sequence of MCQs, all of the MCQs are interconnected. They all have to be
answered correctly for the student to demonstrate that the sequence is known. Moreover, with a
sequence of MCQs, a given question may depend on the answer of the previous MCQ, i.e., the
second MCQ will depend upon the answer to the first MCQ. This approach is referred to as a
Computer Adaptive Test or CAT, and it is widely used to assess the fluency of a student in a language.
Since the use of CAT in experimental sciences is still subject to debate, we will not discuss this further.
Science Team Piso-MOOC
Designing a New Learning Environment Course
Figure 4: Learning a sequence of tasks: the cloze test
I.2.2 Learning the right gesture
The whole Bradford assay can be decomposed into a sequence of individual gestures, and
each gesture can be evaluated individually. Evaluating whether a student performs the right gestures
is easier than evaluating cognitive skills. Even practical tasks such as the dilution or the correct use of
a pipette can be assessed through automated tests. The most rigorous way of evaluating student
gestures would be to have an instructor analyze a video of the student practicing the gesture, but if we
want to stick to automated evaluation, there are a handful of approaches. First, a video tutorial
showing the whole Bradford assay with the different gestures would be shown, with oral explanations
or subtitles. Since this video can be pretty long, from five to ten minutes, there are two ways to test.
Either the test takes place during the tutorial, an approach that is followed by many MOOCs like
Udacity and Coursera, or it takes place after it.
The first kind of test is to show a video gesture without any comment, and to ask through a MCQ
whether the gesture was performed correctly or not, and if not what is wrong. Let us call this kind of
test the wrong gesture analysis. Another way of using the MCQ is to show four different ways of
performing it, and ask the student to assess which video shows the correct gesture. MCQ is perfectly
fit for this approach, since there are not so many different ways of failing a given task. The few most
common mistakes can be identified and students can be trained to detect them through the approach
we described, and therefore avoid them.
Science Team Piso-MOOC
Designing a New Learning Environment Course
Figure 5: MCQ and choice of the right gesture based on videos
We believe this approach can be more efficient to assess student skills than real-world, in-
person courses. In in-person courses, there are usually around five to ten students per instructor, if not
more. It is impossible for an instructor to correct all of the students at the same time; as a result, many
students may perform an incorrect gesture without being noticed by the instructor. Moreover, for some
tasks, even close attention from the instructor is not enough to detect whether the task is done
correctly. A typical example in the case of the Bradford assay is the use of the pipette. When sampling
a liquid with a pipette, one has to avoid sampling too much liquid by pressing the button too much.
This is a very common mistake among students that decreases measurement precision.
It is difficult to assess whether a student is making this mistake just by looking at him. A properly
designed MCQ would be more efficient in this particular case to assess whether the students know
what the right gesture is.
I.2.3 Memorizing formulae
Performing the Bradford assay implies more than just performing the right gesture. Students
have to calculate volumes in order to do the dilution properly. Then they have to create a calibration
curve by measuring the absorbance of samples of known concentrations of proteins. Once the
calibration curve is done, they have to measure the absorbance of the sample of interest and to
deduce its concentration in protein. This last step involves calculus skills since the computation of the
error bars involves the use of a fairly complex formula.
Both MCQ and cloze tests are usually used in this situation. A student may be asked to give a number
through a cloze test. It can be a volume, a concentration, or something else. There are two options.
Either this value is exact, or it is within a range of values. The latter case is not very common, though
possible, as most cloze tests require a perfect match between a student's answer and the expected
answer. The advantage of using cloze tests for numerical values is that it is easy to assess
automatically whether the student got the right answer. This value can be complex enough to prevent
the student from finding it without applying perfectly the formula. Moreover, it can be a way to assess
whether a student can do complex calculations without making mistakes. It is widely used in
international tests of mathematics, computer science, and other similar disciplines.
Science Team Piso-MOOC
Designing a New Learning Environment Course
The drawback of this approach is that it is not easy to assess whether a student reasons correctly. Or
on the other hand, a student may know the formula but make mistakes in the calculation.
It is hard to assess through cloze tests whether a student knows the formula since there are different
ways of writing a formula, from the choice of the symbol to represent a variable to the order of the
variables within the formula. MCQ and drag-and-drop can be an intermediary solution.
I.3 Evaluation as a teaching methodology
We have discussed some different methods of automatic assessment for the sequence of
tasks that are to be performed in a Bradford assay, the right gestures, and the correct use of formulae.
We would like to highlight that evaluation can also be a learning technique. Active recollection of
memories through these tests is a more efficient way of strengthening a memory than reading or
watching tutorials. This is one of the basis of the learning by doing philosophy. Websites like Memrise
and Iknow.jp are based on this principle. In this approach, memories are strengthened through tests of
increasing difficulty (drag-and-drop and MCQ being relatively easy tests and cloze tests being more
difficult).
For instance, here is the formula that describes the error bar associated to the measurements in the
Bradford Assay.
I.3.1 Memory target: Bradford assay, error bars formula
(variables in http://en.wikipedia.org/wiki/Calibration_curve)
Say we want students to memorize this formula. The easiest test is a drag-and-drop, where students
have to place variables in the right place.
I.3.2 Test 1: Drag-and-drop
Here is a view of the drag and drop test.
Science Team Piso-MOOC
Designing a New Learning Environment Course
The number of possible combinations in this particular case is 6! The first issue is that there is no
difference between these two :
This kind of issue is manageable at this stage because there are only two different options, but the
bigger the formula is, the trickier this issue becomes. The cloze test is almost impossible to apply to
literal formulas due to the many different ways of writing it. This is the reason why the validation of a
formula through a cloze test is easier through its numeric application.
I.3.3 Test 2: MCQ of increasing difficulty
The second kind of test relies on MCQs. In this test, students have to find the right place
It is possible to increase the difficulty of the test by increasing the number of distractors in the MCQ
(distractors are wrong answers in a MCQ), or making hard distractors, i.e. wrong answers pretty
similar to the right answer.
I.3.4 Test 3: Cloze tests
Finally, the cloze test is the most reliable test to assess whether the student actually knows the
formula. As we discussed before, the numeric application is the easiest way to check whether the
formula is known. If we want to stick to the literal formula, the order of the formula has to be somehow
imposed to avoid the issue we evoked previously of various correct literal writings.
Another problem is the issue of the definition of variables through letters like a, m, n. If the student is
given both the names of variables and the structure of the formula, then this cloze test is not much
more than a drag-and-drop test from the cognitive point of view. There should be two sequential tests.
In the first one, students have to define the variables that are to be used.
• Cloze test, step 1: letters to represent the variables
The student has to give the different letters of the formula: m, n, k, etc.
• Cloze test, step 2: definition of the variables.
Science Team Piso-MOOC
Designing a New Learning Environment Course
For a given letter, m, the student has to write its meaning: m = slope of the line
• Cloze test, step 3: structure of the formula
Once variables are defined, students have to write the formula. A part of the structure of the formula
can be given to help, like the square root. Students will have to follow an imposed structure of the
formula for the cloze test to work.
• Cloze test, step 4: Numeric application
Once the formula is accepted, they have to apply it, with given values for variables.
These tests, drag-and-drop, MCQ, and cloze tests can be applied to various contexts. We took this
formula as an example because it is one of the most common objects to memorize, but there are
many other objects that can be memorized this way.
These tests are very common among memory orientated learning platforms. The variations
among these platforms like Memrise, Iknow.jp, Mnemosyne, and quizlet, lie in the sequence and the
frequency of these tests. For a given memorization task, students start with easy tests, and then tests
become increasingly difficult as students provide correct answers. Regarding frequency, it is important
to keep in mind that the fact that a student successfully answered a test at time t does not mean that
they can do it correctly a couple of hours later. It is important to make the difference between short-
and long-term memory. For instance, in Memrise, there are time frames for a given object during
which it is not possible to test it. It is only after having successfully answered the most difficult test at
different times (t0=0, t=t0+3h, t=t0+3 days) that the object is considered lodged in long-term memory.
What Memrise does for language learning, we hope to do for science learning. Basically, Memrise
helps students make elementary associations between words, i.e., it helps student learn words lists.
Users can only choose the words lists, but do not have any control upon the choice of the tests, the
sequence nor the frequency and impose tests that absolutely meaningless in science. Our memorizing
tool is therefore a mix between quiz makers, like Hot Potatoes, and memory-based tools like Memrise.
Figure 6: Associating cognition and competence to quizzes
Science Team Piso-MOOC
Designing a New Learning Environment Course
An additional functionality that could be add to this kind of tool is the notion of skill or competence. It is
of course subject to debate, and we will here present a very simple way of looking at it. In our case, all
the tests that we want to design will be associated with a skill. For instance, the formula cloze test is
associated with the skill “knowing how to compute error bars in calibration curves.” Skills can be
themselves nested to higher order skills. In our case, “how to compute error bars in calibration curves”
is nested in the skill “how to perform a bradford assay”. From the practical point of view, the user, a
teacher, should be able to add this notion of skill as a tag of a test, and create a hierarchy among tags
to account for skills being nested in one another. From the student point of view, a list of skills appears
in his dashboard, and the more tests he completes successfully, the more the associated skills are
considered acquired.
I.4 A need for critical thinking in experiments: the implementation of a
virtual laboratory
Although memorizing is an essential part of the learning process, it cannot alone develop creativity
and critical thinking. Tools must be developed to simulate of simple experiments. In the context of
science, these tools are often virtual laboratories.
For instance, the Mastering suite from Pearson offers virtual textbooks in various scientific domains,
ranging from engineering to biology. Some virtual laboratories are associated to these textbooks:
microscopy, molecular biology, systematics, ecology, genetics. They allow students to explore the
world of molecular biology through costless virtual experiments. Such environments can be pretty
realistic. Usually based on Flash, they allow students to manipulate virtual chemicals and biological
material through a drag-and-drop approach. Advanced virtual laboratories allow a certain degree of
freedom, experiments can fail if the experimental choices are inappropriate.
I.4.1 Virtual laboratories should include a student model
However, the main drawback of these virtual laboratories is that there is no associated student model.
Students play around with the buttons, but it is difficult to know what the student is thinking from this
“Push the button” approach. Playing randomly with virtual chemicals does not foster critical thinking
either. Moreover, another issue is the artificial intelligence behind the simulator, which is usually
simple even in the case of advanced virtual laboratories. Creating an authoring tool to enable teachers
to simulate virtual experiments is a challenging task.
I.4.2 Human vs. Automated evaluation of experimental choices
There are two approaches to a robust virtual laboratory: a fully automated virtual laboratory,
and a human-based virtual laboratory. In the human-based approach, students propose an experiment
virtually, a tutor reviews it, and then proposes an outcome that the students have to analyze. This
approach solves the issue of building a complex simulator, but raises the issue of scalability since it
implies finding competent tutors. Under this approach, the virtual laboratory is not much more than a
platform to exchange documents, pdfs, videos, etc.
The main difficulty with completely automated simulators is that replicating the complexity of
experimental environments is resource-intensive, due to the many combinations of chemicals and
experimental conditions (like temperature and time) that students might manipulate. We highlight the
fact that this simulator is not supposed to model biological or chemical processes; rather, it is about
learning and remembering. If the models underlying the simulator are too complex to be understood by
Science Team Piso-MOOC
Designing a New Learning Environment Course
students, then we miss the point. We suggest giving students the freedom to modify only one or two
parameters at a time, such as temperature or choice of chemical. In some ways it would perform like a
MCQ, but instead of having a feedback in the form of “Correct answer” or “Wrong answer”, the
feedback would represent the outcome of the experimental choice in the form of an image, a graph, or
other result. This kind of approach already exists; in Hot Potatoes is an example. Based on these
results, the students would have either to change the experiment condition, either to go further in the
experiment. In the latter case, the next question would depend upon the previous experimental choice.
Every aspect of an experiment could be divided into small parts and taught so that students get
mastery of the whole process. To avoid people answering randomly and not taking the output
seriously, there should be somehow a penalty if they do too many bad experimental choices. A
commonly followed approach is to associate a cost to any experimental choice, whether it is relevant
or not, like in Mastering Biology virtual labs. It would give students a better idea of how a real
experiment works, and would avoid the “randomly push the button” approach that we often see in
online learning. Instructors could also impose budgetary restraints on students to require them to think
about the costs of performing a given experiment.
Finally, unlike in most existing virtual laboratories, student choices would be recorded,
regardless of their relevance. After having completed the virtual experiment and understood all of its
aspects, we could test what they have learnt and memorized from it by following the same memory
based approach we have described previously. This virtual experiment simulator is not much more
than a quiz maker with complex feedback and pre-designed paths to go from one experimental choice
to the other. Of course, there will be an authoring tool associated so that teachers can design any
experiment they like. This authoring tool makes the difference between our virtual lab and existing
stand-alone solutions.
I.4.3 Collaboration in virtual experiments
We have described here a very individualistic approach of learning through the virtual lab. We
could add a social learning aspect to it. Students will work in teams and have to design the experiment
together, within budgetary constraints, as in the individualistic approach. They could gather in online
study rooms like in getstudyroom.com to design the experiment together. Some links towards existing
solutions like Study room should be included in the virtual lab to avoid having to develop features that
already exist elsewhere.
Science Team Piso-MOOC
Designing a New Learning Environment Course
Figure 7: Fostering collaboration: links towards Study room or other existing online solutions
I.4.4 Usual learning environment features
Of course, student activity, whether they work on the memorizing aspect or on the exploration
part of the virtual laboratory, should be recorded and displayed on a dashboard. They could choose
topics to learn or experiments to do, and could also earn medals when learning. Ongoing topics and
experiments are stored in the student’s library. We could also think about memorizing or experimental
design competitions like in Memrise. Basic social network tools like a mailbox, finding friend, and
eventually tutors would also be available. Since these aspects of the platform are not specific to our
project, we only mention these as future possibilities. They are mentioned here because some of them
are in the mock up.
Figure 8: Dashboard
Science Team Piso-MOOC
Designing a New Learning Environment Course
Appendix II - Additional resources
II.1 Links towards video tutorials
Serial dilution in Bradford Assay: http://cfscience.net/DNLE/Serial_Dilutions/Serial_Dilutions.htm
Providing the context: http://cfscience.net/DNLE/Protein_PreLab/Protein_PreLab.htm
Bradford assay procedure: http://cfscience.net/DNLE/Bradford_Assay/Bradford_Assay.htm
Video tutorial showing the real Bradford assay: http://www.youtube.com/watch?v=5It_AEOTYoM
II.2 Google presentation link
https://docs.google.com/presentation/d/1JAODI4rssuSR9VXw-1NUYHiCCsUMo-
Zw_kOeoHkFn8E/edit
II.3 Prezi link
http://prezi.com/ohf_brwdy54g/science-team-final-project-for-
dnle/?auth_key=77084dcb9f84fa7f6071f63939cca38ed5d06c8f&kw=view-ohf_brwdy54g&rc=ref-
17313360
II.4 Link to the mock up
Our project Mock up
https://www.dropbox.com/sh/mdbhboj0db16q6b/VVcAMcAncB
II.5 Bibliography
European Commision, 2012, COMMISSION STAFF WORKING DOCUMENT ;Assessment of Key Competences
in initial education and training: Policy Guidance Accompanying the document Communication from the
Commission Rethinking Education: Investing in skills for better socio-economic outcomes The key competences
for European citizens in the knowledge society
Google developers. (2012, September 27). Retrieved December 15, 2012, from Java Scripts Overlays:
https://docs.google.com/document/d/1r8cJ1ponVvuLdaJcfwbR5d3GlmlbHEnVpwYw0grpE5M/edit#
Hanne Shapiro,John René Keller Lauritzen, Pat Irving, 2011.Emerging Skills and Competences- A transatlantic
study , Emerging Skills and Competences - A transatlantic study
Khan Academy. (n.d.). Retrieved december 15, 2012, from Khan Academy: http://www.khanacademy.org/
Anderson Cognitive modeling and intelligent tutoring Artificial Intelligence 42 (1)p. 7-49 (1990)
http://act-r.psy.cmu.edu/papers/119/CogMod_IntTut.pdf
Kristina Brandstädter, Ute Harms, Jörg Großschedl Assessing System Thinking Through Different Concept-
Mapping Practices International Journal of Science Education 34 (14) p. 2147-2170 (2012)
http://dx.doi.org/10.1080/09500693.2012.716549
Brownell et al. Undergraduate Biology Lab Courses: Comparing the Impact of Traditionally Based “Cookbook”
and Authentic Research-Based Courses on Student Lab Experiences. Journal of College Science Teaching 41:4
(2012)
Science Team Piso-MOOC
Designing a New Learning Environment Course
Cobb et al. The Learning Gains and Student Perceptions of a Second Life Virtual Lab. Bioscience Education 13:1
(2009)
Corbett, et al. Intelligent Tutoring Systems Springer Berlin Heidelberg5091 (6) p. 31-40-40 (2008)
http://www.springerlink.com/content/b4197184w8667748/
Dadić. Intelligent Tutoring System for Learning Programming. Intelligent Tutoring Systems in ELearning
Environments Design Implementation and Evaluation p. 166 (2010)
http://www.igi-global.com/viewtitlesample.aspx?id=45547
Feng et al. Student modeling in an intelligent tutoring system. Intelligent Tutoring Systems in ELearning
Environments Design Implementation and Evaluation p. 208 (2010)
http://nth.wpi.edu/pubs_and_grants/papers/2008/Feng_Stude...
Gershon Computer adaptive testing. Journal Of Applied Measurement 6 (1) p. 109-127 (2005)
http://www.ncbi.nlm.nih.gov/pubmed/15701948
Elisabeth Georgiadou, Evangelos Triantafillou, Anastasios A Economides Evaluation parameters for computer-
adaptive testing. British Journal of Educational Technology 37 (2) p. 261-278(2006)
http://www.blackwell-synergy.com/doi/abs/10.1111/j.1467-8...
Girault Characterizing the Experimental Procedure in Science Laboratories: A preliminary step towards students
experimental design. International Journal of Science Education 34:825-854 (2012)
Holt, C. E., Abramoff, P., Wilcox, L. V.,& Abell, D. L. (1969). Investigative laboratory programs in biology: A
position paper of the commission undergraduate education in the biological sciences. BioScience, 19, 1104–1107.
H Kazi A Diverse and Robust Tutoring System for Medical Problem-Based Learning Supporting Learning Flow
through Integrative Technologies162 p. 659-660 (2007)
Kloser et al. Integrating Teaching and Research in Undergraduate Biology Laboratory Education PloS Biology
9:11 (2011)
Model et al. Modell, H. I., & Michael, J. A. Promoting active learning in the life sciences classroom: Defining the
issues. Annals of the New York Academy of Sciences, 701, 1–7. (1993).
OECD (Organisation for Economic Co-operation and Development). (2006). Evolution of student interest in
science and technology studies (Policy Report). Paris: Author.
Schulze Andes: An active learning intelligent tutoring system for Newtonian physics THEMES in Education 1 (2) p.
115–136 (2000) http://www.public.asu.edu/~kvanlehn/Stringent/PDF/00ANDES...
Shute et al. Intelligent tutoring systems: Past, present and future. Handbook of Research on Educational
Communications and Technology. Scholastic Publications. (1995)
Suebnukarn A Bayesian approach to generating tutorial hints in a collaborative medical
problem-based learning system Artificial Intelligence in Medicine (2006) 38, 5-2
Sykes A Prototype for an Intelligent Tutoring System for Students Learning to Program in Java
Advanced Technology for Learning 1 (1) http://www.actapress.com/PaperInfo.aspx?paperId=20654 (2004)
Science Team Piso-MOOC
Designing a New Learning Environment Course
Viswanathan A Comparison of Model-Tracing and Constraint-Based Intelligent Tutoring Paradigms The
International Journal of Artificial Intelligence in Education 15:1 (2005)
Zumbach et al. Learning Life Sciences: Design and Development of a Virtual Molecular
Biology Learning Lab. Journal of Computers in Mathematics and Science Teaching, 25, 281-
300. (2006)

Contenu connexe

Similaire à Dnle final project

Conole State Of The Art Review 2010 2 4
Conole State Of The Art Review 2010 2 4Conole State Of The Art Review 2010 2 4
Conole State Of The Art Review 2010 2 4grainne
 
Final report 1.0 - Good Practice Report
Final report 1.0 - Good Practice ReportFinal report 1.0 - Good Practice Report
Final report 1.0 - Good Practice ReportMike KEPPELL
 
Guidance on Personalised Learning
Guidance on Personalised LearningGuidance on Personalised Learning
Guidance on Personalised LearningTheSoFGr
 
School on the Cloud, D3.1
School on the Cloud, D3.1School on the Cloud, D3.1
School on the Cloud, D3.1Sofie De Cupere
 
Univirtual laboratorio eng
Univirtual laboratorio engUnivirtual laboratorio eng
Univirtual laboratorio engUnivirtual
 
INSPIRING SCIENCE EDUCATION ETWINNING WEBINAR
INSPIRING SCIENCE EDUCATION ETWINNING WEBINARINSPIRING SCIENCE EDUCATION ETWINNING WEBINAR
INSPIRING SCIENCE EDUCATION ETWINNING WEBINARSofoklis Sotiriou
 
Metis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshopMetis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshopYishay Mor
 
CULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENT
CULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENTCULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENT
CULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENTijejournal
 
Sandra dykes storyboard_week_9 multi presentation
Sandra dykes storyboard_week_9 multi presentationSandra dykes storyboard_week_9 multi presentation
Sandra dykes storyboard_week_9 multi presentationsandralynndykes
 
Designing for Change: Mash-Up Personal Learning Environments
Designing for Change: Mash-Up Personal Learning EnvironmentsDesigning for Change: Mash-Up Personal Learning Environments
Designing for Change: Mash-Up Personal Learning EnvironmentseLearning Papers
 
Sofoklis Sotiriou: Innovative Approaches and Tools in Astronomy Education
Sofoklis Sotiriou: Innovative Approaches and Tools in Astronomy EducationSofoklis Sotiriou: Innovative Approaches and Tools in Astronomy Education
Sofoklis Sotiriou: Innovative Approaches and Tools in Astronomy EducationGTTP-GHOU-NUCLIO
 
masterthesis-nico-final-digital
masterthesis-nico-final-digitalmasterthesis-nico-final-digital
masterthesis-nico-final-digitalNicolas Fricke
 
The open university_innovating_pedagogy_2014_0
The open university_innovating_pedagogy_2014_0The open university_innovating_pedagogy_2014_0
The open university_innovating_pedagogy_2014_0arnaud thomelin
 
Teaching Methods (Workshop, Project,Role play )
Teaching Methods (Workshop, Project,Role play )Teaching Methods (Workshop, Project,Role play )
Teaching Methods (Workshop, Project,Role play )DR .PALLAVI PATHANIA
 
Teacher behavior and student achievement.2
Teacher behavior and student achievement.2Teacher behavior and student achievement.2
Teacher behavior and student achievement.2Dr. Jorge Nelson
 
Teacher behavior and student achievement.2
Teacher behavior and student achievement.2Teacher behavior and student achievement.2
Teacher behavior and student achievement.2Dr. Jorge Nelson
 
From OpenCourseWare to Open CourseWare
From OpenCourseWare to Open CourseWareFrom OpenCourseWare to Open CourseWare
From OpenCourseWare to Open CourseWareBrandon Muramatsu
 

Similaire à Dnle final project (20)

Conole State Of The Art Review 2010 2 4
Conole State Of The Art Review 2010 2 4Conole State Of The Art Review 2010 2 4
Conole State Of The Art Review 2010 2 4
 
Flipped classroom-guidelines
Flipped classroom-guidelinesFlipped classroom-guidelines
Flipped classroom-guidelines
 
Final report 1.0 - Good Practice Report
Final report 1.0 - Good Practice ReportFinal report 1.0 - Good Practice Report
Final report 1.0 - Good Practice Report
 
Teaching strategies
Teaching strategiesTeaching strategies
Teaching strategies
 
Guidance on Personalised Learning
Guidance on Personalised LearningGuidance on Personalised Learning
Guidance on Personalised Learning
 
School on the Cloud, D3.1
School on the Cloud, D3.1School on the Cloud, D3.1
School on the Cloud, D3.1
 
Univirtual laboratorio eng
Univirtual laboratorio engUnivirtual laboratorio eng
Univirtual laboratorio eng
 
Innovating pedagogy 2014
Innovating pedagogy 2014Innovating pedagogy 2014
Innovating pedagogy 2014
 
INSPIRING SCIENCE EDUCATION ETWINNING WEBINAR
INSPIRING SCIENCE EDUCATION ETWINNING WEBINARINSPIRING SCIENCE EDUCATION ETWINNING WEBINAR
INSPIRING SCIENCE EDUCATION ETWINNING WEBINAR
 
Metis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshopMetis project deliverable D3.2: Draft of pilot workshop
Metis project deliverable D3.2: Draft of pilot workshop
 
CULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENT
CULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENTCULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENT
CULTIVATING STUDENTS'INNOVATIVE PRACTICE ABILITY IN HARDWARE EXPERIMENT
 
Sandra dykes storyboard_week_9 multi presentation
Sandra dykes storyboard_week_9 multi presentationSandra dykes storyboard_week_9 multi presentation
Sandra dykes storyboard_week_9 multi presentation
 
Designing for Change: Mash-Up Personal Learning Environments
Designing for Change: Mash-Up Personal Learning EnvironmentsDesigning for Change: Mash-Up Personal Learning Environments
Designing for Change: Mash-Up Personal Learning Environments
 
Sofoklis Sotiriou: Innovative Approaches and Tools in Astronomy Education
Sofoklis Sotiriou: Innovative Approaches and Tools in Astronomy EducationSofoklis Sotiriou: Innovative Approaches and Tools in Astronomy Education
Sofoklis Sotiriou: Innovative Approaches and Tools in Astronomy Education
 
masterthesis-nico-final-digital
masterthesis-nico-final-digitalmasterthesis-nico-final-digital
masterthesis-nico-final-digital
 
The open university_innovating_pedagogy_2014_0
The open university_innovating_pedagogy_2014_0The open university_innovating_pedagogy_2014_0
The open university_innovating_pedagogy_2014_0
 
Teaching Methods (Workshop, Project,Role play )
Teaching Methods (Workshop, Project,Role play )Teaching Methods (Workshop, Project,Role play )
Teaching Methods (Workshop, Project,Role play )
 
Teacher behavior and student achievement.2
Teacher behavior and student achievement.2Teacher behavior and student achievement.2
Teacher behavior and student achievement.2
 
Teacher behavior and student achievement.2
Teacher behavior and student achievement.2Teacher behavior and student achievement.2
Teacher behavior and student achievement.2
 
From OpenCourseWare to Open CourseWare
From OpenCourseWare to Open CourseWareFrom OpenCourseWare to Open CourseWare
From OpenCourseWare to Open CourseWare
 

Plus de Matthieu Cisel

Social Network Analysis report
Social Network Analysis reportSocial Network Analysis report
Social Network Analysis reportMatthieu Cisel
 
Intermediate statistics CY Tech
Intermediate statistics CY TechIntermediate statistics CY Tech
Intermediate statistics CY TechMatthieu Cisel
 
Rapport d'analyse Dimensionality Reduction
Rapport d'analyse Dimensionality ReductionRapport d'analyse Dimensionality Reduction
Rapport d'analyse Dimensionality ReductionMatthieu Cisel
 
Data wrangling project
Data wrangling projectData wrangling project
Data wrangling projectMatthieu Cisel
 
Brochure bachelor ygrec_2020_2021
Brochure bachelor ygrec_2020_2021Brochure bachelor ygrec_2020_2021
Brochure bachelor ygrec_2020_2021Matthieu Cisel
 
Descriptif du poste conseillers cellule d'appui
Descriptif du poste conseillers cellule d'appuiDescriptif du poste conseillers cellule d'appui
Descriptif du poste conseillers cellule d'appuiMatthieu Cisel
 
Prez MOOC French Touch Education
Prez MOOC French Touch EducationPrez MOOC French Touch Education
Prez MOOC French Touch EducationMatthieu Cisel
 
Gamification captain up mooc
Gamification captain up moocGamification captain up mooc
Gamification captain up moocMatthieu Cisel
 
Fonctionnement du cours
Fonctionnement du coursFonctionnement du cours
Fonctionnement du coursMatthieu Cisel
 
Comment suivre le mooc
Comment suivre le moocComment suivre le mooc
Comment suivre le moocMatthieu Cisel
 
Fonctionnement des forums
Fonctionnement des forumsFonctionnement des forums
Fonctionnement des forumsMatthieu Cisel
 
Fonctionnement de la plate forme
Fonctionnement de la plate formeFonctionnement de la plate forme
Fonctionnement de la plate formeMatthieu Cisel
 
[L mooc] présentation
[L mooc] présentation[L mooc] présentation
[L mooc] présentationMatthieu Cisel
 

Plus de Matthieu Cisel (18)

Time series report
Time series reportTime series report
Time series report
 
Social Network Analysis report
Social Network Analysis reportSocial Network Analysis report
Social Network Analysis report
 
Intermediate statistics CY Tech
Intermediate statistics CY TechIntermediate statistics CY Tech
Intermediate statistics CY Tech
 
Rapport d'analyse Dimensionality Reduction
Rapport d'analyse Dimensionality ReductionRapport d'analyse Dimensionality Reduction
Rapport d'analyse Dimensionality Reduction
 
Data wrangling project
Data wrangling projectData wrangling project
Data wrangling project
 
Brochure bachelor ygrec_2020_2021
Brochure bachelor ygrec_2020_2021Brochure bachelor ygrec_2020_2021
Brochure bachelor ygrec_2020_2021
 
Descriptif du poste conseillers cellule d'appui
Descriptif du poste conseillers cellule d'appuiDescriptif du poste conseillers cellule d'appui
Descriptif du poste conseillers cellule d'appui
 
Prez MOOC French Touch Education
Prez MOOC French Touch EducationPrez MOOC French Touch Education
Prez MOOC French Touch Education
 
Gamification captain up mooc
Gamification captain up moocGamification captain up mooc
Gamification captain up mooc
 
Fonctionnement du cours
Fonctionnement du coursFonctionnement du cours
Fonctionnement du cours
 
Proposition de projet
Proposition de projetProposition de projet
Proposition de projet
 
Comment suivre le mooc
Comment suivre le moocComment suivre le mooc
Comment suivre le mooc
 
Fonctionnement des forums
Fonctionnement des forumsFonctionnement des forums
Fonctionnement des forums
 
Fonctionnement de la plate forme
Fonctionnement de la plate formeFonctionnement de la plate forme
Fonctionnement de la plate forme
 
Prez epfl
Prez epflPrez epfl
Prez epfl
 
[L mooc] présentation
[L mooc] présentation[L mooc] présentation
[L mooc] présentation
 
Google map, photos
Google map, photosGoogle map, photos
Google map, photos
 
Study room screenshot
Study room screenshotStudy room screenshot
Study room screenshot
 

Dernier

mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersChitralekhaTherkar
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docxPoojaSen20
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 

Dernier (20)

TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
Micromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of PowdersMicromeritics - Fundamental and Derived Properties of Powders
Micromeritics - Fundamental and Derived Properties of Powders
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docx
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 

Dnle final project

  • 1. Science Team Designing a New Learning Environment Course A Plug-in for Science Enhancing the 21st century education: The intelligent tutoring spirit Science team proposal Piso-MOOC Designing a New Learning Environment Course in for Science-oriented MOOCs (PiSo Enhancing the 21st century education: The intelligent tutoring spirit Science team proposal 16 December 2012 oriented MOOCs (PiSo-MOOC)
  • 2. Science Team Piso-MOOC Designing a New Learning Environment Course Science team - Designing New Learning Environments Venture Lab Fall Course 2012
  • 3. Science Team Piso-MOOC Designing a New Learning Environment Course Summary 1. Our Rationale....................................................................................................................................... 5 2. Our System function and features ...................................................................................................... 5 2.1. State of the art ............................................................................................................................. 6 2.2. Technological requirements......................................................................................................... 6 2.3. Learning environment based on scientific research .................................................................... 7 2.4. Higher order learning (implementation of Bloom’s taxonomy)................................................... 7 2.5. Learning sequence (curriculum sequencing)................................................................................ 7 3. Our Target Audience and Learning Conditions ................................................................................... 7 3.1. Target audience............................................................................................................................ 7 3.2. Student with disabilities............................................................................................................... 8 3.3. Student profiles............................................................................................................................ 8 4. Our Business Model/implementation.................................................................................................8 4.1. Required technological and human resources............................................................................. 8 4.2. Business model............................................................................................................................. 8 4.3. Contingency planning................................................................................................................... 9 Appendix I - Prototype description: Bradford assay laboratory exercise ............................................... 9 I.1 Automated tests: when and why? ................................................................................................. 9 I.2 Automated assessment.................................................................................................................... 10 I.2.1 Testing a sequence of tasks....................................................................................................... 10 I.2.2 Learning the right gesture......................................................................................................... 13 I.2.3 Memorizing formulae................................................................................................................ 14 I.3 Evaluation as a teaching methodology ........................................................................................ 15 I.3.1 Memory target: Bradford assay, error bars formula............................................................. 15 I.3.2 Test 1: Drag-and-drop ........................................................................................................... 15 I.3.3 Test 2: MCQ of increasing difficulty ...................................................................................... 16 I.3.4 Test 3: Cloze tests.................................................................................................................. 16 I.4 A need for critical thinking in experiments: the implementation of a virtual laboratory............ 18 I.4.1 Virtual laboratories should include a student model............................................................ 18 I.4.2 Human vs. Automated evaluation of experimental choices ................................................. 18 I.4.3 Collaboration in virtual experiments..................................................................................... 19
  • 4. Science Team Piso-MOOC Designing a New Learning Environment Course I.4.4 Usual learning environment features.................................................................................... 20 Appendix II - Additional resources ........................................................................................................ 21 II.1 Links towards video tutorials ...................................................................................................... 21 II.2 Google presentation link............................................................................................................. 21 II.3 Prezi link ...................................................................................................................................... 21 II.4 Link to the mock up..................................................................................................................... 21 II.5 Bibliography................................................................................................................................. 21
  • 5. Science Team Piso-MOOC Designing a New Learning Environment Course 1. Our Rationale Basic understanding of science and learning to learn have been recognized as the key competencies for the 21st century (The key competences for European citizens in the knowledge society). As described in this document, “competence in science refers to the willingness to use the body of knowledge and methodology employed to explain the natural world in order to identify questions and to draw evidence-based conclusions”, while “learning to learn is pursuing and persisting in learning, organising one’s own learning, including through effective management of time and information, both individually and in groups”. Moreover, critical thinking and problem solving skills have been labeled with the highest ranking among the skills with increasing demands in today’s society in both USA and EU (Emerging Skills and Competences - A transatlantic study). Those skills include effective reasoning, systems thinking, making judgements and decisions based on the critical analysis of evidence and identifying and asking significant questions that lead to better solutions. Training the new generation of experts according to these guidelines is therefore the global aim of education systems throughout the world. Including scientific training in the curriculum could provide a basis for the development of key competencies for 21st century professionals. Unfortunately, many students switch from STEM majors early in their educational career, and this attrition rate hasn’t changed much in the past few decades. Often, even students studying sciences lack a deep understanding of the theory behind the techniques that are used in their respective fields, have problems organizing their learning and show inadequate critical thinking and problem solving skills. A possible cause for this problem could be insufficient or poorly implemented laboratory exercises. Instead of deepening the understanding, laboratory exercises encourage repetition, following the procedures without understanding and offer no challenge or possibility for problem-based inquiry. In addition, high costs of laboratory equipment and consumables greatly inhibits any innovation in this type of learning (learning by doing and learning through research). Our proposal describes a novel learning environment based on scientific evidence offering an experimental training in a virtual laboratory as an added value to existing campus-based theoretical courses or recently extremely popular MOOCs (Massive Open Online Courses). Our design would offer a tool for teachers of laboratory sciences to engage students outside of the laboratory. This could serve as a preview to a laboratory exercise or as a review of a laboratory exercise. In addition, in environments that do not have access to a science lab, our design can serve as a way for students to learn laboratory procedures without a large investment in resources. Moreover, our modular design allows for high flexibility and evolvability to suit any content and learning objectives, with special emphasis on the development of deep understanding of material, critical thinking and learning to learn competencies. Finally, the flexibility of our design could allow for implementation of this learning environment to any discipline and educational level, allowing the teaching of meta-competences in a broader sense to a much wider audience. 2. Our System function and features We believe that laboratory exercises are crucial for the development of scientific thought and understanding. Our proposal enhances the experience for the students and the instructors despite increasing demands on instructors due to class size and time limitations. Our design aims to offer teachers a way to reach larger numbers of students, as well as to engage those students outside of labs and classrooms.
  • 6. Science Team Piso-MOOC Designing a New Learning Environment Course We base our system on the needs we identified for modern science teaching: • clear visualizations of laboratory procedures • easy memorization of lab tasks, supported by audio and video material, and repetition • critical thinking and understanding of the procedures prior to class, using assessment tools and exercises • large-scale and low-cost open and accessible tools • intuitive, fun, and learner-centered learning environment that engages students in content and develops crucial skills 2.1. State of the art The recent success of Massive Open Online Courses has highlighted the importance of developing online based learning tools. MOOCs enable teachers to efficiently deliver pedagogical content and assignments. Nevertheless, the efficiency of this learning approach is often questioned, since it is not clear what the benefits of learning a topic through a MOOC are. In MOOC science classes, remembering and developing critical thinking are essential. However, to our knowledge, there is no automated procedure to assess precisely what students remember from the course, and how well they would perform in a situation where they have to apply their knowledge to real problems, by having to design experiments by themselves for instance. We will describe a tool that assesses students’ automatically, with a focus on memory and creativity. In many ways, the tool belongs to a large family of quiz makers like Hot potatoes, Adobe Captivate, and so many others. The difference lies in the fact that we add a memory component to it; our philosophy is very close to the one that underlies cognition-based learning platforms like Memrise or Iknow.jp. The main difference lies in the focus; Memrise is designed towards learning vocabulary in foreign languages, i.e. word lists. Memrise, permits little control over the nature of tests, and in most quiz makers, there is no control upon the sequence and the frequency of testing. We would therefore combine the advantages of classic quiz makers and Memrise-like tools in one platform to create a powerful online learning tool. Moreover, we would like this tool to be used in the context of virtual experiments in order to help students acquire critical thinking over what they do. Several virtual laboratories have been developed in order to perform low cost experiments, but as of yet, they have had very limited success, partly because the value they add to the learning process is unknown. In our virtual laboratory, there will be a strong focus on student evaluation in order to avoid this problem. To enhance the learning experience for the learner, we plan to integrate knowledge maps and game elements. Knowledge maps have already been used to guide self-directed learners in Khan Academy Math lessons , where the implementation was done using Google maps overlays. Gamification in education is gaining more and more visibility and impacts course design and curriculum development (see infographic), both in online systems like the Khan Academy, but also in off-line systems, such as a school-based A Quest to Learn. We hope to provide a learning environment where the learner has instant feedback of its progress towards mastery and where we offer reminders and recommendations of suitable activities to enhance both memorization and understanding of the content. We hope to create an environment where both instructors and students can set specific learning goals and where every action taken towards a learning goal is recognized by giving points and badges to motivate the learner. In addition, we plan on integrating existing social networks to allow users to share their success stories, as well as to compete in challenges with other users. These elements will hopefully provide an interesting and dynamic social learning environment that would attract users to use the system more frequently. 2.2. Technological requirements Our design would be optimized for online learning, requiring a stable Internet connection, but an offline version would be made available for students with unreliable Internet access. The learning
  • 7. Science Team Piso-MOOC Designing a New Learning Environment Course environment is designed to be used on most common learning devices, including tablets, smartphones and personal computers. social networking will be integrated to. 2.3. Learning environment based on scientific research To implement our design, we draw on research in both intelligent tutoring systems and computer adaptive testing. Intelligent tutoring systems has been a very intensive field of research in the seventies (Corbett et al. 2008). Though recently research has waned, findings in the field have been applied to various topics, ranging from programming (Sykes 2007, Dadic 2010) to physics (Schulze 2000) and medical studies (Suebnukarn 2006, Kazi 2007) . The computer adaptive testing approach we draw on (Gerson 2005, Georgiadou 2006) follows the philosophy of cognitive tutoring systems that highlight the importance of the student cognition, including memory and misconceptions (Aderson 1990, Shute et al.1995, Colace 2006, Feng 2012, Branstadter 2012). Regarding virtual laboratories, we refer to the inquiry based pedagogy that has been encouraged from the seventies (Holt et al. 1969, OCDE 2006 Kloser 2012, Brownell 2012). The actual implementation of this pedagogy with numeric technologies has been described in the virtual laboratories literature (Zumbach 2006, Cobb 2009, Girault 2012). 2.4. Higher order learning (implementation of Bloom’s taxonomy) Our system will address the following learning objectives: • remembering will be enforced through cognition-based features, such as memorization • understanding will be evaluated through quizzes • applying will be presented through problem-based exercises • analyzing will be developed through comparing the results of virtual experiments • evaluating will be trained by choosing and prioritizing the steps of procedures in a virtual experiment • creating will be taught through experimental design and planning of learning activities • learning will globally enhanced through mastering both content-specific knowledge and acquiring meta-competencies (critical thinking, creativity, learning to learn) 2.5. Learning sequence (curriculum sequencing) Our learning platform is aimed at fostering memorization and learning through trials and error via virtual experiments. It will include an authoring tool to enable teachers to design and implement their own pedagogy. Teachers and self-learners will be responsible for the learning sequence that will be implemented, as some people prefer to experiment before memorizing whereas others prefer the other way round. Ultimately, however, learners following the same module will be evaluated in the same way. 3. Our Target Audience and Learning Conditions 3.1. Target audience We are targeting students attending MOOCs or institutions where practical laboratory experience is not available for financial or other reasons. Even in developed countries, the expense of
  • 8. Science Team Piso-MOOC Designing a New Learning Environment Course laboratory exercises is a limiting factor to having each student perform a laboratory exercise in a cost- efficient manner. Initially, our design would be geared towards university undergraduates in scientific fields via their instructors who would be able to design an optimal learning environment (by either customizing their own module or using an “off-the-shelf” version). Our design could also be implemented in schools or for a general audience, and can be customized for use in other fields that might benefit from memorization, procedural planning, and automatic evaluation modules. 3.2. Student with disabilities The complementary use of videos with narratives, subtitles, and supporting material will allow equal access to students with disabilities, particularly hearing or vision disabilities. The customizable design, which can be modified to suit individual learning pace, offers a nurturing environment for students with learning disabilities, including autism spectrum disorders. 3.3. Student profiles The problem of developmental-appropriateness of our design is solved by the fact that the instructor will create the content, ensuring optimization for the students’ learning stage. Subtle modifications can be easily introduced to teach the same content to different education levels or developmental stages of students. More specifically, in the beginning of each virtual exercise the student will be presented with an evaluation of previous knowledge and they will be asked to auto- evaluate their motivation and learning skills. According to both content-related aspects and self- directed learning level, the student can be offered different content depths and more or less complex activities, and/or have a learning path tailored towards individual objectives and learning styles. 4. Our Business/Implementation Model 4.1. Required technological and human resources Development of our design requires curriculum designers, educators and programmers to form a solid interdisciplinary team. The team should also have an expert who would be offering workshops to educate instructors on how to use our design and integrate it in the existing framework. 4.2. Business model Although we would be a non-profit entity, we would use a for-profit model to increase the business’s self-sufficiency and reduce our need for outside donors. We would offer two tiers of service: for instructors interested in experimenting with the platform, a limited version would be free; for instructors interested in using a fully featured version for academic classes, we would offer renewing yearly and quarterly subscriptions that would track the academic year. Outside philanthropists could also purchase subscriptions for other groups of users. For example, a charitable organization might sponsor a subscription to the platform for a school in need. We hope to initially launch through a crowdfunding source like Kickstarter, which would allow us to present a prototype (and demonstrate value to potential users) before making large investments. This funding would allow us to develop the platform and create several stock modules that would both demonstrate to potential users what the platform is capable of and be products investors could immediately implement, if desired. As our module bank grew, we would open the platform up for subscriptions, giving discounts and other rewards to initial investors. One of the goals would be to foster instructor interaction through the platform to enable the exchange of knowledge between instructors and promote the diffusion of innovation. Modules created by any instructor would
  • 9. Science Team Piso-MOOC Designing a New Learning Environment Course be public by default, and instructors would have the ability to discuss, adopt, and adapt modules created by other instructors. Once subscribed, instructors can create as many modules as they need and for as many different courses as they desire, which would incentivize subscribers to invest in using the platform. We would also issue regular tutorials for both students and instructors on how to make their experience using the platform as valuable as possible. We plan to gain additional revenue from “swag” sales and events between other potential partners in the learning space. 4.3. Contingency planning Although subscription models can work well once implemented, probably the most difficult part would be demonstrating the value of the platform to initial subscribers. Many crowdfunded projects fail because the product was ultimately more expensive to develop than anticipated or because the business could not transition into a sustainable business model. To avoid this problem, we plan on building collaborative partnerships with private and public organizations, including governments and ministries of education interested in enhancing the current educational system, as well as private organizations that would be interested in increasing their visibility through backing up projects such as ours. In the long-run, attaining a critical mass of users to our product would be essential for success. To build a growing and active community, we will invest a significant amount of time in communications and public relations, primarily through social networks and targeted advertising, to ensure the influx of interested instructors and students. Appendix I - Prototype description: Bradford assay laboratory exercise For the purposes of our demonstration, our target audience is college freshmen or sophomores with some basic chemistry background.We have focused on one specific laboratory procedure, Bradford protein assay, to illustrate how our proposed system works. Bradford assay is used to measure the total protein concentration in the solution. In this assay, a dye Coomasie blue shifts from its red form to the blue form once bound to proteins. The change can be quantified by spectrophotometry - the increase of absorbance at 595 nm is proportional to the concentration of proteins in the sample. Bradford assay laboratory exercise offers several higher-order concepts the students should learn, including experimental design, statistical analysis, standard error, using a reference curve, critical thinking and collaboration. I.1 Automated tests: when and why? Evaluating student skills is one of the main issues associated with online learning platforms. This evaluation can be either automatic or done by humans. In the latter case, the main problem is scalability. Human evaluation implies that either an instructor or a peer spend some time evaluating students outcomes. An instructor can evaluate only a limited number of students and peer evaluation also has its drawbacks. Although it is possible to assess whether homework has been submitted and whether evaluation criteria have been followed through peer evaluation, most peers, particularly for sciences courses, do not have the knowledge required to assess whether a skill has been acquired. There is therefore a strong need to improve automated evaluation techniques, especially for scientific skills. Many skills cannot be assessed through automated evaluation. For instance, there is no automated way to assess the quality of an argument or of a demonstration. Automated evaluation is therefore restricted to a subset of skills, and it is mostly based on tests like multiple choice questions, cloze tests,
  • 10. Science Team Piso-MOOC Designing a New Learning Environment Course matching tests, among others. This approach is widely followed in MOOCs like Udacity, edX, and Coursera. Here, we will discuss the best uses of automated tests for assessing scientific skills. We will use the Bradford assay as an example. A test may have different objectives. Here are three different aims of a test: • Evaluating the student before the beginning of a course. This approach allows an instructor to assess whether a student has the prerequisites to attend the course and also provides the instructor with an opportunity to adapt their teaching methods to the needs of the students. • Evaluating the student after a course. This type of assessment may help determine whether specific skills have been acquired. • Evaluating during the course. This can provide an opportunity to check whether the concepts or procedures, taught through tutorials for instance, have been correctly understood. Evaluations can also be viewed as teaching methods as they require memory recollection and therefore strengthen skill acquisition. This philosophy underlies sites such as Memrise. We will consider two types of skills: cognitive and procedural. Both types of skills are needed to perform a Bradford assay. Procedural skills include the correct gestures necessary to perform the assay, like preparing dilutions and manipulating chemicals. Cognitive skills include mathematical computations, such as applying calculus and proper formula use. Figure 1: Left, Bradford Assay video tutorial on Youtube. Right, theoretical step-by-step explanation (Adobe Captivate, thanks to Dr. Karen Champ) The first step, before evaluation, is the passive assimilation of knowledge through lectures, video tutorials or any kind of document. To illustrate this aspect of the work we designed some introductory lectures to the Bradford Assay explaining its context, lab procedures and the correct use of formulae. In addition to these explanations, students will go through a video tutorial showing the whole Bradford assay. After having watched the videos, students can proceed to the evaluation itself. I.2 Automated assessment I.2.1 Testing a sequence of tasks A Bradford assay is a sequence of tasks, beginning with the dilution of the original solution and ending with the calculation of the protein concentration of an unknown sample. Before assessing whether students know how to perform each individual task, an instructor may want
  • 11. Science Team Piso-MOOC Designing a New Learning Environment Course to evaluate if the students know the correct sequence of the tasks that are to be performed. One easy way to assess this automatically is to use a drag-and-drop test. In this case, a name would be given to each particular task. This would include names such as "dilution", "calculation of error bars", etc. The tasks are then presented to the student in a random order, and the students are required to “drag and drop” (i.e. rearrange) the tasks to put them in the correct order. This exercise makes sense in a number of situations where the sequence is not obvious. However, in the case of the Bradford assay, this type of assessment is not challenging enough due to the limited number of tasks. Common sense is sufficient to know that one cannot create the calibration curve before having done the dilutions. Figure 2:Learning a sequence of tasks: the drag-and-drop matching test Moreover, the fact that a student can rearrange the tasks in the correct order does not mean that they would be able to recollect all of these tasks. For example, a student may be able to place tasks D, C, B, A in the correct order A, B, C, D if asked to do so. However, if asked to perform the Bradford Assay, the student may forget task C and propose the sequence A, B, D. In this case, tasks are in the proper order but one is missing. The student is, therefore, not able to perform the Bradford assay by himself. This test is too easy to be valuable In comparison, it can be difficult to use a cloze test to assess whether a student knows a specific sequence. In a cloze test, the student must type exactly (or nearly exactly) what the computer expects. In some cases, a small range of answers are acceptable. One difficulty is that the designer of the cloze test cannot easily predict the way in which students will refer to a given task, as well as what they will consider to be a task to be included versus a subtask that should not be included. Even if the student knows the total number of tasks he is expected to list, it is unlikely that the student’s answer will match the computer’s expectations. This is a major drawback of current automated tests. However, we can use the cloze test as a final test and assume that students can learn by heart our way of referring at the different steps of the experiment. One possible approach to avoid this cloze test issue is to rely on a sequence of multiple choice questions. With simple MCQs, users must choose the correct answer to a question from among four (or more) different answers. After submitting the answer, the user is usually provided feedback, at least whether his answer was correct or not. With a sequence of MCQs, the approach is slightly different. This is because the user needs to answer multiple, interconnected MCQs before submitting their responses. First MCQ: What is step 1? Second MCQ: What is step 2? Third MCQ: What is step 3? After selecting an answer for each question, the user submits these for grading. If there is only one correct answer to a given test, there is a 1/96 chance (one out of 4x4x4=96 ) of randomly getting the correct answer if there are four answer choices for each included question. The higher the number of MCQs that are included in the sequence, the lower the probability that the correct answer will be selected randomly.
  • 12. Science Team Piso-MOOC Designing a New Learning Environment Course Figure 3: Learning a sequence of tasks: the MCQ This approach also has drawbacks. Since this test is more difficult than the previously described drag and drop test, it is more reliable for determining whether the student really knows the sequence. However, since there are many potential incorrect answers, it is hard to infer the associated misconception from those incorrect answers. Correspondingly, it is difficult to guide accordingly. Intelligent tutoring systems research indicates that this is a common issue associated with constraint- based models. With constraint-based models, the answer provided by the student must meet certain constraints in order to be accepted as correct. An easy way to deal with this issue is to run through the sequence proposed by the student and stop at the MCQ where the mistake occurred. This is an easy way for a computer to infer what the corresponding misconception is. This type of issue has been addressed in previous papers (Viswanathan 2005). One might argue that few differences exist between a sequence of MCQs and a list of MCQs, but this is not entirely true. When considering a list of MCQs, the MCQs are independent from one another. With a sequence of MCQs, all of the MCQs are interconnected. They all have to be answered correctly for the student to demonstrate that the sequence is known. Moreover, with a sequence of MCQs, a given question may depend on the answer of the previous MCQ, i.e., the second MCQ will depend upon the answer to the first MCQ. This approach is referred to as a Computer Adaptive Test or CAT, and it is widely used to assess the fluency of a student in a language. Since the use of CAT in experimental sciences is still subject to debate, we will not discuss this further.
  • 13. Science Team Piso-MOOC Designing a New Learning Environment Course Figure 4: Learning a sequence of tasks: the cloze test I.2.2 Learning the right gesture The whole Bradford assay can be decomposed into a sequence of individual gestures, and each gesture can be evaluated individually. Evaluating whether a student performs the right gestures is easier than evaluating cognitive skills. Even practical tasks such as the dilution or the correct use of a pipette can be assessed through automated tests. The most rigorous way of evaluating student gestures would be to have an instructor analyze a video of the student practicing the gesture, but if we want to stick to automated evaluation, there are a handful of approaches. First, a video tutorial showing the whole Bradford assay with the different gestures would be shown, with oral explanations or subtitles. Since this video can be pretty long, from five to ten minutes, there are two ways to test. Either the test takes place during the tutorial, an approach that is followed by many MOOCs like Udacity and Coursera, or it takes place after it. The first kind of test is to show a video gesture without any comment, and to ask through a MCQ whether the gesture was performed correctly or not, and if not what is wrong. Let us call this kind of test the wrong gesture analysis. Another way of using the MCQ is to show four different ways of performing it, and ask the student to assess which video shows the correct gesture. MCQ is perfectly fit for this approach, since there are not so many different ways of failing a given task. The few most common mistakes can be identified and students can be trained to detect them through the approach we described, and therefore avoid them.
  • 14. Science Team Piso-MOOC Designing a New Learning Environment Course Figure 5: MCQ and choice of the right gesture based on videos We believe this approach can be more efficient to assess student skills than real-world, in- person courses. In in-person courses, there are usually around five to ten students per instructor, if not more. It is impossible for an instructor to correct all of the students at the same time; as a result, many students may perform an incorrect gesture without being noticed by the instructor. Moreover, for some tasks, even close attention from the instructor is not enough to detect whether the task is done correctly. A typical example in the case of the Bradford assay is the use of the pipette. When sampling a liquid with a pipette, one has to avoid sampling too much liquid by pressing the button too much. This is a very common mistake among students that decreases measurement precision. It is difficult to assess whether a student is making this mistake just by looking at him. A properly designed MCQ would be more efficient in this particular case to assess whether the students know what the right gesture is. I.2.3 Memorizing formulae Performing the Bradford assay implies more than just performing the right gesture. Students have to calculate volumes in order to do the dilution properly. Then they have to create a calibration curve by measuring the absorbance of samples of known concentrations of proteins. Once the calibration curve is done, they have to measure the absorbance of the sample of interest and to deduce its concentration in protein. This last step involves calculus skills since the computation of the error bars involves the use of a fairly complex formula. Both MCQ and cloze tests are usually used in this situation. A student may be asked to give a number through a cloze test. It can be a volume, a concentration, or something else. There are two options. Either this value is exact, or it is within a range of values. The latter case is not very common, though possible, as most cloze tests require a perfect match between a student's answer and the expected answer. The advantage of using cloze tests for numerical values is that it is easy to assess automatically whether the student got the right answer. This value can be complex enough to prevent the student from finding it without applying perfectly the formula. Moreover, it can be a way to assess whether a student can do complex calculations without making mistakes. It is widely used in international tests of mathematics, computer science, and other similar disciplines.
  • 15. Science Team Piso-MOOC Designing a New Learning Environment Course The drawback of this approach is that it is not easy to assess whether a student reasons correctly. Or on the other hand, a student may know the formula but make mistakes in the calculation. It is hard to assess through cloze tests whether a student knows the formula since there are different ways of writing a formula, from the choice of the symbol to represent a variable to the order of the variables within the formula. MCQ and drag-and-drop can be an intermediary solution. I.3 Evaluation as a teaching methodology We have discussed some different methods of automatic assessment for the sequence of tasks that are to be performed in a Bradford assay, the right gestures, and the correct use of formulae. We would like to highlight that evaluation can also be a learning technique. Active recollection of memories through these tests is a more efficient way of strengthening a memory than reading or watching tutorials. This is one of the basis of the learning by doing philosophy. Websites like Memrise and Iknow.jp are based on this principle. In this approach, memories are strengthened through tests of increasing difficulty (drag-and-drop and MCQ being relatively easy tests and cloze tests being more difficult). For instance, here is the formula that describes the error bar associated to the measurements in the Bradford Assay. I.3.1 Memory target: Bradford assay, error bars formula (variables in http://en.wikipedia.org/wiki/Calibration_curve) Say we want students to memorize this formula. The easiest test is a drag-and-drop, where students have to place variables in the right place. I.3.2 Test 1: Drag-and-drop Here is a view of the drag and drop test.
  • 16. Science Team Piso-MOOC Designing a New Learning Environment Course The number of possible combinations in this particular case is 6! The first issue is that there is no difference between these two : This kind of issue is manageable at this stage because there are only two different options, but the bigger the formula is, the trickier this issue becomes. The cloze test is almost impossible to apply to literal formulas due to the many different ways of writing it. This is the reason why the validation of a formula through a cloze test is easier through its numeric application. I.3.3 Test 2: MCQ of increasing difficulty The second kind of test relies on MCQs. In this test, students have to find the right place It is possible to increase the difficulty of the test by increasing the number of distractors in the MCQ (distractors are wrong answers in a MCQ), or making hard distractors, i.e. wrong answers pretty similar to the right answer. I.3.4 Test 3: Cloze tests Finally, the cloze test is the most reliable test to assess whether the student actually knows the formula. As we discussed before, the numeric application is the easiest way to check whether the formula is known. If we want to stick to the literal formula, the order of the formula has to be somehow imposed to avoid the issue we evoked previously of various correct literal writings. Another problem is the issue of the definition of variables through letters like a, m, n. If the student is given both the names of variables and the structure of the formula, then this cloze test is not much more than a drag-and-drop test from the cognitive point of view. There should be two sequential tests. In the first one, students have to define the variables that are to be used. • Cloze test, step 1: letters to represent the variables The student has to give the different letters of the formula: m, n, k, etc. • Cloze test, step 2: definition of the variables.
  • 17. Science Team Piso-MOOC Designing a New Learning Environment Course For a given letter, m, the student has to write its meaning: m = slope of the line • Cloze test, step 3: structure of the formula Once variables are defined, students have to write the formula. A part of the structure of the formula can be given to help, like the square root. Students will have to follow an imposed structure of the formula for the cloze test to work. • Cloze test, step 4: Numeric application Once the formula is accepted, they have to apply it, with given values for variables. These tests, drag-and-drop, MCQ, and cloze tests can be applied to various contexts. We took this formula as an example because it is one of the most common objects to memorize, but there are many other objects that can be memorized this way. These tests are very common among memory orientated learning platforms. The variations among these platforms like Memrise, Iknow.jp, Mnemosyne, and quizlet, lie in the sequence and the frequency of these tests. For a given memorization task, students start with easy tests, and then tests become increasingly difficult as students provide correct answers. Regarding frequency, it is important to keep in mind that the fact that a student successfully answered a test at time t does not mean that they can do it correctly a couple of hours later. It is important to make the difference between short- and long-term memory. For instance, in Memrise, there are time frames for a given object during which it is not possible to test it. It is only after having successfully answered the most difficult test at different times (t0=0, t=t0+3h, t=t0+3 days) that the object is considered lodged in long-term memory. What Memrise does for language learning, we hope to do for science learning. Basically, Memrise helps students make elementary associations between words, i.e., it helps student learn words lists. Users can only choose the words lists, but do not have any control upon the choice of the tests, the sequence nor the frequency and impose tests that absolutely meaningless in science. Our memorizing tool is therefore a mix between quiz makers, like Hot Potatoes, and memory-based tools like Memrise. Figure 6: Associating cognition and competence to quizzes
  • 18. Science Team Piso-MOOC Designing a New Learning Environment Course An additional functionality that could be add to this kind of tool is the notion of skill or competence. It is of course subject to debate, and we will here present a very simple way of looking at it. In our case, all the tests that we want to design will be associated with a skill. For instance, the formula cloze test is associated with the skill “knowing how to compute error bars in calibration curves.” Skills can be themselves nested to higher order skills. In our case, “how to compute error bars in calibration curves” is nested in the skill “how to perform a bradford assay”. From the practical point of view, the user, a teacher, should be able to add this notion of skill as a tag of a test, and create a hierarchy among tags to account for skills being nested in one another. From the student point of view, a list of skills appears in his dashboard, and the more tests he completes successfully, the more the associated skills are considered acquired. I.4 A need for critical thinking in experiments: the implementation of a virtual laboratory Although memorizing is an essential part of the learning process, it cannot alone develop creativity and critical thinking. Tools must be developed to simulate of simple experiments. In the context of science, these tools are often virtual laboratories. For instance, the Mastering suite from Pearson offers virtual textbooks in various scientific domains, ranging from engineering to biology. Some virtual laboratories are associated to these textbooks: microscopy, molecular biology, systematics, ecology, genetics. They allow students to explore the world of molecular biology through costless virtual experiments. Such environments can be pretty realistic. Usually based on Flash, they allow students to manipulate virtual chemicals and biological material through a drag-and-drop approach. Advanced virtual laboratories allow a certain degree of freedom, experiments can fail if the experimental choices are inappropriate. I.4.1 Virtual laboratories should include a student model However, the main drawback of these virtual laboratories is that there is no associated student model. Students play around with the buttons, but it is difficult to know what the student is thinking from this “Push the button” approach. Playing randomly with virtual chemicals does not foster critical thinking either. Moreover, another issue is the artificial intelligence behind the simulator, which is usually simple even in the case of advanced virtual laboratories. Creating an authoring tool to enable teachers to simulate virtual experiments is a challenging task. I.4.2 Human vs. Automated evaluation of experimental choices There are two approaches to a robust virtual laboratory: a fully automated virtual laboratory, and a human-based virtual laboratory. In the human-based approach, students propose an experiment virtually, a tutor reviews it, and then proposes an outcome that the students have to analyze. This approach solves the issue of building a complex simulator, but raises the issue of scalability since it implies finding competent tutors. Under this approach, the virtual laboratory is not much more than a platform to exchange documents, pdfs, videos, etc. The main difficulty with completely automated simulators is that replicating the complexity of experimental environments is resource-intensive, due to the many combinations of chemicals and experimental conditions (like temperature and time) that students might manipulate. We highlight the fact that this simulator is not supposed to model biological or chemical processes; rather, it is about learning and remembering. If the models underlying the simulator are too complex to be understood by
  • 19. Science Team Piso-MOOC Designing a New Learning Environment Course students, then we miss the point. We suggest giving students the freedom to modify only one or two parameters at a time, such as temperature or choice of chemical. In some ways it would perform like a MCQ, but instead of having a feedback in the form of “Correct answer” or “Wrong answer”, the feedback would represent the outcome of the experimental choice in the form of an image, a graph, or other result. This kind of approach already exists; in Hot Potatoes is an example. Based on these results, the students would have either to change the experiment condition, either to go further in the experiment. In the latter case, the next question would depend upon the previous experimental choice. Every aspect of an experiment could be divided into small parts and taught so that students get mastery of the whole process. To avoid people answering randomly and not taking the output seriously, there should be somehow a penalty if they do too many bad experimental choices. A commonly followed approach is to associate a cost to any experimental choice, whether it is relevant or not, like in Mastering Biology virtual labs. It would give students a better idea of how a real experiment works, and would avoid the “randomly push the button” approach that we often see in online learning. Instructors could also impose budgetary restraints on students to require them to think about the costs of performing a given experiment. Finally, unlike in most existing virtual laboratories, student choices would be recorded, regardless of their relevance. After having completed the virtual experiment and understood all of its aspects, we could test what they have learnt and memorized from it by following the same memory based approach we have described previously. This virtual experiment simulator is not much more than a quiz maker with complex feedback and pre-designed paths to go from one experimental choice to the other. Of course, there will be an authoring tool associated so that teachers can design any experiment they like. This authoring tool makes the difference between our virtual lab and existing stand-alone solutions. I.4.3 Collaboration in virtual experiments We have described here a very individualistic approach of learning through the virtual lab. We could add a social learning aspect to it. Students will work in teams and have to design the experiment together, within budgetary constraints, as in the individualistic approach. They could gather in online study rooms like in getstudyroom.com to design the experiment together. Some links towards existing solutions like Study room should be included in the virtual lab to avoid having to develop features that already exist elsewhere.
  • 20. Science Team Piso-MOOC Designing a New Learning Environment Course Figure 7: Fostering collaboration: links towards Study room or other existing online solutions I.4.4 Usual learning environment features Of course, student activity, whether they work on the memorizing aspect or on the exploration part of the virtual laboratory, should be recorded and displayed on a dashboard. They could choose topics to learn or experiments to do, and could also earn medals when learning. Ongoing topics and experiments are stored in the student’s library. We could also think about memorizing or experimental design competitions like in Memrise. Basic social network tools like a mailbox, finding friend, and eventually tutors would also be available. Since these aspects of the platform are not specific to our project, we only mention these as future possibilities. They are mentioned here because some of them are in the mock up. Figure 8: Dashboard
  • 21. Science Team Piso-MOOC Designing a New Learning Environment Course Appendix II - Additional resources II.1 Links towards video tutorials Serial dilution in Bradford Assay: http://cfscience.net/DNLE/Serial_Dilutions/Serial_Dilutions.htm Providing the context: http://cfscience.net/DNLE/Protein_PreLab/Protein_PreLab.htm Bradford assay procedure: http://cfscience.net/DNLE/Bradford_Assay/Bradford_Assay.htm Video tutorial showing the real Bradford assay: http://www.youtube.com/watch?v=5It_AEOTYoM II.2 Google presentation link https://docs.google.com/presentation/d/1JAODI4rssuSR9VXw-1NUYHiCCsUMo- Zw_kOeoHkFn8E/edit II.3 Prezi link http://prezi.com/ohf_brwdy54g/science-team-final-project-for- dnle/?auth_key=77084dcb9f84fa7f6071f63939cca38ed5d06c8f&kw=view-ohf_brwdy54g&rc=ref- 17313360 II.4 Link to the mock up Our project Mock up https://www.dropbox.com/sh/mdbhboj0db16q6b/VVcAMcAncB II.5 Bibliography European Commision, 2012, COMMISSION STAFF WORKING DOCUMENT ;Assessment of Key Competences in initial education and training: Policy Guidance Accompanying the document Communication from the Commission Rethinking Education: Investing in skills for better socio-economic outcomes The key competences for European citizens in the knowledge society Google developers. (2012, September 27). Retrieved December 15, 2012, from Java Scripts Overlays: https://docs.google.com/document/d/1r8cJ1ponVvuLdaJcfwbR5d3GlmlbHEnVpwYw0grpE5M/edit# Hanne Shapiro,John René Keller Lauritzen, Pat Irving, 2011.Emerging Skills and Competences- A transatlantic study , Emerging Skills and Competences - A transatlantic study Khan Academy. (n.d.). Retrieved december 15, 2012, from Khan Academy: http://www.khanacademy.org/ Anderson Cognitive modeling and intelligent tutoring Artificial Intelligence 42 (1)p. 7-49 (1990) http://act-r.psy.cmu.edu/papers/119/CogMod_IntTut.pdf Kristina Brandstädter, Ute Harms, Jörg Großschedl Assessing System Thinking Through Different Concept- Mapping Practices International Journal of Science Education 34 (14) p. 2147-2170 (2012) http://dx.doi.org/10.1080/09500693.2012.716549 Brownell et al. Undergraduate Biology Lab Courses: Comparing the Impact of Traditionally Based “Cookbook” and Authentic Research-Based Courses on Student Lab Experiences. Journal of College Science Teaching 41:4 (2012)
  • 22. Science Team Piso-MOOC Designing a New Learning Environment Course Cobb et al. The Learning Gains and Student Perceptions of a Second Life Virtual Lab. Bioscience Education 13:1 (2009) Corbett, et al. Intelligent Tutoring Systems Springer Berlin Heidelberg5091 (6) p. 31-40-40 (2008) http://www.springerlink.com/content/b4197184w8667748/ Dadić. Intelligent Tutoring System for Learning Programming. Intelligent Tutoring Systems in ELearning Environments Design Implementation and Evaluation p. 166 (2010) http://www.igi-global.com/viewtitlesample.aspx?id=45547 Feng et al. Student modeling in an intelligent tutoring system. Intelligent Tutoring Systems in ELearning Environments Design Implementation and Evaluation p. 208 (2010) http://nth.wpi.edu/pubs_and_grants/papers/2008/Feng_Stude... Gershon Computer adaptive testing. Journal Of Applied Measurement 6 (1) p. 109-127 (2005) http://www.ncbi.nlm.nih.gov/pubmed/15701948 Elisabeth Georgiadou, Evangelos Triantafillou, Anastasios A Economides Evaluation parameters for computer- adaptive testing. British Journal of Educational Technology 37 (2) p. 261-278(2006) http://www.blackwell-synergy.com/doi/abs/10.1111/j.1467-8... Girault Characterizing the Experimental Procedure in Science Laboratories: A preliminary step towards students experimental design. International Journal of Science Education 34:825-854 (2012) Holt, C. E., Abramoff, P., Wilcox, L. V.,& Abell, D. L. (1969). Investigative laboratory programs in biology: A position paper of the commission undergraduate education in the biological sciences. BioScience, 19, 1104–1107. H Kazi A Diverse and Robust Tutoring System for Medical Problem-Based Learning Supporting Learning Flow through Integrative Technologies162 p. 659-660 (2007) Kloser et al. Integrating Teaching and Research in Undergraduate Biology Laboratory Education PloS Biology 9:11 (2011) Model et al. Modell, H. I., & Michael, J. A. Promoting active learning in the life sciences classroom: Defining the issues. Annals of the New York Academy of Sciences, 701, 1–7. (1993). OECD (Organisation for Economic Co-operation and Development). (2006). Evolution of student interest in science and technology studies (Policy Report). Paris: Author. Schulze Andes: An active learning intelligent tutoring system for Newtonian physics THEMES in Education 1 (2) p. 115–136 (2000) http://www.public.asu.edu/~kvanlehn/Stringent/PDF/00ANDES... Shute et al. Intelligent tutoring systems: Past, present and future. Handbook of Research on Educational Communications and Technology. Scholastic Publications. (1995) Suebnukarn A Bayesian approach to generating tutorial hints in a collaborative medical problem-based learning system Artificial Intelligence in Medicine (2006) 38, 5-2 Sykes A Prototype for an Intelligent Tutoring System for Students Learning to Program in Java Advanced Technology for Learning 1 (1) http://www.actapress.com/PaperInfo.aspx?paperId=20654 (2004)
  • 23. Science Team Piso-MOOC Designing a New Learning Environment Course Viswanathan A Comparison of Model-Tracing and Constraint-Based Intelligent Tutoring Paradigms The International Journal of Artificial Intelligence in Education 15:1 (2005) Zumbach et al. Learning Life Sciences: Design and Development of a Virtual Molecular Biology Learning Lab. Journal of Computers in Mathematics and Science Teaching, 25, 281- 300. (2006)