9993888

6
Modern Computer Architecture Teaching and Learning Support: An Experience in Evaluation Besim Mustafa Business School Edge Hill University Ormskirk, UK [email protected] Abstract—A set of integrated educational simulators supporting teaching and learning of computer architecture concepts at degree level have been used to engage students and enhance their understanding of modern computer system architecture. Examples of practical exercises using an integrated set of simulations are described and a methodology for evaluating the educational value of the simulations is explained. The results are presented and it is noted that the statistical analysis of the results provided some support for the positive educational value of the simulations. (Abstract) Keywords-Computer education; simulation; visualization; computer architecture; operating systems I. INTRODUCTION The study of modern computer architectures forms an important and essential part of computer science students’ education and many modern degree level courses today offer teaching modules on both the fundamentals and the advanced features of the technology as identified in [1]. The author has been responsible for designing and delivering teaching modules on computer architectures and operating systems at undergraduate degree level for the past seven years. In order to support the practical lab sessions he has designed and implemented an integrated set of simulators collectively identified as the system simulator. The simulators conform to modern principles of pedagogy with respect to facilitating student engagement [2] and enhancing student learning experiences [3] and are extensively backed up by practical lab assignments that aim to maximize the pedagogical benefits to the students. An educational resource such as the system simulator is useful only if it can provide true educational benefits to the students using it. This paper therefore concerns itself with the process of evaluating the educational value of both the simulations and the practical tutorial exercises supporting the simulations. II. INTEGRATED SYSTEM SIMULATOR The system simulator captures the three important and key aspects of modern computer architecture in one educational software package [4]: generation of CPU instructions using assemblers and high-level language compilers; the CPU as the processor of the instructions; the operating system as the facilitator of multiprogramming and multi-threading of the CPU instructions. The simulators are designed to demonstrate and explore the interfaces and the interplay between these three areas. Therefore the CPU simulator, the inbuilt assembler, the compiler and the operating system simulator are optimized to work together and support each other as in the real systems. A. The CPU Simulator The CPU simulator simulates the hardware functionality of a fictitious, but highly realistic, CPU based on RISC type architecture. This simulator incorporates a five-stage pipeline simulator and hardware data and instruction cache simulators which are used to support both the introductory and the advanced modules in computer architecture. The CPU simulator executes instructions which are either automatically generated by the integrated compiler from a high-level source code or manually created by the students using a special inbuilt tool which makes the process user friendly. Both the cache and the instruction pipeline simulators cooperate with the CPU simulator while the instructions are being executed. B. The OS Simulator The OS simulator is designed to support two main aspects of a computer system’s resource management: process management and memory management. All CPU code is available to the OS simulator which is able to create multiple instances of the code as separate processes. The process scheduler includes support for scheduling policies including priority-based, pre-emptive and round-robin scheduling with selectable time slots. Virtual resources can be allocated and de- allocated to processes allowing demonstration of deadlocks associated with resources and investigation of deadlock prevention, detection and resolution techniques. Threads are supported via special teaching language constructs which allow parts of program code to be executed as threads and process synchronization concepts to be explored. C. The Teaching Compiler A basic but complete high-level teaching language is developed to support the CPU and OS simulations. This language incorporates standard language control structures, constructs and system calls which are used to demonstrate a modern computer system’s key architectural features. A teaching compiler is developed for this language that can generate assembly-level code as well as its equivalent binary Copyright © i-Society 2011 Technical Co-Sponsored by IEEE UK/RI Computer Chapter 428

Upload: didi-juardi

Post on 30-Sep-2015

215 views

Category:

Documents


2 download

DESCRIPTION

data

TRANSCRIPT

  • Modern Computer Architecture Teaching and Learning Support: An Experience in Evaluation

    Besim Mustafa Business School

    Edge Hill University Ormskirk, UK

    [email protected]

    AbstractA set of integrated educational simulators supporting teaching and learning of computer architecture concepts at degree level have been used to engage students and enhance their understanding of modern computer system architecture. Examples of practical exercises using an integrated set of simulations are described and a methodology for evaluating the educational value of the simulations is explained. The results are presented and it is noted that the statistical analysis of the results provided some support for the positive educational value of the simulations. (Abstract)

    Keywords-Computer education; simulation; visualization; computer architecture; operating systems

    I. INTRODUCTIONThe study of modern computer architectures forms an

    important and essential part of computer science students education and many modern degree level courses today offer teaching modules on both the fundamentals and the advanced features of the technology as identified in [1].

    The author has been responsible for designing and delivering teaching modules on computer architectures and operating systems at undergraduate degree level for the past seven years. In order to support the practical lab sessions he has designed and implemented an integrated set of simulators collectively identified as the system simulator. The simulators conform to modern principles of pedagogy with respect to facilitating student engagement [2] and enhancing student learning experiences [3] and are extensively backed up by practical lab assignments that aim to maximize the pedagogical benefits to the students.

    An educational resource such as the system simulator is useful only if it can provide true educational benefits to the students using it. This paper therefore concerns itself with the process of evaluating the educational value of both the simulations and the practical tutorial exercises supporting the simulations.

    II. INTEGRATED SYSTEM SIMULATORThe system simulator captures the three important and key

    aspects of modern computer architecture in one educational software package [4]: generation of CPU instructions using assemblers and high-level language compilers; the CPU as the processor of the instructions; the operating system as the

    facilitator of multiprogramming and multi-threading of the CPU instructions. The simulators are designed to demonstrate and explore the interfaces and the interplay between these three areas. Therefore the CPU simulator, the inbuilt assembler, the compiler and the operating system simulator are optimized to work together and support each other as in the real systems.

    A. The CPU Simulator The CPU simulator simulates the hardware functionality of

    a fictitious, but highly realistic, CPU based on RISC type architecture. This simulator incorporates a five-stage pipeline simulator and hardware data and instruction cache simulators which are used to support both the introductory and the advanced modules in computer architecture. The CPU simulator executes instructions which are either automatically generated by the integrated compiler from a high-level source code or manually created by the students using a special inbuilt tool which makes the process user friendly. Both the cache and the instruction pipeline simulators cooperate with the CPU simulator while the instructions are being executed.

    B. The OS Simulator The OS simulator is designed to support two main aspects

    of a computer systems resource management: process management and memory management. All CPU code is available to the OS simulator which is able to create multiple instances of the code as separate processes. The process scheduler includes support for scheduling policies including priority-based, pre-emptive and round-robin scheduling with selectable time slots. Virtual resources can be allocated and de-allocated to processes allowing demonstration of deadlocks associated with resources and investigation of deadlock prevention, detection and resolution techniques. Threads are supported via special teaching language constructs which allow parts of program code to be executed as threads and process synchronization concepts to be explored.

    C. The Teaching Compiler A basic but complete high-level teaching language is

    developed to support the CPU and OS simulations. This language incorporates standard language control structures, constructs and system calls which are used to demonstrate a modern computer systems key architectural features. A teaching compiler is developed for this language that can generate assembly-level code as well as its equivalent binary

    Copyright i-Society 2011 Technical Co-Sponsored by IEEE UK/RI Computer Chapter 428

  • byte-code as output. The students can observe and analyze the assembly and the binary code generated against the high level language statements. The compiler incorporates object-oriented constructs mainly used to demonstrate the generation of code resulting from class variables, methods and object instantiation.

    III. RELATED WORKSimulators have been popular for educational purposes for

    many years. They have been educating medical students, engineering students and increasingly the students of computer science [5]. In computing there have been many reports regarding the well established methods for the evaluation of algorithm simulations [6]. However evaluation of system simulators for educational purposes is different and not as widely reported. That is partly because the designs of such simulators have been ad hoc, and as a result, it has not been easy to develop reliable methods of evaluation. There are some notable exceptions [7, 8, 9] and it is hoped that this paper will make a valuable contribution in this area.

    IV. THE TEACHING AND LEARNING STRATEGYThe simulator has successfully been integrated into

    modules on computer architecture and operating systems and has been in use for the past four years. During each two-hour practical tutorial session the students work in small groups. The simulator software is installed on all lab computers and runs under Windows operating system. The practical exercise questions are designed to encourage critical thinking and deeper understanding of the concepts under investigation. The year one students study the introductory computer architecture module and use the simulator to explore CPU instruction sets and experiment with the assembly language programming. They also get to know the role of OS in managing hardware resources and processes. The year two students study advanced computer architecture and concentrate on performance issues. Here they work with the data and the instruction cache simulators, the CPU pipeline simulator and the teaching compiler optimizations all of which are part of the integrated system simulator described above. The year three students study advanced features of OS and explore advanced scheduling mechanisms and memory management techniques, threads, deadlocks, synchronization and critical regions and inter-process communications.

    V. SIMULATION TUTORIAL EXERCISESIn this section we look at three examples of practical

    tutorials year one students carry out and four examples of practical tutorials year two students carry out using the integrated simulators as part of their coursework portfolios. These tutorials are used to collect the evaluation data presented in this paper. The students work in groups of two or three and are guided by the tutorial sheets handed to them at the start of the tutorial sessions. All the exercises are designed to be completed well within the two-hour sessions. A tutor is available at all times to provide support whenever required.

    A. Programming Model 1 (year one) The students use the CPU simulator to investigate aspects

    of instruction set architecture. The simulator facilitates the entry of CPU instructions through a user-friendly interface.

    The students are presented with a list of available op codes. They identify and select the desired op code. They are then presented with valid operand types (if any) and the related valid addressing modes supported by the operands. This way they construct the CPU instructions needed to perform the desired task. Once the instruction is complete it is then entered in the simulators memory. The students can then execute instructions individually or run a set of instructions starting from the selected instruction. In this tutorial students explore basic low level programming features such as moving data to/from registers, comparing registers, pushing and popping data to/from the stack, jumping to address locations and doing arithmetic operations. The students can observe and if necessary manually alter the results as data is moved into registers and onto the stack as a result of executing the instructions.

    B. Programming Model 2 (year one) Using the same simulator facilities as in (A) above, the

    students explore more advanced low level programming activities such as constructing simple loops, using indirect addressing modes, calling and returning from subroutines and experimenting with passing parameters to subroutines using registers and the stack.

    C. Investigating Process Scheduling (year one) The students use the integrated OS simulator to explore

    aspects of process scheduling. They use the integrated compiler to enter and compile a simple loop using high-level statements. They then switch to the OS simulator and use it to manually create one or more instances of the program as processes. Once the processes are created they can be run. The OS simulator shows the queued, running and waiting processes enabling the students to observe the various process states and the state transitions as the processes change states. This feature is further facilitated by the ability to alter the speed of the simulations. The students explore first-come-first-served, priority-based and round-robin scheduling mechanisms. They then experiment further with pre-emptive scheduling and with different time slots when using round-robin scheduling.

    D. Investigating CPU Cache (year two) The CPU simulator incorporates data and instruction cache

    simulators. These simulators are highly configurable. Different cache types, cache sizes, block and set sizes as well as the placement and replacement policies are all selectable. The caches display the data and instructions and maintain counts of hits and misses. The students can program the CPU simulator at assembly level or at high-level using the inbuilt compiler in order to drive the practical experiments. In this practical tutorial the students first investigate the directly-mapped data cache organization and program the CPU to demonstrate a disadvantage where the same block is repeatedly replaced causing a high miss rate. Next, the students investigate the set-associative cache organization by configuring the cache first as a 2-way and then as a 4-way cache. They then repeat the same test as in the directly-mapped cache and demonstrate that the previously noted disadvantage is not as severe and becomes less of a problem as the blocks per set increase. Next, the students use the inbuilt compiler to enter and compile a high level program that writes and reads back an array of bytes in a

    Copyright i-Society 2011 Technical Co-Sponsored by IEEE UK/RI Computer Chapter 429

  • loop. They are asked to run the program for different sizes of the cache for each of the following mappings: direct, 2-way, 4-way and 8-way set associative. At the end of each run they record the miss rate against the cache size. They are then asked to comment on their observations. They note that direct and 2-way set associative mappings yield similar results with 2-way slightly better as its miss rate shrinks more steeply and this is visually shown by the displayed graph, but the 4-way mapping turns out to be demonstrably more efficient than both. However increasing this to 8-way appears to only slightly improve over the 4-way mapping. Finally the students are asked to demonstrate that programming style can have significant influence on cache efficiency.

    E. Investigating CPU Pipeline (year two) The CPU simulator incorporates a 5-stage pipeline. The

    pipeline simulator is highly interactive and configurable. It can be switched off, i.e. instruction stages are processed sequentially (this would not be possible on real hardware) and different pipeline optimizations such as operand forwarding and jump prediction can be switched on or off. The simulator displays different stages of instructions as they progress through the pipeline while the instructions are processed. It also calculates and displays the clocks per instruction (CPI) and the speed-up factor (SF). The students enter the source for a small program loop and compile it. They run the generated code first when the pipeline is switched off then again when it is switched on. They note down the CPI and the SF in both cases and comment. Next the students investigate data hazards. They configure the cache so that it does not insert any hazard bubbles (again, not possible on real hardware). The students then enter a small assembly code which moves data to registers and adds their contents making sure there is a data hazard condition. They run this code and observe that the mathematical result is not what they expect. They are then asked to use a NOP instruction in the appropriate place to eliminate the hazard and verify. They are then asked to switch on the hazard bubbles and run the program one more time. They note that the result is now correct. Next they investigate operand forwarding to remove data hazards. They note the resulting CPI that is now reduced demonstrating performance improvement. Next they investigate jump prediction mechanism using the jump predict table to remove control hazards. This again reduces the CPI further and the simulator displays a high percentage of predictions recorded. They then use the compilers loop-unrolling optimization to show the effect of this optimization on the pipeline operation with significant performance gain as the CPI is demonstrably reduced and the SF is increased. Finally they investigate the out-of-order execution of instructions to minimize data hazards. The inbuilt compiler is directed to re-arrange instructions such that the data dependencies are minimized. The students study the new code generated and then run it to verify the performance improvement.

    F. Investigating Compilers (year two) The system simulator incorporates a compiler capable of

    accepting and compiling typical high-level language statements and generates low level assembly code that can run on the CPU simulator. The compiler also generates the corresponding binary code which can be studied by the students. A list of

    instruction types and their frequencies are also maintained for statistical analysis. The compiler features a number of optimizations that students can experiment with. Also, the results of the various compiler stages, the contents of the symbol table and the subroutine information are displayed. The students are asked to enter a small source code and compile it. They are asked to observe the displayed information and identify the different stages of compiling. They are asked to observe the contents of the symbol table and note down the kind of information it contains. They then study the binary byte code generated and use an inbuilt tool to isolate individual instructions and study their contents. They are then asked to disassemble the binary code as a paper exercise and then verify this by using the simulators inbuilt disassembler. Finally students investigate several typical compiler optimizations such as redundant code elimination, constant folding, strength reduction and loop unrolling. They study the un-optimized and the optimized code generated in each case and are asked to comment on the differences they observe. As an extended exercise the students are invited to see if they could better the compiler by further improving on the optimized assembly code generated by the compiler.

    G. Investigating IO Interrupts (year two) The CPU simulator supports both the vectored and polled

    type interrupts. A vector table is maintained for the vectored addresses for console input, the hardware timer, exceptions and software interrupts. These addresses can be manually specified or can be generated by a special compiler construct used to define the interrupt routines. The students are asked to enter and compile a source code implementing separate interrupt handler routines that simply display the interrupt names. The students note the start addresses of these routines. The code is then loaded in CPU simulators instruction memory and the students are asked to note down the addresses that are automatically planted in the interrupt vector table and compare them to the interrupt routine addresses previously noted. They can then manually trigger each of the interrupts and observe the text displayed on the simulated consoles window. Students next enter a source code that demonstrates polled interrupt using the console read statement that does not block. This code continuously checks to see if any input is supplied in a tight loop. Next they modify this code by adding an interrupt handler for the console input. This routine displays the key pressed. In this case whenever a console input is provided, via the system simulators virtual keyboard, the interrupt routine is entered and the value of the key code is displayed demonstrating vectoring taking place. The students can either slow down the simulation or use a breakpoint to observe this event.

    VI. EVALUATION METHODOLOGYThe following sections describe the methodology followed

    for the evaluation of the simulations and the accompanying tutorial exercises. The main tenet of this exercise is the following null hypothesis: The use of the integrated simulators is not particularly helpful in consolidating the theory covered during the lectures on computer systems architecture. The alternative hypothesis is just the opposite of this. For the simulations to be deemed to be educationally valuable the

    Copyright i-Society 2011 Technical Co-Sponsored by IEEE UK/RI Computer Chapter 430

  • evaluation results will need to disprove the null hypothesis and prove the alternative hypothesis.

    A. The Evaluation Process The research described in this paper was conducted in the

    normal course of the scheduled tutorial sessions and no special arrangements were made. For this, seven practical sessions that required the use of the simulators were selected. The only additions to the tutorial sessions were the pre and post tests. Students were asked to complete a pre test just before the start of the exercises and a post test soon after they completed the practical work. The practical exercises are documented on a tutorial sheet and are normally composed of several interactive activities, visual observations and student comments including reflections on their observations. The activities are chosen such that they are self-contained and short enough to easily complete during one practical session. The practical exercises are particularly designed to consolidate the main topics covered during the lecture on the same day. The learning outcomes are clearly indicated on the practical exercises and each exercise is aligned with a single learning outcome. It is worth noting that the year two students have been using the system simulator in the previous year. The year one students complete a short tutorial at the start of the semester to familiarize themselves with the features of the integrated simulators. A unique feature of the system simulator is that it is designed to support the students throughout their studies.

    B. The Measurements Pre and Post Tests The evaluation of the effectiveness of an educational

    resource such as a software simulator requires effective and meaningful measurements. One common method is the testing of student knowledge before and after applying the chosen learning intervention in between. It is the authors experience that little attention appears to be paid to the nature of these tests. Yet it is safe to assume that those tests that are not well thought out and are not aligned with the learning outcomes are more likely to yield unreliable results. This section therefore concentrates on this aspect of the research in some detail.

    The evaluation of an educational resource such as the system simulator described in this paper requires careful and well focused attention to both the nature of the measurements and the processes in which the measurements are taken if it is to yield reliable and meaningful results. Table I presents the kind of questions the researcher needs to ask and offers some guidelines.

    It is important to note that the tests were conducted anonymously and that the results were not included in the final student assessment. Matching of the pre and post test returns were achieved by students using unique numbers.

    The pre and post tests took the form of 5-point Likert scale (strongly agree, agree, neutral, disagree, strongly disagree) with five items in each test. The items of the tests take the form of confidence-based opinions rather than multiple choice type questions. Table II shows sample pre and post test questions taken from each of the four year two practical exercises and demonstrates how these are aligned with the corresponding learning outcomes (LOs).

    TABLE I. EVALUATION QUESTIONS AND GUIDELINES

    How do we make sure we are measuring what is being learned?

    Define clear and achievable learning outcomes; target for a small number of learning outcomes, say 4; design practical exercises closely aligned with the learning outcomes.

    How do we make sure we establish a meaningful baseline prior to measurement of any changes?

    Use pre test questions that reflect the general goals of learning outcomes; seek evidence for levels of confidence; ask one question directly corresponding to each learning outcome.

    How do we make sure we measure only against the established baseline?

    Use post test questions that correspond to pre test questions in modified form, one question against each learning outcome; seek evidence of confidence attributable to the intervention

    How do we make sure what is measured is only due to the intervention?

    Do post tests immediately after the intervention; link questions to the method of intervention; seek level of confidence in the use of the intervention alone.

    The individual items of both the pre and the post tests directly correspond to and reflect the expected learning outcomes of the tutorials. This way we can be confident that what is being evaluated is the actual work carried out during the tutorial sessions. The post test items are designed to closely correspond to the pre test items.

    TABLE II. SAMPLE PRE AND POST TEST ELEMENTS AND LOS

    LO Describe different compiler optimization methods

    Pre I can describe at least four examples of compiler optimization methods and state their impact on CPU performance

    PostThe simulator aided my understanding of compiler optimization methods and I can explain how they impact CPU performance

    LO Explain the effect of cache size and mapping scheme on cache performance

    Pre I understand and can explain the effects of the cache type and the cache size on cache performance

    Post The simulator aided me in demonstrating and understanding the effects of cache type and cache size on cache performance LO Describe a pipeline technique to eliminate data hazards

    Pre I can describe a method used by CPU pipeline in order to eliminate data hazards

    Post The tutorial exercises helped me understand and describe a method used by CPU pipeline to eliminate data hazards

    LO Explain the difference between polled and vectored interrupts

    Pre I understand and can explain which of the two interrupt handling methods is generally more efficient

    Post The simulations clearly demonstrated to me which of the two interrupt handling methods is more efficient

    The rationale behind the tests requires further explanation. It was decided that the pre tests should be administered immediately after the lecture and prior to the practical exercises. As both the lecture and tutorial schedules have been made available online in advance and the students have been regularly reminded of what they should read for next weeks lecture this was not necessarily considered to be a drawback. The post tests were carried out immediately after the completion of the practical exercises. It is therefore not unreasonable to assume that this method afforded increased confidence in the differences between the test results and that

    Copyright i-Society 2011 Technical Co-Sponsored by IEEE UK/RI Computer Chapter 431

  • this could safely be attributed solely to the use of the simulators as the learning intervention.

    The usual concerns with opinion type measurements using the Likert scale do apply here too. However, we took care in minimizing the effects of these. For example, the possibility of bias is first minimized by keeping data gathering anonymous and excluding the pre and post test results from summative assessments; the bias is also minimized by the fact that both the pre and the post tests are confidence-opinion based and use very similar and matching constructs and since the differences in the scores are used in the calculations (see next section) any bias is likely to be cancelled out to a certain extend; reproducibility of the evaluation process is addressed by making the methodology as thorough as possible and designing reliable pre and post tests closely in line with the learning outcomes of the practical exercises using the simulations; demonstrating validity of the methodology is addressed to a great extend by paying close attention to the internal consistency and reliability of the test items while designing the tests.

    VII. RESULTS AND ANALYSISIn each practical session the pre test completions were

    collected just before the students started using the simulations for the practical exercises and the post tests were administered soon after they completed all the exercises. Both the pre and post test results were then matched against each student using the previously assigned unique numbers to the students. Each of the five points on the Likert scale was then assigned a number (5: strongly agree, 4: agree, 3: neutral, 2: disagree, 1; strongly disagree). These were then used to calculate the quantitative results using statistical methods. At the end of the evaluation period, the same students were asked to complete an opinion questionnaire to complement the quantitative results.

    TABLE III. RESULTS OF PRE AND POST TESTS (YEAR ONE)

    Topic Sample size Q1 Q2 Q3 Q4 Q5

    Programming Model 1 14 0.002 0.001 0.001 0.002 0.001

    Programming Model 2 13 0.002 0.001 0.002 0.006 0.002

    ProcessScheduling 41 0.001 0.000 0.000 0.000 0.000

    TABLE IV. RESULTS OF PRE AND POST TESTS (YEAR TWO)

    Topic Sample size Q1 Q2 Q3 Q4 Q5

    Cache technology 12 0.015 0.011 0.003 0.003 0.002

    Compiler technology 13 0.014 0.013 1.000 0.005 0.008

    CPU pipeline technology 12 0.180 0.002 0.020 0.008 0.008

    IO Interrupts 14 0.002 0.005 0.002 0.007 N/A

    Due to the nature of the data collected it was most appropriate to use non-parametric statistical analyses. For this we used Wilcoxon signed-rank test [10] that is suitable for comparing two dependent conditions as the repeat samples came from the same set of students. Table III shows the analysis of the evaluation data from year one students and Table IV shows the analysis of the data from year two students using the non-parametric tests. These results are obtained using the SPSS Statistics 18 software package. The table shows the number of pre/post-test returns, i.e. sample sizes, and the p values for each item of the tests. The items are labeled Q1 to Q5. For this evaluation, values of p < 0.05 are regarded as statistically significant. Apart from Q3 in Compiler Technology tutorial and Q1 in CPU Pipeline Technology the rest of the results indicate that the probability that the differences between the pre and the post tests can be attributed to the intervention by the simulations is significantly high. This then disproves the null hypothesis and lends significant support to the assumption that the simulations were instrumental in helping the students to feel more confident in their understanding and in their ability to explain and demonstrate the theory covered during the lectures.

    As the evaluation was based on Likert scale for gathering data using pre and post versions we wanted to be able to have some confidence in the way we were doing this and wished to test for the internal consistency of the scales. This we achieved by running a reliability analysis on all pre and post tests for year two questions taken as a sample. This analysis yielded the Cronbachs alpha as the measure of reliability [10]. Table V shows these values for both the pre and the post tests in each of the four practical tutorial sessions in this case. The values obtained indicate that there is a high degree of reliability and internal consistency within the tests. These results were obtained despite the fact that the number of items in the pre and post tests was relatively small; Cronbachs alpha tends to be artificially high for large number of items.

    TABLE V. PRE AND POST TEST RELIABILITY RESULTS

    Topic Pre Test Post Test Cache technology 0.898 0.770 Compiler technology 0.714 0.843 CPU pipeline technology 0.849 0.883 IO Interrupts 0.869 0.739

    Table VI shows the results of one of the final opinion surveys (SA: Strongly Agree; A: Agree) conducted with year two students. There were 14 returns and the numbers indicate the numbers of students. None of the students disagreed with the statements. The survey results show that, overall, the students regarded the simulations as engaging and enjoyable learning experiences.

    Copyright i-Society 2011 Technical Co-Sponsored by IEEE UK/RI Computer Chapter 432

  • TABLE VI. THE SIMULATOR USAGE OPINION SURVEY RESULTS

    SA A I believe the CPU-OS simulator has greatly aided my understanding of the topics covered in this module 8 6

    I found the practical tutorials using the CPU-OS simulator engaging and stimulating 7 7

    Using the CPU-OS simulator encouraged me to explore CPU functionality beyond the set tasks 7 7

    The tasks using the CPU-OS simulator nicely complemented the theory covered during the lectures 8 6

    Overall I enjoyed using the CPU-OS simulator as an interactive and visual learning resource 10 4

    In order to gauge the difference the simulations made in enhancing student learning as opposed to some other type of intervention we also conducted two evaluation exercises involving control groups. In two of the tutorial sessions we gave all the students the same exercises in each tutorial session where one group used the system simulator and the other group completed the exercises using just pen-and-paper. The results are shown in Table VII and Table VIII.

    TABLE VII. PROGRAMMING MODEL 1 RESULTS

    Using integrated simulator Q1 Q2 Q3 Q4 Q5

    Mean pre-test results 2.79 2.50 2.43 2.57 2.43 Mean post-test results 4.36 4.43 4.21 4.36 4.29 Mean differences 1.57 1.93 1.78 1.79 1.86 p 0.002 0.001 0.001 0.002 0.001

    Not using integrated simulator Mean pre-test results 2.80 3.27 2.80 2.60 2.40 Mean post-test results 4.13 4.13 4.07 3.87 4.00 Mean differences 1.33 0.86 1.27 1.27 1.60 p 0.001 0.004 0.001 0.003 0.001

    TABLE VIII. PROGRAMMING MODEL 2 RESULTS

    Using integrated simulator Q1 Q2 Q3 Q4 Q5

    Mean pre-test results 2.54 2.85 2.54 2.77 2.38 Mean post-test results 4.00 4.15 4.08 3.77 3.92 Mean differences 1.46 1.30 1.54 1.00 1.54 p 0.002 0.001 0.002 0.006 0.002

    Not using integrated simulator Mean pre-test results 2.64 2.79 2.43 2.64 2.57 Mean post-test results 3.71 3.86 4.07 3.50 3.79 Mean differences 1.07 1.07 1.64 0.86 1.22 p 0.006 1.000 0.001 0.015 0.002

    It can be seen from both tables that the control groups using just the pen-and-paper method still managed to significantly consolidate what they learned during the lectures as expected and as evidenced by the very low p values. However, it is also clear that the use of the simulator provided a more enhanced learning experience compared to the pen-and-paper method as evidenced by the mean post-test results. The next step is to include comparisons with other similar simulators. However, due to the lack of such similar simulators that are able to support as wide a range of features, this may be difficult or impossible to achieve. Also, what these result do not capture is the way in which the students attempt the tutorial exercises.

    Observations show that there was more engagement with the use of the simulator, more teamwork and more eagerness to discover outside the strict requirements of the tutorial exercises, all of which normally regarded as desirable in promoting deep learning and enriching students learning experiences. Some of this is supported by the results of the qualitative survey shown in Table VI.

    VIII. CONCLUSIONSThe evaluation exercise described in this paper has been

    greatly facilitated by the unique integrated set of simulations, the use of the clearly identified learning objectives driving both the tutorial exercises and the pre/post-test questionnaires as the instruments of quantitative analysis. The results of the evaluations will be used to improve the usability and the pedagogical value of the simulations. In the future we will be exploring other complementary methods of evaluating the simulations such as using student focus groups, more extensive use of control groups (in modules with high student populations) and analysis of final module assessment results where this is practical.

    REFERENCES[1] Computing Curricula 2001. Computing Science Final Report, December

    15, 2001. ACM and IEEE Computer Society joint report, USA. [2] Bloom, B. S., Krathwohl, D. R. 1956. Taxonomy of Educational

    Objectives; the Classification of Educational Goals, Handbook I: Cognitive Domain. Addison-Wesley.

    [3] Naps, T.L., et. al. 2003. Exploring the Role of Visualization and Engagement in Computer Science Education. ACM SIGCSE Bulletin 35(2), June 2003.

    [4] Mustafa, B. 2009. YASS: A System Simulator for Operating System and Computer Architecture Teaching and Learning. FISER09 Conference, Famagusta, North Cyprus, Mar 22-24

    [5] Yehezkel, C., Yurcik, W., Pearson, M. and Armstrong, M. (2002). Three simulator tools for teaching computer architecture: EasyCPU, Little Man Computer, and RTLSim. ACM Journal of Educational Resources in Computing, Vol. 1, No. 4, December 2002, Pages 60-80.

    [6] Urquiza-Fuentes, J., Valezquez-Iturbide, J.A. 2009. Asurvey of Successful Evaluations of Program Visualization and Algorithm Animation Systems. ACM Transactions on Computing Education, Vol. 9, No. 2, Article 9, June 2009.

    [7] Chalk, B. 2002. Evaluation of a Simulator to Support the Teaching of Computer Architecture. 3rd Annual LTSN-ICS Conference, Loughborough University.

    [8] Navarro, E.O., Hoek, A. 2005. Design and Evaluation of an Educational Software Process Simulation Environment and Associated Model. Conference on Software Engineering Education and Training - CSEE&T, Ottawa, Canada, April 18-20, 2005.

    [9] Navarro, E.O., Hoek, A. 2007. Comprehensive Evaluation of an Educational Software Engineering Simulation. Conference on Software Engineering Education and Training - CSEE&T, pp. 195-202.

    [10] Field, A. 2009. Discovering Statistics Using SPSS. 3rd edition. SAGE Publications Ltd.

    Copyright i-Society 2011 Technical Co-Sponsored by IEEE UK/RI Computer Chapter 433