Skip navigation
st. Mary's University Institutional Repository St. Mary's University Institutional Repository

Please use this identifier to cite or link to this item: http://hdl.handle.net/123456789/2088
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAshagre, Endale-
dc.date.accessioned2016-07-02T07:07:45Z-
dc.date.available2016-07-02T07:07:45Z-
dc.date.issued2009-08-
dc.identifier.urihttp://hdl.handle.net/123456789/2088-
dc.description.abstractIt is widely believed that “assessment drives curriculum”. Hence, it can be argued that if the quality of teaching, training, and leaning is to be upgraded, assessment is the obvious staring point. Instructors employ a variety of assessment tools in order to get an overview of students’ performance. Among these, multiple choice quizzes and tests are the most reliable and commonly used assessment tools. A dependable multiple choice item is not just coming. It requires a thorough assessment and a continuous refinement. This implies the need to examine the quality, within the context the item is employed, through different mechanisms. One way to deal with this is through item analysis. It is a statistical procedure to analyze test items that combines methods used to evaluate the important characteristics of test items, such as difficulty, discrimination, and distractor analysis of the items in a test. Accordingly, the purpose of this study is to examine sample exam papers being administered by the different departments of St. Mary’s University College and subsequently forward appropriate feedback on how to improve multiple choice items. The study employed both qualitative and quantitative analysis. Quantitatively; items were examined using basic item analysis statistics, which includes Item Difficulty and Discrimination index and Point-Biserial Correlation as well as Frequency Counts and Percentage. A total of seven hundred sixty one exam papers, which consisted 234 items, from nine courses, have been considered. To supplement the results obtained from this quantitative date, items were qualitatively reviewed in comparison with the basic guidelines of multiple choice item writing. Results of the study indicated that the majority (83%) of items examined have a moderate difficulty (a difficulty index: .20<p<.80) and more than half of the total number of items were found to be good and effective in discriminating (Discrimination index ≥.20 (72%) and Point-Biserial Correlation≥.20 (52%)). On the other hand, those poor performing items, which are identified by quantitative analysis, were found violating the basic principles of multiple choice test item writing.en_US
dc.description.sponsorshipSt.Mary's Universityen_US
dc.language.isoenen_US
dc.publisherSt.Mary's Universityen_US
dc.subjectTest Construction Skills, St. Mary’s University Collegeen_US
dc.titleImproving Test Construction Skills through Item Analysis: The Case of St. Mary’s University Collegeen_US
dc.typeArticleen_US
Appears in Collections:Proceedings of the 7th National Conference on Private Higher Education Institutions (PHEIs) in Ethiopia

Files in This Item:
File Description SizeFormat 
Endale Ashagre.pdf223.83 kBAdobe PDFView/Open
Show simple item record


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.