Top of Page
Skip main navigation

Frequently Asked Questions

To have an exam copied: Get a Testing Center Work Order from this web site or pick one up in the Testing Center. Fill out the top portion and right-hand side. Drop off your exam and work order at the Testing Center or e-mail them to hpdtc@nova.edu. The Testing Center will call you when the exam is ready to be picked up. Your exam package will have the exams and enough Scantron sheets for your class.

To have an exam created: Save your questions as a Word document and e-mail the file and a work order to hpdtc@nova.edu or drop by the Testing Center with a work order and the file on a flash drive. The Testing Center will format your questions and call or e-mail you to proof your exam. After you have approved the exam, the Testing Center will make copies for your class. They will call or e-mail you when the exam is ready to be picked up. Your exam package will have the exams and enough Scantron sheets for your class. NOTE: If you have the time to do so, you can greatly reduce the time it takes to produce your exam by formatting your questions yourself. See How to Format Your Questions in the Testing Center Manual or stop by the Testing Center for a lesson.

To have an exam scored: If the Testing Center created your exam, bring the students' Scantron sheets to the Testing Center and fill in the second page of your original Work Order. You will be called when your exam is scored. If the Testing Center did not create your exam, separate the students' Scantron sheets by the version they took. Create a test key for each version of the exam on a Scantron sheet. Then bring the Scantrons and keys to the Testing Center. Fill in the second page of your original Work Order. You will be called when your exam is scored.

To have an evaluation created: Meet with the Testing Center Coordinator (Jacquelyn Moore, extension 21733, e-mail jmoore@nova.edu, or Terry Building, Room 1522) to discuss your idea for an evaluation. Once your evaluation content and delivery system have been agreed upon, your evaluation will be proofed. You will be contacted to proof your evaluation before multiple copies are made or it is posted online. After you have approved the evaluation, you will be informed when your copies are ready or the evaluation is ready to go live.

It depends on what you want the Testing Center to do and how busy they are.

Guidelines for peak times (midterms and finals)

  • To have an exam copied: 2 days.
  • To have an exam scored: usually no more than 4 days. During midterms and finals, scoring takes a back seat to test creation. The Testing Center does scoring on a first-come, first-served basis.
  • To have an exam created: 5 working days.
  • To have evaluations created or scored: not until after midterms and finals.

Guidelines for non-peak times

  • To have an exam copied: 1 day.
  • To have an exam scored: usually a few hours or at most, 1 day.
  • To have an exam created: 3 working days.
  • To have evaluations created or scored: evaluations can be created within 2 days. Scoring may take longer depending on the number of evaluations submitted.

If an instructor is in an emergency situation, please call before deciding that there isn't enough time. The Testing Center is flexible and willing to help out if at all possible.

Yes, with 2 provisions.

  1. The test was composed of true/false, multiple choice, or limited matching questions (limited meaning no more than 5 possible choices).
  2. The students' responses are recorded on Scantron sheets.

It depends. If the Testing Center created your test, you do not need to separate the different versions. If they did not, you will need to separate your students' Scantrons by version.

For a complete discussion of the statistics provided by the Testing Center, please see the Analyzing Your Tests section in the Testing Center Manual.

When the Testing Center has scored your test, you will receive the following items: a roster for your files, a roster for posting, a roster showing raw scores, an item statistics report, a test statistics report, and the students' Scantrons. The reports and rosters all have good information to help you analyze your students' performance and the effectiveness of the exam.

File Roster
This roster lists scores for all the students in the class and will calculate a course grade. It can be arranged alphabetically by student last name or by course grade rank. At the bottom of the report is the class average for each exam and the course grade, the highest number of points scored for each exam and the course grade, the lowest number of points scored for each exam and the course grade, the median score for each exam and the course grade, the standard deviation for each exam and the course grade, and the number of points each exam is worth.

Posting Roster
This roster lists scores for all students in the class and will calculate a course grade. It lists students' scores by their token ID numbers to protect their anonymity. At the bottom of the report is the class average for each exam and the course grade, the highest number of points scored for each exam and the course grade, the lowest number of points scored for each exam and the course grade, the median score for each exam and the course grade, the standard deviation for each exam and the course grade, and the number of points each exam is worth.

Raw Scores Roster
This roster lists raw scores for all the students in the class who took a particular exam. It also shows by which version test they were scored.

Item Statistics Report
This report analyses each question on an exam. It shows 1) the question number, 2) the type of question (TF = true/false; MCS = multiple-choice, single response; MCM = multiple-choice, multiple response), 3) the number of points the question is worth, 4) the number of people who took the exam, 5) the p value of the question (an indication of the difficulty of the question -­ a p value of 1 indicates that all students answered the question correctly), 6) the test score average of the students who answered the question correctly, 7) the test score average of the students who answered each response, 8) the point biserial for the correct answer, and 9) the point biserial for each response (the point biserial is a measure of how well a question discriminates between better and poorer students -­ 0.30 and higher indicates a good discriminator).

Test Statistics Report
This report provides a histogram of students' scores (a graph showing how many students scored with a given range) and also information on highest score, lowest score, median, mean, standard deviation, and test reliability.

Check the item statistics that are provided with your students' scores. The two statistics that will help you the most are the p value and the point biserial. The p value is an indication of the difficulty of the question. As the p value approaches 1, more students answered correctly, with a score of 1.0 meaning that every student got that question right. The point biserial indicates if a question was a good discriminator between the better students and the poorer students. A point biserial of 0.3 or higher indicates a good discriminator.

First, don't panic. Ask yourself a few questions.

  1. Did the students say they found the exam exceptionally difficult?
    If so, you may have just caught them off guard with a really hard exam. Our gradebook program will allow scores to be curved or will support dropping the lowest score for a course. Or, you could check through the item statistics and see if all the students missed certain questions. On those, check that the key is correct. If it is, but you see that the question is ambiguous, you could either credit every one or else drop the question.
  2. Did the students fail by several points or just a few?
    If the students failed by several points, check that they were scored on the correct version. Scoring a student on Version A when he/she actually took a different version will normally result in scores in the teens. (There are some exceptions to this general rule. For example, if an exam is composed mostly of true/false questions, scoring on a wrong version won't make that much difference because true/false questions can't be scrambled.) The LXR scoring program automatically assigns Version A to those Scantrons that have a blank version bubble.

    If the students failed by just a few points, check through the item statistics and see if all the students missed certain questions. On those, check that the key is correct. If it is, but you see that the question is ambiguous, you could either credit every one or else drop the question.
  3. Did every single student fail or did just a few fail spectacularly?
    If every student failed, check your keys and item statistics. Make sure that the keys are correct and that every question has only one correct answer. If the exam was particularly difficult, you may want to curve scores or eliminate questions that every one missed.

    If just a few students failed spectacularly, check that they were scored under the right version. Also, check that they answered all the questions on the exam. The Testing Center can let you know if students left several questions blank.

Yes it can. The Testing Center's gradebook program, GradeMachine, has some very nice features. For example, it can curve scores, drop lowest scores, give bonus points, excuse selected students from exams, and give different weights to different categories of exams (for example, quizzes weighed differently than exams, weighed differently from finals, etc.). It also supports exporting students' scores into an Excel format for those professors who prefer to maintain their own gradebooks.

Yes. Just provide the Testing Center a roster with the scores to be added, either a hard copy or as an Excel file (preferable with students' N numbers in a column). If you don’t have a roster, ask the Testing Center to print a master roster for you. It may need to be adjusted to reflect actual student enrollment.

Return to top of page