Williams (1975) recommended that instructors spend time presenting the rationale underlying the multiple regression computational algorithm. Students could then apply what they have learned to solving a problem using a real data set with a computer software application. Williams argued this approach would lead to greater retention and the flexibility to apply knowledge to different domains versus the cookbook approach present in many statistics texts.
Dunn (1989) proposed that researchers and instructors should consider graphics to assess the appropriateness of a data set for multiple regression (i.e., test for violations of multiple regression assumptions). In addition, Dunn asserted that graphics could also be an effective alternative to more automated processes (e.g., stepwise) for building multiple regression models. Ip (2001) encouraged instructors to use Venn diagrams to enhance students' understanding of multiple regression concepts such as the coefficient of determination. He reported that students taught using Venn diagrams performed significantly higher on a sample assessment measure (e.g., a question on multicollinearity) than did students who were taught multiple regression using traditional methods. Kennedy (2002) proposed that instructors could also use Venn diagrams to teach students about bias and variance of coefficient estimates in multiple regression analyses.
Hershberger (2003) examined the number of articles in PsycINFO and the journal Structural Equation Modeling to determine the prevalence of SEM across both journals. His analysis demonstrated that the total number of SEM articles increased by almost 50% from 264 articles in 1994 to 527 published studies in 2001. Moreover, in 2001, Hershberger found that the number of articles in PsycINFO that employed SEM (381) were more than the total number of articles published using four other multivariate techniques combined (MANOVA, cluster analysis, discriminant analysis, and multidimensional scaling).
Instructors often give a great deal of attention to the beginning of a class but may simply watch students disappear at the end of the term. Eggleston and Smith (2004) asserted that by attending to the end of a course, students not only experience closure but also feel increased rapport and connection with the instructor and fellow students. Additionally, last day activities may help students develop a sense of accomplishment for what they learned and may even serve to encourage students to take additional coursework on the topic. Eggleston and Smith presented a range of methods for effectively ending a course. Although not all of their suggestions would be appropriate for a statistics course, some would work quite well. Instructors can have students at the end of a course write letters to future students discussing what they have learned and how they benefited from the course. Students finishing the course may find these letters beneficial as a means to document their understanding of statistics. Future students might benefit from selected letters that include words of encouragement and advice from previous students. If instructors gave pre-tests at the beginning of the term, they could administer post-tests to demonstrate student learning and increased reasoning/thinking skills. In terms of additional parting techniques, Eggleston and Smith suggested providing students with certificates of accomplishment that instructors easily can print from a computer on high quality paper. They also suggested that a parting e-mail to the class could demonstrate warmth and appreciation to the students.
Tyrrell (2003) presented an interesting last day activity involving packages of "crisps" better known in the United States as potato chips. He passed out individual packages of chips to students and had them brainstorm about all of the different statistical tools that they had learned that term and how to apply these to their supply of chips. Techniques ranged from descriptive statistics (e.g., mean number of chips per bag) and inferential techniques (e.g., comparing the weight of various chip samples) to the population norm as specified on the bag. Of course, most students suggested taste tests for the various types of chips.
Dunn, R. (1989). Building regression models: The importance of graphics. Journal of Geography in Higher Education, 13, 15-30.
Eggleston, T. J., & Smith, G. E. (2004). Parting ways: Ending your course. In B. Perlman, L. I. McCann, & S. H. McFadden (Eds.), Lessons learned: Practical advice for the teaching of psychology (Vol. 2; pp. 71-79). Washington, DC: American Psychological Society.
Ip, E. H. S. (2001). Visualizing multiple regression. Journal of Statistics Education, 9(1). Retrieved July 31, 2007, from http://www.amstat.org/publications/jse/v9n1/ip.html
Kennedy, P. E. (2002). More on Venn diagrams for regression. Journal of Statistics Education, 10(1). Retrieved July 31, 2007, from http://www.amstat.org/publications/jse/v10n1/kennedy.html
Martens, M. P., & Haase, R. F. (2006). Advanced applications of structural equation modeling in counseling psychology research. The Counseling Psychologist, 34, 878-911.
Tyrrell, S. (2003). A walker runs through the syllabus. Teaching Statistics, 25, 26-27.
Williams, R. H. (1975). A new method for teaching multiple regression to behavioral science students. Teaching of Psychology, 2, 76-78.