*Proceedings of the Canadian Engineering Education Association*.

Miller-Young, J.E. (2013). Calculations and expectations: How engineering students describe three-dimensional forces.

*The Canadian Journal for the Scholarship of Teaching and Learning 4*(1), Article 4, 1-11.

I have grouped these two papers together since they are almost the same. The first is a 2010 conference paper and the second is a 2013 journal paper which includes all the 2010 work as well as a bit more data. The study was interested in digging into the details of how students visualise three dimensional statics problems when what they are presented with is a 2-d diagram. The data collected was students’ think-aloud processes of answering two questions, one without context and the other in a real-world context. The 2013 paper also included data on a quiz question which was part of a standard course assignment. All three problems required that the students see the page as the given vertical coordinate plane (

*xy*in the three problems) and the third axis (

*z*) extending out of the page in the positive direction. Points with a negative

*z*-coordinate, in other words, are behind the plane of the page.

The students seemed to find the problems relatively difficult. The author found three main themes in student errors. (1) The students struggled to visualise points behind the plane of the page or vector which extended behind the plane of the page. The two-dimensional drawing on the flat page had to be visualised as a three-dimensional collection of vectors and the students found that particularly tricky for vectors extending backwards relative to their gaze. (2) The students did not always use the provided context to help them visualise the problems. One of the problems involved a pylon with guy ropes attaching to the ground, which was idealised as the flat

*xz*-plane. All the ends of the guy ropes in this problem were on the

*xz*-plane and had a

*y*-coordinate of zero, yet some students struggled to see that. (3) The students reached too quickly for equations to try and answer questions even when there was not enough information to answer the question that way. The tendency to calculate something using a formula is ubiquitous across all maths and physics teaching and is no surprise. This final data point serves only to add to the depressing mountain of similar results.

*Do not treat this blog entry as a replacement for reading the paper. This blog post represents the understandings and opinions of Torquetum only and could contain errors, misunderstandings or subjective views.*

*Proceedings of the 2005 American Society for Engineering Education Annual Conference & Exposition*.

The authors report on their progress in developing the Dynamics Concept Inventory, a MCQ format assessment of 30 questions on concepts used or needed by students in a mechanical engineering dynamics course. The process followed to achieve the final product was thorough, involving polling multiple lecturers of dynamics across several institutions, developing questions, piloting the instrument and going through various phases of refining the instrument. The DCI is available online by contacting the developers through the DCI website.

In this paper, the authors describe the process towards the development of the final version of the instrument and give a list of the concepts involved. They also provide much statistical evidence for the reliability and validity of the instrument. A few items on the test are pulled out for special scrutiny to illustrate clear evidence of misconceptions. The authors are clearly in favour of the test being used in pre-test/post-test format. Their website encourages this format and the DCI developers request that anyone using the test send them the raw data so that they can use the data to further verify the discriminatory power of the instrument.

It would be interesting to run the DCI on one of our cohorts of dynamics students and see if any of the results correlate with our vector assessment results.

*Do not treat this blog entry as a replacement for reading the paper. This blog post represents the understandings and opinions of Torquetum only and could contain errors, misunderstandings or subjective views.*

*Eurasia Journal of Mathematics, Science & Technology Education*, 12(9), 2387-2398.

Following their earlier work (2013, 2014) on determining frequent errors in vector problems, the authors developed a tutorial carefully designed to address the conceptual difficulties students experience with vector projections. The tutorial is presented in an Appendix to the paper. It consists of six sections, requiring the students to determine projections geometrically as well as using the mod(A)mod(B)cos(theta) definition, using a range of theta values. The final section of the tutorial explicitly addresses the observed confusion students experience between the scalar product and vector addition. The paper closes with an open invitation to teachers to use their tutorial. I am tempted to do just that.

*Do not treat this blog entry as a replacement for reading the paper. This blog post represents the understandings and opinions of Torquetum only and could contain errors, misunderstandings or subjective views.*

*Proceedings of the 41st SEFI Conference*(pp. 1-8).

De Laet and De Schutter are robotics researchers and lecturers of 3D kinematics and statics. They observed that students struggle with the concepts and the notation of the subject and that their struggles were related to challenges roboticists experience with non-standardised coordinate representations and related software. They developed a semantics underlying the geometric relationships in 3D kinematics and a notation designed to make relationships clearer and eliminate errors experienced while working across different coordinate representations.

Neither kinematics nor robotics is a speciality of mine, so I might be phrasing my summary badly. I hope I’m correctly representing the work discussed here. The authors claim that their students have benefited from the new notation, making fewer errors than before, and that roboticists have also welcomed the new notation. I particularly liked two bits of this paper. The first bit is the explicit admission that engineers and engineering students need to be aware of the different terminology and notation which can exist across even closely related disciplines – “it is important that students are aware of the lack of standardisation and the implications this might have when reading textbooks or consulting literature” (p. 2) – which relates to my concern about vector notation. The second bit is the attention the authors pay to threshold concepts, which has long been a theory I have tried to apply to vectors, with little luck so far. Reading this paper has given me some new ideas, not least that I would probably enjoy a SEFI conference!

*Physical Review Special Topics-Physics Education Research*, 10(1), 010121-1-010121-14.

Barniol and Zavala describe a really nicely designed investigative project. In the first phase they conducted several studies over a period of four years, using open-ended problems in order to develop a taxonomy of frequent errors students make when solving vector problems. At the same time, they sought references in the literature to frequent vector errors. In the second phase, they developed a test in multiple choice format, named the “test of understanding of vectors" (TUV). They administered this test to over 400 physics students and thereafter observed the categories of errors and the frequencies of errors in different classes of problems.

I really admire the TUV, the preliminary work that went into designing it and the detailed analysis of the errors made. I feel the authors left the “so what?” question up to the reader, making a few minor suggestions about other people using the test in similar ways, but not making any broad assertions about teaching or learning or cognitive concept formation. I hope Barniol and Zavala have written further on this topic, as the work laid out in this paper is admirable and provides much food for thought.

*AIP Conference Proceedings (Vol. 1513, No. 1). American Institute of Physics*. American Institute of Physics, Melville NY United States.

Zavala and Barniol ran a project investigating students’ understanding of the projection role of the dot product. First they gave three isomorphic problems to a class of physics students, with approximately 140 students seeing each of the three problems. The following semester they interviewed 14 of these students, who solved the three problems (and two more) while thinking aloud. The three problems all involved the same arrangement of vectors requiring the same projection, however one was no-context and the others were in context, namely work and electric flux. The investigation found that the students had a weakly constructed concept of the projection role of the dot product. The students were more likely to answer the question correctly in the contextualised problems than in the no-context problem, however even at best only 39% of the students answered correctly.

The authors observe that a majority of the students chose one of the two responses which described scalar quantities, rather than the four other MCQ options which described vector quantities. However in the no-context problem that majority is a disappointingly low 57%. Problematically, I would use the term “projection” differently to Zavala and Barniol, where projection of a vector onto another vector is still a vector, not a scalar quantity. The projection of A onto B in my lexicon is the vector component of A in the direction of B. Zavala and Barniol mean by projection what we elsewhere (Craig and Cleote, 2015) have referred to as “the amount” of A in the direction of B (p. 22) . So, given my definition, there is only one available MCQ option which describes a scalar quantity (option a, a popular incorrect option). I have to assume that the students participating in the study were familiar with the authors’ definition, however, and would have seen that MCQ option as describing a scalar quantity.

The authors cite other work reporting students’ difficulties in connecting concepts and formal representations. They see this dot product projection difficulty as part of that more general situation. “In this article we demonstrate that this failure to make connections is very serious with regard to dot product projection’s formal representation” (p. 4).

Not much has been written on students’ difficulties with the dot product. It is likely that the computational simplicity of the product masks the conceptual challenge of the geometric interpretation.

van Dyke, F., Malloy, E.J. and Stallings, V. (2014). An activity to encourage writing in mathematics.

The authors ran a very interesting study in three stages. The first stage was a short assessment of three MC questions involving the relationships between equations and their representative graphs, the first two only linear, the third linear and quadratic. The questions were quickly answered and easily graded. The second stage was giving summaries of student responses back to the class for discussion (not led by the lecturer). Directly after discussion the students were asked to write about the questions. One group was asked to write about the underlying concepts necessary to answer the questions correctly. The second group was asked to write about why students might have given incorrect answers. These written responses were also evaluated. The third stage of the study was to ask the students to answer a survey (strongly agree/disagree Likert style) testing hypotheses developed during the first two stages. The findings are interesting and point, yet again, to the student tendency to want to do calculations even if the question might not require them - “blind attraction to processes” (p. 379) - and also to the expectation that similar problems should have been encountered before. Interestingly, the students in the second writing group wrote more than those in the first, but did not make many references to actual underlying concepts. The authors stress that if you want students to write or talk about underlying concepts you need to make that explicit.

*Canadian Journal of Science, Mathematics and Technology Education*, 14(4), 371-387.The authors ran a very interesting study in three stages. The first stage was a short assessment of three MC questions involving the relationships between equations and their representative graphs, the first two only linear, the third linear and quadratic. The questions were quickly answered and easily graded. The second stage was giving summaries of student responses back to the class for discussion (not led by the lecturer). Directly after discussion the students were asked to write about the questions. One group was asked to write about the underlying concepts necessary to answer the questions correctly. The second group was asked to write about why students might have given incorrect answers. These written responses were also evaluated. The third stage of the study was to ask the students to answer a survey (strongly agree/disagree Likert style) testing hypotheses developed during the first two stages. The findings are interesting and point, yet again, to the student tendency to want to do calculations even if the question might not require them - “blind attraction to processes” (p. 379) - and also to the expectation that similar problems should have been encountered before. Interestingly, the students in the second writing group wrote more than those in the first, but did not make many references to actual underlying concepts. The authors stress that if you want students to write or talk about underlying concepts you need to make that explicit.

The authors present the design of this study as a way of using writing to encourage reflection without it taking a lot of time or being difficult to grade. I agree and would like to try this myself. Running effective writing assignments in a maths class can be very hard to get right. The authors make reference to cognitive conflict and how resolving a cognitive conflict can lead to cognitive growth. “It is not the intent of this article to explore the efficacy of using writing or conflict resolution in the mathematics classroom but to take that as given …” (p. 373).

*Do not treat this blog entry as a replacement for reading the paper. This blog post represents the understandings and opinions of Torquetum only and could contain errors, misunderstandings or subjective views.*

*European Journal of Engineering Education*(ahead of print) 1-11. DOI: 10.1080/03043797.2015.1059408.

The authors define “living learning communities as “Most LLCs are communities in which students pursue their academic curriculum with a blended co-curriculum involving a theme, concept, or common subject matter while living together in a reserved part of a residence hall.” (p. 2) and “LLCs can be characterised by close working relationships among students and faculty; specialised course assignments; study groups; close relationships among student members; and specialised events, activities and workshops.” (pp. 2-3). They report on a survey carried out in and engineering living learning community (ELC). The authors argue that LLCs might be particularly beneficial to engineering students, given their having to adjust not only to a new environment as entering students, but also to a heavy course load. In this ELC, the students lived together in the same residence hall, took two common courses per semester and their classes encouraged cooperative learning. Students applied to join the ELC, which allowed the authors to compare two cohorts – the ELC students and the non-ELC students enrolled for the same courses. The students were surveyed for their perceptions of transition, student-student relationships, student-faculty relationships and levels of satisfaction with the institution. The findings show that the ELC students perceived their transition to college to be easier than the non-ELC students. They also reported better student-student relationships and greater satisfaction with and connectedness to their institution. The two groups were about the same in their perceptions of student-faculty relationships. The authors conclude “It is recommended that LLCs be used to foster positive student perceptions of transition to college, connectedness to the institution, peer relationships, and their overall satisfaction with the institution.” (p. 9)

This paper has many references to other studies on LLCs, reporting on many and varied benefits. For example “Literature suggests that peer interactions of this sort [friendships, networking, study groups] increase student involvement and participation, which in turn are positively linked to institutional retention” (p. 7) and “First-year students who are easily able to transition from high school to college are more likely to stay at the institution and graduate, positively impacting retention rates” (p. 5). I would really like to look up all of those references and see which ones are based on actual hard data. Altogether I enjoyed this paper, found the data interesting and plan to follow up on several of the references.

*Journal of Engineering Education*,

*89*(4), 443-459.

The author lists 150 published problem solving strategies, although he complains that few are based in research. In an appendix after the (huge number of) references, he briefly gives each of those strategies. The author finds many similarities between the strategies, such as “understand the problem” and “verify your answer”. The strategies vary in number of stages, but it is usually between two and seven. Some have mnemonics, some draw analogies. The author describes careful criteria for an effective problem solving strategy, such as “If possible, none of the stages should include titles describing a skill or attitude … since that skill or attitude could be used in many different stages” (p. 444). He warns against encouraging a linear mindset towards the strategy, by, for instance, numbering the steps, or giving them in an obvious sequence. The author present a strategy represented on a disc, rather than linearly, with six stages, each carefully described. The strategy was based on the literature, trialled on expert practitioners and refined over years of student use.

*Do not treat this blog entry as a replacement for reading the paper. This blog post represents the understandings and opinions of Torquetum only and could contain errors, misunderstandings and subjective views.*

*Journal of Engineering Education*,

*88*(4), 477-483.

The authors describe an assessment regime carried out in an engineering science course, designed to reward students for skills valued in engineers. The assessment regime was run three times in consecutive semesters and the authors feel that it worked well although there were a few challenges. Four assessment types were carried out.

**Readiness assessment tests**: They encourage being prepared for class, but setting one or two questions based on reading an assignment before class. These seem to have been carried out twice a week, but I could see them being almost daily. It was tricky to set questions which did reward reading and understanding. After the course a significant correlation was found between preparation for class and success at the course.

**Basic understanding tests**: These were conceptual in nature, testing understanding of physical phenomena. They involved no mathematics. These were held once per week.

**Major evening examinations**: These were also conceptual in nature and bore the closest resemblance of all the assessments to traditional exams. The questions were tough and required engineering-type skills such as simplification. The problems were also frequently ill-founded and there could be a variety of solutions. There were 3 or 4 of these per semester.

**Minimum skills tests**: Again, these were once a week and were made up of simpler versions of the prior week’s homework. They were multiple choice, which means no partial credit.

The authors argue for these assessments as criterion-referenced, and argue against the use of norm-referenced assessment. I found their insistence on no partial credit interesting. The paper presents analysis of the results as well as comparison with what the grades would have looked like if only the major evening examinations (as closest to traditional exams) were used. Various challenges were discussed such as the trickiness of determining validity and reliability and also the students’ struggles with this new type of assessment and the expectations of them.

*Do not treat this blog entry as a replacement for reading the paper. This blog post represents the understandings and opinions of Torquetum only and could contain errors, misunderstandings and subjective views.*