**RESEARCH LETTER**

**Competence of Science Foundation students in basic intellectual skills**

**Mailoo Selvaratnam; Nkosana Mavuso**

Science Foundation, North-West University (Mafikeng Campus), South Africa

]]>

**ABSTRACT**

The competence of Science Foundation students at the Mafikeng Campus of North-West University in some basic intellectual skills was studied, over a period of three years, utilising carefully designed questions. The skills tested included language, mathematical, graphical, three-dimensional visualisation, information processing and reasoning skills. The results showed that their competence in the basic intellectual skills needed to study science effectively was far below standard. This lack of competence could be expected to be detrimental to self-confidence and may also be an important reason for the high failure rate of students in their science courses. We concluded with the suggestion that much greater emphasis should be placed on the systematic and sustained training of students in intellectual skills and strategies of various types and that such training should be integrated, throughout the courses, with the teaching of subject content.

**Keywords: **intellectual skills; thinking skills; problem solving; students' difficulties; effective learning

**INTRODUCTION**

The various types of intellectual skills and strategies that are important for learning and problem solving have been organised and discussed comprehensively by Marzano et al.^{1} It is generally believed that they are the 'tools' for all our mental activities. They are particularly important for successful problem solving, an important part of most science courses. Even though extensive research has been done on problem solving,^{1,2,3,4,5,6,7,8} not much work has been done on the systematic testing of individual students' competence levels in the basic intellectual skills and strategies that are essential for effective problem solving. Such testing would help in the identification of some fundamental reasons for failures in problem solving and can hence assist in remedial action through appropriate instruction.

Problem solving generally involves multiple steps and often requires a combination of various basic intellectual skills and strategies. We tested some of these basic intellectual skills on Science Foundation students. They did not qualify for admission into BSc degree programmes and the main objective of the Science Foundation programme is to prepare them for tertiary education. As part of her PhD thesis, Drummond^{9} tested the competence of first year BSc science students at the Mafikeng campus of North-West University in some basic skills and strategies. Our study differs from Drummonds' in relation to the type of students that were tested as well as the type of questions used in the testing. Also the questions we used were more basic. Emphasis was placed on designing the simplest possible questions for testing each type of basic intellectual skill, namely: language, mathematical, graphical, three-dimensional visualisation, information processing and reasoning skills. Many of the questions we designed did not involve scientific concepts.

**OBJECTIVES AND METHOD OF STUDY**

*diagnostic*testing and not performance testing, in order to identify students' thinking difficulties. For the purpose of ameliorating the obstacles students were likely to encounter, workshops on basic scientific reasoning were also conducted by the authors of this paper. Also, a detailed report on students' difficulties was circulated to the various lecturers so that they could attempt to remedy students' shortcomings in their courses.

We did not attempt to compare and statistically analyse students' performances in the different questions because the study involved diagnostic testing and the different questions tested different types of skills at different levels of difficulty. To test competence in intellectual abilities, a question had to satisfy the following two criteria: (1) content knowledge should not cause any difficulties, and (2) the solutions to the questions should not already be known to the students (to prevent them from merely recalling the solution). To help satisfy these criteria, questions were designed that were expected to be unfamiliar to students. Additionally, the questions were such that their solutions needed only simple principles (i.e. rules, laws and procedures). The principles that were needed to answer a question were provided in the question itself.

Two types of questions (multiple-choice and structured) were used for testing. Whenever possible they were multiple-choice. The distracters in these questions were selected to correspond with common errors made by students.

This study was done over a period of three years (2006, 2007, 2008). There were 110 students in 2006, 99 students in 2007 and 104 students in 2008. All the students participated in the tests and the total number of students tested was 313. Different question papers were given each year, although some questions were repeated. In 2006, 28 questions were used (in one question paper), 69 questions in 2007 (in five question papers, one for each subject discipline of the Science Foundation course: chemistry, physics, biology, mathematics and language) and 15 questions in 2008 (in one question paper). These question papers may be obtained from the authors. Although no time limit was set, more than 75% of the students submitted their answer scripts within one and a half hours. Students were tested within one month of their entry into the course. They wrote the answers on the question paper itself where ample space was provided below each question. Because the main objective of the study was to investigate how students think and how they set-about solving problems, they were given the following written instructions (along with other instructions in the title page of the question paper):

Show the steps in your reasoning. Also do all the "rough work" in the space provided adjoining each question because the main objective of this test is to determine your thinking processes.

A representative sample of the questions is given in Table 1. These questions were classified into six types based on the type of skill tested: language, mathematical, graphical, three-dimensional visualisation, information processing and reasoning.

**RESULTS AND DISCUSSION**

The percentages of students who correctly answered each question are shown in the last column of Table 1 (under heading: % correct). These results showed that competence of students in most of the intellectual skills tested was poor. Drummond^{9} reached a similar conclusion in her study of first-year students.

Language skills were tested by the first three questions in Table 1. Question 1 tested the students' understanding of four central concepts in the study of science: 'description', 'explanation', 'qualitative' and 'quantitative'. The students' performances were averaged across all the parts, and showed that about 40% of the students were unable to distinguish between descriptive and explanatory statements and between qualitative and quantitative statements. (The distinction between description and explanation is crucial in science and many students' failure in some examination questions is due to their giving a descriptive answer when what was required was an explanation. Giving an explanation is often associated with answering the question 'why', and it generally involves cause-effect reasoning.)

]]> Students' performances in Question 2, which tested only qualitative understanding of some words or phrases important in science (for example: 'directly proportional', 'inversely proportional', 'variable', 'constant', 'percentage', 'prediction', 'estimation') were better: more than 65% of their answers were correct. Their quantitative understanding of these words/phrases however, (which was tested by other questions not given in Table 1) was poor. For example, about 45% of the students were unable to convert a statement of a familiar inverse relationship (the time*t*needed to perform some task and the number of men

*N*doing a task) into an equation.

Question 3 tested ability to identify the variable quantities, and also the type of relationship (whether directly proportional or inversely proportional) between these quantities, in a non-scientific problem. Even though the statements in the question were familiar and simple, about 20% (averaged across their performance in parts a, b and c) of the students had difficulty. Since scientific statements generally are more abstract, students' difficulties with them could be expected to be greater. The ability to carefully and critically analyse verbal statements so as to identify the nature of information provided by them (e.g. whether they are experimental facts or theoretical statements), and also the various types of information in them (e.g. whether the quantities involved are variables or constants, whether the relationship between the variables is directly proportional or inversely proportional), is fundamentally important in all types of learning. Many students experienced difficulty with this type of reasoning, which seems to indicate a need for specialised training in this important aspect throughout their study courses. **Mathematical skills** were tested by Questions 4 to 8 in Table 1. Questions 4 and 5, respectively, deal with numbers in scientific notation and with the basic mathematical operations (addition, subtraction, multiplication, division) involved in them. The results for Question 4 (see last column in Table 1) showed that about 45% of the students tested were unable to arrange correctly, in decreasing order, the four numbers given in the question. About 30% of them also had difficulty in carrying out basic mathematical operations with numbers (averaged across their performance in Questions 5a to 5d). About 40% did not recognise the principle that to add, multiply, subtract or divide two quantities (see Questions 5e and 5f), they must first be converted into the same units (e.g. 1.2 x 10^{-4} km - 1.0 x 10^{-2} m = 1.2 x 10^{-4} km - 0.1 x 10^{-4 }km = 1.1 x 10^{-4 }km). Mathematical operations involving logarithmic numbers (Question 5g) caused even greater difficulty: about 50% of the students incorrectly worked out that log 18 + log 7 = log 25.

Questions 6 to 8 tested many types of intellectual skills associated with equations. Question 6 tested qualitative understanding, particularly how the quantities in an equation will change (whether it will decrease or increase) when the other quantities were changed. The results showed that about 20% of the students were unable to interpret qualitatively how the various quantities in the equation *F = k m _{1} m_{2 }/ r^{2}* depend on one another. Question 7(a) tested their ability to apply the equation

*R = k c*to do a calculation. This equation, however, had to be applied twice: first to calculate

^{2}*k*using the data and then to use the value of

*k*to do the needed calculation. Student performance was not good, with about 70% being unable to do the calculation. The understanding of equations and also the ability to use equations for calculations are very important in all quantitative sciences and lack of competence in this skill will seriously handicap students' performances. Answers to the second part of Question 7 also indicated that about 35% of them did not recognise that a constant term in an equation does not change when the variables are changed.

Question 8 tested the important ability to combine two equations (*c = n/V *and* pV = nRT*) so as to obtain an equation that relates *c *and *p *(which is* c = p / RT*)This combination merely required the elimination of a constant quantity (*n* or *V*) found in both the equations. The table shows that over 75% of the students were incapable of doing this. This inability would seriously limit problem solving because combinations of equations are needed for the solution of most quantitative problems.

Graphical skills were tested in Questions 9 and 10 (see Table 1). They tested, respectively, the ability to deduce the information provided by a simple graph and to correlate linear graphs with an equation (*p ^{1/2}V = k*) The results showed that students' level of competence in graphical skills was unsatisfactory. More than 55% were unable to find out, from a simple graph involving familiar quantities (distance travelled vs time), the distance travelled between two time values (Question 9a). The skills needed to do this are very simple. Over 75% of them also reasoned, incorrectly, that because

*d*increases linearly with

*t*, the speed also increases linearly (Question 9b). Question 10 tested students' understanding of the relationship between linear graphs and equations. Over 90% could not correctly decide on the type of plots they would give a linear graph for the simple equation

*p*.

^{1/2}V = kThree-dimensional visualisation skills were tested in Questions 11 and 12. Students' answers to Question 11 showed that over 80% did not recognise, from a drawing of the *x, y* and *z* axes, that each of the three angles between these axes is 90º. They did not, therefore, visualise the drawing of the *x, y* and *z* axes three-dimensionally. Only about 25% (performance averaged across Question 12) could visualise a cube drawn on paper three-dimensionally. Three-dimensional visualisation of two-dimensional drawings (e.g. the structures of molecules and their arrangement in solids) is important for understanding many aspects of science, and students' lack of competence in this aspect will be an important limiting factor in their learning and understanding of science.

**Information processing skills** were tested in Questions 13 and 14 (see Table 1). Identification of the information given in statements and its processing is an important skill because it is often crucial to successful problem solving. The results for Question 13 showed that 18% of the students could not accomplish the simple task of identifying the information explicitly stated in the problem statement. These results are indicative of the students' difficulties with the linguistic interpretation of the problem statements, an inability to focus and a lack of self-confidence. These result in the inhibition of their overall mental ability and performance.

Question 14 tested ability to convert the information provided by statements into equations. The results showed that about 60% of the students tested could not do this. The conversion of the information provided by statements into equations is a very useful skill. Many students struggle with the use of verbal reasoning (reasoning using statements) for calculations, and we suggest that a method for by-passing verbal reasoning would be to first convert the information provided by statements into equations and then use the equations for calculations.^{10}

**Reasoning skills** and others (e.g. focusing skills, organising skills, ability to proceed systematically) were tested in Questions 15 to 20. Question 15 tested the ability to apply a basic law (the law of conservation of mass), which was given in the problem as a statement, to solve two simple problems. The results showed that about 75% of students were unable to perform this task. Under-performance in this simple question, as well as in other questions that involve the use of statements for reasoning, suggests that most students have difficulty with verbal reasoning. A useful strategy for overcoming this difficulty would be first to convert, using the appropriate principles, the information given in statements into equations and then use these equations for calculations. For Question 15, the equation relating the relevant quantities, by the law of conservation of mass, is:

*m _{A }(at 25 ºC) = m_{A }(at 80 ºC) + m_{B }(at 80 ºC) + m_{C }(at 80 ºC).*

^{23}' , 'molecule' and 'atom' in the 2007 question were replaced by the more familiar terms 'dozen', '12', 'box' and 'marble' in the 2008 question). Student performance improved markedly: about 65% answered Question 17 correctly in contrast to only 12% for Question 16. Because the steps in reasoning needed to answer both questions are the same, it appears that unfamiliar terminology caused difficulty for many students in Question 16. Unfamiliar terminology should not lead to a lack of self-confidence that inhibits problem solving if students are trained to first identify the principles needed and then apply them in a step-by-step manner.

Questions 18 and 19 in Table 1 are examples of problems encountered in our daily lives. Though only direct proportion reasoning was needed to solve Question 18, about 35% of the students (final column in Table 1) were unable to solve it. Direct proportion reasoning and inverse proportion reasoning are very important, not only for the study of science, but also in our daily activities. It is absolutely essential to train students in these two types of reasoning and to ensure that they are competent in them. Question 19 mainly tested the ability to correlate the information given in the problem statement with each of the symbols (quantities) in the defining equation for the mass percentage [this equation was given in the problem statement and is: m_{A} % = (mass of A / total mass) x 100], and then do a simple calculation. The results showed that about 70% of students were incapable of doing this. The difficulties of most students were due to their inability to recognise that the terms 'mass of A' and 'total mass' in the defining equation correspond respectively to 'mass of water in the person' and 'total mass of the person'. While the correlation of the information given in the problem statement with the quantities in an equation is not a particularly difficult task, students still seem to have difficulty and therefore need guidance.

Question 20 tested students' abilities to identify the different ways in which objects can be combined. This ability is required for learning many aspects of science and mathematics. The results showed that about 45% of the students lacked this ability.

**CONCLUSION**

This study's main objective was to use carefully designed questions to test Science Foundation students' level of competence in simple intellectual abilities. The results of this test showed that a large percentage of the students were not competent in most of the intellectual skills that are considered essential for learning science effectively. The skills we tested were very basic, but essential for success in our daily activities. The study found, as far as the students' skills level was concerned, that:

• 40% were unable to distinguish between descriptive and explanatory statements and between quantitative and qualitative statements.

• 45% had difficulties in basic mathematical skills.

• 70% were unable to use a simple equation,

]]> • 80% were unable to combine two simple equations (R = kc, which had to be applied twice, to do a simple calculation.^{2}c = n/VandpV= nRT) so as to obtain the equation that relatesctop.• 65% could not correctly deduce information from a simple graph of distance travelled versus time.

• 25% did not visualise three-dimensionally a drawing of a cube.

• 75% were unable to apply a simple law (the law of conservation of mass), which was given as a statement, to solve problems.

• 35% were unable to use direct proportion reasoning to do a simple calculation.

Intellectual skills and strategies are the 'tools' for all our mental activities. Competence in them would lead to more effective learning, whereas a lack of competence would seriously handicap students' learning throughout their courses resulting in a lack of self-confidence. The greatest emphasis in most educational courses in South Africa, as is the case in many other countries, is on content knowledge training of students and not in the required intellectual skills and strategies. This is indeed a major shortcoming in our opinion because the development of students' intellectual abilities should be an important objective of educational courses. The subject content memorised is often forgotten but the improvement in intellectual skills and abilities will be integral in helping us to solve problems we encounter in our daily lives. Much greater emphasis should therefore be placed, in all our science courses, on the development of the intellectual abilities of students. When starting any topic, teachers should first check (for example by testing students with short test items) whether students are competent in the intellectual skills and strategies needed to learn that topic. The training in intellectual skills and strategies should be integrated with the teaching of subject content throughout the entire course. How such integration could be done is discussed, along with many other aspects, in an excellent resource book on thinking, edited by Costa^{11}, which has 85 chapters written by different authors.

**REFERENCES**

1. Marzano RJ, Brandt RS, Hughes CS, et al. Dimensions of thinking: a framework for curriculum instruction. Alexandria: Association for Supervision and Curriculum Development; 1998. [ Links ]

]]>2. Selvaratnam M, Frazer MJ. Problem solving in chemistry. London: Heinemann Educational Publishers; 1982. [ Links ]

3. Cook E, Cook RL. Cross proportions: a conceptual method for developing quantitative problem-solving skills. J Chem Educ. 2005;82:1187-1189. [ Links ]

4. White RT. Learning Science. Oxford: Basil Blackwell; 1988. [ Links ]

5. Drummond HP, Selvaratnam M. Students' competence in intellectual strategies needed for solving chemistry problems. S Afr J Chem. 2008;61:56-62. [ Links ]

6. Selvaratnam M, Canagaratna SG. Using problem-solution maps to improve students' problem solving skills. J Chem Educ. 2008;85:381-385. [ Links ]

]]>7. Selvaratnam M, Mazibuko B. Importance of focusing sharply on the goal for successful problem solving. S Afr J Chem. 1998;51:42-46. [ Links ]

8. Gagne RM. The conditions of learning. New York: Holt, Rinehart and Winston; 1977. [ Links ]

9. Drummond HP. Students' competence in the intellectual skills and strategies needed for learning South African matriculation chemistry effectively. Ph. D thesis, University of North-West, Mafikeng, South Africa; 2003. [ Links ]

10. Selvaratnam M. A guided approach to learning chemistry. Cape Town: Juta, 1998; p. 17. [ Links ]

11. Costa AL. Developing minds: a resource book for teaching thinking. Alexandria: Association for Supervision and Curriculum Development; 2001. [ Links ]

]]>

** Correspondence to:**

Nkosana Mavuso

Science Foundation

North-West University, Private Bag X2046

Mmabatho 2735, South Africa

email: Nkosana.Mavuso@nwu.ac.za

Received: 11 Nov. 2008

Accepted: 05 Nov. 2009 ]]>
Published: 11 Mar. 2010

**This article is available at:** http://www.sajs.co.za