To start, think about yourself, who you are, and how much you can change. Especially think about yourself as a student. Think about the relationship of your being a student to the other things you want to do and be in life. What is the point of what you're doing now, going to school, taking classes? Is there any point to it? What will it help or allow you to do? Are you getting better at the things that it will be most important for you to do after you leave college?
These are hard questions, in part because you don't know much for sure about the future. But I would guess that a lot of your choices and decisions about what you will do as a student are not based on any idea of what you think you will need to know or do in the future. If you are like most of us, you probably make many of your choices based on what you have done in the past. This is inevitable, of course. There's no way around it. But there are different ways of approaching the way we think about the past that can prepare us better for the future.
If you are like most students, you probably think, based on what you have done in the past, that you are good at some things and poor, or less good, at others. Have you ever said, at the very beginning of a class, something like "Math is my worst subject" or "I always do well in history" or "Science is hard for me"? I can't tell you how often students volunteer that kind of statement to me. If you're like most people you make choices based on what you think you're good at. If you think you aren't very good at math, chances are you will not consider becoming a math major. If English is your "worst subject" you probably won't take very many English classes. If you think of yourself as a lousy athlete, you probably won't take any physical education classes that aren't required. If you choose to go with your strengths, and avoid areas of weakness, you are acting just the way I did when I was a student. Now these decisions may be ones that work well for you and there is nothing necessarily wrong with them. But this way of thinking about your abilities suggests some other ideas that seem to lie behind it.
Think about this: How do you know what you're good at? How do you know you lack ability in some areas? Usually, in a school context, the answer to those questions probably lies somewhere in your previous schooling. You probably had a hard time learning something, or failed to learn it successfully, in your past schooling. And that tells you that you're no good at math or foreign languages or writing. Likewise, you may have found it easy to learn something in the past and performed well in that subject, and that tells you that you're good at it.
If you take this kind of thinking too far, it becomes harmful. If you take your past performance as the limit of your future performance, then you make an assumption that will prevent you from learning and growing. If you believe what you have done in the past is all you can do, then you won't try very hard to do more or better. It would be a waste of time. And this is as much true if you think you are good at something as it is if you think you're bad at it.
What does this have to do with the relationship between college and your future? Just this: if you approach being a student locked into ideas about your potential that were shaped by your past, you limit how much you can learn and how much you can grow. And if you don't learn and grow in college, you're mostly wasting your time. If you keep yourself from completing a degree because of these attitudes, it's obvious that you are wasting your time. But even if you "do well" in terms of producing a decent transcript and get a degree, you're mostly wasting your time. A degree will certainly help you to get a job. But once you have one, it will become irrelevant. You won't succeed in your chosen work any better or enjoy it any more because you have a degree. You won't progress to more responsible positions or be able to take on more important work because you have a degree. The degree will help to get you in the door, but you won't live your life with an A.A. or a B.A.; you'll live your life and do your work with your knowledge and abilities. What you have learned in college is what can really help you in the world, open doors, create opportunities. So if you limit your learning, you limit your life, degree or no degree.
On the other hand, if you learn to do new things and carry that knowledge away with you, it expands your horizons vastly. And you can do that. Because the limitations that you have placed on yourself based on your past experiences are not necessary. If you think you can not become much better at doing a given task, just because you have failed at it before, you are wrong. If you believe your past successes define the only way you can succeed, you are wrong. You are definitely, clearly, provably wrong. But you are also normal, because many of us do tend to believe those things. If we change our minds about it, we will change our lives.
The general term we use to describe the potential for performance in academic skills-the sorts of things you learn in school-is "intelligence." When we think someone is very good at thinking in a particular field or knows a lot about it we may say that she is intelligent. Most of us probably had to take some sort of test at some point in our schooling intended to measure our intelligence. We were probably labeled with a number that was supposed to reflect our "IQ" or "intelligence quotient." The idea of IQ is very powerful in our society. To summarize it briefly, it assumes that each person is born with a certain amount of mental ability, that our genetic endowment that we inherit from our parents determines this ability, and that this ability is basically stable throughout our lives. In other words, some people are smart, and some people aren't, and that's the way it is. Furthermore, this way of thinking tends to assume that IQ or "smartness" is a general quality that applies in the same way to everything we do: if you're smart in one thing, you're smart in another. If you're smart you're smart; if you're not, you're not.
Hardly any of us believe this idea completely, but probably none of us is completely free of it. We all, for example, tend to believe that we are smarter at some things than at others. We don't really believe that intelligence is a general quality that applies in the same way to all of our abilities. I'm smarter with words than I am with numbers; I'm pretty sure of that by now. I know people who would say just the opposite. I know people who are very smart when it comes to music, but downright slow when it comes to statistics. So we don't completely believe in the classic idea of IQ. But you probably believe in it to some extent.
The part that many people do believe in is that our abilities, at least in a particular field like math or music or art, are inborn. And if we believe that, it has consequences. If my abilities are inherited, then I will tend to perform consistently. So if I do poorly in learning my multiplication tables in second grade (which I did, as a matter of fact) it means that I'm not smart in math. If I get a poor grade on an essay in junior high, it means I'm not smart in English. And if reading is difficult for me, it means I'm not smart in reading, or that I'm not smart at all. If I believe that intelligence is fixed and genetic, then I will see difficulties or failures as evidence that there is no point to trying any more, that I simply lack intelligence.
The idea of intelligence as a single, genetically coded ability is almost certainly false. Research on the human mind and learning in the last fifty years or so has provided strong and mounting evidence that this idea of intelligence is over-simplified, misleading, and in some respects simply wrong. David Perkins, a professor at Harvard University's Graduate School of Education, tells us that thinking about intelligence is undergoing a "Copernican revolution" (4-11). In the 1500s, the Polish astronomer Nicolas Copernicus made the case that the earth revolved around the sun rather than the sun and other heavenly bodies around the earth, and this revolution in thinking changed the way people thought about the universe in a fundamental way. Likewise, Perkins says in his 1995 book Outsmarting IQ: The Emerging Science of Learnable Intelligence, new insights into the way the human mind works are changing the way we all think about our own abilities.
Perkins uses the comparison with computers to make the point. The old view of intelligence and ability was based on the idea that everything was determined by hardware. Many scientists thought of the mind as a machine like a steam engine or an electric motor. The physical parts of the machine determine its potential, and that potential can't change much without taking the machine apart and putting it together again in a different way. With the advent of digital computers, however, we can see a very different kind of machine at work: more complex, more flexible, more like the human brain. In the case of computers, certainly the basic hardware is important. The central processing unit, memory units, and other devices that make up the computer determine the processing speed of the computer, its memory capacity, and other limitations that it can't exceed. But this is only a part of the story. The hardware is the foundation, but what builds the functions of the computer on that foundation is the software. Software is a set of instructions that allow the basic processor to perform a variety of specific functions like word processing, spreadsheets, e-mail, and accessing the Internet. One of the most familiar examples of the dramatic difference that software can make in the potential of a computer is the Internet browser that you are probably reading these words from-or that you printed them from. The Internet browser gave people access to a world of information that would have been inaccessible without it. The advances in software have made computers more effective-as you have found out yourself if you have ever upgraded to a new program or a new operating system. Software makes computer hardware "smarter."
So it is with our minds, says Perkins. We have the basic hardware of our brains, most of which we are born with. But that is only a small portion of what determines our "intelligence." On top of the brain we have what Perkins calls mindware, the human equivalent of software. And our mindware is not inborn or hard-wired. It is by improving and developing our mindware that all of us can develop more intelligence in almost any field. Mindware, Perkins says, "is whatever people can learn that helps them to solve problems, make decisions, understand difficult concepts, and perform other intellectually demanding tasks better. . . . [M]indware is software for the mind-the programs you run in your mind that enable you to do useful things with data stored in your memory" (13).
Neural intelligence is your basic brain capacity. It corresponds to the hardware of your mind. This does vary between people, and it is probably largely determined by genetics. But it also develops as a child grows up and depends to some extent on the child's and parents' behavior. For example, we know that good nutrition contributes to the proper development of the brain and that being deprived of certain nutrients while growing up can impair neural development. Scientists haven't fully determined just how large a role genetics and upbringing play, but most believe that genetics has an important influence on neural intelligence.
Experiential intelligence reflects the role of knowledge in helping us to decide and act more effectively. In general, to behave intelligently requires that you know a certain amount about the context in which you are acting. When you first start a new job, you probably move slowly because you are unsure of what to expect or what is expected of you. But as you get more experience, you quite literally get smarter about doing your job because you know the lay of the land better. Experience makes us smarter.
Reflective intelligence consists of the planning, strategy, and attitude that help us to think and act more effectively. The way we think about what we are doing makes us more or less intelligent in doing it. We will usually think more intelligently if we have a sound plan for working through a problem than if we just guess or act by trial and error. In playing games, for example, whether chess or bridge or football, players who have thought through a strategy for advancing their side will generally do better than those who simply play defensively and react to the other side's moves. Likewise, problem solvers who have a strategy for defining the problem, discovering its causes, and testing possible solutions are more likely to actually solve problems. (102-3)
Why is it worth breaking down intelligence into these three dimensions? Because it makes a great difference in the way we think about intelligence. Only the first dimension of intelligence, neural intelligence, corresponds at all to the traditional idea of hereditary, in-born mental ability. The other two dimensions are clearly things we are not born with but develop through practice. In other words, we probably do inherit some elements of our intelligence from our parents. But a much larger portion of our intelligence is learned rather than in-born. Furthermore, research shows that the kind of intelligence measured by IQ tests becomes less important the more skillful the people involved. As Perkins points out, "As a generalization, IQ has the most predictive power when people begin to develop competence in an area; its predictive power lessens as learners approach mastery" (106). In other words, higher neural intelligence may be most helpful to people when they are beginning to learn new material. After they become skilled in a particular activity, experiential and reflective intelligence become more important.
To see how important experiential and reflective intelligence are we only have to think in terms of difficult kinds of tasks. If we need a doctor or a lawyer-or an airline pilot or investment advisor or therapist-we don't want someone just generally "smart." We hope that our doctor will have a lot of experience as well. Indeed, much of what medical students do in medical school is designed to give them experience doing the actual kinds of work that doctors do, working with the terms and concepts of medicine until they are second nature. In other words, we want a doctor who has experiential intelligence. But we also know that a very experienced doctor can make mistakes; a doctor can misdiagnose your unfamiliar illness as a more familiar one. Experience alone doesn't make us smart. A lot depends on the way we use our experience. We want a doctor who can ask the right questions and do the right tests, who understands the process of diagnosis well enough to tell when a familiar diagnosis should be trusted and when to look more carefully. In other words, we want a doctor who has reflective intelligence, who not only has a lot of knowledge but uses strategies that apply that knowledge effectively.
Reflective intelligence is perhaps the most important kind of intelligence, not only because it can be learned, but because it helps us to learn how to constantly become smarter. One of the great dividing lines in life, I am afraid, is among those people who accept their abilities as set and predetermined and those who constantly get better at whatever they are doing. And the difference between these two kinds of people has nothing to do with natural gifts; it has everything to do with reflective intelligence. One of the components of reflective intelligence-and perhaps the most important one-is our attitudes about our selves and our own abilities.
One of the most exciting and surprising results of research in the past twenty years is that our performance depends crucially on what we believe about our own abilities. Carol S. Dweck, a professor of psychology at Columbia University, has studied students from grammar school through college in an attempt to find out why some students do better than others. She summarizes the results of twenty-five years of research in her book Self-theories: Their Role in Motivation, Personality, and Development, published in 2000. She has found that intelligence as measured on IQ tests is not the most important factor.
For example, some students who are apparently very intelligent and do very well in elementary school fall behind when they get to junior high school, and some high-performing high school students do poorly in college. On the other hand, some students who do poorly in elementary school do much better in junior high and some students who seem mediocre in high school excel in college (29-36). What is the difference? It's not their intelligence as measured by IQ tests. It is their attitude, based on their beliefs about their own intelligence. To put it simply, students who believe that their intelligence is a fixed entity, that you are either smart or dumb, are much more likely to give up when faced with difficulties or failures. On the other hand, students who believe that if you try harder you will get better, do try harder and do get better.
What this very careful research shows is not that people who think they are smarter do better. Indeed, how smart you think you are doesn't seem to have much effect on performance. What makes a difference is whether you believe that your abilities are changeable or set. In the study of students making the transition to junior high school, many students who had high confidence in their ability, but believed that their ability was fixed, showed a dramatic drop in their performance when they had to adapt to a new environment and new challenges. "But many of those who had been among the lower achievers in sixth grade were now doing much better, often entering the ranks of the high achievers" (Dweck 31). The key was that they believed that effort was more important than native ability. Students who believe this, according to Dweck,
don't have to feel they're already high in ability in order to take on challenging learning tasks in a vigorous way. Our findings thus show that students' level of confidence was not nearly as important as their theory of intelligence in helping them meet and conquer this difficult transition. (31)
Our beliefs about intelligence itself are thus one of the major components of our reflective intelligence. People who believe that intelligence can be learned, in effect, learn to be more intelligent. I'm sure one reason for this is that people who believe that effort can make a difference spend more time and effort developing their reflective intelligence, learning strategies that can help them to be more effective. Attitude by itself probably doesn't directly affect your ability. But your beliefs and attitude do determine how hard you work and how long you persist in seeking to improve.
Given the belief that you can learn, a lot still depends on the techniques of reflective intelligence you employ. You have probably heard dozens of teachers tell you that the most important learning is learning how to learn. Let me add myself to the list by saying it again. In fact, at its most basic level, reflective intelligence is thinking about and mastering learning strategies, learning how to learn better.
We will spend a great deal of time in this class trying to develop strategies for learning and doing. Indeed, much of what you will read in your handbook is aimed at developing strategies that will allow you to think more effectively about what you have to do as a researcher and a writer, so that you can do it more effectively.
Your theory of intelligence, your idea about how ability works, is just as important if you are "smart" as it is if you aren't. People who assume that intelligence is fixed, and that you basically can't get smarter, tend to see effort as a sign of weakness: if you're really smart, everything should be easy for you. People who have easy success at a task sometimes come to believe that easiness is a sign of competence, that if something is within your range, so to speak, it should be effortless. And those same people, often very talented, learn the lesson that if you have to try hard, you might as well give up, because the task is just too hard for you. That is a terrible lesson; it makes smart people dumb.
Whatever your native "intelligence," you can improve it. Whatever your ability, you can develop it. There are two keys to doing so. The first is to believe that you can, that your own abilities are not fixed but flexible and that the more effort you apply the more you will grow and develop. The second key is to recognize that thinking about how you think, paying attention to the strategies you use to achieve a goal, is the way to get better at achieving goals. We will be exploring and talking about strategies for better thinking all semester. Take this exploration seriously, spend the time, because it is the way to grow. It's a cliché that the people who succeed and set the standard in any kind of work not only work harder but work smarter. And like many clichés, it's true.
This section has been translated into Slovak. The translation is available here.
Copyright © 2000 John Tagg