Our department chair brought up this video a recent instructional faculty meeting, warning us that “it claims everything we do is wrong.”

The argument is, in essence, that requiring the “calculation” step of math problems is out of date and technology allows focus on the setting up of problems and the application of knowledge.

Other than Maria Andersen bringing it up there hasn’t been much discussion. I’m not unilaterally against the thesis, but I don’t believe the consequences have been well explored. Let me pose a thought-experiment example.

Suppose you are a scientist of the future schooled in a no-calculation-instead-computation curriculum and you must deal with the following function:

Suppose in this context only real-valued answers make sense. Dutifully you type the formula into Mathematica 24, attempt some small tests, and get an error for the x value you’d really like to know about, 3.6.

How would you diagnose the error?

For a computation-fluent reader the second term is clearly the issue, suggesting perhaps the second term of the formula is not genuinely what’s wanted, or that a different approximation formula might be required instead.

Without computational fluency it’s still possible to discuss domain holistically, but any problem diagnosis I’ve been able to think of is distant by a step; perhaps a feature that allows isolating each term and discusses the effect on the domain. In any case, being unaware the domain of the second term is findable via

Is that a calculation? Remember the step he said which was important which was the check step. It seems to me that being able to check that the computer has in fact found the answer you want would involve some computation.

Are you arguing against his entire thesis? Or do you think there is some scope for recognizing that being a backup to a device which could just as easily tell you the domain of the formula (if you asked for it) and produce the answer in a fraction of the time…

I think that a certain amount of computational thinking is important. You have to recognize how algorithms are constructed, etc… but there needs to be much more recognition that much mathematics today is done using computers and that these devices are under-utilized in schools and almost completely absent from most school curriculum.

How can the most powerful computational devices the world has ever seen NOT have a prominent use in our school systems? Are we worried the computers are going to disappear?

In a world where our computational devices vanish or no longer work, we will have much bigger problems and it won’t matter much what we are teaching our students, none of it will be relevant.

No, I’m picking at a particular thing which worries me, that of having difficulty recognizing “bugs” in one’s equations when having insufficient computational experience. I’m not ready to make a general statement but I wanted to throw out a particular example.

I am thinking micro rather than macro. Conrad Wolfram wants to pitch everything by arguing at a macro level, but the micro level issues are not irrelevant.

I agree with you, Jason. I believe Wolfram grossly underestimates the depth of intimacy and emotional involvement a person develops — and NEEDS to develop — with the processes of problem-solving and equation-solving as s/he grows as a mathematical thinker.

To me, this argument is the equivalent of saying, Well, you could learn MUCH more efficiently and quickly how to be in an intimate relationship with another person by STUDYING about how to have a solid relationship with another person rather than by experimenting with the random potential partners you encounter in your daily life.

After all, there are plenty of bestselling books and web sites by real, bona fide experts (just ask them!) about how to have a good relationship. Or you could hire a relationship coach. You could practice with your coach until you are “ready” to have a high-quality relationship. I mean, why waste your time with other FLAWED human beings? They’re so imperfect and complicated. And messy. Actual “practice” relationships are messy. Why learn how to read and respond to other human beings when you could simply rely on computer face-recognition or voice-reading technology? I mean, don’t you have better things to do with your precious life energy? Wouldn’t you rather delegate all of this stuff to Microsoft or Texas Instruments?

I hope your actual answers to those (rhetorical) questions were horror. If they weren’t, please seek professional help.

I simply don’t believe it is possible to delegate or externalize the intimate connections that happen within one’s own mind to an external computational device. What is more, a person has to CARE — and care deeply — to do mathematics. It’s not a casual thing!

So I think you are right about this approach leading to an inability to recognize “bugs” in one’s thinking, but I think there is an even deeper problem with it — a failure to recognize the depth of human involvement (aka engagement) that is required to do mathematics.

There’s an optimal balance point between calculation by hand and by computer. What this guy seems to be advocating is too far on the “computer side”, but what we currently have is way, way, WAY too far on the “hand” side. To see this, consider freshman integral calculus. Three quarters of the course is about calculating antiderivatives by increasingly exotic methods. In real life, maybe .0001% of the population will ever need to integrate those exotic functions, and that fraction of a percent all live in STEM departments and, in practice, will do it on a computer anyway. Meanwhile, gorgeous applications of the integral (such as using it to estimate unexpected probabilities, like the probability of a random line segment intersecting another random line segment under various constraints, probability questions in a continuous setting) are completely ignored.

The amount of obsession you get with anti-derivatives is of course site and country dependent (I recall India’s crazy about them). My school’s ratio is more like … 1/4? I’ll have to double-check.

re: lack of nice applications, I think the historical ones that are wildly common (emptying water tanks, for example) have preempted other applications for so many years they are missing due to sheer inertia.

Another issue is some of the nice applications (including the probability ones) require some background setup that has nothing to do with calculus and the time / learning ratio might be too far off to be worth the effort.

I can’t have an opinion until I test whatever this is, but my first impression is “the sooner the better”. I don’t agree with the whole leap metaphor though. I think we can change gradually and test more conceptual understanding yet still use calculations to illustrate the concepts.
I am a little worried however that he’s not really paying much attention to the deductive reasoning part of math. In fact, is he actually talking about natural science and economy?

For me, the beauty of math is intimately connected to how it continuously grows through logical deductions. So until I’m convinced that removal of calculations still allows for proof, I’m not committing to anything.
As an example: the proof of Euler’s formula involves solving a differential equation. What would Wolfram have us do? Set it up, solve through Mathematica, and then put in a clause that the proof holds given Mathematica doesn’t have a bug?

Well, I don’t mean that computer generated proofs are and should be forever forbidden. The main obstacle right now is surely that not enough people know enough programming to understand and trust such proofs.

What I did mean however is that just like we want students to understand, and not just apply, natural sciences, economics, and a whole literature, it makes sense to want them to understand mathematics. Somehow teaching them to understand the programming needed to prove things seems more difficult than teaching them to prove things.

The counter-question is, does teaching the mechanics of calculation inherently lead to understanding? Being able to multiply two numbers by hand is significantly more common than being able to *explain* why the technique works.

(I’m not sure how well that example stretches to algebra and calculus, but I don’t think I can dismiss the parallels entirely.)

josh, I think we’re approaching the issue from different directions. I’m not saying calculations build understanding – it’s obvious that without reflection calculation does nothing of the sort.

But at the same time, how would we understand functions without spending at least a little bit of time using functions in calculations?
I think it would be nice to teach students multiplication or whatever concept we’re talking about and then use calculations to illustrate the concept. That means a lot less calculations than is done today, but probably more than Wolfram wants.

“The counter-question is, does teaching the mechanics of calculation inherently lead to understanding? Being able to multiply two numbers by hand is significantly more common than being able to *explain* why the technique works.”

Great question. It might have been rhetorical, but I can tell you that no amount of calculating ever helped me to see what multiplication IS. I had to figure it out for my self, by going back to the beginning and understanding how arithmetic works, and what it is useful for. i had to ask questions that no teachers knew the answer to.

In the end I found out that there is a logical way to connect basic arithmetic elements as in:
* Numbers are for counting things
* Counting implies addition and subtraction, because there is a group you are counting from (subtraction) and a group you are counting to (addition) (Take a pile of coins and count them and you will see you create two piles in the process). So addition and subtraction are opposite operations.
* Counting and addition tells you how many, but now how MUCH. Solving that issue requires the idea of measurement.
* Measurement introduces the idea of units. When you are measuring, the things you are counting are units (imaginary) instead of real things like coins.
* Using units introduces the idea of scale, that larger units are composed of the smaller ones.
* The idea of scale requires a way to convert between units of different sizes. Solving this requires multiplication and division. They are specifically for converting units. When you multiply and divide, the quantity does not change (unless you utilization multiplication as a shortcut for repeat addition, but then you are not really performing multiplication).

So here is the key point: None of these concepts are contained in the algorithms I learned for hand-calculating.

So I agree that we should be less calculation centric, and more concept centric in teaching. But I also remind myself that someone who has created a big business out of computing software is going to be biased when they tell me that we should all use computers to calculate.

It is also true that our “intuitive” notions about what works for teaching and learning are often wrong. What works is often dependent on what you are measuring and how you are measuring it.

Finally, I am always skeptical when a representative of a corporation tells me what is good for me…because corporations do not exist for the public benefit or public good. They exist for maximizing shareholder value at ALL costs, and this is a legal restriction that is placed on corporations. Not to descend into a whole other topic, but it is relevant to evaluating Wolfram’s proposal.

I worry that we’re missing the point here. It’s not that the kids won’t learn anything about calculation, just that they spend much more time thinking about the big picture trends. In his curriculum, kids would spend time with algorithms that included taking roots, during which kids would see that roots of negative numbers don’t work in a standard algorithm. Why not? That’s where the teaching comes in, but let the kid make the discovery and ask the question first. Give the kid a situation/problem in which taking the root of a term makes sense; the computer will immediately let him/her know that something is wrong when the radicand becomes negative. If the curiosity is there, which it would be with a good problem, the necessity of either a limited domain or imaginary numbers becomes a consequence of the model and not because it was the next chapter in the text. I would say it’s much better to have them ready to test and accept the results of computational logic than to have them fail to memorize seemingly unrelated mathematical processes in an attempt to get to the ‘next level’ of mathematics.

In my hypothetical I am assuming the scientist knows something conceptually about domain and so forth. What I’m puzzling about is the micro-scenario of diagnosing this particular domain problem. Now, it would hopefully occur to them to computationally check the internals of each square root, but that is a step removed in such a way that is awkward.

If it was just this example I wouldn’t be worried, but in programming anything (mathematical or otherwise) legions of bugs occur, and depending on the computer to help decipher its own problem is like understanding the mathematics through a layer of mud.

I’m not closed off to the possibility of a way out of this, but it’s certainly something that needs to be addressed in any potential computational-based curriculum.

I think computation is necessary, but all those pencil-and-paper algorithms are a little silly. We should be able to do very simple calculations mentally, and thereby estimate and check answers that machines give us.

If you want fast and accurate computation – which most elementary school teachers still do! – then ask a computer. If you want understanding, ask a person.

My problem with Wolfram’s talk is that he thinks he has found the silver bullet to fix math education. I go into more detail on my blog with this post.

There is perhaps a middle ground. Just as computer programmers learn to debug programs by breaking them into pieces, each of which can be individually tested, surely students using computation engines can be trained to do the same. The insights that traditionally trained mathematicians would have from looking at the function can also be arrived at by “debugging” techniques using a computational engine.

This may not be as fast and efficient, but in time, with enough practical experience chasing these sorts of problems down, I wonder if future students who rely heavily on computational engines would acquire a very similar level of insight into the behavior of functions as many traditionally trained folks?

Part of the issue is current programming languages are relatively mysterious with their bugs when the bugs are related to the mathematics being implemented rather than the programming.

I am imagining some special programming language just designed for teaching students mathematics (LOGO, BASIC, etc. are designed for teaching students programming). This could be GUI-based and also friendly in terms of error messages.

Suppose as a small example something I’ve done before — teaching how to program the quadratic formula. The custom language would a.) give the current look of the formula at any given moment, so students know if they accidentally divided by 2 and multiplied by a rather than dividing by (2a) b.) have a template for teachers to insist that, for example, there should be two (not necessarily distinct) answers so that there can be a friendly message if students forget there is a plus-or-minus involved c.) be relatively intelligent about response to typos — did you mean subtraction or a negative number here? d.) be verbose and helpful in regard to non-real answers.

Incidentally, the main thing that frustrated me about the Google computational project with the teachers is that any students capable of using Python without freaking out are not the ones struggling in mathematics. I had been hoping for a program like I just described above, it’s the sort of thing Google resources would be perfect for.

While your assessment about students who wouldn’t freak out when facing down a Python prompt may be correct for current 10th/11th graders, it doesn’t track for younger kids. If you expose 7th and 8th graders to programing, they *all* freak out (or, in the case of Scratch, occupy themselves with the draw/animation tools and ignore the programing concepts). But well-structured introduction in late elementary or middle grades can push through that initial discomfort.

My concern is that we’re changing one set of ancillary tools (hand-calculation) for another (elementary programing). Now, I like to code and I think that some basic fluency in how software is made is worthwhile for almost any current student. Asking a 6th grader to write a function that calculates the LCMof any two integers does require that they have clear mastery of the LCM concept, but it wraps the math a bunch of non-math skills. If your goal is only fluency with fractions, this is wasted time and unnecessary overhead. If you’re building towards Wolfram’s vision of revised maths, then it’s essential to the endeavor — the same way hand-calculation is essential to our current curricular environment.

Read that as (group of students who can use Python without freaking out) is a subset of (group of students who are doing well in mathematics) but not the converse.

Asking a 6th grader to write a function that calculates the LCM of any two integers does require that they have clear mastery of the LCM concept, but it wraps the math a bunch of non-math skills. If your goal is only fluency with fractions, this is wasted time and unnecessary overhead.

I’d agree that it’s a very advantageous societal shift in the near-term. If my math department decided to abandon our current math texts and invest themselves in building a system like this from the middle grades up, I think we’d see a significant positive result.

There’s a number of open questions about how to build such a program, or wether it lends itself to independent assessment, or how to sustain one for a decade or two. But in my observation, much of the current system is tripping over all of those hurdles right now, so that doesn’t dissuade me form wanting to try this as a new approach.

Definitely computers allow us to be more ambitious in our calculations, but if you use them as a “problem solving black box”, good luck!

As someone who has extensive experience with CAS, they are **not** a black box system. You’d better be prepared when you fire them up, because you never know what they’re going to spit back at you. You’d better know a *lot* of 19th and 20th century math to make sense of many answers (or to make sense of *why* the software doesn’t — or maybe cannot — understand what you mean, or read your mind).

Fun example: ask students to factor x^8-1 into irreducibles (trick question: over Q, R or C?).

The link is dead. I would love to see this paper. It might be that teaching problem solving skills does not relate to better math understanding…but it does equate to better performance in math class. Part of the reason is the HORRIBLE textbooks. The word problems my kids are faced with are intentionally made difficult to understand, are constructed with poor grammar, and contain clumsy wording. I understand that our intuitive notions of what works are sometimes wrong. But I need to see if this paper is differentiating between pure math understanding, and success in math learning. If you can repost a working link that would be great.

I watched the Conrad Wolfram video yesterday and had a few thoughts about it.

First – at the beginning of his talk, he shows people using “math” in the “real world.” In other words, he shows scientists and other professionals using mathematical software and assumes that this proves that people don’t need to understand math in order to use it. OK, but who WRITES the software being used? (Oh, right Conrad Wolfram and his brother!)

Also, what if there isn’t a piece of software for the math you need to do?

Another point that caught my eye was the car analogy, where he says that if we learn how to drive, we don’t need to learn how the engine and transmission and suspension work (although I think that learning about internal combustion IS important).

I use a similar analogy in my classes, but with one important difference – I say

“We learn how to drive a car, but that doesn’t mean we don’t learn how to walk!”

Cars need roads, you can’t drive your car in the forest, but you might want to take a walk in the woods.

Also, what if there isn’t a piece of software for the math you need to do?

That’s why he discusses programming, I’d imagine. Programming in a math languages lets a lot of the computational internals to be shuffled under the rug. (If you protest that it makes debugging hard, well, that was my original post’s point.)

Now, writing the internals of the software would require deeper understanding, but Conrad’s argument seems to be only a small number of people need to do this, just like most programmers cannot nor need not to code in assembly language.

[…] posted before about Conrad Wolfram’s efforts to remove calculation from the curriculum and make everything computer based. There is now a website devoted to the initative (http://computerbasedmath.org) and Conrad […]

dwees, on November 22, 2010 at 11:25 am said:Is that a calculation? Remember the step he said which was important which was the check step. It seems to me that being able to check that the computer has in fact found the answer you want would involve some computation.

Are you arguing against his entire thesis? Or do you think there is some scope for recognizing that being a backup to a device which could just as easily tell you the domain of the formula (if you asked for it) and produce the answer in a fraction of the time…

I think that a certain amount of computational thinking is important. You have to recognize how algorithms are constructed, etc… but there needs to be much more recognition that much mathematics today is done using computers and that these devices are under-utilized in schools and almost completely absent from most school curriculum.

How can the most powerful computational devices the world has ever seen NOT have a prominent use in our school systems? Are we worried the computers are going to disappear?

In a world where our computational devices vanish or no longer work, we will have much bigger problems and it won’t matter much what we are teaching our students, none of it will be relevant.

Jason Dyer, on November 22, 2010 at 12:14 pm said:Are you arguing against his entire thesis?No, I’m picking at a particular thing which worries me, that of having difficulty recognizing “bugs” in one’s equations when having insufficient computational experience. I’m not ready to make a general statement but I wanted to throw out a particular example.

I am thinking micro rather than macro. Conrad Wolfram wants to pitch everything by arguing at a macro level, but the micro level issues are not irrelevant.

Elizabeth, on November 22, 2010 at 9:43 pm said:I agree with you, Jason. I believe Wolfram grossly underestimates the depth of intimacy and emotional involvement a person develops — and NEEDS to develop — with the processes of problem-solving and equation-solving as s/he grows as a mathematical thinker.

To me, this argument is the equivalent of saying, Well, you could learn MUCH more efficiently and quickly how to be in an intimate relationship with another person by STUDYING about how to have a solid relationship with another person rather than by experimenting with the random potential partners you encounter in your daily life.

After all, there are plenty of bestselling books and web sites by real, bona fide experts (just ask them!) about how to have a good relationship. Or you could hire a relationship coach. You could practice with your coach until you are “ready” to have a high-quality relationship. I mean, why waste your time with other FLAWED human beings? They’re so imperfect and complicated. And messy. Actual “practice” relationships are messy. Why learn how to read and respond to other human beings when you could simply rely on computer face-recognition or voice-reading technology? I mean, don’t you have better things to do with your precious life energy? Wouldn’t you rather delegate all of this stuff to Microsoft or Texas Instruments?

I hope your actual answers to those (rhetorical) questions were horror. If they weren’t, please seek professional help.

I simply don’t believe it is possible to delegate or externalize the intimate connections that happen within one’s own mind to an external computational device. What is more, a person has to CARE — and care deeply — to do mathematics. It’s not a casual thing!

So I think you are right about this approach leading to an inability to recognize “bugs” in one’s thinking, but I think there is an even deeper problem with it — a failure to recognize the depth of human involvement (aka engagement) that is required to do mathematics.

Xamuel, on November 22, 2010 at 11:56 am said:There’s an optimal balance point between calculation by hand and by computer. What this guy seems to be advocating is too far on the “computer side”, but what we currently have is way, way, WAY too far on the “hand” side. To see this, consider freshman integral calculus. Three quarters of the course is about calculating antiderivatives by increasingly exotic methods. In real life, maybe .0001% of the population will ever need to integrate those exotic functions, and that fraction of a percent all live in STEM departments and, in practice, will do it on a computer anyway. Meanwhile, gorgeous applications of the integral (such as using it to estimate unexpected probabilities, like the probability of a random line segment intersecting another random line segment under various constraints, probability questions in a continuous setting) are completely ignored.

dwees, on November 22, 2010 at 11:58 am said:That is a great example. We teach students this complex object called an integral and provide a couple of shallow examples of how it can be used.

Jason Dyer, on November 22, 2010 at 12:26 pm said:The amount of obsession you get with anti-derivatives is of course site and country dependent (I recall India’s crazy about them). My school’s ratio is more like … 1/4? I’ll have to double-check.

re: lack of nice applications, I think the historical ones that are wildly common (emptying water tanks, for example) have preempted other applications for so many years they are missing due to sheer inertia.

Another issue is some of the nice applications (including the probability ones) require some background setup that has nothing to do with calculus and the time / learning ratio might be too far off to be worth the effort.

Julia, on November 22, 2010 at 12:23 pm said:I can’t have an opinion until I test whatever this is, but my first impression is “the sooner the better”. I don’t agree with the whole leap metaphor though. I think we can change gradually and test more conceptual understanding yet still use calculations to illustrate the concepts.

I am a little worried however that he’s not really paying much attention to the deductive reasoning part of math. In fact, is he actually talking about natural science and economy?

For me, the beauty of math is intimately connected to how it continuously grows through logical deductions. So until I’m convinced that removal of calculations still allows for proof, I’m not committing to anything.

As an example: the proof of Euler’s formula involves solving a differential equation. What would Wolfram have us do? Set it up, solve through Mathematica, and then put in a clause that the proof holds given Mathematica doesn’t have a bug?

Jason Dyer, on November 22, 2010 at 12:30 pm said:Dunno about Wolfram, but Doron Zeilberger thinks so, more or less:

Opinion 112: On Human Hypocrisy (and Human-Centric Bigotry): A Typical Computer-Assisted Proof is Far More Rigorous (and Certain!, and Deeper!) than a typical Human-Generated Proof, (and some suggestions on how to improve the reliability of published mathematics in “peer”-reviewed journals)

Julia, on November 22, 2010 at 12:39 pm said:Well, I don’t mean that computer generated proofs are and should be forever forbidden. The main obstacle right now is surely that not enough people know enough programming to understand and trust such proofs.

What I did mean however is that just like we want students to understand, and not just apply, natural sciences, economics, and a whole literature, it makes sense to want them to understand mathematics. Somehow teaching them to understand the programming needed to prove things seems more difficult than teaching them to prove things.

Jason Dyer, on November 22, 2010 at 12:47 pm said:Point taken (although if you read DZ’s opinions in general, you do get the sense he would be OK with the Euler example).

josh g., on November 22, 2010 at 3:41 pm said:The counter-question is, does teaching the mechanics of calculation inherently lead to understanding? Being able to multiply two numbers by hand is significantly more common than being able to *explain* why the technique works.

(I’m not sure how well that example stretches to algebra and calculus, but I don’t think I can dismiss the parallels entirely.)

Julia, on November 22, 2010 at 4:19 pm said:josh, I think we’re approaching the issue from different directions. I’m not saying calculations build understanding – it’s obvious that without reflection calculation does nothing of the sort.

But at the same time, how would we understand functions without spending at least a little bit of time using functions in calculations?

I think it would be nice to teach students multiplication or whatever concept we’re talking about and then use calculations to illustrate the concept. That means a lot less calculations than is done today, but probably more than Wolfram wants.

Keith Sherwood, on December 7, 2010 at 6:30 am said:josh g., on November 22, 2010 at 3:41 pm said:

“The counter-question is, does teaching the mechanics of calculation inherently lead to understanding? Being able to multiply two numbers by hand is significantly more common than being able to *explain* why the technique works.”

Great question. It might have been rhetorical, but I can tell you that no amount of calculating ever helped me to see what multiplication IS. I had to figure it out for my self, by going back to the beginning and understanding how arithmetic works, and what it is useful for. i had to ask questions that no teachers knew the answer to.

In the end I found out that there is a logical way to connect basic arithmetic elements as in:

* Numbers are for counting things

* Counting implies addition and subtraction, because there is a group you are counting from (subtraction) and a group you are counting to (addition) (Take a pile of coins and count them and you will see you create two piles in the process). So addition and subtraction are opposite operations.

* Counting and addition tells you how many, but now how MUCH. Solving that issue requires the idea of measurement.

* Measurement introduces the idea of units. When you are measuring, the things you are counting are units (imaginary) instead of real things like coins.

* Using units introduces the idea of scale, that larger units are composed of the smaller ones.

* The idea of scale requires a way to convert between units of different sizes. Solving this requires multiplication and division. They are specifically for converting units. When you multiply and divide, the quantity does not change (unless you utilization multiplication as a shortcut for repeat addition, but then you are not really performing multiplication).

So here is the key point: None of these concepts are contained in the algorithms I learned for hand-calculating.

So I agree that we should be less calculation centric, and more concept centric in teaching. But I also remind myself that someone who has created a big business out of computing software is going to be biased when they tell me that we should all use computers to calculate.

It is also true that our “intuitive” notions about what works for teaching and learning are often wrong. What works is often dependent on what you are measuring and how you are measuring it.

Finally, I am always skeptical when a representative of a corporation tells me what is good for me…because corporations do not exist for the public benefit or public good. They exist for maximizing shareholder value at ALL costs, and this is a legal restriction that is placed on corporations. Not to descend into a whole other topic, but it is relevant to evaluating Wolfram’s proposal.

Theron, on November 22, 2010 at 7:41 pm said:I worry that we’re missing the point here. It’s not that the kids won’t learn anything about calculation, just that they spend much more time thinking about the big picture trends. In his curriculum, kids would spend time with algorithms that included taking roots, during which kids would see that roots of negative numbers don’t work in a standard algorithm. Why not? That’s where the teaching comes in, but let the kid make the discovery and ask the question first. Give the kid a situation/problem in which taking the root of a term makes sense; the computer will immediately let him/her know that something is wrong when the radicand becomes negative. If the curiosity is there, which it would be with a good problem, the necessity of either a limited domain or imaginary numbers becomes a consequence of the model and not because it was the next chapter in the text. I would say it’s much better to have them ready to test and accept the results of computational logic than to have them fail to memorize seemingly unrelated mathematical processes in an attempt to get to the ‘next level’ of mathematics.

Jason Dyer, on November 23, 2010 at 7:23 am said:In my hypothetical I am assuming the scientist knows something conceptually about domain and so forth. What I’m puzzling about is the micro-scenario of diagnosing this particular domain problem. Now, it would hopefully occur to them to computationally check the internals of each square root, but that is a step removed in such a way that is awkward.

If it was just this example I wouldn’t be worried, but in programming anything (mathematical or otherwise) legions of bugs occur, and depending on the computer to help decipher its own problem is like understanding the mathematics through a layer of mud.

I’m not closed off to the possibility of a way out of this, but it’s certainly something that needs to be addressed in any potential computational-based curriculum.

Joshua Fisher, on November 23, 2010 at 6:09 pm said:Oh, look! A brief clip of someone using a catscan or MRI machine!

THEY must be using math, right?

Joshua Zucker, on November 24, 2010 at 9:19 am said:I think computation is necessary, but all those pencil-and-paper algorithms are a little silly. We should be able to do very simple calculations mentally, and thereby estimate and check answers that machines give us.

If you want fast and accurate computation – which most elementary school teachers still do! – then ask a computer. If you want understanding, ask a person.

Chris Sears, on November 24, 2010 at 12:57 pm said:My problem with Wolfram’s talk is that he thinks he has found the silver bullet to fix math education. I go into more detail on my blog with this post.

Whit Ford, on November 28, 2010 at 9:10 pm said:There is perhaps a middle ground. Just as computer programmers learn to debug programs by breaking them into pieces, each of which can be individually tested, surely students using computation engines can be trained to do the same. The insights that traditionally trained mathematicians would have from looking at the function can also be arrived at by “debugging” techniques using a computational engine.

This may not be as fast and efficient, but in time, with enough practical experience chasing these sorts of problems down, I wonder if future students who rely heavily on computational engines would acquire a very similar level of insight into the behavior of functions as many traditionally trained folks?

Whit

http:\\mathmaine.wordpress.com

Jason Dyer, on November 29, 2010 at 8:13 am said:Part of the issue is current programming languages are relatively mysterious with their bugs when the bugs are related to the mathematics being implemented rather than the programming.

I am imagining some special programming language just designed for teaching students mathematics (LOGO, BASIC, etc. are designed for teaching students programming). This could be GUI-based and also friendly in terms of error messages.

Suppose as a small example something I’ve done before — teaching how to program the quadratic formula. The custom language would a.) give the current look of the formula at any given moment, so students know if they accidentally divided by 2 and multiplied by a rather than dividing by (2a) b.) have a template for teachers to insist that, for example, there should be two (not necessarily distinct) answers so that there can be a friendly message if students forget there is a plus-or-minus involved c.) be relatively intelligent about response to typos — did you mean subtraction or a negative number here? d.) be verbose and helpful in regard to non-real answers.

Jason Dyer, on November 29, 2010 at 8:17 am said:Incidentally, the main thing that frustrated me about the Google computational project with the teachers is that any students capable of using Python without freaking out are not the ones struggling in mathematics. I had been hoping for a program like I just described above, it’s the sort of thing Google resources would be perfect for.

tieandjeans, on November 29, 2010 at 10:40 am said:Jason –

While your assessment about students who wouldn’t freak out when facing down a Python prompt may be correct for current 10th/11th graders, it doesn’t track for younger kids. If you expose 7th and 8th graders to programing, they *all* freak out (or, in the case of Scratch, occupy themselves with the draw/animation tools and ignore the programing concepts). But well-structured introduction in late elementary or middle grades can push through that initial discomfort.

My concern is that we’re changing one set of ancillary tools (hand-calculation) for another (elementary programing). Now, I like to code and I think that some basic fluency in how software is made is worthwhile for almost any current student. Asking a 6th grader to write a function that calculates the LCMof any two integers does require that they have clear mastery of the LCM concept, but it wraps the math a bunch of non-math skills. If your goal is only fluency with fractions, this is wasted time and unnecessary overhead. If you’re building towards Wolfram’s vision of revised maths, then it’s essential to the endeavor — the same way hand-calculation is essential to our current curricular environment.

Jason Dyer, on November 29, 2010 at 1:33 pm said:Read that as (group of students who can use Python without freaking out) is a subset of (group of students who are doing well in mathematics) but not the converse.

Asking a 6th grader to write a function that calculates the LCM of any two integers does require that they have clear mastery of the LCM concept, but it wraps the math a bunch of non-math skills. If your goal is only fluency with fractions, this is wasted time and unnecessary overhead.I get the impression Conrad thinks the extra skills are a necessary societal shift; see Stephen Wolfram’s recent discussion of natural language programming in Mathematica 8 for other musings in that direction.

tieandjeans, on November 29, 2010 at 6:25 pm said:I’d agree that it’s a very advantageous societal shift in the near-term. If my math department decided to abandon our current math texts and invest themselves in building a system like this from the middle grades up, I think we’d see a significant positive result.

There’s a number of open questions about how to build such a program, or wether it lends itself to independent assessment, or how to sustain one for a decade or two. But in my observation, much of the current system is tripping over all of those hurdles right now, so that doesn’t dissuade me form wanting to try this as a new approach.

Random, on November 30, 2010 at 8:28 pm said:Somewhat related, I think:

Teaching General Problem- Solving Skills Is Not a Substitute for, or a Viable Addition to, Teaching MathematicsDoceamus, In Notices, Nov 2010, http://www.ams.org/notices/201010/rtx101001303p.pdf

Definitely computers allow us to be more ambitious in our calculations, but if you use them as a “problem solving black box”, good luck!

As someone who has extensive experience with CAS, they are **not** a black box system. You’d better be prepared when you fire them up, because you never know what they’re going to spit back at you. You’d better know a *lot* of 19th and 20th century math to make sense of many answers (or to make sense of *why* the software doesn’t — or maybe cannot — understand what you mean, or read your mind).

Fun example: ask students to factor x^8-1 into irreducibles (trick question: over Q, R or C?).

Keith Sherwood, on December 8, 2010 at 6:33 am said:The link is dead. I would love to see this paper. It might be that teaching problem solving skills does not relate to better math understanding…but it does equate to better performance in math class. Part of the reason is the HORRIBLE textbooks. The word problems my kids are faced with are intentionally made difficult to understand, are constructed with poor grammar, and contain clumsy wording. I understand that our intuitive notions of what works are sometimes wrong. But I need to see if this paper is differentiating between pure math understanding, and success in math learning. If you can repost a working link that would be great.

Jason Dyer, on December 8, 2010 at 8:22 am said:The link is ok, the auto-linking in WordPress included the parenthesis at the end. I have fixed the comment so you can just click.

Rich Beveridge, on December 1, 2010 at 3:44 pm said:I watched the Conrad Wolfram video yesterday and had a few thoughts about it.

First – at the beginning of his talk, he shows people using “math” in the “real world.” In other words, he shows scientists and other professionals using mathematical software and assumes that this proves that people don’t need to understand math in order to use it. OK, but who WRITES the software being used? (Oh, right Conrad Wolfram and his brother!)

Also, what if there isn’t a piece of software for the math you need to do?

Another point that caught my eye was the car analogy, where he says that if we learn how to drive, we don’t need to learn how the engine and transmission and suspension work (although I think that learning about internal combustion IS important).

I use a similar analogy in my classes, but with one important difference – I say

“We learn how to drive a car, but that doesn’t mean we don’t learn how to walk!”

Cars need roads, you can’t drive your car in the forest, but you might want to take a walk in the woods.

Jason Dyer, on December 1, 2010 at 4:26 pm said:I like your analogy.

Also, what if there isn’t a piece of software for the math you need to do?That’s why he discusses programming, I’d imagine. Programming in a math languages lets a lot of the computational internals to be shuffled under the rug. (If you protest that it makes debugging hard, well, that was my original post’s point.)

Now, writing the internals of the software would require deeper understanding, but Conrad’s argument seems to be only a small number of people need to do this, just like most programmers cannot nor need not to code in assembly language.

TI-Inspire « Andrew's Math Site, on January 3, 2011 at 1:52 am said:[…] On removing calculation from the curriculum […]

Computer Based Math Redux « The Number Warrior, on February 15, 2013 at 9:24 am said:[…] posted before about Conrad Wolfram’s efforts to remove calculation from the curriculum and make everything computer based. There is now a website devoted to the initative (http://computerbasedmath.org) and Conrad […]

#WeHateMath Math without Calculations? | We Hate Math, on May 5, 2014 at 8:04 pm said:[…] ran across a couple of articles that discuss something that I’ve been pondering (and talking about on this blog) for a while now. […]