**Replay**

I think it depends on what type of programming you want to do. As far as being a programmer in the business world goes, I would say that the answer is no. You can become a great programmer without knowing advanced mathematics. When you do end up having to deal with math, the formulas are usually defined in the business requirements so it only becomes a matter of implementing them in code.

On the flip side, If you want to become a low-level programmer or say create 3D graphics engines, mathematics will play a huge role.

I'm going against the grain and saying **yes, you need a math mindset**. Most people think of math as doing arithmetic or memorizing arcane formulas. This is like asking if you need perfect spelling or an extraordinary vocabulary to be a good writer.

Writing is about communication, and math/programming is about the process of clear, logical thinking (in a way that you can't make mistakes; the equation doesn't balance, or the program doesn't compile). Specifically, that logical thinking manifests in:

- Ability to estimate / understand differences between numbers: O(n^2) vs O(lg(n)), intuitive sense of KB vs MB vs GB, how slow disk is compared to RAM. If you don't realize how tiny a KB is compared to a GB you'll be wasting time optimizing things that don't matter.
- Functions / functional programming (is it any coincidence that the equation f(x) = x^2 is so similar to how you'd write that method? The words "algorithm" and "function" were around in the math world far before the first computer was born :-))
- Basic algebra to create & reorder your own equations, take averages, basic stats

So, I'll say you need a math mindset, being able to construct & manipulate mental models of what your program is doing, rather than a collection of facts & theorems. Certain fields like graphics or databases will have certain facts you need also, but to me that's not the essence of being "good at math".

There are many different fields of programming and many of those don't require a particularly high standard of mathematical knowledge. You will never be able to write a 3D engine, but you will certainly be able to develop business and web applications. Let's face it - the most common mathematical operation in most computer programmes is incrementing a number by one.

I'll quite happily admit I've never particularly liked maths or been good at it (I actually graduated with a degree in English Literature!) and have worked as a professional developer for over 12 years now. I develop mostly web applications, which rarely require that much maths. More important is the ability to think logically, be able to break problems down into chunks and have a wide understanding of the various technologies and frameworks involved.

As a programmer you are much more likely to have to implement an *existing* algorithm than devise an entirely *new* one. Need to work out, say, compound interest? You don't need to figure it out yourself, just look-up the formula and apply it. Most of the problems have already been solved, you just need to know how to implement the solutions in your language of choice. That's not to say that being good at maths wouldn't be an *advantage*; it's just that it isn't totally *essential*.

When I was at school in the mid 80's when home computers where not very common I often wrote programs to solve my maths homework. I often couldn't do it in my head, but I could apply whatever formula was required as a software routine. You don't need to be another Pythagoras to work out the longest side of a right-angled triangle, you simply need to be able to code up `a² + b² = h²`

in your language of choice.

You don't have to be good at math. However, you have to be good at logic, and problem solving. However people who are good at logic and problem solving are usually good at math also. I would say that it really depends on the type of math. You can be terrible at calculus (like me), and still be a good programmer (like me). But if you have trouble with Discrete Math and Set Theory, you would probably find a lot of aspects of programming quite hard.

I think it's important to look closely at why you don't like maths.

A dislike of an academic discipline is usually something that happens at school, and may be down to a conflict of some sort or another with a teacher, lack of confidence in your own ability within a subject, or peer group pressure.

Programming != maths. It doesn't even "feel" like maths, for me (and I enjoyed maths, despite not doing so well towards the end of my formal studies in it). Many skills that you might use in maths are useful, necessary even in programming, but many programmers teach themselves for the most part. Not liking maths in school has pretty much zero bearing on your ability or enjoyment of programming.

Math and programming are very closely related as math is really the universal language between humans and computers. You do not need to know a lot of math for high level programming as a lot of that is behind the scenes, but it will aid in comprehension for a lot of more advanced programming concepts. If you plan to do more low level programming (systems or device programming), then you will need to know a lot more math.

A *good* one? Very unlikely. Most design patterns have at least some basis in mathematical concepts. Things that are essential to programming, like variables, loops, procedures, and objects, are analogues to concepts in mathematical fields like algebra, calculus, and set theory.

Consider also that computer science is a subset of mathematics: algorithms and formal logic, upon which all programming is based, are fundamentally mathematics.

If you hate math, you're going to *hate* programming.

Almost everybody has answered: "do you need to know math to be a good programmer?" The correct answer to this is: "No, not really, but it helps," as many have already said.

But my interpretation of the question is "is there a strong correlation between mathematical aptitude and programming aptitude?" The correct answer to this is: "Yes, there is." If you struggle through algebra, geometry, and calculus, then you probably aren't very good at dealing with abstractions and/or thinking logically. If you're bad at math, you probably won't ever be a great programmer. (Not that you shouldn't try.)

It depends on what you're programming. A 3D game engine, for example, would be extremely difficult (if not impossible) to pull off with any degree of coherency without knowledge of the appropriate mathematical concepts.

"Like" and "be competent at" are entirely different things - so as long as you are properly numerate then I can't see a reason why you would *have* to like maths.

But lets be absolutely clear here - programming has a strong basis in maths and sooner or later almost any non-trivial development is going to involve calculations - you can't hide from this.

Any programming involves logic (basis in maths), most modern programming probably involves things (like SQL) that involve set theory (even if its not obvious) and if it doesn't then it may well be the case that you're off in realms (like games programming) that are even more explicitly maths based (rendering - maths, AI -> probability and randomness - maths...) and so it goes on.

The upshot of the above is that you have to be comfortable with numbers - you certainly have to get why "There are 10 types of people in the world, those who understand binary and those who don't" is funny. But you're probably excused "2 + 2 = 5... for very large values of 2".

The fundamental concept of maths is the following, devising, understanding, implementation, and use of algorithms. If you cannot do maths then it is because you cannot do these things, and if you cannot do these things then you cannot be an effective programmer.

Common programming tasks might not need any specific mathematical knowledge (e.g. you probably won't need vector algebra and calculus unless you're doing tasks like 3D graphics or physics simulations, for example), but the underlying skillsets are identical, and lack of ability in one domain will be matched by a corresponding lack of ability in the other domain.

To be honest, I was a horrible math student in school. Algebra was completely beyond me at the time, and I don't think I ever got higher than a D in it.

However, a few years later, after having worked as a professional software developer, I went back to college and took a course in algebra. To my amazement, it was the easiest class I had, and I got an A in it.

Truth was, *programming taught me algebra, because virtually everything is just an algebraic expression.*

So no, you don't need it to start. It helps, but it isn't required. The beautiful thing about software development as a means to teach math is that the compiler, debugger, and executing program are wonderful ways to verify that you've got the answer correct. In this regard, debugging particularly is a huge boon to learning, because you can step through the code and watch each step of your algorithm's evaluation.

It somewhat depends on what exactly you are doing, though it definitely can't hurt.

For example, someone who majored in Computer Science has got to go through a lot of math to get their degree. CS generally focuses a lot on algorithms and their correctness, proven through high-level math-style proofs. Many Universitie's CS programs are so close to their math program that a double major is only a few courses away. Even as a Software Engineering major, myself, I was 2 courses away from a Math minor.

However, that being said, a lot of the proofs, data structures, search methods, and algorithm correctness stuff that I learned hasn't really been put to direct use since I finished school. But it would be hard for me to say that it didn't at least give me a good foundation and better understanding of what I do at a low level.

Because no matter how you look at it, at the lowest level, everything you are doing boils down to math.

Can you become a good software developer without maths? Yes, I think so. Can you become the sort of heroic programmer that people talk about all the time? I think not.

The problem is, most, if not all, heroic programmers (think Dennis Ritchie), have computer science or maths backgrounds. To become a truly great programmer, you need to understand algorithms at a level that's more than just superficial, which means you're forced to delve into formal computer science. And computer science is just applied maths.

Similarly, an understanding of lambda calculus would be invaluable to an OS architect or a language designer.

I have seen this topic argued back and forth. I have worked with people who had degrees in mathematics that thought they could progam and within a year or two changed careers. One of the best programmers I ever had the pleasure to work with had a Ph.D in Biochemistry and never took a formal programming/CS class in school but self-taught himself and started a successful software company!

Ultimately, what makes a good to great programmer is someone who is capable of understanding logic, workflows, can learn by example and willing to research for a solution. Also, you HAVE TO LEARN THE BUSINESS your applications are for. I hate programmers who are pround they don't understand accounting, yet write accountting applications. They always make incorrect assumptions and really slow down development.

You will find, no matter what school you go to, you will learn more in one year out of school than you did in 4 years of school. School teaches you how to learn with basic skill set - but real world experience is so much more valuable over time.

Experience is the best teacher and when you have to apply mathematics in to software development, as long as you learn the business - you will be fine. Also, remember, as an earlier post said, unless you are trying to work on a 3D graphics engine or graphic coordinate systems like GIS application, math you learned through high school is all you really need.

I've worked on Accounting and Billing systems - and I never had to figure out Log(x), SIN, COS, etc for handling a general ledger or allowing data entry. An Aging Journal is not "high mathematics" but critical to evaluating AP issues.

Come to think about it, I've never meet an accountant with a scientific calculator on their desk!

The short answer is no. I think it's a bit of a myth but it's propogated because maths problems are usually well suited to being solved by computers.

So in uni/college, people will get maths problems that they need to solve in compsci subjects but what you will usually find is that the maths is actually harder to solve than the code that is needed to implement the solution.

Once you get into the real world, you'll increasingly find that the problems are largely solved for you, your job will just be to implement them in code.

You have to either learn math, or create your own. Either way it is important to be good at it in some form or another.

As long as you can work with values and understand what they are doing, why and what you can **make** them do, then traditional mathematics may not always be necessary. Occasionally it even gets in the way.

There are alternative ways to visualize a byte's value other than numbers, but they are most definitely the most thought after method. It would be feasible to write a program thinking of all values as colors for instance.

Today's programing derives much of it's value from being able to represent 1s and 0s as different types of data. Even though really those 1s and 0s aren't numbers at all, but electrical wavelength changes, math isn't so much at play as physics,... *however*,... it is very important in understanding a great great deal of what other programmers say and code.

Still it would be *possible* to be a good programmer without math, however difficult.

I just finished an intro course to discrete math, and I found that I *already* knew almost everything about predicate logic thanks to programming; all that was new was the syntax--it was basically just working with booleans.

In short: perhaps you do not have to learn math *explicitly*, but just by being a programmer you have probably learned some math without realizing. That is, by being a "good programmer", you are also really being a mathematician (to some extent).

The Curry-Howard Correspondence illustrates what I mean: basically, it states that mathematical proofs and certain computer programs are "isomorphic", that is, they are different ways of writing the same thing. Of course, it is actually more complex than this, but I'm not a mathematician, so this is the best explanation I can give. Hopefully it isn't too far off the mark.

In summary, not only do many fields in CS and programming involve a lot of math, but even basic programming ideas (e.g. booleans) are basically math in disguise.

This is a very hard question to answer and will likely stir up a lot of debate.

One of the reasons why this question is so hard is that it partly depends upon what type of work you are doing. There is not a lot of math involved with most business applications, so you can get by with a solid understanding of algebra and business math. However, more advanced applications call for more advanced math and you start needing a solid understanding of calculus, linear algebra, and the like.

However, that is just one part of the equation in that you still need a certain degree of mathematics for just the practice of programming itself. It goes without saying that you need to be comfortable with logic to just be able to write a basic program as well as basic algebra. Looking a bit beyond just getting a basic program working though you need to have an understanding of certain aspects of discrete mathematics to be able to make determination as to what makes for a good algorithm to use for a given problem.

To get back to the heart of the question though. I personally don't think you need to be a mathematician to be a good programmer; however, I do think that you need to be comfortable with math to be able to be a good generalist programmer.

Yes, definitely.

Even run of the mill business programing requires some skill at math.

Run of the mill business programing requires database skills. Being a good database programmer requires an understanding of how databases work, and what the algorithms are that the query processor uses when it translates your queries. Without an understanding of limits and derivatives (or even the basic understanding that the line y = x intersects the line y = x^2 twice), it is not feasible to accurately compare a hash-match inner join query plan vs a nested loop join query plan.

Also, a good programmer can work in just about any domain, provided that they study up a bit: games, simulation, embedded development, compilers, operating systems, web stuff, databases, etc. Being able to do all those things (or more accurately being able to quickly learn how to do all those things) requires a decent amount of mathematics background.

I'd say that one should have had some experience at some point in time with the following:

- 3 Semesters of Calc
- Diff Eq
- Linear Algebra
- Modern Algebra
- Basic probability,counting, and statistics

Math is more than just formulas. Understanding some mathematical principles about set theory is very useful to grasping complex concepts on type systems, as is understanding complexity a paramount to efficient data structure usages.

Graph theory is also extremely useful, as many programming problems can be modelled by a graph. I was very astonished, as I was developing a business application, to find out that the shortest path theorems were providing an elegant solution to a thorny problem that I had!

I have always considered programming to be nothing *but* mathematics.

It just doesn't look like high school algebra.

Maths is the ante-room of programming.

Being able to work with layers upon layers of abstraction, models, "objectification" of functions, transformations and temporal concepts, maths is the perfect training ground for all that.

It is possible to develop the right mindset to programming without maths but it's a lot harder.

However, specialist areas aside, only understanding maths is important, knowing the name of everything and how a given theorem can be proved isn't. So even if you have good marks in maths because you learned it all without really understanding, you will still struggle with programming.

Math knowledge is good for some applications (like gaming, artificial intelligence, computer graphics, etc), but math teach you something beyond just formulas or complex equations.

Learning math is like learning a new programming language. In fact, programming is applied math. When you learn a new language, you learn a lot of things that make you a better programmer. It is not different with math, but if you really master math, you will be a better programmer forever, even you don't use advanced math in your job.

The reason is simple: math teach you to see the world with other eyes. It teaches you to solve problems with different approaches without necessarily programming. This new way to think certainly leds you to a better way to do your job.

Programming is an art. Math is an art. If you combine both of them you will be a better artist.

No.

Much like most science disciplines, having a good understanding of maths concepts is going to be helpful, particularly when evaluating things like efficiency. But for most programming tasks your maths ability is only relevant if the problem you're solving is related to maths.

Computers are excellent at doing mathematics, so it makes sense that early computers were used extensively to do the 'grunt work' associated with a lot of complex maths work. A lot of software still **does** solve complex maths problems, in that case being good at maths will help you write a better program, but it's not what makes you a good programmer.

Usually, not in the sense that you need to know, say, calculus or trig equations to do most work. If you're doing heavy graphics/game programming, then yes. A famous math hack for Quake is a good example of this. However, the thinking that you have to get into while dealing with higher-level math certainly is applicable to programming; with programming, you are developing your own logic structure, your own functions, your own "proofs."

The only time I've run across math in my job (internal enterprise workflow and apps) is when I do some reporting apps that require knowledge of statistics, but that's only because it was directly applicable to requirements.

I would say not necessarily. Certain programming disciplines (crypto, graphics, physics engines etc) would definitely have a clear advantage for the mathematically inclined, but I don't think a good understanding of differential equations would be particularly useful for web programming for instance.

Boolean logic is probably a requirement to be a good programmer, but it wouldn't surprise me if a lot of people who didn't achieve good marks in high school maths turn out to be good at programming.

I would say you definitely don't need to be good at math to be a good programmer.

My first job as a programmer was doing 3D graphics for B-52 and Cruise Missile mission planning. It was a math intensive application, but I really only needed access to people that were good/great in math. I didn't need to know the formula for computing great circle distances between two points. I did need to know how to convert the formula so that it worked in a programming language. The same with flight simulation. Boeing did all the math we just had to apply it.

That experience also helped me get a sense of who would be a good programmer and who wouldn't. The job had pilots and navigators taking a tour of duty as programmers and to help the programmers understand the needs of the mission. You could usually tell within a few weeks what pilots and navigators would be good at it. Math majors usually took to programming right away.

So I would say being good at math makes it more likely you will be good at programming, but I know a lot of good programmers that aren't so good at math.

I don't like math and I always got low math grades. I don't want to say that I'm a good programmer, however I'm in the software industry for 10 years with great success.

Is it possible for people who don't like math to become a good programmer?

No, no-no, no, yes and no!

No, because often you need it.

```
(! (a | (! (b && c) || d) && (! e)))
```

Why doesn't it work?

```
foo ('a', 'b', 19, g(h))
bar ('c', 'd', 44)
```

can it be rewritten in a more abstract way?

Is 968 ms more or less than 0.7 s? How many MB do you need, how many Ghz does the machine have, will a byte be enough - math is everyday part of the job. Sometimes explicitly and higher math.

Always implicitly lower math.

Math is a wide field, from calculating, to matrix, to geometry, logic, statistic, category theory, graph theory. So if you believe you're programming without using math - maybe you're wrong.

If you look at problems at the Project Euler page, you will find puzzles, where I don't have an idea, how math is used to solve it. (Not that I could solve them without math.) Note that the problem size is normally that big, that you can't solve them with brute force.

However - since I can't solve lot of them (about 2/3 by now), does it mean that I don't like math?

If you didn't study math, you will probably not know, where you can find math your daily life, including programming.

Even if you just specialised in moving GUI-components on the screen to look good, you're doing math in some way.