Sooooo I'm not even being close to a "coder" or "developer", though I have my own experiences with programming 8-bit AVR microcontrollers (quite successful, but nothing awesome) in Bascom/C++ (even with a tiny bit of Assembly), and Mongoose's post gave me something to think of.
In order to get a supercomputer doing something, you don't need a brain made of one - you just need some basic understanding of what coding is and several ready-to-use procedures with their documentations. That's basically what modern coding is: clamping some building blocks together so that they work as needed... right?
Think of it: a procedure in high-level language is comprised of many others made in a low-level language (if anybody knows about them: Arduino vs AVR C++), a low-level language procedure is made of others in even lower level language (like AVR C++ vs Assembly). Then there is machine code. *Someone* made the machine code, *someone* made C, Java, Python, not mentioning the stuff needed for convenient 3D rendering. Someone designed all of those so they don't collide (too much) with each other. Someone made the compilers.
Let's look deeper. Somebody designed the architecture, the entire system inside a computer, coordinated all of the present devices into a single, coherent, functional system.
You do not need to know everything about computers to code - or to be a good coder. Just feel this: even with the simplest code possible you wave around with decades of tens of thousands' brilliant work. The little #include tag, a compiler, everything you use to make your program going is a result of such incredible brilliance of very many people.
That's why you don't need to know everything: history does that for you.
<Maybe if I would know English a bit better this thing up there wouldn't be so silly>
By the way, it's amazing how many cool things one can do with a single 8MHz ALU. For some time now, I'm thinking of stuff like Z80 with enormous respect instead of being sure that it's a relic and a piece of junk.