Classes are about learning what to learn. Or at least good ones are. They say "Here are some things you should learn about; we won't waste your time by forcing you to wait for the slowest person in the class to understand everything, and instead will tell you to actually learn the material on your own time." Any intern-level codemonkey can read online documentation if they knew what it was they were looking for. Becoming a good programmer isn't about learning new things, it's about learning about how to learn new things. To become an observer from all sides, seeing things in every light and every perspective, regardless of whether they are new or old.
Programming is all about taking a complex task apart in your head. You start with "What do I want this to do?" and then figure out what that actually means. Computers are the lamp-inhabiting genies; they will do exactly what you tell them to, whether or not it is what you intended. The job of a programmer is thusly to not only figure out a solution, but to reduce that solution into a form which is airtight and open to no interpretation.
For example, some code I did several years back was 2D shadow calculation. I wanted to be able to feed in shapes, then figure out what shadows they would make based on light emanating from a point in the 2D space. By the time I finished planning out the systems and sub-systems I would use to code this, I had a 2000 word google doc written up, complete with a dozen or so diagrams. The code itself was even longer; something like 1800 lines. A single variable incorrect, a single mis-step in my logic, and you get bugs ranging from mundane to catastrophic. So then there's the second part of programming: debugging. For debugging, you must understand the code in question; how it flows, and how it sends information around and processes it. Then, you figure out what is going wrong, and look for the origin of that error.
So to be a good programmer, you need to be capable of incredible memory loads, since you're keeping vast structures in your head, and critical thinking capable of ripping a problem into its component pieces or working through the logic of a complex structure someone else wrote.
Languages to learn: C++ is the biggest one. If you can code in C++, you know and can code with any other language in existence. That goes doubly if you know some semblance of an assembly language as well. Muz is absolutely wrong there. The reason for this is because C++ straddles multiple paradigms, while also being relatively low level. If programming is going to be your actual profession, you really need to learn this, as it will pretty adequately prepare you for just about anything that anyone can throw at you. Which is also why LB is wrong; while the specific languages may not exist today, nor the hardware for them, a programmer well versed in the fundamentals exposed by low-level programming can adapt to any of that without skipping a beat. Words and languages are worthless; it's the paradigms and ideas behind that which is important. And there is only one thing which will change the hardware-level basics, and that's quantum computing; which only affects a small subset of problems, won't be useful for a decade or so, and will make perfect sense if you know the basics anyway.
But C++ is just the beginning. After that, there is the specialist knowledge. A generalist programmer won't really get hired; or if they do, it won't be as in-demand or well paid as a specialist.
A really hot topic these days is statistics and modern AI (as they are really one and the same). So read up on some AI rooted in Bayesian Reasoning if you want to go that direction.
If you want to go into games or scientific computing, there's GPU programming (which is what I do). This involves using APIs like DirectX, OpenGL, OpenCL, CUDA, HLSL, GLSL, and so on, to write code for the GPU; this is very important in games, since it's the core of the graphics programming, and important for both games and scientific computing since a GPU has about 100 times the computational power as the CPU.
Another in-demand specialty is network programmers; these are the people who write the low-level backend for specialty server applications and games. It's a really difficult skill, as it involves highly asynchronous operation, and accounting for everything that can go wrong over a network.
Then there's database programmers. They write the code that manages massive databases, queries to those systems, ect.
There are plenty of other specialties, but those are some big ones.
I will say that at some point it does indeed *click*, like someone else said, but before that you'll just feel dumb. After that you'll feel a little smarter...
Also, this is false. It's a highly cyclic operation in which you learn more, open up your horizons, and see the vast unexplored wilderness before you. Again, it's all about learning what you need to learn.
For getting started, I suggest online tutorials. I've also heard good things about the (free) site CodeAcademy, though I've never used it. To put it quite bluntly, the internet was created by programmers, and so they utilize it better than any other profession. Documentation and tutorials on any language and in general any code you will have access to will exist online. Google and similar search engines are also created by programmers; and so will very readily find useful information on topics you want. IMO, the first part to becoming a good programmer, going back to where I started, is to learn how to learn. To enhance your Google-fu and ability to phrase queries such that you can locate the information you seek in as timely a manner as possible. It's flippant to just say "google it," but googling it is itself a process and a skill. Working that into your brain such that it is your default modus operandi upon pondering a subject will prove extremely valuable as a programmer.
Now for specific, highly relevant tutorials:
C++
http://www.cplusplus.com/doc/tutorial/C#
http://msdn.microsoft.com/en-us/library/a72418yk.aspx