I never said you should
start with C++ (at least, not in this thread
). C# is a much nicer one, since the MSDN documentation & tutorials are amazing, and you can directly apply it to the Unity game engine and jump right into games/visualization if that's your thing. C++ is what you would learn afterwards.
I'd think professors know better which languages are employable. (haha. no. no they don't)
Still, it's quite convincing if they won't even teach it in a core computer science degree. I think C and C++ are great skills to learn how computers work and why object oriented is so awesome, so no loss if you learn it properly. But if you're going to jump straight into coding as fast as possible, just do Python or something.
Multithreading and parallel work kinda falls into the low level field these days
Everyone has different opinions on what's good though. I mean back when I was in uni, they taught Haskell as the basic computing language (C for advanced students). This was supposed to be the best comp sci university in Australia too. So just feel free to throw away and follow whoever's advice here, because I'm sure any advice would conflict with some professional academic's advice. What matters is that you learn something.
First, your default assumption should be that CS professors know nothing about the industry until they prove otherwise. If they went into the industry at all, it was a good time in the past, and they are no longer in it, as they are professors now. Which means everything they say should be assumed to be inaccurate, out of date, or blatantly wrong. For example, when visiting universities to figure out where I wanted to go, I talked to the head of a medium-sized in-state CS program. I asked about what they did with video games, and the response I got was that "There really aren't jobs in video games; that work is all outsourced." Very close to the exact opposite of the actual state of things.
Second, I strongly suspect that Java and Python are so frequently the most in-depth programming taught is not because it is actually better, but because it is easier for professors to teach and students to learn if that is all that needs to be taught. Can you get a job with it? Sure. But it won't be an industry-leading job.
Third, object oriented is not awesome. In fact, it can be extraordinarily terrible.
If you want good performance out of your code, you will toss most of OOP out the window at the first chance you get. And with a highly complex system like a game engine, it can also help readability by unifying important operations, rather than scattering them to the winds as member functions of dozens of classes and subclasses in a hierarchy. OOP is great when you're new and find the abstractions of code hard to think about, but it certainly isn't the end-all paradigm.
Fourth, multithreading and highly parallel programming is precisely what people should be learning about. A modern CPU has half a dozen cores; without knowing the basics of load balancing and how to write parallel-friendly code, you are stuck using a tiny fragment of the CPU. Then there's the GPU; those have upwards of 2 orders of magnitude more power than the CPU, and is thus immensely important for any big data, video games, or supercomputer work. If you can't write parallel code, you're throwing away literally more than 99% of your available computing power. In my book, that makes it as important as data structures and Big O Notation.
Fact of the matter is, basic programming is diffusing through the culture of an entire generation. It's becoming a highschool level topic. If you actually want a career in it, you need to be immersed in "the low level field" or the children of today and tomorrow will put you out of a job by growing up learning the things which are the extent of your knowledge. You need to outcompete them; and having seen some of the freshmen coming into the uni I went to, I can safely say that there is a massive wave of programming-literacy on the horizon that makes mere basic knowledge no longer competitive.