Aren't pointers just a way to pass by reference?
I think so, essentially. Pointer syntax confuses the heck out of me, though (the inconsistent use of asterisks depending on context).
More like "references are just safe pointers with abstraction overhead".
A pointer is an integer value which stores a memory address. That's it.
C/C++ let you have typed pointers, which conveniently know what data type they are pointing to.
C's grandparent language, BCPL, did not (nor real pointers, only integers, which it got creative with).
A reference is a unique identifier into a map of objects that the runtime library is tracking and managing for you, keeping track of how often and where it is used, re-using it's memory when nothing is using it any more. When you get the value in a string object, for example, you go ask the runtime to look up the ID, then it goes to the correct place in memory, and it might increment a counter saying how many places the reference is being used, might copy the value to a new place, might do a lot of things, before returning the value (or, another reference). And then some.
Core Difference:Pointers are much faster, efficient, and dangerous than OO references, because...
1) References are managed by a pile of invisible overhead to access, and their allocation/deallocation are handled by the garbage collector.
2) Pointers are directly dereferenced by the CPU and their allocation/deallocation are handled by YOU.
They are also the primary point of angst for C programmers trying to find a difficult to isolate bug, not to mention buffer overrun/underrun security problems.
There is good a reason why automatic memory management & garbage collection has become so popular in modern applications, especially in the business community.
There is also a good reason that most graphics intensive games are still written in C/C++.
The problem I have with people starting with languages like Java and C# is how far removed they are from computing, due to the huge standard libraries that get students fixed on just one way of addressing problems. In those languages you don't solve problems, you merely implement other peoples' solutions. This is fine after you deeply understand the concepts, but undermines learning if you do not.
I usually suggest Python to start because it keeps it's concepts fairly base-level (even though it's a scripting language), and the libraries are largely third-party and have full source code to analyze. It teaches core concepts, and teaches best practices, but leaves you to climb the mountain yourself. Yet, still wears kid gloves.
But I only suggest it for introductory programming.
You can
never achieve your potential if you don't grok how the CPU functions, or don't understand all the invisible OO magic that happens when you say:
foo.property = bar.method( parameter );