I've been reading Linux From Scratch (great book if you're looking to specifically waste time), and I've noticed something that I can't help thinking is strange to me.
From what I've gathered, it basically goes:
1) Start with source code, and use your existing toolchain (note: I barely understand what a "toolchain" is) to compile the bare minimum to be able to compile other things on this new "system" (despite it being on the same physical system as your host system).
2) Use this new toolchain on your new, LFS system (by using chroot; you can't boot off of it yet) to build everything else you need for a functional Linux system.
3) Use the toolchain... to recompile itself, on the LFS system after booting off of it.
Look, on an intellectual level, I get it. You're trying to isolate all the host system's libraries from the LFS system so that the LFS system can actually run on its own without relying on the host at all. Step 3's to verify that everything's correct; it's not strictly necessary. That's clearly a YOLO strat if you'd want to speedrun an LFS install. (Speedruns for Linux installs have been done before, it turns out, though not LFS itself).
But to my peanut brain, that's like, loopy as hell. If you did this on an x86_64 machine, that's like, x86_64 to x86_64 to x86_64. The book says you're supposed to do this on one machine (there's CLFS, but that's a different story), so that's also (denoting the one physical machine as A) A to A to A.
Then again, what do I know? All the programming languages I've used (which isn't saying much) are interpreted, not compiled. I've never had to compile Python code, for instance. Toolchains? Compiling? Linking? The hell are those?