I think it's fair to have an up to 75-150watt (a little depending what timeperiod we're talking) card, just in case, even if you don't play a lot of graphically demanding games. It's a shame it became so difficult to achieve, within a reasonable budget for "just in case might never use".
75w are the ones that dont even have a separate power intake, but get all their juice from the mobo iirc. If you had like a 1050 or 1650, you can still be somewhat comfortable even in eyecandy games (it wouldnt be eyecandy for you, but probably very playable, except for utterly poorly optimized stuff)
I applaud you for running that 760 until today,good card, good value good patience... If you didn't play 3d games it might be comfortable another decade... If you want to slap a new card in an old computer, careful with the very low end of the latest generations they might only have x8 pcie lanes because they're banking on you to have pcie4.0
Since we're on the subject of graphics, saw a discussion earlier today with someone criticising gamedevelopment focusing too much on graphics and another person like "buddy that is not a controversial take"... and ooh if they knew how wrong they are. There are people that overvalue the graphics of a game so incredibly much, take this for example: got a friend who played ark survival like 10 years back when it was still new and fresh. At the time a good gaming laptop struggled to run it, but did manage... So the guy is allways making poor decisions because of the looks of game, and then get this: he recently expressed interest in the ark rerelease, that's expected to run poorly again on his newer laptop... I say how about we play the same damn game that is finished and will actually run smoothly for once? "meh" ... unfathomable to me how can make such poorly adjusted choices but all the guys from my homevillage are ultradense when it comes to picking good games its pretty depressing. I could tell you how to tailor your marketing material to sell a shit game to just about anyone of them.