As long as you're running a discrete graphics card from AMD or NVIDIA, you're on a system powerful enough to run the game with most settings cranked up.
Intel graphics is VERY hit or miss, especially with compatibility.
Running a chip with 2 or more physical cores is optimal.
Hyperthreading will help free up processor cycles, just don't rely on virtual cores like you would physical cores.
If you're building new, an 8th, 9th or 10th Gen I3 (low end) chip will kick the game in the balls and scream for more.
An 8th, 9th or 10th Gen i5 is just more of same. They just use steel-toed boots.
Similar 8th, 9th or 10th Gen i7 includes spikes and a "taser to the crotch" function.
i9 looks attractive, but isn't really intended as a gaming chip. Can they DO it? Sure. But no game is actually USING that many cores. So it's VERY difficult to justify the price .
With i3/i5/i7, you have chips that can run between 3 and 4 Gigahertz for a base clock speed. And, once you've gone multi-core, clock speed is the biggest single determinant for how well a game plays when factoring CPUs.
The 3000 and 5000 (if you can get them) Ryzen series Ryzen3, Ryzen5, Ryzen7 and Ryzen9 are basically "more of same".
Basically anything in the last 6 generations works pretty much like this:
##40 and below are barebones entry level graphics cards. They can handle simple games, even CoH. But they're usually going to be somewhat resource limited for things like Full Screen Anti-Aliasing, any resolution over 1920x1080, etc.
##50 cards are generally almost identical to the previous cards. They're just given better memory, more memory and possibly a wider memory bus. Think of these as bare-minimum "gaming" cards.
##60 are what NVIDIA sees as "mainstream" dedicated "gaming" cards. Core counts on the GPU tend to shoot up, and memory becomes more plentiful. Even if it's not always the highest speed or widest bandwidth. At this point, you've hit absolute overkill for CoH at 1920x1080.
##70 cards are "enthusiast" cards More cores, more and higher bandwidth memory, and more hardware options available.
Basically, as the song goes "Anything a ##60 series can do, I can do better!". Can game at 1440 easily and can produce respectable numbers at 4K resolution.
##80 series cards are generally apex-level gaming cards. If you wanna do it, these cards will generally "git er done"
##90 and "Titan" cards. These are "aspirational" cards.
TIME FOR A CAR ANALOGY!
Ever known of someone who buys what is ALREADY a hideously, dangerously powerful (in stock configuration) ride?
Then proceeds to supercharge it, throw on turbos bigger than a dinner plate, convert over to E85 and a blown nitrous system, delete the cats, while stripping out anything not absolutely critical to a driver inside, then replace every bit of sheet metal and fiberglass with carbon fiber replacements?
(Note: And HOPEFULLY putting on much wider tires for traction control and brakes strong enough to stop THE PLANET from rotating and installing a roll cage.)
Yeah. That's ##90 and Titan series cards. Hand-picked silicon, the absolute bestest, mostest card that takes whatever you feed them, belches loud enough to be heard ON THE MOON, then screams for MOAR!
The real beeyotch of it is, these cards don't really add terribly much to the experience over a ##70 or a ##80 card. Because these cards are ALREADY so powerful, that the CPU and other things in the system are, again, the bottlenecks..
As for multi-GPU, SLI? They add almost nothing to most games these days. And, even way back, I had a friend at NVIDIA testing a quad SLI setup against CoH. And framerates just didn't go up. Again, the cards weren't the bottleneck. Now, that said, AVERAGE and MINIMUM framerate went up a bit, as the SLI setup could cover for severe framerate swings a lot better than a single card could (at the time).
I'd try to dig my original boards.cityofheroes.com article out of the Wayback machine. I'm just not sure HOW.
AMD's a bit different. They actually have whole different series of cards, rather than a flat numbering convention.
Their RX VEGA series are their previous generation cards. They're still quite powerful.
The RX 540 and 550 are entry-level graphics cards. They work. Nuff said.
The 560-590 work similarly to the NVIDIA scaling and were entry-to-high-end graphics in their day.
The Vega cards, Vega 56 and 64 are steps up from there. More cores. Better specs.
The RX 5000 series cards: is the previous generation. 5500 is the entry level card. 5600 is the barebones gamer card. 5700 cards are the enthusiast cards.
The current RX 6000 series: 6700 to 6900 are ALL enthusiast grade cards.
The main problem here is that the current generation of NVIDIA 3000 series cards, AMD's Ryzen 5000 chips and AMD's RX 6000 cards...
JUST ARE NOT AVAILABLE.
AMD soft-launched Ryzen and RX 6000 with absolutely TINY initial runs. And were promptly sold out. Some of these CAN be gotten for markups of anywhere from 50-200% of their MSRP (and NO consumer CPU or graphics card is worth $1800 to $2500). But you're buying from scalpers. And as a third party purchase, you're not eligible for warranty coverage. And, even if you WERE, AMD simply doesn't have any stock to handle RMAs with!
NVIDIA has all of these problems as well, but an absolutely HUGE problem with crypto-mining and their vendors' relationships with crypto miners.
Before RTX 2000 and 3000 series cards were generally available, certain vendors (who shall remain unnamed) were shipping FULL CRATES of the cards off to crypto-miners because they were getting premium pricing for them (WELL beyond MSRP).
And with the current silicon crisis, we have no way of knowing when new stock will actually arrive (I'm betting late July at the earliest). What little they're getting out of the third party foundries, is being watched like a hawk. There are actual YouTube channels set up to watch availability, and everything disappears in seconds.
They make noises about killing at least the entry level 3060 cards' ability to mine in drivers.
But this pre-supposes they;'re using WINDOWS.
Most miners aren't They're using Linux and alternate drivers that're tuned for max-mining output.
It's actually so bad that PREVIOUS GENERATION cards and AMD chips are skyrocketing in price and becoming just as unavailable.
All I know is that, if I were Intel, I'd be salivating right now. They have HUGE fab space for this stuff. And if they wanted to be REALLY disruptive in the market, they could buckle down and release a graphics card that could compete in the entry level graphics and entry/midrange level gaming card space.
And they could churn the damn things out all day, every day.
And you'd watch the heads at AMD and NVIDIA positively *EXPLODE* (Both of 'em! In as painful a manner as you could wish.)
Okay, enough diatribe.
Pretty much any PC storage will do for CoH.
If you decide to stick with hard drives, I recommend opting for 7200 RPM drives. CoH can be slow to load locally.
Western Digital is my go-to. Unless you're talking enterprise-grade (10,000-15,000 RPM Serial SCSI) drives, I simply do not recommend Seagate. Their consumer-grade drives simply have an outrageous failure rate. The only place they make sense is in massively redundant bulk hosting applications, because you're simply going for VOLUME and building in redundancy and data integrity at a level above the drive hardware itself.
Toshiba makes attractive-sounding drives. But they're crap. I'd rather hand assemble my data with a hand-held magnetic system than use Toshiba.
By preference, I recommend solid state drives these days. They're now large enough that a minimal investment will get you a decent primary AND secondary drive. And compared to a hard drive, they SCREAM! Especially the NVMe drives.
Crucial (Micron) drives are inexpensive, solid and dependable, though they're not a raw performance leader. If you just want to get the hell away from spinning rust platters, these are great drives.
Samsung: These are my preferred drives. They have two classes. The major differences are intended use and a tiny bit of bandwidth/performance.
The Evo series is their consumer grade drives. Great drives.
The Pro series is, as the name implies, their pro grade drives. They're provisioned a bit differently.
Most SSDs have more than the advertised amount of memory on them. Flash memory has a certain amount of write cycles before that cell is no longer usable. So drive controllers tend to spread their writes over the entire disk to stave off early cell death. And most have a small amount above the advertised limit set aside to offset early cell death from extreme use cycles.
Most of the ones aimed at the pro/enterprise market tend to have MORE set aside. Making the drives more "durable".
Think of it like an endless baseball game. Where one team has 5 reserve pitchers and the other team has 50.
And, believe it or not,, another brand I trust here is Seagate with their FireCuda drives. They're a direct competitor for the Samsung Pro series.
Samsung has the performance edge. But the Seagate drives have the edge in durability.