AMD Unveils New Radeon RX 6700 XT Midrange GPU To Take On GeForce RTX 3060 Ti (hothardware.com) 76
MojoKid writes: AMD announced a new member of its Radeon RX 6000 series graphics card line-up today, dubbed Radeon RX 6700 XT. Based on AMD's RDNA 2 GPU architecture, the Radeon RX 6700 XT targets high frame rate 1440p gaming at max image quality with an MSRP of $479. The new GPU has 40 Compute Units (CUs) with 40 Ray Tracing Accelerators, 96MB of on-chip Infinity Cache, and 12Gb of GDDR6 memory. Game Clocks of up to 2424MHz will be possible and board power is rated for 230 watts. Versus NVIDIA's current competitive offerings, AMD is claiming wins for the Radeon RX 6700 XT across many titles at 1440p/max settings versus the GeForce RTX 3060 Ti and RTX 3070, but with the added benefit of a larger 12GB frame buffer, which should add a measure of future-proofing as games get more graphically complex. Finally, AMD also revealed that it will be doing something a bit different with the launch of the Radeon RX 6700 XT. AMD-built reference cards will be available directly from AMD.com and numerous partner boards will be available from retailers and system builders, all on March 18th.
and the f* miners will jack the price up (Score:2)
and the f* miners will jack the price up
Re: (Score:2)
Is this stil going on? I had figured the whole ASIC thing a few years back would have put an end to that nonsense. Apparently not?
Re: (Score:1)
> Is this stil going on? I had figured the whole ASIC thing a few years back would have put an end to that nonsense. Apparently not?
ASIC-resistant proof-of-work is desirable for startup cryptos that don't want to co-mine and don't want to be 51%-reorg'ed.
Some projects are changing their PoW consensus rules with every hardfork now.
Did you know DARPA funded the development of first-person-shooter games because new enlistees always "aimed high" - that is they flinched when ordered to murder? Nowadays well-
Re: (Score:1)
Nowadays well-trained recruits arrive asking when they get to go door-to-door and shoot people.
And then when they're done with their enlistment, they become cops.
Stop quoting me. (Score:1)
Stop quoting me, you asshole.
I don't give two shits about creimer and your obsession with him.
At the time, I merely thought my comment was funny.
But apparently, morons took it seriously.
And now I feel a bit bad, because truly, I don't even freaking know the person behind creimer.
Who knows who beat him as a child?
To me, you are just another creimer too.
A beaten child acting shit online.
So excuse me for not just being funny again.
At least I normally don't act like you creimer too, but sometimes, I'm a former
Re: (Score:2)
I used a FPS in the military before we were mobilized for Desert Storm, so it was 1990 or earlier.
It was based on Laserdiscs. The video would freeze for a very short but perceptible time when it changed tracks. This usually happened when walking into an ambush or when about to make contact. That definitely detracted from the realism.
Re: (Score:3)
I'd say citation needed, but I might as well get straight to the point and call just say the claims bullshit. DARPA have funded games, but FPS was a well established genre well before that; it's also pretty comical to see someone equating being an FPS player with being a well trained recruit. I've seen no evidence that exposure to FPS impacts on willingness or desire to kill in any context, or to
Re: (Score:3)
Did you know DARPA funded the development of first-person-shooter games because new enlistees always "aimed high" - that is they flinched when ordered to murder? Nowadays well-trained recruits arrive asking when they get to go door-to-door and shoot people. DARPA's program succeeded.
I don't see how an FPS will help with that. People new to firing a gun aim high because they pull the trigger rather than squeeze it, or because they don't manage the kick properly. There is no way to fix either problem other than firing a an actual gun repeatedly with an instructor next to you. I've done both, and I don't see playing an FPS to be in any way at all to be a similar in experience to firing an actual assault rifle.
Re: (Score:1)
A violence simulator would solve this by acclimating people to violence.
Re: (Score:2)
Re: (Score:2)
I bought a pre-made low/mid range gaming machine in late 2019 with a RTX 2060. The entire machine cost $800 from Newegg (no monitor). Now (almost 18 months later), RTX 2060s are sold for almost the price of the entire PC I bought.
Re: (Score:3)
My GTX 1070 is apparently worth as much now as it was when I bought it new at the height of the last mining boom.
That said the coin prices came down a bit in the last week or so, so maybe the market will cool too. Otherwise I'd sell the card, but there's nothing to replace it with lol.
Re: (Score:2)
Re: and the f* miners will jack the price up (Score:1)
Funny, because they will use more than $70 in electricity, I can tell you that.
Re: (Score:2)
Funny, because they will use more than $70 in electricity, I can tell you that.
I don’t know why people believe mining gpus use a lot of power. Miners usually underclock the cpu because it’s the ram speed that really matters, so these gpus use less power mining than they do when they’re gaming. For example, a properly configured 3070 will use about 120 watts while mining. If you’re paying 10 cents a kWh, that’s about 10 cents every 8 hours, or about 30 cents a day to mine. 3070 is making between $5 and $8 a day depending on market fluctuations and what
Re: (Score:2)
Maybe for Bitcoin but not all crypto currencies have been around long enough or have enough market to make ASICs viable.
Re: (Score:1)
For Bitcoin? Yes.
For other, lesser-traded crypto, no.
Re: and the f* miners will jack the price up (Score:1)
putting the gay into gaming (Score:1)
Paper Launch #6! (Score:1)
I thought they'd hit peak stupid with 5 paper launches.
Radeon 6000
The Ryzen 5000 series processors
The Playstation 5
The Xbox Series X
The Xbox Series S
Now this.
Re: (Score:2)
Re: (Score:2)
It took searching from "launch" until just this week for me to find a 5600X at MSRP. I think it was just bad timing to launch new CPUs, GPUs, and 2 consoles all at the same time, but it didn't help me that they were using something like 70+% of their capacity to make console parts. Perhaps they could have kept up if last year had been "normal".
Re: (Score:2)
Consider that somewhere between Zen1 and Zen3 AMD went from "launching" things right before mass production to "launching" things right before initial production.
You cant blame them for trying to taking every opportunity to punch Intel when i
Boy who cried wolf... (Score:1)
If it starts to look like it is punching itselfy, maybe it's time to try a better PR strategy...
Re: (Score:1)
Sorry, but tell me how the differentiation between "no product" and "nearly no product, with no availability for 4+ months, after the first DAY" differs in the end...
As for punching Intel, I approve.
But launching a product people just can't get isn't a punch. It's blowing kisses. Because it highlights the fact that Intel can churn out their product IN QUANTITY and AT WILL. And AMD simply CANNOT.
Sure, their product may not be THE best.
But people can get their hands on it.
And they can get their hands on it
Re: (Score:3)
Re: (Score:1)
I'm not saying Ryzen isn't a better platform.
(Though we'll see once the 11000 series hits.)
I'm saying that when you NEED the parts, Intel is simply more dependably available.
Period.
If you disagree, find me an ACTUAL "new in box" Ryzen 9 chip within $100 of MSRP.
You have one week.
Re: (Score:2)
Re: (Score:1)
There's an obvious and fundamental difference between trying to acquire a 4 year old, obsoleted part.
And building new with a current-version chip.
So I'll make this even easier to understand.
If I want to build a new system with a current version of the latest Intel chip, I can go ANYWHERE and just pick it up.
If I want to build a new system with a current version of the latest AMD chip, I can only go to SCALPERS and pay markup of anywhere from 50 to 200%.
Which voids any warranty.
And since there ARE problems w
Midrange? (Score:4, Insightful)
$500 for a frickin' graphics card is MIDRANGE????
Re: (Score:3)
$500 for a frickin' graphics card is MIDRANGE????
For a gaming system, yeah.
Re: Midrange? (Score:1)
Re: (Score:2)
Re: (Score:2)
A new RTX 3090 seems to top out at $3000. You could get a 20-year-old car with 200,000 miles on it for the same price. I suspect it's partly because so many people are working remotely. They don't need so many cars any more, but they do want to play games at home.
Re: (Score:2)
Re:Midrange? (Score:5, Informative)
As if latest games max-settings 1080P at 60+FPS doesnt qualify as at least mid-range. Every "gaming" video card on the market today is compute-heavy. In the case of nVidia 80% of the boards processing capability will go entirely unused in almost all cases (not using any tensor/rt cores, for example) and you dont even want to know how ridiculous the raw disparity is between bandwidth and compute on the latest AMD APU's (even the 3400G.)
So ok they are targeting 1440P, max settings,
If you want what a rational person would call a mid-range gaming system, one that will easily handle 1080P max with high frame rates, and 1440P medium with high frame rates, then spend no more than $250 on a recent video card, and strongly consider $200 as a more reasonable maximum, and your reshade stuff will also barely impact framerate, because there is compute to spare everywhere now, even on AMD APU's. Just make sure its a recent gpu architecture of some kind and it will have compute to spare.
Re: (Score:2)
Yeah. I've got an RX550 that I spent about $125-$150 on. Using an AORUS Pro B550 and Ryzen 3 3600. Works just fine for me.
Re: (Score:2)
Are you me? Used RX580, Aorus B550, Ryzen 7 3800.
I got the RX580 off ebay (former miner of course) a few years ago when you could still get one for $250. Kicking myself now for not getting an Aorus for the extra overclocking potential. I've been nursing it along; started getting thermal issues and instead of trying to fiddle with reglueing the heat sink I had a little bit of fun putting a cheap AOI water block on it. Sadly, maxing out the voltage I can still only squeeze 1400MHz out of it. The Ryzen 7 was a
Re: (Score:2)
Forgot to mention - I can play most recent games at maxed out at 1080P settings, and some I can play with medium-high settings at 1440P, 144Hz with a freesync monitor.
Re: (Score:2)
Bought my parts from Newegg.
Re: (Score:3)
They have been slowly shifting the expectation of mid-range over time. Nowadays 1440p with a minimum frame-rate of 60 FPS, peaking much higher on a monitor capable of 144Hz, is "mid-range".
It's a lot like how even relatively cheap cars have a lot more acceleration than they did 30 years ago. Sub 10 second 0-60 times are pretty standard for mid-range cars.
The most popular GPU, according to Steam, is the GeForce 1060, followed by the 1060 Ti which together account for getting on 20% of the installed user-base
Re: (Score:2)
This is not what is considered the mid-range price point.
Then you should start by prefacing what games require this card to achieve great framerates. If your answer is 1 or 2 and even then only with RTX On then you have your answer. These are mid-range cards for the coming generation. They aren't the best, but the entire generation is far more than what most people need. You're talking about a card that is faster than an RTX 2070 Super, the lineup of which would have been considered high end only a few years ago.
Re: (Score:2)
Re: (Score:2)
Yeah making my point for me are you, or did you not understand my post? Let me rephrase so you can play along. It's a next gen card, the midrange of it. It is very much midrange, that's typically how the bottom tiers of gaming works. Low end GPUs aren't for gaming. Mid-range GPUs are for base games. High end GPUs are for esoteric edge cases. At the time when new cards get released they are rarely taxed by existing games.
Mind you don't expect to be running Cyberpunk 2077 at decent frame rates on this with th
Re: Midrange? (Score:1)
Re: Midrange? (Score:1)
Re: (Score:2)
$500 for a frickin' graphics card is MIDRANGE????
I guess it depends on how much you expect a midrange PC to cost? Id a mid range gaming PC is $1000, is half on the GPU, half on the rest reasonable? I honestly don't know. As an avid non gamer, that sounds on the cheap side to me for a gaming PC.
Re: (Score:3)
What used to be low-range is no more:
iGPUs have become so good that those that nedded a low-end GPU in the past don't need a dGPU at all anymore.
Thus, within the graphics-card market now, low-end is at the level of former mid-range, and former high-end is now split into current mid-range and the new high-end.
Re: (Score:2)
Re: (Score:3)
$500 for a frickin' graphics card is MIDRANGE????
Yeah. Just like a Porsche 911 is a midrange sports car, but expensive compared to your daily commuter, a $500 GPU is a midrange gaming GPU while being expensive for the typical webbrowser displaying pc.
Falls flat on raytracing (Score:2)
It amazes me that they claim this is out to take on the 3060 ti without mentioning the 6800xt can only keep up with the 3060 ti if the 3060 ti turns off DLSS. The 3070 and 3080 both stomp it. Then, they give you less power and claim this is to take on the 3060 ti. Sure, few games utilize ray tracing now, but it IS the future. None of AMD's cards can do DLSS (proprietary), they can't do an AMD alternative to DLSS and raytracing is a major performance hit for them. If I'm spending that much money, I'm not buy
Re: (Score:1)
Where have you gotten your performance figure [youtube.com] that the 6800xt can barely keep up with the 3060ti? Because the only scenarios I can think of that potentially occurring are in heavily Nvidia optimized games, like Minecraft RTX or Quake II RTX, and even then, that's with DLSS on. I made the assumption that you're talking about games, since you mentioned DLSS, in which case, the 6800xt tends to trade blows with the 3080 (DLSS off) with performance typically leaning more towards AMD, likely due to console archit
Re: (Score:2)
Only 1.5 GB of RAM?? In 2021? (Score:1)
That doesn't sound right ...
I thought cards nowadays had somewhere around 96 Gb...
Re: (Score:2)
"AMD did disclose that the GPU has 40 Compute Units (CUs) with 40 Ray Tracing Accelerators, with 96MB of Infinity Cache, and 12Gb of GDDR6 memory"
Not sure where you're seeing 1.5gb of RAM?
Sweet! But can you buy one? (Score:2)
Has that changed?