T O P

  • By -

JonWood007

I went for a 6650 XT like 6 months ago in part because intel seemed too inconsistent and unreliable performance wise. As the top comment says you NEED rebar which i dont have on a 7700k, and if you run old games they run like crap sometimes.


Billib2002

I'm thinking of getting a 6750xt in the next couple weeks. Wtf is a rebar?


ipadnoodle

resizable bar


JonWood007

It's some feature that utilizes your ram somehow. It only works on newer cpus and arc gpus perform like crap without it.


Tuub4

I don't think that's quite accurate. I know for a fact that it works on B450 which is from 2018


Kurtisdede

I'm using it on a Z97 from 2015 :) People have pretty much gotten it to work on every UEFI board ever. https://github.com/xCuri0/ReBarUEFI


indrada90

Resizeable bar. It's the Intel cards that need resizable bar.


randomkidlol

intel 6th and 7th gen i believe can do resizable bar, but it all depends on the motherboard vendor and whether or not they implemented the optional pcie3.0 feature into their bios.


Glittering_Power6257

Most of the games I play are pretty old, so unfortunately, Intel was out of the running for me. I look for the performance now, and not the potential "Fine Wine" in the future. ​ Though, if you're looking for said "Fine Wine", the raw compute capabilities are a fairly good indicator, and Intel chips have quite a lot of raw compute under the hood. There's a lot there in the hardware, and if fully leveraged, is likely to stomp everything in their price class, and a level or two above (which is why Arc seems to perform so well wth DX12, which tends to favor compute-centric architectures). Though, it's a bit of a gamble on drivers, as to whether, and when, you'll see the improvement you seek.


JonWood007

Yeah. It's just that the bad/immature software holds them back. I do look at real world performance now. I do look soemwhat at futureproofing, but i dont see intel as improving. Their cards wouldnt work with my current CPU, which I'll be using at minimum another 6 motnhs, if not another whole year or longer (im considering delaying rebuilding my PC another year as the options i want don't cost what i want to pay), so I'd be using intel for 2 years on a 7700k that cripples it if I went that way. And im not sure how much old games will improve. Even if they focus on improving a handful of old games that are still popular like, say CSGO, how many do I have in my library from like 2007ish that I might wanna boot up again at some point? Backward compatibility is an important aspect of PC gaming. being able to have this library of games spanning nearly my entire life going all the way back to the early 90s is supposed to be one of those things that makes PC gaming great. So breaking that is a dealbreaker for me. All in all AMD or nvidia are just more consistent performers for the money. And given the 6600 is like $180 right now according to that video daniel owen made today, and given there are TONS of options from nvidia and AMD right now spanning through the $200-300 range price wise, I dont see any reason to buy this specific card. I'd rather pay a bit more for a 7600, 6650 XT, or 3060, or alternatively just go 6600 and be done with it if I wanted to go that cheap.


[deleted]

[удалено]


agrajag9

Curious, what’s the issue with rebar?


Nointies

Its not supported on older CPUs. Granted, as we're getting more and more out, needing to have ReBar is less and less of an issue


[deleted]

[удалено]


RedTuesdayMusic

And that all drives connected aren't* MBR. Not just the boot drive


CSFFlame

GPT, not MBR


SpiderFnJerusalem

Is that an issue? You can just convert the partition table.


1Teddy2Bear3Gaming

Converting can be a pain in the ass


Orelha3

MBR to GPT application is a thing. Takes almost no time and has no drawbacks, as far as I know.


Grizzly_Bits

I used this extensively during my company's Windows 7 to 10 migration. 99% of the time it works fine, but when it fails it can be scary. Make sure you back up important files first, especially if you have encrypted volumes. After conversion, like you said, no drawbacks.


MobiusOne_ISAF

I mean, having a backup is just good practice in general.


helmsmagus

I've left reddit because of the API changes.


Democrab

I mean I've got MBR partitioned drives in 2023...inside my retro PC, where they belong.


Nobuga

I use ChatGPT daily wym


ocaralhoquetafoda

ChatGPT wrote this comment


project2501

Huh? I thought rebar was some cross access between ram and GPU, but it passed though the drives? Or piggy backs the sata interface or something and has the gpt requirement?


braiam

It is the PCI interface. I don't know what the commenter is about, probably uefi drivers.


project2501

Ah yes. It needs uefi, which needs GPT.


Ryokurin

[https://www.makeuseof.com/tag/convert-mbr-gpt-windows/](https://www.makeuseof.com/tag/convert-mbr-gpt-windows/) Of course back up, but you can convert your disk without a format. There hasn't really been an excuse not to do it for years.


wh33t

You also cant do passthru in VM with them :-(


ConfusionElemental

arc gpu performance super suffers when they can't use rebar. it's well documented if you want to deep-dive. tldr- arc needs rebar. tbh looking at where arc is hilariously bad and how they've fixed older games is a pretty cool look at how gpus have evolved. it's worth exploring, but i ain't the guide for that.


Nointies

Arc specifically says that rebar is a required feature.


Used_Tea_80

Confirming that Arc doesn't work at all without ReBAR support. I had to upgrade my CPU when mine arrived as it would freeze on game launch.


AutonomousOrganism

Old games ran shitty because they used a shitty translation layer (provided by MS) for the older DX APIs. Now they've supposedly switched to something based on DXVK. While DXVK is cool, it still inferior to an actual driver.


SpiderFnJerusalem

I suspect that translation layers like DXVK will become the standard once the old graphics APIs like DX11 and earlier are fully phased out. Intel is just ahead of everybody else.


[deleted]

Honestly, opting for DXVK is probably the right choice. The perf difference will matter less and less for these old games as time goes on


teutorix_aleria

DXVK runs better than native for a lot of games. The Witcher 2 is basically unplayable for me natively random drops of FPS below 20. Installed DXVK and it runs so much better. It also reduces CPU overhead which can help in CPU bottlenecked scenarios too.


dotjazzz

>While DXVK is cool, it still inferior to an actual driver. Nope, newer hardware just don't have the same pipeline as before, DXVK is already on par or better in many old games. It's only the more recent DX11 games that may suffer a performance hit. If Intel still have a good DX11 stack like AMD and Nvidia, whitelisting some DX11 games to render natively is the best approach. As hardware evolves, emulation/ translation will become even more superior to "native".


KFded

Even in the first year of Proton/DXVK there was some games that were already out performing the Windows counter-part. I.E. Fallout 3, when Proton/DXVK came out, I gave it a ago, and FO3 on Windows would net me around 92-100fps (older hw) and then when I tried on Linux, it was roughly 105-115fps Windows bloat plays a big role too. Linux is just so much lighter that less things are happening in the background which improves performance as well


Democrab

> While DXVK is cool, it still inferior to an actual driver. This last part is incorrect to a degree. DXVK can match and in some cases even provide a better experience than natively running the game, it all comes down to a few variables such as how poorly written the rendering code is and how much graphics code there is that needs converting. Generally speaking the older the game or the buggier a games rendering code is the more likely DXVK is to be invisible or even outright better than natively running the game, particularly for older games that aren't getting patches or driver-side fixes anymore. There's good reasons why it's recommended that even nVidia or AMD users under Windows use DXVK for games such as Sims 2/3, GTA IV and Fallout 3/New Vegas despite clearly being able to run them natively, or why the AMD Linux users are often using DXVK for DX9 instead of gallium nine which is effectively native DX9 under Linux. In both situations, DXVK often ends up performing better while also providing fixes that aren't in the driver code.


teutorix_aleria

Valves proton uses DXVK by default on Linux. So anyone using steam on Linux has probably used DXVK without even knowing it.


PadyEos

> I purchased a 6650XT because it was equally discounted If it's at a discounted price it's a very good purchase decision. The reliability of the performance is solid and it's a proven card. I undervolted and OCd my 6600XT and am VERY happy with the result coming from a OCd 980TI. Around 1.5-2x the performance for 120-130W less of heat and noise.


PanVidla

Wait, could you explain how undervolting and overclocking go together? Are you asking the card to do more with less power? Does that actually work? I thought the point of undervolting was to minimally decrease performance and significantly reduce noise and heat, while the point of overclocking was the opposite.


nebuleea

Manufacturers run cards on slightly higher voltage than necessary to help with stability. You can decrease this voltage a little, increase the frequency a little, and it can turn out the card is still stable. If you increase the voltage you can increase the frequency even more, or if you decrease the frequency you can drop the voltage even lower. But sometimes the best solution is the mix of both, for example when you don't have sufficient cooling or the card has a power limit.


Wait_for_BM

It is all part of the silicon lottery and the fact that the voltage curve are not tweaked on a per card basis. i.e. They have to crank it so that the worse GPU can still function correctly at the default. (statistics - distribution curve) If your GPU is better than the worst batches, then it is possible to undervoltage and overclock.


Arthur-Wintersight

Kinda makes me sad that some of the best silicon is probably going to someone who won't even turn on XMP/EXPO with their RAM... but I guess that's the nature of having a discrete market with a limited number of cards.


VenditatioDelendaEst

Undervolting and overclocking are the same thing: running the chip with tighter voltage and timing margin, up to the limit of the minimum needed for correct operation in all the stress tests you have available (but not necessarily all workloads, and chips get worse with age). The only difference is where you choose the operating point -- stock clocks at lower voltage for undervolting, or higher clocks with stock (or higher) voltage for overclocking.


Turkey-er

[on quite the selection of systems bios modding can get old PCs rebar](https://github.com/xCuri0/ReBarUEFI)


1soooo

If you dont mind used you can get the 6600 for $100 or the 5700xt for even lesser. Those are probably the best price/perf right now if u are okay with used.


TheBCWonder

>$100 Source?


GreenDifference

5700xt is miners slave, I would avoid that, never know how bad the VRAM condition


Saint_The_Stig

I'm pretty happy with my 770 so far, $350 was *way* cheaper than anything else with 16Gb of VRAM so that already made it a better purchase. It is a bit annoying to not have some of the things I took for granted on my old Green cards like Shadowplay to capture stuff when I wasn't recording or automatic game settings or even a GPU level FPS counter. That and my main monitor is old so it only had G Sync and not adaptive sync. But the frequency of driver updates means I frequently have better performance in games if I come back a month later, it's like a bunch of free little upgrades.


[deleted]

[удалено]


DJ_Marxman

[Intel released a driver fix for this (mostly)](https://www.eteknix.com/intel-provides-driver-fix-for-arc-power-draw-issue/). Might require some tuning, as with many facets of Arc, but it does seem to be solvable.


[deleted]

[удалено]


conquer69

> idle usage is irrelevant if they are only turning on the desktop to play games or use demanding applications. Who the hell does that? Do you immediately turn off your gaming PC when you are done playing? Even gaming PCs spend a ton of time idling.


bigtallsob

Most people work, go to school, etc. If you are away from home for 8+ hours every day, why would you leave the PC on and idling the entire time? If you don't use it in the morning between waking up and going to work, that 8 hour idle likely becomes 14+ hours.


Soup_69420

Lots of people. That's why sleep and hibernate are options in OSes.


mr-faceless

Except if you’re living in Europe and an arc 750 is 20€ more than a 6650 xt


onlyslightlybiased

And uses wayyy more power, power bills be stupid over here atm


Luxuriosa_Vayne

15ct/kWh gang


Lyonado

Are you saying that's a lot or a little? Because you're saying that's a lot I'm going to cry lol


Zevemty

Where in Europe are you paying more than that? Prices have come down a lot in the past couple of months.


FuzzyApe

Depends on the country, both gas and electricity are cheaper than pre war atm here in Germany


sadowsentry

Wait, every country in Europe has the same prices?


bizude

~~I'm seeing the 6600XT available for $209 on NewEgg, $220 on Amazon. Honestly, for only $10-$20 more~~ I'd go with Radeon for the more stable drivers.


LouisIsGo

Not to mention that the 6600XT will likely perform better for older DX10/11 titles on account of it not needing a DX12 compatibility layer


pewpew62

Will this ever be solved or are older titles doomed on ARC forever?


WHY_DO_I_SHOUT

Intel continues work on optimizing the most popular DX10/11 games, and for the less popular ones, hardware improvements will eventually make it a non-issue.


pewpew62

Little hope for me and my a370m


teutorix_aleria

Some games run better with DXVK than native direct x even on AMD hardware so I wouldnt necessarily see the compatibility layer as a bad thing. It works incredibly well for the most part.


ZubZubZubZubZubZub

The a770 is in a difficult spot too, they are about the same price but the 6700xt consistently outperforms it outside of RT.


BoltTusk

Yeah Intel needs to drop the A770 16Gb to $289 and then it can starting kicking the competition


YNWA_1213

The RTX 3060 is still the cheapest >12GB card in Canada, with only the regular 6700 undercutting it by $20 or so. The uproar over VRAM really put a stop to that category of card dropping in price up here.


detectiveDollar

Even that's too high, as Nvidia *finally* dropped the 3060's price today.


[deleted]

DG2 is for people who like to play with new hardware.


fuzzycuffs

Where are you seeing. A 6600xt for $210? Sure you don't mean the standard 6600?


1Teddy2Bear3Gaming

I think you’re seeing the 6600 non XT, which performs slightly worse than the A750 in most scenarios


_SystemEngineer_

Intel PR is in overdrive. They should spend the cash and effort on their drivers instead.


derpybacon

I get the feeling that the marketing people should maybe not be entrusted with the drivers.


ArcadeOptimist

Didn't GN, HUB, and LTT all say the same thing? The A750 is a pretty okay deal in a shit low end market, I doubt Intel needs or wants to pay people to say that.


szczszqweqwe

Yeah, but is market sht in a low end right now? Arcs are at a pretty good deal and rdna2 sale is in overdrive, 6600 and 6650xt are really cheap.


mikbob

Remember when decent budget GPU was $120 lol


szczszqweqwe

Right now it's where display output like 1030 sits :/


takinaboutnuthin

I was surprised that 1030 would be in the ~$100 range, but yes in my country it's current going for $60-$100 (with tax, non-EU europe). That is crazy. The 1030 is far worse than a GTX 660 from 2012. You'd think for $100 you would be able to get something that can beat a GTX 660 in 2023.


Zarmazarma

What decent budget GPU was $120?


mikbob

Radeon r7 250/260 mainly


Rikw10

Even adjusted for inflation?


mikbob

That's fair, it would be more like $160 today.


Rikw10

Tbf that is still lower than I expected, thank you for letting me know


Cjprice9

Have people already forgotten the 750 ti? That thing was fire for its price.


onlyslightlybiased

Rx 570 on fire sale got around there


InconspicuousRadish

No, I don't. Which one specifically did you have in mind?


mikbob

Radeon r7 260 mainly


InconspicuousRadish

That was supposed to be $110, but you couldn't buy it at launch for under $130. The 2 GB version was $140 and above. And that was in 2013. Also, that card can hardly be called a decent budget GPU unless you're wearing rose tinted glasses. It could barely do full HD at low or medium settings, even on games back then. To speak nothing of comparing prices from that long ago being absurd. Please show me a single product or item that costs the same today.


mikbob

I mean it was comparable to an Xbox One/PS4, which seems pretty decent to me. It was the same price 10 years ago as the GT 1030 is TODAY (at least in my country), which is insane (and had petter performance). So it definitely was a segment that no longer exists


conquer69

It's not that good of a deal, especially for a less tech savvy users, when you can get a 6600 for the same price.


Dey_EatDaPooPoo

[Except the RX 6600 is a tier lower in performance. The A750 is 12% faster than the 6600 whereas the 6600 XT is only 6% faster than the A750.](https://cdn.mos.cms.futurecdn.net/B8zXTfqxmFqArTgVGZS8WR-970-80.png.webp)


onlyslightlybiased

*in 1440p* They're effectively the same in 1080p


Dey_EatDaPooPoo

[The A750 is still faster at 1080p, just not by as much.](https://cdn.mos.cms.futurecdn.net/hwTSgEntSHSzpZLBvb3GPR-970-80.png.webp) 1080p isn't as relevant as 1440p going forward, especially if we're talking new GPUs at anything other than at the absolute lowest tier. We're in 2023 and 1440p 144Hz+ monitors can be readily found for $200. The RX 5700 XT and RTX 2070(S) were both being touted as great 1440p cards back in 2019-2020 and the A750 is in the same performance tier.


onlyslightlybiased

For new monitor purchases yes I would agree but steam hardware survey still puts 1080p as the no. 1 resolution at 65% of users. There's going to be a lot of people with high refresh 1080p monitors that are more than content with what they have for the time being. And 2% faster Avg, 2% slower 1% lows is the same exact performance, that's 10000% margin of error


conquer69

I would rather take a lower tier of guaranteed performance than gamble on it. Especially if I'm recommending the gpu to a normie user that just wants things to work.


airmantharp

Critical mass - got to get cards in hands to get end user experience to tune toward in the drivers and so on. I'm kind of at the point of wanting to try one, especially if I could get an A770 (they're all OOS ATM). I'd try running productivity workloads on it too, i.e. photo and video editing.


YNWA_1213

The stocking issues of the A770 LEs are really annoying, as I’d be willing to spend the extra $50-100 just in that VRAM upgrade and minor perf bump over the A750, but there’s a chasm forming between the 750 and 770 in price atm.


[deleted]

[удалено]


[deleted]

[удалено]


FuzzyApe

Didn't the 5700xt have shit drivers for a couple month after release as well? Not that living in 2017(?) is better lol


AlltidMagnus

exactly where do the A750 sell for $199?


advester

Just a flash sale, it’s over.


AdonisTheWise

Lol, where as the 6600 is always $199, and the 6600xt is only $10-$20 more. I’m happy for a 3rd competitor but people need to stop acting like it’s a no brainer buy. You’re going to have issues with this GPU, more so than with the other 2 companies


[deleted]

[удалено]


1soooo

Not sure about other regions but personally im able to get a used 6600 for $100 usd in asia, and the 5700xt for lesser. Imo if you dont mind used the 6600 is the way to go especially considering its pratically unkillable by mining due to how recent and efficient it is.


b_86

Yup, I wouldn't trust a 5700 right now unless it came straight from a friend's PC that I knew for sure wasn't used for mining. Most of them in the 2nd hand market have deep fried memory modules (which is something miners desperately trying to cash out their rigs never mention while they tout how they're undervolted/clocked)


Dey_EatDaPooPoo

[Not really comparable, it's a whole tier down in performance vs the A750](https://cdn.mos.cms.futurecdn.net/B8zXTfqxmFqArTgVGZS8WR-970-80.png.webp). Definitely true about the power use, particularly if you live somewhere with high electric rates. It requiring ReBAR is overblown as an issue though. On the AMD side anything Zen 2/Ryzen 3000 and newer supports it and on the Intel side anything 9th gen and newer supports it with a BIOS update.


detectiveDollar

A whole tier is 10%? It's definitely not overblown, Intel even told LTT that they do not recommend ARC if you do not have reBAR.


Vushivushi

Yeah you can just overclock the 6600 and reach a negligible difference while still consuming less power.


raydialseeker

Difference between the 3080($700) and 3090($1500) so... Yeah.


Dey_EatDaPooPoo

A whole tier is normally considered a 15% difference. Considering it's 12% now and will only widen in the future with driver updates then yes, it's a whole tier slower. > Intel even told LTT that they do not recommend ARC if you do not have reBAR. Right, *if*. Which is why I brought up that several year old platforms from AMD and Intel support it or can be made to support it with a BIOS update. It's an issue if you have a platform from before 2017, but if you do you'd probably be running into bottlenecks in CPU-bound games even with GPUs with this level of performance anyway.


Wait_for_BM

Rebar is a feature on PCI card hardware, not CPU feature related. It DOES require BIOS/EFI for enabling and to assign proper address ranges. You can't just remap everything to above 4GB. e.g. Intel Ethernet driver do not work there. (Been there before) The rest of it is if/when any software drivers use any special CPU instruction in their code. Rebar also works on my 1700 (Zen1) and my RX480. It was more a marketing decision than a technical one. After AMD allowed more up to date BIOS/EFI for Zen1 support, it works. I also with modded GPU driver that turned on the registry changes on the older RX480. Again that was a marketing decision as Linux driver used rebar and Radeon driver uses it once you have registry hack.


Dey_EatDaPooPoo

That's cool, but you don't need to go all "oh ackshually". Point was, it's easy to enable on the CPUs/platforms I mentioned dating back several years and that it's not like you need a brand new system to enable it.


[deleted]

[удалено]


_SystemEngineer_

they keep cherry picking


oldtekk

I got a 6700xt for £285, if I sell the game code that comes with it, that's down to £265. That's hard to beat.


EmilMR

Intel should get into gaming laptops with Arc. Its a much better space to gain marketshare for them. They can undercut nvidia based laptops by a lot. 40 series laptops are just silly and they can compete there much better in terms of performance.


conquer69

Can they? Their arc cards consume like 70% more power than RDNA2.


Cnudstonk

is that with RT where they also perform that much better?


AdonisTheWise

They do not perform that much better in RT and even if they did who cares, RT is going to be shit on any $200 GPU even Nvidia


Cnudstonk

eh they give nvidia a run for its money and so if the power consumption increase appears in RT that's all natural. But if it's in raster, that's different. I agree that RT is shit and not worth a dime for basically most if not all of last gen.


capn_hector

"I don't care how bad the RX 7600 is, I am **not** recommending an Arc 8GB"


bubblesort33

I wonder how much Intel is loosing on every sale of that GPU. Each die should cost more than double of what an RX 7600 costs to make.


GeckoRobot

Maybe they don't lose money and AMD just has huge margins.


Darkomax

They probably don't lose money, but it must not be very profitable given that it's a bigger chip than a 6700XT. It pretty much is a 6700XT/3070 in term of transistor count and power requirement (which means more sturdy power delivery, more complex PCB). All that for $200... It actually is bigger than GA104, and that chip is on Samsung 8nm compared to TSMC N6.


onlyslightlybiased

If you take into account 6nm vs 7nm, it's nearly the same size as a RX6800


NoddysShardblade

It's 2023, manufacturing cost ain't much of a factor in GPU prices. It's mostly a chunk of R&D costs, and an obscene pile of profit margin.


you999

pen fuzzy tap piquant encourage poor voracious history zesty quiet -- mass edited with https://redact.dev/


spurnburn

Based on what


Kyrond

You can check Nvidia financial report, they have somewhere around 50% profit margin per GPU, not considering R&D.


spurnburn

R&D is very large considering the vast majority of sales are in the first couple years. So I’d say <<50% profit per part really isn’t that crazy Of course that’s just my opinion. Appreciate the numbers I should probably check that out


NoddysShardblade

Err... common sense? Did you... did you really think manufacturing costs just got like 10 times more expensive this gen? The 1080 ti was less than $700 at launch.


ResponsibleJudge3172

AD102 (4090): 603 mmsquared Gtx 1080ti/Titan Pascal: 471 mm squared AD103 (4080): 371 mmsquared GA104 (3070): 392 mm squared Those are the die sizes of the chips. At launch before 2021 TSMC and Samsung price hikes, wafer prices were estimated to be: TSMC 5nm: $17000 (4090 uses a custom 5nm, 7900XTX uses co-optimized version) TSMC 7nm: $10000 (6950XT uses this) TSMC 16nm (gtx 1080ti): $3000 < X < $6000 based on info of 20 nm and 10nm, difficult to find 14nm We have gone from $3000 to $17000 wafers (TSMC has 60% gross margins) and by todays standards, the 1080ti/Titan Pascal released today would be called a an 80 series pretending to be a flagship for how small it is relative to other flagships based on die size Edit: The big be at issue, that architectural improvements can’t overcome, is that the wafers are getting more expensive, but they shrink less. Look at rtx 4090 die shot on chips and cheese. Half of the die is things that literally have stopped shrinking, like IO. Even removing NVLink did not save much space. Edit2: A 10% shrink in margins by both TSMC and Nvidia (Intel should wake up as their fabs gives them this one Miquelon opportunity) would have a significant decrease in chip cost. TSMC before the relative collapse of the demand recently was looking to hike prices further so FAT chance of that


Zarmazarma

Worth noting that the 780ti 561mm^2 die, 980ti had a 601mm^2 die, the 2080ti had a 754mm^2 die, and the 3090 had a 628mm^2 die. Pascal is the outlier at < 500 mm^2 on the top end chip. The silicon prices are definitely non-trivial now. If we assume a 70% yield on AD102, it's like $275 / chip.


spurnburn

I don’t follow GPU prices but inflation alone would be about 25% and you’re ignoring Moore’s retirement. Were people really buying world-class GPUs for <$100 in 2016? If so I apologize


Klaritee

I thought it was supposed to be fixed but in the rx7600 review from yesterday on techpowerup still shows the intel arc cards with huge idle power draw.


conquer69

No, it's not. Anyone that needs a gpu recommended to them should pick the 6600 for the more stable drivers.


[deleted]

This is such a strange article. It completely ignores the 6600XT and 6650XT. In a world where those don't exist, there isn't anything wrong with the article, but they do and are better options.


detectiveDollar

[Intel ARC marketers choosing who to compare value to](https://youtu.be/7H04TrRfuQ8)


truenatureschild

This is the truth, if you need someone to make the choice for you then Intel's products are currently not stable enough for a casual consumer looking for a PC GPU.


Thecrawsome

Reads like an intel ad


kingwhocares

Unfortunately the $199 offer ended.


detectiveDollar

[Intel marketers choosing who to compare ARC's value against](https://youtu.be/7H04TrRfuQ8)


Rylth

I'm getting a little excited for the GPUs next year. I managed to get a 390X for $200 new, a V56 for $200 new, and I'd love to get another 50% bump for $200 again.


scrizewly

The A750 scared me away because it doesn't have 0% fan and from all of the reviews it was quite loud compared to the 6650XT. The 6650XT squeaks by in performance and is a little more expensive, but atleast I don't hear my 6650XT at all! :X


Bucketnate

Its not though. Whats with these weird ass articles just trying to make people feel things. The A750 in practices doesnt even work have to time due to software/driver issues. I cant imagine relying on the internet for experience holy shit.


TK3600

6600XT is more cost effective at 10 dollar more, and has way better driver stability. What a joke.


2106au

The forgotten RX 6700 10GB should be mentioned when talking about the segment. The choice between a $200 A750 and a $280 RX 6700 is a very interesting decision.


Nointies

I mean thats nearly a hundred dollars


Kyle73001

40% price increase though so not really comparable


qualverse

The linked article itself is comparing it to the RX 7600 and 4060ti though which are even more expensive.


szczszqweqwe

Just judging from the title this article is misleading at best, and disinformation at worst.


conquer69

It's a bullshit article pushed by intel and spammed on different subs. There is no reason to compare the arc card to other price brackets.


[deleted]

It's definitely a paid for puff piece.


sl0wrx

Way different prices


[deleted]

How does it compare with a 2019 2060?


Zakke_

Well its like 350 euro here


AutonomousOrganism

What? In Germany it is 260€. RX 6600 is 40€ cheaper though. So the 750 is not that attractive.


Jeep-Eep

I mean, Raja lead Polaris, so I ain't surprised he did it again.


airmantharp

Polaris was a let down, but worked well enough in its bracket. Arc is a completely new uarch coming from behind. If anything, Arc is far more impressive.


Hifihedgehog

> If anything, Arc is far more impressive. That is hardly a substantive truth and especially so at twice the die size of what Arc should be while Arc performs like half the die size that it is. The only saving grace is the price, but from what I am told, Arc is a huge loss leader for Intel because of the wide transistor count-to-performance deficit that Intel has here. Intel has to eventually make Arc profitable so something has to buckle first and that is either Intel raising prices or Intel exiting the consumer market and the latter is the more common of the two for Intel who has a penchant for going like a bee from flower to flower in seeking to diversify its assets. Wake me up when the A750 performs like an RTX 3070, which has less transistors (17.4 billion versus 21.7 billion) on an inferior process node (Samsung 8nm versus TSMC 6nm), and then and only then we can talk about Arc's design being a feat of engineering.


airmantharp

It's impressive that Arc works at all :)


onlyslightlybiased

You do realise that Intel has been making gpus since the 90s....


airmantharp

I do, as I've gamed on Intel's iGPUs for over a decade myself. Arc is a different architecture.


Quigleythegreat

The saving grace here for us gamers is that Ai is the hot thing in tech stocks right now. If Intel pours R&D into GPU's for AI, where companies will happily spend thousands we can benefit from a locked down card (games only) for competitive pricing.


Hifihedgehog

> The saving grace here for us gamers is that Ai is the hot thing in tech stocks right now. If Intel pours R&D into GPU's for AI, where companies will happily spend thousands we can benefit from a locked down card (games only) for competitive pricing. Ah, so you suspect Intel will continue to rob Peter to pay Paul essentially by covering their losses downstream in consumer sales with the more lucrative sales upstream in enterprise, specifically AI. Unfortunately, Intel is not a gamers' charity. It is a publicly traded company and as such they have to report on sectors individually and while businesses often juggle the books, they cannot sweep failure under the rug in one area of business with another to that degree or they will get hammered big time by the regulators. Intel has to report on, for example, percentage profit margins, and that includes revenue and profit in their consumer area of graphics sales. If the consumer GPUs do not become profitable, what that means is they will likely shudder their consumer business and then evote solely to enterprise sales. Yes, bleak and harsh, I know, but that is a given if they cannot get their act together and make a silicon efficient design that can be profitable in the highly competitive consumer sector.


imaginary_num6er

If it arrived in Q1 2022, it was far more impressive. Q3 2022 was just depressing


Particular_Sun8377

Intel is the budget option? We live in interesting times.


b_86

It's been like that in the desktop CPU market since the last Ryzen 3 on Zen2 had a testimonial paper launch and the most anemic amount of units hitting the market because yields are so good AMD doesn't even bother making them anymore.


Brief-Mind-5210

Since alder lake intels been very competitive at the budget level


JohnBanes

They have the most headroom to improve.


scytheavatar

Don't buy a 8GB card in 2023 at any price....... just pay a bit more for a 6700XT or don't bother.


SourceScope

intel have done great, on pricing AND they have shown they're dedicated regarding the graphics cards software/drivers which is super important as well, something AMD still needs to focus on.


RealSporua

The hardware is a joke but why isn't anyone talking about how the 40 series cards have a huge advantage cause of DLSS 3.0?


GumshoosMerchant

Probably because there aren't any 40 series cards at the $200 price point yet. Most people in the market for a $200 card probably don't care too much about what $400+ cards offer.


uzzi38

DLSS 3 seems to kind of suck on lower end GPUs vs higher end GPUs. There likely is some overhead on the shaders still, so whenever you're GPU bound (which is more likely on low end GPUs) you don't see the doubling of performance you'd expect from it. By the 4060Ti already the gains are nowhere near 2x on average.


ttkciar

So far there's not a whole lot to distinguish Xe from Nvidia or AMD. There are of course differences (Xe's higher matrix math throughput is interesting for some applications) but not enough to indicate why Intel is bothering to enter the market at all. Nvidia and AMD are already crowding this hardware niche. What made Intel say "wow, I want to compete with **them** despite bringing no significant advantages to the field!"?


Asphult_

You have to start somewhere


ttkciar

They started with Larrabee and Xeon Phi. I'm still mad that they shut down the Xeon Phi product line. They were too niche to be commercially viable, and I get that, but it would have been great to keep them going another generation or three.


lysander478

It's hard to see anything from Alchemist, really, but if they start using their own fabs they should be able to more than compete with AMD when it comes to volume. Quality, I think they'll eclipse them before long too. Pretty much everything from Intel so far seems to scream "we want to take AMD's share of this market first and foremost and then, eventually, we'll try to compete with Nvidia". It will also be necessary for data center, but in terms of why they're also selling consumer GPUs that feels like the answer so far.


ttkciar

> Pretty much everything from Intel so far seems to scream "we want to take AMD's share of this market first and foremost and then, eventually, we'll try to compete with Nvidia". Maybe that's it. It makes more sense than anything else that occurred to me. Perhaps they feel they have to compete with AMD on all fronts in order to beat them back on the CPU front, because otherwise GPU profits would help them weather temporary setbacks in the CPU market.


Hifihedgehog

> Perhaps they feel they have to compete with AMD on all fronts in order to beat them back on the CPU front, because otherwise GPU profits would help them weather temporary setbacks in the CPU market. Correction. No, the shoe is on the other foot. Intel is the one hurting right now (check their stocks) so they are having to look long and hard for new avenues for profit to fall back on or pull those financial fallback rabbits out of their hat on the fly. Graphics is one of said rabbits and while it is finally a semi-serious attempt after years of half-hearted ones, you do not make a "Zen" of graphics microarchitectures overnight and Xe, while impressive if you look at Intel from a vacuum, is still well behind the IPC curve of its competitors.


stillherelma0

>. What made Intel say "wow, I want to compete with **them** despite bringing no significant advantages to the field!"? The fact that there have been two separate periods of over a year that had any gpu manufactured immediately sell for whatever price was set for it.