Intel’s highest-end graphics card lineup is approaching its retail launch, and that means we’re getting more answers to crucial market questions of prices, launch dates, performance, and availability. Today, Intel answered more of those A700-series GPU questions, and they’re paired with claims that every card in the Arc A700 series punches back at Nvidia’s 18-month-old RTX 3060.
After announcing a $329 price for its A770 GPU earlier this week, Intel clarified it would launch three A700 series products on October 12: The aforementioned Arc A770 for $329, which sports 8GB of GDDR6 memory; an additional Arc A770 Limited Edition for $349, which jumps up to 16GB of GDDR6 at slightly higher memory bandwidth and sports otherwise identical specs; and the slightly weaker A750 Limited Edition for $289.
If you missed the memo on that sub-$300 GPU when it was announced, the A750 LE is essentially a binned version of the A770’s chipset, with 87.5 percent of the shading units and ray tracing (RT) units turned on, along with an ever-so-slightly downclocked boost clock (2.05 GHz, compared to 2.1 GHz on both A770 models).
Intel previously confirmed that new purchases of Arc A700 series GPUs made by January 2023 would come with a bundle of downloadable games and software, including this year’s remake of Call of Duty: Modern Warfare II, Gotham Knights, and more.
Ahead of independent benchmarks, GPUs have a confusing “performance-per-dollar” metric
In a conference call with the press, Intel representatives declined to clarify initial shipment counts for its first three A700-series GPUs other than to suggest low stock for the larger-memory A770 LE: “I suspect we’re going to sell out of that one very quickly,” Intel Graphics Fellow Tom Petersen told Ars. He was reluctant to clarify whether he expected early sellouts of Intel’s A700 GPUs, “We don’t know if we’re going to have a supply problem or a demand problem. I hope we have a demand problem.” He then confirmed that Intel plans to produce its own in-house GPU models over time instead of cutting off “LE” production while demand might still exist.
Unfortunately, Intel compounded the GPUs’ availability question by not confirming which add-in board (AIB) partners would be part of the A700 series’ October rollout. Petersen kicked that can down the road by suggesting those third-party GPU manufacturers will make their own announcements, then mentioned an interest in expanding its list of Arc-powered AIBs.
Intel’s latest presentation includes game benchmark measurements that directly compare the 8GB A750 with an EVGA model of the RTX 3060, which sports 12GB of GDDR6 RAM. Intel’s testing results have not yet been independently verified by Ars Technica. The above chart and a few others use a confusing “performance-per-dollar” metric to obfuscate raw comparisons in frame rates without listing raw frame rates or clear percentage differences.
But Intel seems determined to make that performance-per-dollar metric quite loud in the A700 series’ promotional effort, as it has advertised that the higher-end A770, priced at $349, nets “42 percent” more performance-per-dollar, on average, than an RTX 3060 that sells at retailers for an average of $418. The same fuzzy-math sales pitch suggests that the $289 A750 will net “53 percent” more average performance-per-dollar than the same RTX 3060 model.
We look forward to someone in the Ars comments section breaking down that incomplete algebra formula to determine the actual performance gap between each product, at least according to their own internal testing methodology. Perhaps it will align with previous Intel comments that peg the A750 at roughly 3 to 5 percent faster than the RTX 3060.
Intel continued conceding the Arc series’ biggest teething issue in its first generation: A700 series’ drivers and hardware are not so far doing a fantastic job besting the RTX 3060 in DirectX 11 performance. Although Intel claims that a few DX11 games have nearly identical performance or even superior performance on Arc A770 compared to the RTX 3060, its reps admit that Nvidia has a generally noticeable lead on those older games.
When pressed about how each GPU compares to the better-reviewed RTX 3060 Ti, Petersen pushed back, again apparently stuck on the pricing gap between GPUs: “Pricing on the 3060 Ti is just crazy, so we didn’t want to include that in our analysis,” he said. As I’ve previously covered, the RTX 3060 emerged with a severe performance drop compared to the 3060 Ti—though, if Intel manages to push meaningful gains in general rasterization, specific ray tracing workloads, and XeSS-powered image reconstruction, its price-to-power metric may pan out for anyone eager to buy an Nvidia alternative (so long as it’s in stock at your favorite retailer, anyway).
We’ll have more on the A700 series of GPUs soon at Ars Technica.