<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Radeon GPUs, Inside the Stack | Drivers &amp; Behavior - [ Geeknify ]</title>
	<atom:link href="https://geeknify.com/tag/radeon/feed/" rel="self" type="application/rss+xml" />
	<link>https://geeknify.com/tag/radeon/</link>
	<description>Tech news, Gadget reviews &#38; Geek insights</description>
	<lastBuildDate>Fri, 13 Feb 2026 08:06:43 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Radeon RX 9070 XT, RX 9070, and RX 9060 XT tested: 10 Games at 1440p reveal a clear VRAM problem</title>
		<link>https://geeknify.com/radeon-rx-9070-xt-rx-9070-and-rx-9060-xt-tested-10-games-at-1440p-reveal-a-clear-vram-problem/</link>
		
		<dc:creator><![CDATA[Vlad Phigod]]></dc:creator>
		<pubDate>Fri, 13 Feb 2026 07:45:03 +0000</pubDate>
				<category><![CDATA[Games]]></category>
		<category><![CDATA[View All]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Radeon]]></category>
		<guid isPermaLink="false">https://geeknify.com/?p=830</guid>

					<description><![CDATA[<p>AMD's RDNA 4 lineup spans $300 to $600, but the benchmark gaps tell a more complicated story than the price tags suggest. One card struggles badly.</p>
<p>The post <a href="https://geeknify.com/radeon-rx-9070-xt-rx-9070-and-rx-9060-xt-tested-10-games-at-1440p-reveal-a-clear-vram-problem/">Radeon RX 9070 XT, RX 9070, and RX 9060 XT tested: 10 Games at 1440p reveal a clear VRAM problem</a> appeared first on <a href="https://geeknify.com">Geeknify</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>AMD&#8217;s RDNA 4 generation finally landed with four distinct SKUs covering the $300–$600 range, and on paper the lineup makes sense. The Radeon RX 9060 XT comes in 8GB ($300) and 16GB ($350) variants for budget-conscious 1440p gaming, while the RX 9070 ($550) and RX 9070 XT ($600) target higher framerates and heavier workloads. Clean segmentation, reasonable price gaps, predictable performance tiers — except the benchmarks tell a messier story.</p>



<p>After running all four cards through ten demanding titles at 2560×1440, one thing became painfully obvious: the 8GB RX 9060 XT has no business being sold as a 1440p graphics card in early 2025. The<br>VRAM bottleneck isn&#8217;t subtle. It&#8217;s catastrophic in specific titles, and merely bad in others. Meanwhile, the $50 premium for the 16GB variant transforms the card into something genuinely competitive, which raises uncomfortable questions about why AMD shipped the 8GB model at all.</p>



<h2 class="wp-block-heading">The test suite and methodology</h2>



<p>All benchmarks were captured at 1440p with maxed settings (or the highest preset available) without upscaling. The ten games span different engines, VRAM requirements, and optimization profiles:</p>



<p>Cyberpunk 2077, Mafia: The Old Country, Ghost of Tsushima, Forza Horizon 5, S.T.A.L.K.E.R. 2, Microsoft Flight Simulator 2024, Red Dead Redemption 2, God of War: Ragnarök, Horizon Forbidden West, and The Last of Us Part II. This isn&#8217;t a cherry-picked list — it represents what people actually play, including several notoriously VRAM-hungry titles that expose memory limitations quickly. We tested with the latest AMD drivers as of February 2025; performance may shift with future updates, particularly in newer titles still receiving optimization patches.</p>



<h2 class="wp-block-heading">The Numbers: Where 8GB falls apart</h2>



<ul class="wp-block-list">
<li><strong>CYBERPUNK 2077: <br></strong>&#8211; 56 fps (RX 9060 XT 8 GB)<br>&#8211; 61 fps (RX 9060 XT 16 GB)<br>&#8211; 98 fps (RX 9070)<br>&#8211; 109 fps (RX 9070 XT)</li>



<li><strong>Mafia: The Old Country: <br></strong>&#8211; 28 fps (RX 9060 XT 8 GB)<br>&#8211; 48 fps (RX 9060 XT 16 GB)<br>&#8211; 67 fps (RX 9070)<br>&#8211; 73 fps (RX 9070 XT)</li>



<li><strong>Ghost of Tsushima: <br></strong>&#8211; 57 fps (RX 9060 XT 8 GB)<br>&#8211; 63 fps (RX 9060 XT 16 GB)<br>&#8211; 102 fps (RX 9070)<br>&#8211; 117 fps (RX 9070 XT)</li>



<li><strong>Forza Horizon 5: <br></strong>&#8211; 79 fps (RX 9060 XT 8 GB)<br>&#8211; 126 fps (RX 9060 XT 16 GB)<br>&#8211; 190 fps (RX 9070)<br>&#8211; 204 fps (RX 9070 XT).</li>



<li><strong>S.T.A.L.K.E.R. 2: <br></strong>&#8211; 29 fps (RX 9060 XT 8 GB)<br>&#8211; 44 fps (RX 9060 XT 16 GB)<br>&#8211; 68 fps (RX 9070)<br>&#8211; 74 fps (RX 9070 XT).</li>



<li><strong>Microsoft Flight Simulator 2024:<br></strong>&#8211; 17 fps (RX 9060 XT 8 GB)<br>&#8211; 44 fps (RX 9060 XT 16 GB)<br>&#8211; 72 fps (RX 9070)<br>&#8211; 78 fps (RX 9070 XT)</li>



<li><strong>Red Dead Redemption 2: <br></strong>&#8211; 87 fps (RX 9060 XT 8 GB)<br>&#8211; 92 fps (RX 9060 XT 16 GB)<br>&#8211; 144 fps (RX 9070)<br>&#8211; 160 fps (RX 9070 XT)</li>



<li><strong>God of War: Ragnarök:<br></strong>&#8211; 83 fps (RX 9060 XT 8 GB)<br>&#8211; 85 fps (RX 9060 XT 16 GB)<br>&#8211; 155 fps (RX 9070)<br>&#8211; 160 fps (RX 9070 XT)</li>



<li><strong>Horizon Forbidden West:<br></strong>&#8211; 60 fps (RX 9060 XT 8 GB)<br>&#8211; 71 fps (RX 9060 XT 16 GB)<br>&#8211; 112 fps (RX 9070)<br>&#8211; 122 fps (RX 9070 XT)</li>



<li><strong>The Last of Us Part II:<br></strong>&#8211; 50 fps (RX 9060 XT 8 GB)<br>&#8211; 53 fps (RX 9060 XT 16 GB)<br>&#8211; 92 fps (RX 9070)<br>&#8211; 99 fps (RX 9070 XT)</li>
</ul>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe title="RX 9070 XT vs RX 9070 vs RX 9060 XT 16GB vs RX 9060 XT 8GB - Test in 10 Games" width="500" height="281" src="https://www.youtube.com/embed/ihsvnPcj3YU?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>



<p>The pattern jumps out immediately. In games with moderate VRAM demands — God of War: Ragnarök, The Last of Us Part II, Red Dead Redemption 2 — the 8GB and 16GB RX 9060 XT variants perform within 5–10% of each other. The extra memory sits mostly unused, and both cards deliver playable framerates. Nothing alarming there.</p>



<p>But look at Microsoft Flight Simulator 2024: 17 fps versus 44 fps. That&#8217;s not a performance gap — that&#8217;s the difference between a functional graphics card and an expensive paperweight. Forza Horizon 5<br>shows a similar collapse: 79 fps on the 8GB model, 126 fps on the 16GB. Same GPU, same clocks, same architecture. The only variable is memory capacity, and it&#8217;s costing the 8GB card nearly 40% of its<br>potential performance.</p>



<p>S.T.A.L.K.E.R. 2 and Mafia: The Old Country paint an equally grim picture. Both titles hammer VRAM at max settings, and both expose the 8GB limitation brutally. The RX 9060 XT 8GB drops into the high-20s — territory where even aggressive FSR upscaling can&#8217;t fully compensate without visible quality loss.</p>



<h2 class="wp-block-heading">The $50 Question: Is 16GB Worth It?</h2>



<p>Unequivocally, yes. The Radeon RX 9060 XT 16GB at $350 delivers what AMD should have shipped as the baseline configuration. In VRAM-limited scenarios, you&#8217;re looking at 50–160% performance gains over<br>the 8GB variant for a 17% price increase. That math doesn&#8217;t require a calculator to understand.</p>



<p>Here&#8217;s the uncomfortable truth: AMD knows 8GB isn&#8217;t enough for modern 1440p gaming at max settings. NVIDIA figured this out and killed the 8GB RTX 4070 rumors before launch, shipping 12GB instead. AMD<br>went the other direction, offering an 8GB option that technically exists but performs so poorly in demanding titles that recommending it feels irresponsible. The $300 price tag looks attractive until you realize you&#8217;re buying a 1080p card with 1440p aspirations.</p>



<p>If your budget caps at $300, drop settings or resolution — or wait for a sale on the 16GB model. The 8GB RX 9060 XT works fine for esports titles and older games, but anything released in 2023 or later with high-res textures will expose its limitations fast.</p>



<h2 class="wp-block-heading">RX 9070 vs RX 9070 XT: Diminishing returns at the top</h2>



<p>Shifting focus to the higher tier, the RX 9070 and RX 9070 XT tell a different story — one about diminishing returns rather than catastrophic bottlenecks. The $50 gap between these cards buys you roughly 8–15% more performance depending on the title. That&#8217;s measurable, but it&#8217;s not transformative.</p>



<p>In Cyberpunk 2077, the jump from 98 fps to 109 fps matters if you&#8217;re chasing a locked 120Hz experience. In God of War: Ragnarök, the difference between 155 fps and 160 fps is essentially<br>imperceptible. Across the entire test suite, the RX 9070 XT never pulls dramatically ahead — it&#8217;s consistently faster, but never by a margin that justifies the upgrade for most buyers.</p>



<p>The RX 9070 at $550 emerges as the smarter purchase for anyone who can afford it. You get north of 100 fps in most titles at 1440p max settings, comfortable headroom for future games, and enough VRAM<br>(16GB on both models) to avoid the memory starvation plaguing the budget tier. Honestly, the RX 9070 XT only makes sense if you&#8217;re obsessively chasing 120Hz locks — and even then, the $50 buys you<br>maybe five extra frames in most titles.</p>



<h2 class="wp-block-heading">What each AMD Radeon card actually delivers</h2>



<p><strong>AMD Raden</strong> <strong>RX 9060 XT 8GB: </strong>Skip it for 1440p. The VRAM limitation creates unacceptable performance collapses in modern titles. Buy this only if you&#8217;re gaming at 1080p or playing exclusively older/esports<br>titles with modest memory requirements.</p>



<p><strong>AMD Raden</strong> <strong>RX 9060 XT 16GB:</strong> The actual entry point for 1440p gaming in 2025. Solid 60+ fps in most demanding games, enough memory to handle max textures, and a price that doesn&#8217;t require financial<br>justification. This is the card AMD should have launched as the default RX 9060 XT.</p>



<p><strong>AMD Raden</strong> <strong>RX 9070: </strong>The sweet spot of the RDNA 4 lineup. Near-100 fps or better across the board at 1440p, 16GB of VRAM, and enough GPU horsepower to handle whatever the next few years throw at it. If you can stretch your budget this far, do it.</p>



<p><strong>AMD Raden</strong> <strong>RX 9070 XT:</strong> For enthusiasts who want the fastest AMD option available and don&#8217;t mind paying a premium for single-digit percentage gains.</p>



<h2 class="wp-block-heading">What AMD got right and wrong</h2>



<p>RDNA 4 delivers genuine generational improvement in performance-per-watt and raw throughput. The RX 9070 series competes effectively against NVIDIA&#8217;s offerings in this price range, and the 16GB RX<br>9060 XT provides legitimate budget 1440p gaming at a price point that hasn&#8217;t existed for years. AMD deserves credit for hitting competitive performance targets without the power draw penalties that plagued RDNA 3.</p>



<p>The problem is market segmentation overreach. The 8GB RX 9060 XT exists to hit a $300 price point, not because it makes sense as a product. Launching a 1440p-marketed GPU with 8GB of VRAM in early 2025 — when even last-generation console ports regularly exceed 8GB usage at high settings — feels like a decision made in a spreadsheet rather than a testing lab. The benchmarks prove it: that card simply cannot deliver consistent performance in the games people actually want to play at the resolution AMD suggests.</p>



<p>For buyers navigating this lineup, the takeaway is straightforward. Ignore the 8GB model unless you have very specific, modest use cases. The $50 step-up to 16GB transforms the RX 9060 XT from a compromised product into a properly competent graphics card. And if budget allows, the RX 9070 remains the RDNA 4 card to buy — fast enough for everything today, equipped with enough memory for everything tomorrow, and priced at the point where AMD&#8217;s value proposition actually lands.</p>



<h2 class="wp-block-heading">Radeon RX 9000 Series FAQ: 1440p performance, VRAM and value explained</h2>



<div class="schema-faq wp-block-yoast-faq-block"><div class="schema-faq-section" id="faq-question-1770912666687"><strong class="schema-faq-question">Is 8GB VRAM enough for 1440p gaming in 2025?</strong> <p class="schema-faq-answer">Not reliably. While older titles and esports games run fine on 8GB, modern AAA releases regularly exceed this limit at 1440p max settings. Games like Microsoft Flight Simulator 2024, S.T.A.L.K.E.R. 2 and Forza Horizon 5 show 50–160% performance drops on 8GB cards compared to 16GB variants with identical GPU specifications. The $50 premium for additional VRAM is now essentially mandatory for consistent 1440p performance.</p> </div> <div class="schema-faq-section" id="faq-question-1770912674831"><strong class="schema-faq-question">Which Radeon RX 9000 card is the best value?</strong> <p class="schema-faq-answer">The RX 9060 XT 16GB at $350 offers the best value for budget 1440p gaming, delivering playable framerates in demanding titles without VRAM limitations. For higher-end builds, the RX 9070 at $550 provides the best performance-per-dollar in the lineup — the $50 jump to the RX 9070 XT yields only 8–15% more performance, making it harder to justify.</p> </div> <div class="schema-faq-section" id="faq-question-1770912686082"><strong class="schema-faq-question">How does the RX 9070 compare to the RX 9070 XT?</strong> <p class="schema-faq-answer">The RX 9070 XT averages 8–15% higher framerates than the RX 9070 across demanding titles at 1440p. Both cards feature 16GB VRAM and identical memory configurations. The performance gap rarely exceeds 15 fps in real-world gaming, making the RX 9070 the smarter purchase unless you specifically need maximum framerates for high-refresh displays.</p> </div> <div class="schema-faq-section" id="faq-question-1770912691208"><strong class="schema-faq-question">Can the RX 9060 XT 8GB run Cyberpunk 2077 at 1440p?</strong> <p class="schema-faq-answer">Technically yes — our testing showed 56 fps at 1440p max settings without upscaling. However, this represents a best-case scenario; VRAM-heavier titles like Microsoft Flight Simulator 2024 dropped to 17 fps on the same card. The 16GB variant hit 61 fps in Cyberpunk and maintained playable performance across all tested titles, making it the safer choice for demanding games.</p> </div> <div class="schema-faq-section" id="faq-question-1770912742451"><strong class="schema-faq-question">Should I buy the RX 9060 XT or save for the RX 9070?</strong> <p class="schema-faq-answer">If your budget is fixed at $350, the RX 9060 XT 16GB delivers solid 1440p performance and shouldn&#8217;t require an upgrade for several years. However, the $200 jump to the RX 9070 buys roughly 50–80% more performance in most titles — a significant leap that extends the card&#8217;s relevance considerably. For buyers who can stretch their budget, the RX 9070 represents better long-term value despite the higher upfront cost.</p> </div> </div>



<p>Source: <a href="https://www.youtube.com/watch?v=ihsvnPcj3YU" target="_blank" rel="noreferrer noopener nofollow">Testing Games (YouTube)</a>, <a href="https://www.amd.com/en/products/graphics/desktops/radeon.html">AMD official</a></p>
<p>The post <a href="https://geeknify.com/radeon-rx-9070-xt-rx-9070-and-rx-9060-xt-tested-10-games-at-1440p-reveal-a-clear-vram-problem/">Radeon RX 9070 XT, RX 9070, and RX 9060 XT tested: 10 Games at 1440p reveal a clear VRAM problem</a> appeared first on <a href="https://geeknify.com">Geeknify</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>AMD Adrenalin 26.1.1 reportedly breaks undervolting on new GPUs</title>
		<link>https://geeknify.com/amd-adrenalin-26-breaks-undervolting-on-new-gpus/</link>
		
		<dc:creator><![CDATA[Vlad Phigod]]></dc:creator>
		<pubDate>Wed, 04 Feb 2026 16:48:47 +0000</pubDate>
				<category><![CDATA[Software]]></category>
		<category><![CDATA[View All]]></category>
		<category><![CDATA[AMD]]></category>
		<category><![CDATA[Radeon]]></category>
		<guid isPermaLink="false">https://geeknify.com/?p=524</guid>

					<description><![CDATA[<p>AMD’s latest Adrenalin 26.1.1 driver appears to disrupt undervolting on new Radeon graphics cards, making previously stable voltage and power profiles crash games and forcing users to retune from scratch.</p>
<p>The post <a href="https://geeknify.com/amd-adrenalin-26-breaks-undervolting-on-new-gpus/">AMD Adrenalin 26.1.1 reportedly breaks undervolting on new GPUs</a> appeared first on <a href="https://geeknify.com">Geeknify</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Users of AMD graphics cards are reporting unexpected issues after installing the latest <strong>Adrenalin Edition 26.1.1</strong> driver. According to multiple accounts, the update appears to disrupt previously stable <strong>custom power and voltage profiles</strong>, particularly on newer Radeon GPUs.</p>



<p>One of the most visible cases involves the <strong>Radeon 9070 XT</strong>, where undervolting settings that worked flawlessly on earlier drivers now cause immediate instability.</p>



<h3 class="wp-block-heading">Stable undervolts no longer work</h3>



<p>A Reddit user going by <strong>SnooJokes5264</strong> shared their experience, noting that an undervolt of <strong>–50 mV</strong> combined with a <strong>–9% power limit</strong> was completely stable on <strong>Adrenalin 25.12.1</strong>. After updating to <strong>26.1.1</strong>, however, those same settings began causing instant game crashes.</p>



<p>Launching Cyberpunk 2077 with the old profile results in a crash at startup. Stability only returns after <strong>resetting all tuning options to factory defaults</strong>, suggesting the issue is directly tied to manual voltage and power adjustments rather than the game itself.</p>



<figure class="wp-block-image size-large"><img fetchpriority="high" decoding="async" width="1024" height="576" src="https://geeknify.com/wp-content/uploads/2026/02/adrenalin-software-so-slow-1024x576.webp" alt="Adrenalin Software" class="wp-image-526" srcset="https://geeknify.com/wp-content/uploads/2026/02/adrenalin-software-so-slow-1024x576.webp 1024w, https://geeknify.com/wp-content/uploads/2026/02/adrenalin-software-so-slow-300x169.webp 300w, https://geeknify.com/wp-content/uploads/2026/02/adrenalin-software-so-slow-768x432.webp 768w, https://geeknify.com/wp-content/uploads/2026/02/adrenalin-software-so-slow.webp 1280w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<h3 class="wp-block-heading">Possible changes under the hood</h3>



<p>While AMD has not officially acknowledged the problem, the behavior points to <strong>silent changes in voltage curves or frequency control algorithms</strong> for newer GPU architectures in driver version 26.1.1.</p>



<p>If that’s the case, it would explain why undervolting values that were once safe now push the GPU beyond its new stability thresholds. Even small offsets that previously reduced temperatures and power draw without side effects may now trigger crashes.</p>



<p>Adrenalin 26.1.1 appears to alter how newer Radeon GPUs handle voltage and power, effectively invalidating previously stable undervolt profiles. For now, users focused on efficiency rather than stock performance may want to <strong>hold off on updating</strong> or be prepared to retune from scratch.</p>



<p>As always with GPU drivers, stability beats small gains, especially when the changes happen without warning.</p>



<p>Source: <a href="https://www.reddit.com/r/radeon/comments/1qswov0/my_undervolt_and_power_settings_from_25121_now/">Reddit</a></p>
<p>The post <a href="https://geeknify.com/amd-adrenalin-26-breaks-undervolting-on-new-gpus/">AMD Adrenalin 26.1.1 reportedly breaks undervolting on new GPUs</a> appeared first on <a href="https://geeknify.com">Geeknify</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Intel Arc B390 iGPU crushes Radeon 890M, nearly matches RTX 4050 Laptop</title>
		<link>https://geeknify.com/intel-arc-b390-igpu-crushes-radeon-890m-nearly-matches-rtx-4050-laptop/</link>
		
		<dc:creator><![CDATA[Vlad Phigod]]></dc:creator>
		<pubDate>Mon, 02 Feb 2026 15:06:24 +0000</pubDate>
				<category><![CDATA[Hardware]]></category>
		<category><![CDATA[View All]]></category>
		<category><![CDATA[Intel]]></category>
		<category><![CDATA[Nvidia]]></category>
		<category><![CDATA[Radeon]]></category>
		<guid isPermaLink="false">https://geeknify.com/?p=481</guid>

					<description><![CDATA[<p>Intel’s Arc B390 integrated graphics mark a turning point for laptop gaming. New benchmarks show it decisively beating AMD’s Radeon 890M and delivering performance close to entry-level discrete GPUs without the power penalty.</p>
<p>The post <a href="https://geeknify.com/intel-arc-b390-igpu-crushes-radeon-890m-nearly-matches-rtx-4050-laptop/">Intel Arc B390 iGPU crushes Radeon 890M, nearly matches RTX 4050 Laptop</a> appeared first on <a href="https://geeknify.com">Geeknify</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>For the first time in a long while, Intel’s integrated graphics aren’t just “good enough” &#8211; they’re genuinely competitive.</p>



<p>Early independent benchmarks of <strong>Intel Arc B390</strong>, the new iGPU found in <strong>Core Ultra X9 388H</strong> processors, show a dramatic leap forward. The Arc B390 doesn’t just outperform Intel’s previous iGPUs &#8211; it <strong>comfortably beats AMD’s Radeon 890M</strong> and, in some scenarios, starts encroaching on <strong>low-power GeForce RTX 4050 Laptop GPUs</strong>.</p>



<p>That’s a sentence that simply wasn’t possible to write about Intel iGPUs a few years ago.</p>



<h2 class="wp-block-heading">Arc B390 vs Radeon 890M: A clear win for Intel</h2>



<p>According to ComputerBase, which tested the Arc B390 using LPDDR5X-9600 memory, Intel’s new integrated graphics show a decisive advantage over AMD’s Radeon 890M in 1080p gaming.</p>



<figure class="wp-block-image size-large"><img decoding="async" width="1024" height="526" src="https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-comptuerbase-1024x526.webp" alt="Ryzen AI MAX+ Performance rating" class="wp-image-483" srcset="https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-comptuerbase-1024x526.webp 1024w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-comptuerbase-300x154.webp 300w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-comptuerbase-768x394.webp 768w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-comptuerbase.webp 1143w" sizes="(max-width: 1024px) 100vw, 1024px" /><figcaption class="wp-element-caption">Source: ComputerBase</figcaption></figure>



<p>The most striking result comes from low-power operation. At 24–25 watts, Arc B390 was measured to be up to 63% faster than Radeon 890M. That’s not a marginal win &#8211; it’s a landslide, especially in a power envelope that matters most for thin-and-light laptops.</p>



<p>Just as important, Arc B390 maintains nearly the same performance <strong>on battery power</strong>, a long-standing weakness for mobile GPUs. Minimal performance drop unplugged makes a real difference for portable gaming and content creation.</p>



<h2 class="wp-block-heading">Closing the gap to discrete GPUs</h2>



<p>Notebookcheck places Arc B390 in the same general performance bracket as GeForce RTX 4050 Laptop GPUs configured for lower power limits.</p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="734" src="https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-NOTEBOOKCHECK-1024x734.webp" alt="UX8407AA Performance" class="wp-image-484" srcset="https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-NOTEBOOKCHECK-1024x734.webp 1024w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-NOTEBOOKCHECK-300x215.webp 300w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-NOTEBOOKCHECK-768x550.webp 768w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-NOTEBOOKCHECK.webp 1238w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption class="wp-element-caption">Source: NotebookCheck</figcaption></figure>



<p>That’s a significant milestone. Intel’s iGPU now:</p>



<ul class="wp-block-list">
<li>clearly outperforms Arc 140T / 140V</li>



<li>decisively beats Radeon 890M</li>



<li>approaches entry-level discrete GPUs without requiring a dedicated GPU</li>
</ul>



<p>AMD’s Strix Halo (Ryzen AI Max+) iGPUs still lead the integrated graphics charts, but they do so at much higher power consumption, making the comparison less favorable for thin and efficient laptops.</p>



<h2 class="wp-block-heading">Independent tests Back it up</h2>



<p>Additional testing from well-known hardware analyst The Phawx reinforces these findings. At matched power levels &#8211; particularly above 20 W &#8211; Arc B390 often outperforms the iGPU found in Ryzen AI 9 HX 370 across multiple games.</p>



<p>This consistency across independent testers suggests Arc B390’s gains aren’t benchmark anomalies. They’re architectural.</p>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="512" src="https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-HARDWARE-CANUCKS-1024x512.webp" alt="ARC B390 Testing" class="wp-image-485" srcset="https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-HARDWARE-CANUCKS-1024x512.webp 1024w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-HARDWARE-CANUCKS-300x150.webp 300w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-HARDWARE-CANUCKS-768x384.webp 768w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-HARDWARE-CANUCKS-1536x768.webp 1536w, https://geeknify.com/wp-content/uploads/2026/02/ARC-B390-HARDWARE-CANUCKS-2048x1024.webp 2048w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /><figcaption class="wp-element-caption">Source: Hardware Canucks</figcaption></figure>



<p>Arc B390 changes the value equation for laptops.</p>



<p>If an integrated GPU can deliver playable 1080p performance while staying efficient and stable on battery, the need for budget discrete GPUs becomes far less obvious. That has implications not just for gamers, but for OEM laptop designs, thermals, cost, and battery life.</p>



<p>Intel isn’t just catching up here &#8211; it’s redefining what an iGPU is expected to do.</p>



<p>It’s worth noting that these results come from the top-tier Arc B390 configuration. Intel’s upcoming Panther Lake lineup will also include: &#8211; Arc B370, &#8211; cut-down Xe3 configurations</p>



<p>How far the performance scales down remains to be seen, but if even midrange variants retain a meaningful portion of B390’s gains, Intel’s integrated graphics lineup could become genuinely disruptive.</p>



<p>Arc B390 is the strongest integrated GPU Intel has ever shipped &#8211; by a wide margin. It beats AMD’s mainstream iGPU, challenges low-power discrete GPUs, and does so at laptop-friendly power levels.</p>



<p>For years, Intel graphics were a compromise. With Arc B390, they’re finally a selling point.</p>



<p>Sources: ComputerBase, Notebookcheck, The Phawx, Videocardz</p>
<p>The post <a href="https://geeknify.com/intel-arc-b390-igpu-crushes-radeon-890m-nearly-matches-rtx-4050-laptop/">Intel Arc B390 iGPU crushes Radeon 890M, nearly matches RTX 4050 Laptop</a> appeared first on <a href="https://geeknify.com">Geeknify</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
