<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	xmlns:media="http://search.yahoo.com/mrss/"
>

<channel>
	<title>HDR &#8211; Wade Tregaskis</title>
	<atom:link href="https://wadetregaskis.com/tags/hdr/feed/" rel="self" type="application/rss+xml" />
	<link>https://wadetregaskis.com</link>
	<description></description>
	<lastBuildDate>Sat, 02 May 2026 18:33:33 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
<site xmlns="com-wordpress:feed-additions:1">226351702</site>	<item>
		<title>Asus ProArt PA32QCV</title>
		<link>https://wadetregaskis.com/asus-proart-pa32qcv/</link>
					<comments>https://wadetregaskis.com/asus-proart-pa32qcv/#respond</comments>
		
		<dc:creator><![CDATA[]]></dc:creator>
		<pubDate>Fri, 01 May 2026 20:38:54 +0000</pubDate>
				<category><![CDATA[Reviews]]></category>
		<category><![CDATA[5k display]]></category>
		<category><![CDATA[6k display]]></category>
		<category><![CDATA[Apple]]></category>
		<category><![CDATA[Asus ProArt 6K]]></category>
		<category><![CDATA[HDR]]></category>
		<category><![CDATA[LG UltraFine 5K]]></category>
		<category><![CDATA[PA32QCV]]></category>
		<category><![CDATA[SDR]]></category>
		<category><![CDATA[VESA mount]]></category>
		<guid isPermaLink="false">https://wadetregaskis.com/?p=8959</guid>

					<description><![CDATA[I&#8217;ll cut to the chase: it&#8217;s a nice resolution upgrade from an Apple or LG 5k display. But that&#8217;s about it &#8211; in every other visual respect (brightness, contrast, etc) it&#8217;s basically the same or marginally worse (see Matte-ugly). Though the built-in KVM is a nice addition. Resolution The extra resolution is significant &#8211; it&#8230; <a class="read-more-link" href="https://wadetregaskis.com/asus-proart-pa32qcv/" data-wpel-link="internal">Read more</a>]]></description>
										<content:encoded><![CDATA[
<p>I&#8217;ll cut to the chase: it&#8217;s a nice resolution upgrade from an Apple or LG 5k display. But that&#8217;s about it &#8211; in every other visual respect (brightness, contrast, etc) it&#8217;s basically the same or marginally worse (see <a href="#Matte-ugly">Matte-ugly</a>). Though the built-in KVM is a nice addition.</p>



<h3 class="wp-block-heading">Resolution</h3>



<p>The extra resolution is significant &#8211; it <em>is</em> 38% more pixels &#8211; and welcome, but it&#8217;s not revolutionary.  It feels like what <em>should</em> have just been the natural progression and not a big deal &#8211; in the same way we started with 9&#8243; sub-VGA displays and have over decades worked our way up to bigger and higher-resolution ones.</p>



<p>And in that vein, the prospect of downgrading back to a 5k display is immediately deeply unappealing (pay attention, Apple 😠). The notion of going down to a mere 4k display is absurd (tempting as bright[er] OLEDs are<sup data-fn="916ab8be-1381-43f2-8c81-0e5100a8df6a" class="fn"><a href="#916ab8be-1381-43f2-8c81-0e5100a8df6a" id="916ab8be-1381-43f2-8c81-0e5100a8df6a-link">1</a></sup>).</p>



<h3 class="wp-block-heading">Dimness (née Brightness)</h3>



<p>Even though it&#8217;s supposedly brighter than the LG UltraFine 5K it&#8217;s replacing for me &#8211; at least at peak, given its DisplayHDR 600 rating &#8211; it really isn&#8217;t. For everyday work (coding etc) I had the LG set at 50% brightness (so nominally 250 nits, given its 500 peak rating) but the equivalent brightness requires ~75% on the Asus. Which actually fits if you take <a href="https://www.bhphotovideo.com/lit_files/1239784.pdf" data-wpel-link="external" target="_blank" rel="external noopener">the Asus&#8217;s manual</a> (<em>not</em> <a href="https://www.asus.com/us/displays-desktops/monitors/proart/proart-display-6k-pa32qcv/techspec/" data-wpel-link="external" target="_blank" rel="external noopener">its tech specs</a>) at its word of a 350 nominal max brightness, since 250 nits is roughly 75% of 350.</p>



<p>So that&#8217;s disappointing.  The LG UltraFine 5K was <em>sort of</em> bright for its day, but that day was nearly a decade ago.  We now have cheap laptops with 1000-nit displays, so 500 is unequivocally dim now.</p>



<p>Not that I expected much else &#8211; while <a href="https://wadetregaskis.com/6k-display-comparison/#f5a790c8-9793-46bf-800d-a9c1dbe707d2" data-wpel-link="internal">Asus can&#8217;t seem to agree with themselves over the actual peak brightness</a>, they never claim more than 400, so going in I suspected it would be dim.  What I didn&#8217;t expect was that the claims of <a href="https://www.pcworld.com/article/2873124/asus-proart-pa32qcv-review.html" data-wpel-link="external" target="_blank" rel="external noopener">multiple</a> <a href="https://www.tomshardware.com/monitors/asus-proart-pa32qcv-32-inch-6k-professional-monitor-review#:~:text=I%20measured%20almost%20650%20nits%20in%20my%20tests%2C%20and%20that%20was%20from%20both%20full%20field%20and%20window%20patterns" data-wpel-link="external" target="_blank" rel="external noopener">reviewers</a>, that it&#8217;s actually closer to 700, would turn out to be completely false.</p>



<p>The <em>one</em> thing that makes a difference in practice, in the Asus&#8217;s favour over some older displays like the LG, is that it supports HDR mode.  So you <em>can</em> in practice get noticeably higher brightness in image &amp; video editing without having to blind yourself<sup data-fn="3a434a32-fb6d-4f48-a6b2-6a79e0efbd71" class="fn"><a href="#3a434a32-fb6d-4f48-a6b2-6a79e0efbd71" id="3a434a32-fb6d-4f48-a6b2-6a79e0efbd71-link">2</a></sup>.</p>



<h3 class="wp-block-heading">Matte-ugly</h3>



<p>I do not like the matte finish.  Compared to the LG UltraFine 5K that I was previously using, the PA32QCV has lower overall contrast (it&#8217;s more washed-out looking), and visible &#8216;shimmer&#8217; or &#8216;sparkle&#8217; &#8211; basically fine luminance noise that changes as your viewing position changes (even just slightly). It makes the screen look a bit dirty, too.</p>



<p>In a nutshell, contrast is lower than it should be, and text just doesn&#8217;t have <em>quite</em> the same clarity it does on other displays (like Apple&#8217;s, or the LG UltraFine 5k).</p>



<p>I&#8217;ve not yet decided if it&#8217;s a deal-breaker… there&#8217;s no good glossy <a href="https://wadetregaskis.com/6k-display-comparison/" data-type="post" data-id="8747" data-wpel-link="internal">6k display options</a> currently. I&#8217;m <em>hoping</em> that I&#8217;ll just get used to it. But there&#8217;s no mistaking that the screen&#8217;s finish is notably worse than its predecessors and Apple contemporaries. <em>Especially</em> for bright content (folks using Dark Mode might not be affected so much).</p>



<div class="wp-block-group"><div class="wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained">
<p>A little digression:  I still [vaguely] remember when Apple introduced the first glossy screens (as an option, not the default) in 2006, and then the storm in a teacup when they made glossy the <em>default</em> (but matte still an option) in 2008 MacBooks (and glossy the <em>only</em> option for the 2008 Cinema Displays).  Back then, I was against the glossy displays &#8211; not zealously, but I didn&#8217;t understand why you&#8217;d want a display that was basically the same except for the notable addition of annoying reflections.</p>



<p>In retrospect, I think the key difference was Retina. When your pixels are the size of boulders (pre-Retina), a bit of blurring or fine-resolution grain is largely irrelevant because it&#8217;s so small compared to the pixels themselves. Indeed, I can&#8217;t find <em>any</em> mention of sparkling or blur in contemporary writings of that time &#8211; all the discussion centres foremost on reflections and (sometimes) the possibility of higher macro-contrast (deeper blacks, primarily).</p>



<p>But when your pixels are <em>also</em> small &#8211; approaching the size of the speckling &#8211; suddenly it matters, because an entire pixel can be obscured or corrupted by the matte finish&#8217;s &#8220;sparkles&#8221; or blur.  Your eyes (or brain) can no longer apply an optical &#8220;low pass filter&#8221; to ignore the matte&#8217;s effects.</p>



<p>On the &#8216;upside&#8217;, as I age my eyes continue to degrade, so eventually I&#8217;ll no longer be able to see the sparkles. 😆😐😞</p>
</div></div>



<h3 class="wp-block-heading">VESA mounting</h3>



<p>I was perplexed when I first opened the box, and found what looked like a proprietary mounting system, with only a stand included, not a VESA mount adapter.  The box contains basically no instructions or explanation of anything, and even the manual &#8211; dug up through <a href="https://www.manualslib.com/manual/3962648/Asus-Proart-Pa32qcv.html" data-wpel-link="external" target="_blank" rel="external noopener">manualslib</a> online because Asus&#8217;s website contains only broken links to it (though I later noticed that <a href="https://www.bhphotovideo.com/lit_files/1239784.pdf" data-wpel-link="external" target="_blank" rel="external noopener">B&amp;H host it too</a>) &#8211; has no real information on how to VESA mount the display, other than cryptically stating that you need a &#8220;VESA Wall Mount Adapter (sold separately)&#8221;.</p>



<p>Thankfully, <a href="https://www.reddit.com/user/Itchy_Pin9813/" data-wpel-link="external" target="_blank" rel="external noopener">Itchy_Pin9813</a> on <a href="https://www.reddit.com/r/ASUS/comments/1kyxnpu/how_do_i_use_a_third_party_vesa_stand_with_my/" data-wpel-link="external" target="_blank" rel="external noopener">Reddit</a> had already figured it out &#8211; the four screws that <em>look</em> like they might be structural, are actually just weird placeholders.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img fetchpriority="high" decoding="async" width="2048" height="1536" src="https://wadetregaskis.com/wp-content/uploads/2026/05/before-removing-the-built-in-screws.avif.avif" alt="" class="wp-image-8961" srcset="https://wadetregaskis.com/wp-content/uploads/2026/05/before-removing-the-built-in-screws-1024x768@2x.avif.avif 2048w, https://wadetregaskis.com/wp-content/uploads/2026/05/before-removing-the-built-in-screws-256x192.avif 256w, https://wadetregaskis.com/wp-content/uploads/2026/05/before-removing-the-built-in-screws-1024x768.avif 1024w, https://wadetregaskis.com/wp-content/uploads/2026/05/before-removing-the-built-in-screws-768x576.avif 768w, https://wadetregaskis.com/wp-content/uploads/2026/05/before-removing-the-built-in-screws-256x192@2x.avif.avif 512w" sizes="(max-width: 2048px) 100vw, 2048px" /></figure>
</div>


<p>You just remove them and then screw in any standard 100⨉100 VESA mount.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img decoding="async" width="2048" height="1536" src="https://wadetregaskis.com/wp-content/uploads/2026/05/after-attaching-a-vesa-mount-plate.avif.avif" alt="" class="wp-image-8962" srcset="https://wadetregaskis.com/wp-content/uploads/2026/05/after-attaching-a-vesa-mount-plate-1024x768@2x.avif.avif 2048w, https://wadetregaskis.com/wp-content/uploads/2026/05/after-attaching-a-vesa-mount-plate-256x192.avif 256w, https://wadetregaskis.com/wp-content/uploads/2026/05/after-attaching-a-vesa-mount-plate-1024x768.avif 1024w, https://wadetregaskis.com/wp-content/uploads/2026/05/after-attaching-a-vesa-mount-plate-768x576.avif 768w, https://wadetregaskis.com/wp-content/uploads/2026/05/after-attaching-a-vesa-mount-plate-256x192@2x.avif.avif 512w" sizes="(max-width: 2048px) 100vw, 2048px" /></figure>
</div>


<p>However, be aware that the mounting point is recessed significantly. Though the plate itself fit in the recessed area just fine (as shown above), I was only <em>barely</em> able to get my mount plate attached to the arm itself, without the arm hitting the surrounding casing. I did notice that <a href="https://www.amazon.com/VG27AQ3A-Compatible-VG27AQM5A-VG277Q1A-VG27WQ3B/dp/B0FM81CJ6F" data-wpel-link="external" target="_blank" rel="external noopener">you can buy adapters specifically designed to address this design flaw</a>.</p>



<h3 class="wp-block-heading">BSOD</h3>



<p>When the display detects no input video signal, it displays a <em>hideously</em>-coloured bright blue screen, reminiscent of Window&#8217;s Blue Screen of Death.</p>



<p>Which would be fine &#8211; generally you won&#8217;t see that, unless something&#8217;s genuinely gone wrong with your cables or computer &#8211; except that it sometimes flashes into this mode when your Mac goes to sleep or wakes the display back up. It&#8217;s jarring and ugly.</p>



<p>This isn&#8217;t the first display to have this design flaw, but I just cannot fathom how, after decades of experience in displays all around us in the real world, someone somewhere inside Asus <em>still</em> thought it&#8217;d be a good idea to do this instead of just displaying a black screen (optionally with calm grey &#8220;No Input Signal&#8221; text on it).</p>



<h3 class="wp-block-heading">KVM</h3>



<p>Having a built-in KVM <em>should be nice</em> &#8211; I have my personal Mac and [sometimes] my work laptop connected, and previously I was manually moving the Thunderbolt cable between them (like a cave man! 😜).  Which wasn&#8217;t a big deal per se, but I do worry about frequently plugging and unplugging a Thunderbolt cable &#8211; those USB-C style connectors aren&#8217;t infinitely durable.  And Thunderbolt is a pretty demanding protocol, that I suspect doesn&#8217;t tolerate electrical flaws well.  And quality Thunderbolt replacement cables aren&#8217;t cheap.</p>



<p>So, KVM, great!</p>



<p>Except… the implementation in the Asus is pretty clunky. There&#8217;s a dedicated &#8220;Input Source Switch&#8221; button right on the front panel, which <em>should</em> be perfect &#8211; except it often doesn&#8217;t work. If your Mac(s) are set to turn the display off after some period of idleness, they&#8217;ll stop sending a video signal to the display, and the display then considers them non-existent. So the &#8220;Input Source Switch&#8221; button only ever works in the brief period after a previous switch, when the prior Mac is still sending a video signal to the display.</p>



<p>Disabling display auto-off on all your computers is one option, but not a great one &#8211; sometimes I&#8217;m called away abruptly, potentially for many hours, and I don&#8217;t want the display sitting there wasting power and burning itself in. A screen saver might help with the burn-in aspect, at least, but not the power (and keep in mind this display is rated to about 50W (not counting USB &amp; Thunderbolt power), which is too much to waste).</p>



<p>If you have your keyboard &amp; mouse connected through the display &#8211; in order to make intended use of the KVM functionality &#8211; then you also run into the problem that if the Mac has gone into screen off mode, the Asus display turns off all the USB devices too.  So you can&#8217;t wake your Mac from the keyboard or mouse.  Nor switch input sources.</p>



<p>So, I&#8217;m having to resort to hitting the power button on my Mac Studio, and <em>opening my laptop</em> on the MacBook Pro, in order to wake them up.  Or, I discovered that you can &#8216;force&#8217; switch input sources &#8211; which will wake the connected Mac &#8211; through the display&#8217;s on-screen display.  But that&#8217;s quite a few clicks and nudges of its joystick &#8211; and only makes it more baffling and frustrating that the dedicated &#8220;Input Source Switch&#8221; button doesn&#8217;t just work.</p>



<p>I wish there were a configuration option to (a) never power down the USB devices and (b) just tell the display that some inputs (Thunderbolt, Display Port, and/or HDMI) are <em>always</em> connected, whether it&#8217;s receiving a video signal currently or not.  Clearly the display <em>can</em> wake up connected Macs &#8211; it does so when you select their input source deep in its settings.</p>



<h3 class="wp-block-heading">Picture-in-Picture (PIP) / Picture-beside-Picture (PBP)</h3>



<p>I&#8217;m not sure if I&#8217;d ultimately use these &#8211; I hate the PIP &#8220;feature&#8221; of YouTube and some Apple apps, for example, and don&#8217;t want to mess with display resolution and aspect ratios for any of my connected computers &#8211; but it turns out I can&#8217;t, anyway.</p>



<p>The caveats are buried deep in the manual:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>To active this function [PIP / PBP], you need to do the following: turn off <strong>MediaSync</strong> / <strong>Dynamic Dimming</strong> and disable HDR on your device.</p>
</blockquote>



<p>I don&#8217;t want MediaSync or Dynamic Dimming anyway (see the <a href="#Configuration_recommendations" data-type="internal" data-id="#Configuration-recommendations">Configuration recommendations</a> section below for why), but I <em>do</em> want HDR mode enabled.  Having to disable HDR mode is far too great a sacrifice, for a feature that&#8217;s arguably just a gimmick anyway.</p>



<h3 class="wp-block-heading">10-bit colour lies</h3>



<p>The Asus (like most displays these days) is marketed as having 10-bit colour.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img decoding="async" width="1013" height="521" src="https://wadetregaskis.com/wp-content/uploads/2026/05/asus-pa32qcv-product-page-claiming-10-bit-colour-support.avif.avif" alt="Screenshot of part of the product page on Asus's website for the PA32QCV display, showing their claim that it supports 10-bit colour" class="wp-image-8984" srcset="https://wadetregaskis.com/wp-content/uploads/2026/05/asus-pa32qcv-product-page-claiming-10-bit-colour-support.avif 1013w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-pa32qcv-product-page-claiming-10-bit-colour-support-256x132.avif 256w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-pa32qcv-product-page-claiming-10-bit-colour-support-768x395.avif 768w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-pa32qcv-product-page-claiming-10-bit-colour-support@2x.avif 2026w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-pa32qcv-product-page-claiming-10-bit-colour-support-256x132@2x.avif.avif 512w" sizes="(max-width: 1013px) 100vw, 1013px" /></figure>
</div>


<p><strong>But it doesn&#8217;t.</strong></p>



<p>The lie is revealed only in the back pages of the user manual (the one that doesn&#8217;t come with it in the box, nor is accessible from Asus&#8217;s website).  This display, like so many others, is actually an 8-bit panel that uses FRC (Frame Rate Control) i.e. temporal dithering: the monitor accepts a 10-bit signal but can only set the actual pixels to 8-bit precision, so it oscillates back and forth between adjacent 8-bit values in order to approximate the desired 10-bit value, over time.</p>



<p>I don&#8217;t actually know how much that matters &#8211; I can&#8217;t say I&#8217;ve had complaints about the LG UltraFine 5K w.r.t. banding or other 8-bit artefacts, and it was also an 8-bit display with FRC.</p>



<p>Still, I&#8217;m not happy with Asus basically lying in their marketing material &#8211; and tech specs, which are <em>usually</em> one place you can get to the truth.</p>



<h3 class="wp-block-heading">Subtler effects</h3>



<p>It might take more time to appreciate some of the more subtle differences, vs the LG UltraFine 5K at least.</p>



<h4 class="wp-block-heading">HDR Mode</h4>



<p>The ability to use HDR mode &#8211; meaning I can set my &#8216;regular&#8217; GUI brightness to what&#8217;s comfortable <em>without</em> [artificially] limiting the brightness of imagery &#8211; might reveal itself as kind of a big deal, with further use, but since the additional brightness is pretty minor (in a daylight-lit room, at least) I don&#8217;t know yet.</p>



<h4 class="wp-block-heading">Colour accuracy</h4>



<p>I also haven&#8217;t actually checked the colour accuracy yet.  Asus do include a basic printed calibration report in the box, from their factory calibration, which is nice and hopefully not just performative.  I do have a colour metre (an old ColorMunki), but going through the process is frankly frustrating and tedious, and I&#8217;m rarely actually happy with the results (more accurate is not the same as <em>better looking</em>).  I can at least say that the colour looks fine (once I dialled in better settings than the defaults &#8211; see below) and similar-enough to the other displays in my life that I have no complaints.</p>


<div class="wp-block-image">
<figure class="aligncenter size-full"><img loading="lazy" decoding="async" width="4604" height="5666" src="https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report.avif" alt="" class="wp-image-8970" srcset="https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report.avif 4604w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report-208x256.avif 208w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report-832x1024.avif 832w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report-768x945.avif 768w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report-1664x2048.avif 1664w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report-208x256@2x.avif.avif 416w, https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-display-pa32qcv-example-factory-calibration-report-1664x2048@2x.avif.avif 3328w" sizes="auto, (max-width: 4604px) 100vw, 4604px" /></figure>
</div>


<h3 class="wp-block-heading">Configuration recommendations</h3>



<ul class="wp-block-list">
<li>In macOS Settings:
<ul class="wp-block-list">
<li>Enable &#8220;High Dynamic Range&#8221; for the display (in the &#8220;Displays&#8221; pane).  This does two things:
<ul class="wp-block-list">
<li>It enables macOS brightness control &#8211; the brightness slider appears in System Settings, and the keyboard shortcuts to control brightness will then work.<br><br>Note that, conversely, it basically prevents you adjusting the display&#8217;s brightness from the display&#8217;s own controls (you become limited to &#8220;MAX&#8221; and 250).</li>



<li>It allows HDR content to use the full luminance range of the display irrespective of the brightness setting. i.e. you can adjust the &#8220;regular&#8221; brightness of the GUI independent of the HDR image &amp; video brightness. Note, however, that there doesn&#8217;t seem to be a way to control the brightness of HDR content.<br><br>Keep in mind, though, that the PA32QCV is not a bright display. It&#8217;s rated to <em>up to</em> 600 nits, and even that&#8217;s probably only for small patches of highlights or for brief time periods, which is really quite low &#8211; it&#8217;s nothing like a &#8220;true&#8221; HDR display such as Apple&#8217;s MacBook Pro displays, Apple&#8217;s Pro Display XDR, or the <a href="https://www.asus.com/us/displays-desktops/monitors/proart/proart-display-8k-pa32kcx/" data-wpel-link="external" target="_blank" rel="external noopener">Asus ProArt 8K</a>, that have <em>sustained</em> maximum brightness of at least 1,000 nits (2⨉ brighter), and generally peak much higher.  Even iPhones are brighter<sup data-fn="4f676ac3-d1e3-4bd9-a8fd-ea450633f0b0" class="fn"><a href="#4f676ac3-d1e3-4bd9-a8fd-ea450633f0b0" id="4f676ac3-d1e3-4bd9-a8fd-ea450633f0b0-link">3</a></sup> (and have <em>much</em> better dynamic range, being OLEDs).</li>
</ul>
</li>
</ul>
</li>



<li>In the display&#8217;s settings:
<ul class="wp-block-list">
<li><strong>Settings > Dynamic Dimming</strong> should be <em>OFF</em>. When it&#8217;s on the display adjusts the brightness over time in response to the image shown, but <em>very</em> slowly such that you can see it ramping up or down lazily after an average brightness change (no matter what you tweak its sub-settings to). It&#8217;s just <em>horrible</em> for video creation since it&#8217;s seriously messing with your luminance and animation. And I&#8217;m not even sure it&#8217;s a good idea when merely watching video, since the brightness transitions are noticeable and distracting.<br><br>If you never view animated content, then I suppose leaving it on <em>could</em> be beneficial since its purpose is ostensibly to give you greater static range &#8211; when the screen overall is dim, such as editing a dark photo, it will reduce the backlight in order to darken the blacks, while conversely in a bright image it&#8217;ll boost the backlight to give you maximum brightness (but at the expense of washing out the blacks completely).<br><br>But, it really doesn&#8217;t do much.  Yes, I can see the effect &#8211; it does make the blacks a little bit darker when the image is overall quite dark &#8211; but it&#8217;s very subtle and not remotely worth the visual artefacts it introduces.</li>



<li><strong>Palette > Brightness</strong> must be set to &#8220;MAX&#8221; when macOS is using HDR mode, otherwise you&#8217;ll be limited to 250 nits even for HDR content! If macOS is not set to HDR mode then it sets the brightness in nits, from 0 to 400.<br><br>It&#8217;s pleasing to see a scale that&#8217;s in real units, not just some arbitrary scale like 0 to 100%. Or at least, I&#8217;m assuming that&#8217;s the case &#8211; that the scale goes to 400, and otherwise odd, arbitrary number, and that the display&#8217;s nominal peak [SDR] brightness is 400<sup data-fn="9ff621ae-0e0c-4594-845f-390ba6ebf43d" class="fn"><a href="#9ff621ae-0e0c-4594-845f-390ba6ebf43d" id="9ff621ae-0e0c-4594-845f-390ba6ebf43d-link">4</a></sup> seems like an unlikely coincidence.</li>



<li><strong>Palette &gt; Black Level &gt; Signal</strong> should be left at its default, 50.  Changing this basically changes the luminance curve of the display &#8211; lowering it pulls down the brightness, crushing the shadows in particular, while raising it increases the brightness, washing out the shadows.  It has no apparent effect on actual black levels (nor, surprisingly, does its peer &#8220;Backlight&#8221; setting, which seems strange because surely that&#8217;s the point of it?).</li>



<li>If you&#8217;re <em>creating</em> HDR content, in the display&#8217;s settings, set <strong>Preset > HDR</strong> to &#8220;PQ Clip&#8221;.<br><br>If you&#8217;re <em>viewing</em> HDR content, use &#8220;PQ Optimized&#8221;.<br><br>Yes, this might be something you have to change frequently, because there&#8217;s no happy medium. 😔<br><br>Under the default setting, &#8220;PQ Optimized&#8221;, the display futzes with the image to make the display&#8217;s brightness limit less noticeable &#8211; it &#8220;smoothes&#8221; out the approach to the maximum brightness by making the too-bright parts darker (to prevent clipping) <em>and</em> the nearly-too-bright parts <em>brighter</em>. This provides a <em>pleasing</em> but highly inaccurate effect &#8211; you don&#8217;t see stark clipping as easily, and the image overall looks bright, but you&#8217;re seriously changing the localised luminance of the image. If you edit a photo or video this way and then put it on another display, you may be dismayed to find it looks <em>completely</em> different, luminance- and contrast-wise.<br><br>This is a consequence, I infer, of how macOS renders HDR content. In SDR mode, macOS just directly maps the input image&#8217;s dynamic range to the display&#8217;s &#8211; 100% in the image goes to the display as 100%. But in HDR mode it seems like it&#8217;s basically ignoring your display&#8217;s capabilities and working in <em>absolute</em> brightness. So if the input image says it is 2,000 nits, macOS emits pixels with that nominal brightness. Which may be <em>way</em> beyond what the display can handle, so they just get clipped to its max brightness (or artificially adjusted by the display &#8211; as in &#8220;PQ Optimized&#8221; and &#8220;PQ Basic&#8221; modes).<br><br>And (for completeness) the &#8220;PQ Basic&#8221; setting is, I think, doing just the darkening part of &#8220;PQ Optimized&#8221;, which in a nutshell means it looks like &#8220;PQ Optimized&#8221; but dimmer overall. I&#8217;m not sure what use that is.</li>



<li><strong>Settings &gt; PowerSaving</strong> can be set to &#8220;Deep Level&#8221; (the default, for &#8220;Energy Saver&#8221; mode) <em>iff</em> you have macOS set to HDR mode.  Otherwise, it limits the maximum brightness severely.<br><br>Of course, I&#8217;m not yet sure what &#8220;Deep Level&#8221; <em>does</em> in HDR mode &#8211; possibly nothing.  But I&#8217;m hoping it just means the display uses less power when not actually active.</li>
</ul>
</li>
</ul>


<ol class="wp-block-footnotes"><li id="916ab8be-1381-43f2-8c81-0e5100a8df6a">This ties into my choice to get this 6k display, here and now.  I agonised over the decision for a long time.  I <em>was</em> pretty much resigned to just paying some obscene amount of money for the Apple Pro Display XDR 2 &#8211; on the assumption that it&#8217;d be a nice modest upgrade with more backlight zones and a brightness boost, which turned out to be correct but alas not the whole story &#8211; but once Apple publicised that they were killing the XDR entirely, <a href="https://wadetregaskis.com/studio-display-xdr-vs-pro-display-xdr/" data-type="post" data-id="8738" data-wpel-link="internal">it left me in the doldrums</a>.<br><br>I considered many options &#8211; including buying no-name-brand ones from China &#8211; but ultimately whittled it down to two possibilities:  the Asus ProArt 8K, or the 6K.  The Asus ProArt 8K is Apple-level expensive, and at 32&#8243; its pixel density is far too high to actually reap noticeable benefit from the extra pixels over the 6K, but it otherwise checks the boxes.  I spent a long time trying to convince myself it wasn&#8217;t insane to spend $9,000 on a display &#8211; keeping in mind that the original Apple Cinema Display was $4k at time of release in 1999, which is nearly $8k in 2026 dollars, and the Sony Trinitron displays were <em>thousands</em> of dollars too, in the 1990s… I also tried to reason that a display <em>should</em> last basically forever, and remain useful for decades, so what&#8217;s $9k over the rest of my life? 😅<br><br>But in the end I thought… why?  Even that $9k 8K display isn&#8217;t the best at everything.  It&#8217;s not the brightest, it doesn&#8217;t have the biggest colour gamut, it doesn&#8217;t have the best contrast ratio &#8211; it&#8217;s not even the prettiest… if I&#8217;m going to spend eye-watering amounts of money, I want to at least feel like it&#8217;s <em>worth it</em>.<br><br>So, I went with the cheapest option instead, on the assumption that I&#8217;ll revisit my display situation in a few more years. <a href="#916ab8be-1381-43f2-8c81-0e5100a8df6a-link" aria-label="Jump to footnote reference 1">↩︎</a></li><li id="3a434a32-fb6d-4f48-a6b2-6a79e0efbd71">On &#8220;non-HDR&#8221; displays &#8211; specifically, where macOS doesn&#8217;t let you use the &#8220;HDR&#8221; option in the display settings &#8211; your images &amp; video can only be displayed as bright as the current brightness setting &#8211; which is typically <em>not</em> the maximum brightness the display can manage, because if you raise the display to maximum brightness in order to get the full dynamic range available, then all your regular windows &#8211; white-backed webpages, TextEdit &amp; Xcode documents, etc &#8211; become blindingly bright. <a href="#3a434a32-fb6d-4f48-a6b2-6a79e0efbd71-link" aria-label="Jump to footnote reference 2">↩︎</a></li><li id="4f676ac3-d1e3-4bd9-a8fd-ea450633f0b0">Well, maybe… <a href="https://www.apple.com/iphone-17-pro/specs/#:~:text=3000%C2%A0nits%20peak%20brightness%20(outdoor)" data-wpel-link="external" target="_blank" rel="external noopener">the iPhone 17 Pro might be <em>rated</em> at [up to] 3,000 nits</a>, but in reality it can&#8217;t even handle the display being on <em>at all</em> sometimes, such as if exposed to sunlight.  I have a torch (flashlight, Americans), a <a href="https://www.youtube.com/watch?v=KHgstunB5fE" data-wpel-link="external" target="_blank" rel="external noopener">Google Firesword</a>, that&#8217;s spec&#8217;d as 3,000 lumens &#8211; just 875 nits &#8211; and it&#8217;s <em>way</em> brighter than the iPhone screen ever is.  <em>Way</em> brighter.<br><br>Yes, there&#8217;s a significant difference in emitter area between a torch and an iPhone&#8217;s display, but even just considering how much it lightens the room it&#8217;s in, the Firesword <em>easily</em> wins against any iPhone.  And any display I&#8217;ve owned. <a href="#4f676ac3-d1e3-4bd9-a8fd-ea450633f0b0-link" aria-label="Jump to footnote reference 3">↩︎</a></li><li id="9ff621ae-0e0c-4594-845f-390ba6ebf43d">Well, maybe.  As noted, while <a href="https://www.asus.com/us/displays-desktops/monitors/proart/proart-display-6k-pa32qcv/techspec/" data-wpel-link="external" target="_blank" rel="external noopener">the tech specs</a> claim 400, <a href="https://www.bhphotovideo.com/lit_files/1239784.pdf" data-wpel-link="external" target="_blank" rel="external noopener">the manual</a> says 350, and it&#8217;s my guess &#8211; based on comparison with other displays and referencing their rated peak brightnesses the the truth is much closer to 350 than 400. <a href="#9ff621ae-0e0c-4594-845f-390ba6ebf43d-link" aria-label="Jump to footnote reference 4">↩︎</a></li></ol>]]></content:encoded>
					
					<wfw:commentRss>https://wadetregaskis.com/asus-proart-pa32qcv/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			<media:content url="https://wadetregaskis.com/wp-content/uploads/2026/05/asus-proart-6k-pa32qcv-2048x2048.avif.avif" medium="image" />
<post-id xmlns="com-wordpress:feed-additions:1">8959</post-id>	</item>
		<item>
		<title>Lightroom &#8220;Classic&#8221; doesn&#8217;t play well with others</title>
		<link>https://wadetregaskis.com/lightroom-classic-doesnt-play-well-with-others/</link>
					<comments>https://wadetregaskis.com/lightroom-classic-doesnt-play-well-with-others/#respond</comments>
		
		<dc:creator><![CDATA[]]></dc:creator>
		<pubDate>Sat, 21 Oct 2017 16:16:57 +0000</pubDate>
				<category><![CDATA[Photography]]></category>
		<category><![CDATA[Broken by design]]></category>
		<category><![CDATA[Bugs!]]></category>
		<category><![CDATA[HDR]]></category>
		<category><![CDATA[Lightroom]]></category>
		<category><![CDATA[performance]]></category>
		<category><![CDATA[Time Machine]]></category>
		<guid isPermaLink="false">https://blog.wadetregaskis.com/?p=3972</guid>

					<description><![CDATA[So far the new &#8220;Classic&#8221; Lightroom looks &#38; feels mostly identical to the prior version(s), which isn&#8217;t really a compliment, but could be worse. &#160;There&#8217;s no apparent performance improvements, that&#8217;s for sure, so as expected Adobe&#8217;s promises to suddenly learn how to write efficient &#38; performant software, well… at least their marketing department gave it&#8230; <a class="read-more-link" href="https://wadetregaskis.com/lightroom-classic-doesnt-play-well-with-others/" data-wpel-link="internal">Read more</a>]]></description>
										<content:encoded><![CDATA[
<p>So far the new &#8220;Classic&#8221; Lightroom looks &amp; feels mostly identical to the prior version(s), which isn&#8217;t really a compliment, but could be worse. &nbsp;There&#8217;s no apparent performance improvements, that&#8217;s for sure, so as expected Adobe&#8217;s promises to suddenly learn how to write efficient &amp; performant software, well… at least their marketing department gave it the college try.</p>



<p>One thing I have very quickly discovered, however, is that Lightroom &#8220;Classic&#8221;&nbsp;<em>deliberately</em> chooses not to perform some functions if it is le tired. &nbsp;Or it thinks your computer is le tired. &nbsp;By which I mean, if there is pretty much&nbsp;<em>anything</em> else running and consuming CPU time (and/or RAM?), it refuses to even attempt some operations. &nbsp;HDR merges is the first one I hit. &nbsp;I was a bit flummoxed by it just happily queuing up a number of HDR merge operations, and them just sitting there in its queue, with no indication of error &#8211; just never executing.</p>



<p>Only after I quit or disabled a bunch of other processes &#8211; any and all that were using any measurable CPU time &#8211; did it finally, about ten seconds later, decide that it was now willing to consider my &#8216;requests&#8217;.</p>



<p>#%@!ing fussy little turd.</p>



<p>It&#8217;s worth noting that it&#8217;s not the only popular app, on macOS, that does this same bullshit. &nbsp;Time Machine is another big one. &nbsp;At least in Time Machine&#8217;s case I can see a more plausible line of reasoning behind it, even if it is misguided &#8211; the user&#8217;s&nbsp;<em>probably</em> not explicitly waiting for a Time Machine backup to complete. &nbsp;As in, not all the time. &nbsp;Sometimes they are. And they certainly expect backups to&nbsp;<em>happen at all</em>, which on a consistently busy machine simply&nbsp;<em>doesn&#8217;t</em> happen. &nbsp;So Time Machine&#8217;s reluctance to function on a working machine is still stupid overall. &nbsp;But Lightroom refusing to complete a&nbsp;<em>user initiated, user-interactive, and user-blocking</em> operation, is just patently stupid by its very notion.</p>



<p><strong>Update</strong>:  Worse, now it doesn&#8217;t work <em>at all</em>.  And a quick web search shows <a href="https://web.archive.org/web/20200805043215/https://feedback.photoshop.com/photoshop_family/topics/lightroom-classic-cc-photo-merge-not-working-on-mac" data-wpel-link="external" target="_blank" rel="external noopener">many</a> <a href="https://web.archive.org/web/20190604155342/https://feedback.photoshop.com/photoshop_family/topics/merge-to-hdr-simply-doesnt-work" data-wpel-link="external" target="_blank" rel="external noopener">other people</a> having the same problem, and Adobe as usual doing nothing about it.</p>



<p>Incidentally, I tried to log in to Adobe&#8217;s forums in order to &#8216;Me too&#8217; those issues, only it won&#8217;t let me log in anymore, falsely claiming my password is invalid. &nbsp;Good job, Adobe, good job.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://wadetregaskis.com/lightroom-classic-doesnt-play-well-with-others/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3972</post-id>	</item>
	</channel>
</rss>
