
Search results for 'Technology' - Page: 9
| ITBrief - 24 Sep (ITBrief) Dragos launches Platform 3.0, featuring AI tools and Insights Hub to boost rapid cyber defence for industrial operational technology environments. Read...Newslink ©2025 to ITBrief |  |
|  | | PC World - 24 Sep (PC World)Ever since version 24H2 of Windows 11 was rolled out to the public last autumn, there have been persistent problems with some webcams, which, among other things, made it difficult to use facial recognition to log in with Windows Hello.
To protect users, Microsoft chose back in October 2024 to stop the upgrade to 24H2 on affected devices.
However, Bleeping Computer now reports that Microsoft has finally managed to find a solution to the problem, which means that the blocking of Windows 11 24H2 is now lifted.
And just a week ago, users of certain headphones and speakers were also given access to 24H2, after Microsoft’s developers managed to fix a Bluetooth bug.
With the new fixes, “only” three serious bugs remain with Windows 11 24H2, namely a problem with third-party wallpaper software, as well as incompatible drivers for Intel Smart Sound Technology (SST) and Senseshield Technology. Read...Newslink ©2025 to PC World |  |
|  | | ITBrief - 23 Sep (ITBrief) Lancom Technology launches AI services in Australia and New Zealand to help firms adopt AI safely, managing risks and boosting business growth. Read...Newslink ©2025 to ITBrief |  |
|  | | PC World - 23 Sep (PC World)Neither Intel nor Nvidia have said exactly when the first fruits of its co-designed integrated CPUs will ship. But the thinking right now seems to be that it might a take a few years.
Nvidia announced a $5 billion investment into Intel last week, where Intel will supply CPU cores to Nvidia for potential use in the data center. In the PC, both Intel and Nvidia will collaborate on presumably mobile processors, where Nvidia will supply RTX chiplets for Intel to integrate, potentially upending the direction of GPUs.
Nvidia chief executive Jensen Huang told journalists last week that the partnership dates back a year, as per PCWorld’s reporting. Intel also told PCWorld that the partnership wouldn’t change its own roadmap, essentially adding on premium options to a number of product categories. But even with a head start the development work may take some time, the thinking goes, and the two sides won’t be prepared to talk about their efforts for some time.
Sources at competitors to Intel and Nvidia said that they expect the first products from the collaboration are more than two years away, and that they too are leaving their roadmaps unchanged as a result. One said that their company has doubts that Intel could work together with Nvidia to deliver the sort of complex, highly-integrated products both sides described.
“We’re creating an SOC [system-on-chip] that fuses two processors,” Huang said on a conference call with reporters last week. “It fuses the CPU and Nvidia GPU, RTX GPU, using NVLink and it fuses these two dies into one essentially virtual giant SOC, and that would become essentially a new class of integrated graphics laptops that the world’s never seen before.”
One issue is NVLink, Nvidia’s high-speed interface that can be used to combine the power of two Nvidia graphics card, at speeds higher than the PC’s backbone, PCI Express, achieves. A source at one competitor said that it has doubts that Intel has the engineering capabilities to make an integrated CPU-GPU with NVLink system-on-chip actually work, given Intel’s past history of engineering missteps dating from Arrow Lake’s poor desktop performance or the recent bugs that caused some processors to crash. They also wondered if Nvidia really cares to enable such a chip when its discrete GPUs already serve as a viable alternative.
Another source referenced a note from BofA Global Research, which worried about what role Softbank’s $2 billion investment into Intel might have on the development, as well as input from the U.S. government which has secured its own investment.
Such thinking is typically referred to as FUD, or “fear, uncertainty, and doubt,” a now fairly traditional means of criticizing one’s competitors in the technology industry. Still, this is a time where Intel’s dominance is seen as especially vulnerable, and AMD’s ongoing resurgence in desktop market share is evidence of that. Intel’s competitors would be especially eager to cut into Intel’s share in laptops, where Intel stubbornly holds on to about 80 percent of the market.
This week, rival Qualcomm is expected to unveil new Snapdragon mobile processors for laptops, hoping to cut into Intel. Still, shipments of Copilot+ PCs (which include Qualcomm’s Snapdragon processors) were just 2.3 percent of all Windows PCs sold during the first quarter of 2025, market researcher IDC reported earlier this year. Mercury Research reported that “growth in ARM in Copilot+ PCs also appeared to be at a standstill,” based on the firm’s estimates of the PC CPU market for the second quarter of 2025.
Dean McCarron, principal of Mercury Research, pointed out that Intel had worked together with AMD to develop the “Kaby Lake G” chip, announced in November 2017. In January 2018, Intel announced the “8th-gen Intel Core with Radeon RX Vega M graphics,” shipping the Core i7-8705G chip based on the partnership in June 2018. In PCWorld’s review of the processor, we noted that chip wasn’t tremendously exceptional, compared to the existing CPU + discrete GPU landscape, but that future iterations could have more impact. But only a handful of PC vendors built systems around the chip, and those future iterations never happened, possibly because it wasn’t quite clear which company was support the Kaby Lake G chip and its successors.
“I don’t think the challenge of adapting a GPU to a chiplet is a significant one, particularly for lower power graphics (which is what would likely be used for mobile designs),” McCarron said in an email.
Adapting Nvidia’s “Blackwell” architecture, the basis of its GeForce 5000 series of GPUs, wouldn’t be too long if Nvidia used a standard bus structure like PCI Express — two years maximum, with most of the time associated with CPU integration, packaging, and test. McCarron projected spring 2027 as a guess for when that could happen, though that assumed using a standard PCIe bus, not NVLink.
Intel is thinking of the new collaborative CPUs as a premium offering, and McCarron said he agreed with that.
“I would agree it’s probably a premium play, but probably not at the very top end,” McCarron wrote. “I could see this fitting well with the upper end of Core Ultra 5 and lower end of Core Ultra 7 [using Intel’s new naming scheme] especially in the thin and light segment of notebook.”
“Higher-end [Core Ultra 7] and [Core Ultra 9] would still go with separate GPUs for performance reasons,” McCarron added. “It would be reasonable to assume it’s going to be a normal Intel core with an extra chiplet rather than some new custom core just to support graphics integration, which points to re-using Panther Lake or Nova Lake,” the two Intel CPUs due in late 2025 and late 2026, respectively. Read...Newslink ©2025 to PC World |  |
|  | | PC World - 23 Sep (PC World)Long ago, I had an Android phone with an early facial recognition sign-in feature… and someone could unlock my phone just by holding up a photo of me. Yeah, it was bad.
Fast forward to 2025 and we have Windows Hello facial recognition sign-ins for PCs. Microsoft talks a big game about how secure it is, that Windows Hello can’t be easily tricked, that it’s better than a traditional PIN or password, and that it’s as secure as Apple’s Face ID.
But is it really? I ran an experiment and tried to fool it. Here’s what happened when I put facial recognition to the test on my PC.
How I tried to fool Windows Hello
If someone wanted to fool facial recognition biometrics, they’d probably do it using a photo of your face. So that’s just what I did—I took a photo of myself (available online), put it on an iPad, and held it up in front of my face. My Windows Hello webcam wasn’t fooled for a second.
In fact, Windows Hello doesn’t even see flat pictures as faces! While the Camera app on Windows does register it as a face, Windows Hello knows better. Despite holding up a high-resolution image of my face, Windows Hello kept insisting it couldn’t see me.
Chris Hoffman / Foundry
There are other ways to potentially fool Windows Hello, like printing out a photo of someone on paper and even cutting out eye holes so you can visibly blink while holding it up in front of your face. But none of these methods work. A flat image just won’t cut it.
Why Windows Hello can’t be easily tricked
No technology is perfect, but Windows Hello’s facial recognition support is a lot more secure than you may think. To use facial recognition with Windows Hello, a laptop needs more than just a webcam—it also needs a near-infrared (IR) camera and an IR emitter. This combo is what allows the laptop to create a depth map of your face (and that’s why I’ll never buy a laptop that doesn’t have this hardware).
In other words: it isn’t just looking at your face, but also checking that the physical 3D shape of your face matches what it expects to see. This prevents a flat photo from unlocking your laptop, and it’s similar to what Apple does with Face ID on iPhones.
Mark Hachman / Foundry
Under the hood, Windows isn’t storing an image of your face, but rather data on the shape of your face. Microsoft has some technical documentation on Windows Hello that explains it, but the gist is that Windows Hello’s facial recognition focuses on “facial landmark points” like your eyes, nose, and mouth, then takes samples around them.
Windows Hello captures all this data when you set up facial recognition, and that biometric data is stored entirely on your computer. That’s why you have to set up Windows Hello and re-scan your face every time you set up a new PC. None of it is stored online.
Older facial recognition systems often looked for “proof of liveness,” such as blinking. These were necessary on early systems that only captured images and watched to see if the eyelids blinked. But it didn’t work very well. People printed out photos, then cut eyeholes and blinked through them. Windows Hello’s depth mapping is worlds better.
But watch out if you’re James Bond
Windows Hello is complex enough that your average Joe won’t be able to fool it. But if you were in a James Bond movie—or you’re being targeted by international intelligence agencies with lots of resources—then Windows Hello could potentially be fooled for real.
To do this, the attacker would need to measure your face and build a near-perfect representation of it. I’m not just talking about a papier-mâché head that sort of looks like you, but a life-like replica that perfectly replicates the precise contours of your face. With that, someone could indeed be able to sign in as you.
Fooling modern facial recognition’s biometric security is way more difficult than just cloning your fingerprint for a fingerprint reader, and also much more difficult than “shoulder surfing” in public to steal your PIN or password as you type it in plain view.
Realistically speaking, Windows Hello’s facial recognition is the most secure way to protect your Windows laptop.
Facial recognition is the most secure
If your PC supports it, you should be using facial recognition to sign in. It’s one of the best ways to secure your laptop and the drawbacks are minimal. If your PC doesn’t support it, that’s okay—you can always grab a Windows Hello webcam and plug it into your PC or laptop. It’s one of the best PC accessories that are actually worth it.
When using Windows Hello, you should also activate the “only allow Windows Hello sign-in for Microsoft accounts on this device” option, which you can find under Settings > Accounts > Sign-in options. With this enabled, no one can sneak onto your PC without your face.
Chris Hoffman / Foundry
Oh, there’s one more risk: if you happen to have an identical twin with an identical face shape, they may be able to sign in as you. But if your twin’s face is even a little different—which is likely—you may be surprised to find that Windows Hello can tell the difference.
Subscribe to Chris Hoffman’s newsletter, The Windows Readme, for more PC advice from a real human. Read...Newslink ©2025 to PC World |  |
|  | | ITBrief - 22 Sep (ITBrief) Retailers must embrace technology and end-to-end planning to meet soaring consumer demands and excel in the 2025 peak shopping season. Read...Newslink ©2025 to ITBrief |  |
|  | | ITBrief - 22 Sep (ITBrief) Check Point named DXC Technology Top Partner of the Year and S5 Technology Workspace Security Partner in its 2025 Asia Pacific regional awards. Read...Newslink ©2025 to ITBrief |  |
|  | | RadioNZ - 20 Sep (RadioNZ) Auror`s number-plate recognition technology is currently under legal scrutiny in the Wellington Court of Appeal. Read...Newslink ©2025 to RadioNZ |  |
|  | | PC World - 20 Sep (PC World)Welcome to The Full Nerd newsletter—your weekly dose of hardware talk from the enthusiasts at PCWorld. Missed the hot topics on our YouTube show or latest news from across the web? You’re in the right place.
Want this newsletter to come directly to your inbox? Sign up on our website!
I like my laptops weird. Or at least, that’s what I learned looking at the news out of IFA this year.
Specifically, Lenovo’s latest concept design: the ThinkBook VertiFlex. This thin-and-light notebook lets users rotate the screen between portrait and landscape mode.
Do I like it better than Lenovo’s rolling screen ThinkBook? Not in terms of sheer coolness, no. Do I think I could just buy a 2-in-1 laptop with a 360-degree hinge, prop it on a portable stand in full tablet mode, and pair it with a keyboard? Yeah, I need an ergo keyboard anyway. But would I still consider buying this anyway? Oh yeah.
I’m not a coder, but I do a little writing for my job—and having a taller screen helps me see how the overall piece flows together, without having to zoom out or scrunch down font size. And as accustomed as I am to hobbling together my own solutions, it is nice to have a more elegant, purpose-built version. That’s especially so since I assume this would be more affordable than that $3,000+ ThinkBook Plus Gen 6 Rollable model. Not automation here. You just grab the screen and rotate it on its hinge.
Adam Patrick Murray / Foundry
What else would I love to see? Currently on my mind would be laptops that bridge DIY and off-the-shelf designs—like letting you convert a handheld gaming PC into the littlest of laptops. (Conveniently, the lone mainstream handheld with detachable controllers is…Lenovo’s.) I already waxed poetic this week on the show about using a handheld gaming PC without its controllers as a Windows tablet. Dropping a handheld gaming PC into a frame with a keyboard and trackpad, built just for it, would be even more badass.
I’d also be into the return of the truly wild, like Razer’s Project Valerie (a triple display laptop). And I’m now hoping for more variations in sizes, shapes, and weight for creator and possibly gaming laptops, now that dogs and cats have united, with the surprise announcement of Intel consumer CPUs featuring Nvidia RTX integrated graphics. (More on that in the news recap below!)
We’ve been in a hardware drought on the desktop side—I honestly can’t remember a year quite this slow in a long while, including during the pandemic. I don’t think it’ll end any time soon, even given the startling news about Intel and Nvidia’s partnership on x86 chips. But laptops have been giving a solid boost to my enthusiasm for PCs. It’s been a nice way to shake off the summer doldrums and head into fall feeling more optimistic.
In this episode of The Full Nerd
In this episode of The Full Nerd, Adam Patrick Murray, Alaina Yee, and Will Smith dive into the return of five-year old Intel architecture, Lenovo’s pricing for its fancy-schmancy Legion Go 2 handheld, and AMD Radeon market share. We also dive into a discussion around game launchers on Windows, inspired by the news of Microsoft’s tweaks to its Xbox app. (You won’t be surprised to learn that nothing is as good as we’d want.)
It also turns out that Adam likes to flout domestic laws (and flaunt it by showing us his haul). I may have also delayed the start of the show by singing the praises of German bread (and cooking in general). Definitely try homemade rouladen, if you can!
Honestly, I kind of miss my beard.Alex Esteves / Foundry
Missed our live show? Subscribe now to The Full Nerd Network YouTube channel, and activate notifications. We also answer viewer questions in real-time!
Don’t miss out on our NEW shows too—you can catch episodes of Dual Boot Diaries and The Full Nerd: Extra Edition now! (Plus, we did a live build recently, if you’d like to watch us put together our latest PC for the TFN studio.)
And if you need more hardware talk during the rest of the week, come join our Discord community—it’s full of cool, laid-back nerds.
This week’s dramatic nerd news
Intel and Nvidia (Intvidia?)’s partnership completely took all of us by surprise, of course. But it’s just the peak of the emotional rollercoaster I was on this week, between ominous rumblings about product availability, another full blast of nostalgia, and yet more amazing used PCs finds.
Oh, and a scandal that I found truly delightful.
Alternative-Run363 / Reddit
The Intel-Nvidia deal could utterly rewrite the future of laptops: I’m not the only one on the PCWorld staff thinking about laptops and the impact of x86 chips with RTX graphics. My colleague and TFN regular Mark Hachman takes a different angle, diving into the history of previous SoC architectures and Intel partnerships, as well as the questions sparked by this new partnership.
What’s next for Intel Arc? Brad examines the potential implications this new Intel-Nvidia deal could have on Intel’s Arc graphics division, even with Intel assuring us that it will “continue to have GPU product offerings.”
So, Doom can run 2.5 years without crashing: It even went beyond the original estimate by the person running this experiment. Nice.
Would you eat TSMC honey? No, that’s not a euphemism. Just a sweet byproduct of TSMC’s efforts to restore the ecosystems around its plants. I think I’d try it.
Don’t worry, Nvidia will still take your thousands of dollars: For a moment this past week, people thought the RTX 5090 Founders Edition cards might be discontinued. Nvidia quickly reassured everyone that no, it does want your money.
The WinRing0 driver issue hurts: Choosing between security and PC fan performance is making me unhappy. (Obviously I’m choosing security, hence the sorrow.) A ton of third-party apps are affected—MSI Afterburner, OpenRGB, and Razer Synapse among them.
What a trash haul: Redditor Alternative-Run363’s dad snagged quite a find off the street—a 9th-gen Intel PC with a GTX 1660. Sure, it’s old and dusty. But as the redditor says in response to a commenter, “I can use the PC.” (Also hilarious: this comment.)
Tech apocalypse incoming? Ars Technica had a chat with the Consumer Technology Association, and the impact of tariff effects sounds potentially bad for availability of tech products in the U.S. after the holiday season. Combined with other expected shortages, affordable DIY PC building and upgrades might get tough soon.
Meow.Ubisoft
I love Assassin’s Creed. I love cats more: This DLC was made for me. Rooftop Cat, you’re my number one objective now.
I have demands, Ubisoft: Speaking of Assassin’s Creed, a Black Flag remake seems imminent. But I’m not playing it unless they bring back the companion app and its ship minigame.
This sleeper build gives me all the nostalgia: But it has none of the nonsense of 1990s-era Windows. (Windows 98 is the reason I still stan Windows NT.) Win-win.
There’s a God mode in Windows?! Wait, the thing I wanted most in Windows actually exists? And I’ve just been oblivious this whole time? ajakfldjsa;kfewizs
It’s not free, but dang close to it: Honestly, $23 for a 24-core Threadripper and RTX 3080 Ti system seems somehow more of a steal than free. What an auction win.
I guess I didn’t have to build a Pi-Hole? Instead of blocking Windows 10 telemetry data, I could have just stripped it out at the roots.
AMD X3Ds can hit 1,000fps: At least, they can in some esports games, according to AMD China. And here I am, happy when Overwatch 2 just gives me a consistent 40 to 50fps. (Sometimes the new patches cause awful framerate drops.) I need to demand more from life.
Scandal rocks international stone skipping contest: I get that people broke the rules. But everything about this situation is just so wholesome. (Also the science of rock skipping is neat.)
Catch you all next week—I’ll likely still be reeling from the realization that fall is starting. Where did this year go?
Alaina
This newsletter is dedicated to the memory of Gordon Mah Ung, founder and host of The Full Nerd, and executive editor of hardware at PCWorld. Read...Newslink ©2025 to PC World |  |
|  | | PC World - 19 Sep (PC World)At a glanceExpert`s Rating
Pros
Attractive design with compact stand
Good range of video, USB-C, USB-A connectivity
High SDR and HDR brightness
Outstanding motion clarity at 1080p/330Hz
Cons
USB-C only supports 15 watts of power delivery
Extremely glossy display finish
Only 165Hz refresh rate at 4K
Our Verdict
The Asus ROG Strix OLED XG32UCWG provides great motion clarity with solid brightness for an OLED panel, and the price is right.
Price When Reviewed
This value will show the geolocated pricing text for product undefined
Best Pricing Today
Best Prices Today: Asus ROG Strix OLED XG32UCWG
Retailer
Price
Check
Price comparison from over 24,000 stores worldwide
Product
Price
Price comparison from Backmarket
Best Prices Today: Check today’s prices
Want a monitor with great motion clarity, OLED image quality, and a contrast-rich finish that can also double as a mirror when the monitor is turned off? The Asus ROG Strix OLED XG32UCWG might be for you.
It goes all-in on gaming with a dual-mode display that can refresh at up to 330Hz and a TrueBlack Glossy finish that enhances immersion. Those benefits come with downsides but, for many, the pros and cons will net out to be positive.
Read on to learn more, then see our roundup of the best gaming monitors for comparison.
Asus ROG Strix OLED XG32UCWG specs and features
At its core, the Asus ROG Strix OLED XG32UCWG is another 32-inch 4K monitor, but there are a few interesting details on the spec sheet. It uses LG’s WOLED panel, which is a bit less common than Samsung’s QD-OLED. On top of that, it’s a dual-mode display, meaning it offers both 4K and 1080p native resolution modes. In 4K, the refresh rate goes up to 165Hz, but in 1080p it can reach 330Hz.
The monitor also has what Asus calls a TrueBlack Glossy display coat, which allegedly improves perceived contrast. More on that later in the review.
Display size: 31.5-inch 16:9 aspect ratio
Native resolution: 3840×2160 / 1920×1080
Panel type: WOLED
Refresh rate: 165Hz / 330Hz (in 1080p mode)
Adaptive sync: Yes, AMD FreeSync Premium Pro & G-Sync Compatible
HDR: Yes, HDR10, VESA DisplayHDR 400 True Black
Ports: 2x HDMI 2.1, 1x DisplayPort 1.4, 1x USB-C upstream with DisplayPort Alternate Mode and 15 watts of power delivery, 1x 3.5mm headphone jack, 1x USB-B 3.2 Gen 1 upstream, 3x USB-A 3.2 Gen 1 downstream
Audio: None
Additional features: Proximity sensor, dual-mode display
Price: $999 MSRP / $899 initial retail
The monitor also provides a fair bit of connectivity, including USB-C with DisplayPort and three downstream USB-A ports. That means it works well as a USB hub. There’s also a proximity sensor—a new feature starting to appear in some OLED monitors—meant to reduce image retention by automatically turning off the display when you move away.
Asus ROG Strix OLED XG32UCWG design
The design of the Asus ROG Strix OLED XG32UCWG is quite reserved from the front, with slim black bezels on all sides. The only notable distinction is the glowing red ROG logo on the bottom bezel, which also houses the proximity sensor.
Flip it around and the monitor looks a bit more distinctive, with a large RGB-lit Asus ROG logo and the visually interesting two-tone black look common to many ROG monitors. It’s clearly a gaming monitor, but it leans toward the more subtle end of typical gaming monitor design.
Matthew Smith / Foundry
Looks aside, the design is practical. The monitor ships with an ergonomic stand that has an extremely small base. Asus highlights this as a feature, and for good reason, as the small base makes it easier to position the monitor on your desk and minimize its footprint.
The stand supports tilt, swivel, and height adjustment, though its range is a bit limited in some areas. For example, it adjusts only 80mm for height, while some competitors offer 110mm or, in the best case, 130mm. Still, 80mm is fine for most setups. The stand doesn’t support pivoting into portrait orientation and instead can pivot just a few degrees for minor adjustments, though that’s not too unusual for a 32-inch OLED monitor.
Of course, the monitor also provides a 100x100mm VESA mount, so you can attach it to third-party monitor arms or stands to increase its range of adjustment.
Like most OLED monitors, the XG32UCWG uses an external power brick, so you’ll need to place that under your desk. It’s a small brick as these things go, though, and rated at 240 watts.
Asus ROG Strix OLED XG32UCWG connectivity
Past Asus ROG monitors haven’t always stood out for connectivity, but the ROG Strix OLED XG32UCWG offers a good range of options. The video inputs include two HDMI 2.1 ports, one DisplayPort 1.4, and a USB-C port with DisplayPort Alternate Mode, for a total of four inputs. That’s a bit more than the typical three video inputs.
The USB-C port isn’t a complete win, as it only provides power delivery up to 15 watts, which won’t be enough to handle a connected laptop (unless it’s a MacBook Air, maybe, if you’re not running at full load). However, the USB-C port does provide upstream access to three downstream USB-A ports, which is useful. Those USB-A ports can also be accessed through an upstream USB-B connection if you’re using a desktop, in which case you likely won’t be using USB-C.
A few competitors provide better overall connectivity, such as the HP Omen Transcend 32. On the other hand, some rivals like Alienware have recently offered fewer ports, and the Asus is less expensive than the HP. The XG32UCWG’s connectivity is a middle ground for people who want decent connectivity without paying too much for it.
Asus ROG Strix OLED XG32UCWG menus, features, and audio
The Asus ROG Strix OLED XG32UCWG’s menu system is controlled by a joystick hidden behind the ROG logo on the bottom bezel. The menu is easy to navigate thanks to clearly labeled options and decently sized text. Alternatively, you can use Asus’ DisplayWidget Center to control monitor settings directly in Windows or macOS. It’s a great option for making quick adjustments and mostly makes the joystick unnecessary—unless you simply prefer to use it.
Asus also provides a few interesting features that might sway some shoppers. The monitor offers significant aspect ratio controls, letting the 32-inch panel behave like a 24.5-inch or 27-inch display. Most people will stick with the default settings—a 32-inch display area, 4K resolution, and 165Hz refresh rate—but you could also run it as a 24-inch, 330Hz display for certain esports titles. There’s an OLED anti-flicker mode that can reduce flickering, which OLEDs sometimes exhibit, especially when displaying certain grayscale tones.
Matthew Smith / Foundry
Of course, you also get the usual gaming extras like FPS counters and crosshairs. Asus has added some AI branding here, calling it an AI assistant, which means certain features are dynamic. For example, Dynamic Shadow Boost can automatically brighten dark areas of a scene to make enemies easier to spot, without affecting brighter areas. Personally, I rarely use these features, but I can see how they might be helpful if you often rely on crosshairs or shadow boosting for a competitive edge.
Asus is also all-in on screen protection features to prevent OLED burn-in. The monitor has a proximity sensor to automatically turn off the screen when you move away from the monitor, then turn it back on when you return. There’s also a wide range of features that automatically detect scenarios that might cause burn-in, like a bright logo on a dark image (or vice versa), and attempt to compensate. I can’t comment on how effective these features will be long-term, since I only had the monitor for a couple of weeks, but I expect the proximity sensor, at the least, will be helpful.
On top of all this, the monitor provides a good range of image quality adjustment. It includes gamma and color temperature modes that target precise values, not vague presets, plus color calibration. Though not sold as a monitor for creative professionals, it could work in a creative capacity for many people.
Speakers, on the other hand, are absent. That’s a tad disappointing, but it’s common among OLED gaming monitors, as monitor makers typically assume gamers will want to use a headset.
Asus is also all-in on screen protection features to prevent OLED burn-in.
Asus’ ROG Strix OLED XG32UCWG SDR image quality
The Asus ROG Strix OLED XG32UCWG has an LG WOLED panel. This contrasts to the more common Samsung QD-OLED panel. WOLED panels tend to have slightly inferior color performance to QD-OLED, and the XG32UCWG is no exception. However, its overall SDR performance is extremely good, and WOLED’s color gamut coverage is getting closer to QD-OLED.
Matthew Smith / Foundry
First up is brightness. The Asus ROG Strix OLED XG32UCWG does well here with a maximum sustained SDR brightness of 286 nits. That’s the second-best result from an OLED, behind only the much more expensive Asus ProArt PA32UCDM.
You may need that brightness, however, due to the True Glossy Black panel finish. This finish is meant to enhance perceived contrast, but it’s already extremely reflective. Indeed, it’s virtually a mirror, as highly distinct full-color reflections are easy to make out even in moderately lit rooms. Because of that, I can only recommend the XG32UCWG in a room with very good light control.
Matthew Smith / Foundry
The True Glossy Black finish enhances perceived contrast but not necessarily measured contrast. That’s because OLED panels all hit an effectively infinite contrast ratio anyway due to their success providing a perfect black level of zero nits.
So, what does better perceived contrast mean in practice? It means that dark areas of the image have an incredibly inky, deep look. This is not because the pixels themselves are dimmer but, rather, because of how light scatters across the display.
I happened to review the Samsung Smart Monitor M9, an OLED panel with a matte finish, just before the XG32UCWG. The difference is stark. The XG32UCWG looks dramatically more contrast-rich and vivid, particularly when viewing high-contrast content. A dark alley in Cyberpunk 2077 is a good example.
However, as just mentioned, the XG32UCWG is highly reflective. The Smart Monitor M9 is not. Personally, I would rather have the Smart Monitor M9’s matte coat than the XG32UCWG’s glossy coat. This, however, is a matter of personal preference. Glossy OLED fans will love the XG32UCWG.
Matthew Smith / Foundry
The XG32UCWG’s color gamut results are interesting. It covered 100 percent of sRGB, 96 percent of DCI-P3, and 88 percent of AdobeRGB.
As the graph shows, this puts the XG32UCWG slightly behind the curve for an OLED monitor—as all of the recent 32-inch displays PCWorld has tested were QD-OLED panels. On the other hand, this color gamut is objectively solid, defeating most monitors that lack quantum dots.
I think the XG32UCWG’s color gamut is more than adequate for most situations, but if you really want the best color gamut possible, QD-OLED still has the edge.
Matthew Smith / Foundry
It’s a similar story in color accuracy. The measured average color error of 0.97 is technically towards the bottom of this heap of OLED monitors. However, a color error this low is excellent by any standard, and certainly more than good enough for gaming. It’s also worth mention, again, that the XG32UCWG has an unusually wide range of image quality adjustments for a gaming monitor, which means you can do more to calibrate and tune the monitor to your needs than with some competitors.
The monitor’s color temperature and gamma performance was average. I measured a default color temperature of 6600K which, though slightly above the target of 6500K, isn’t going to be noticeable in most cases. The gamma curve was a bit high too, at 2.3 when set to 2.2 (other gamma presets were also high). That means the image looks a bit darker than it should. I do find this to be slightly noticeable compared to a spot-on IPS-LCD display, but most OLED monitors have the same quirk.
Sharpness is a perk, of course, as the monitor’s maximum resolution of 3840×2160 works out to about 140 pixels per inch across the 31.5-inch display. That is identical to other OLED monitors, so there’s no major advantage here. The high resolution, along with improvements to OLED panel technology, largely banish the sharpness issues of earlier panels. It looks tack-sharp though, of course, no more so than the competition.
The Asus ROG Strix OLED XG32UCWG’s overall SDR image quality is solid, though not exceptional for an OLED monitor. It scores better than most in brightness, though also gives up some ground in color gamut. Contrast is exceptional and perceived contrast is enhanced by the highly glossy display coat, though at the cost of annoying reflections in even moderately lit rooms. Gamers who can get over the highly glossy finish, or prefer it, will find the monitor’s SDR image quality is top-notch.
Asus ROG Strix OLED XG32UCWG HDR image quality
Asus backs up the ROG Strix OLED XG32UCWG’s healthy SDR performance with HDR performance that, while not class-leading, is certainly strong and among the better reasons to buy the monitor.
Matthew Smith / Foundry
As the graph shows, the XG32UCWG delivered great HDR brightness across the board. Its HDR brightness maximum was about 807 nits in a 3 percent window, meaning 3 percent of the screen was displaying a bright white HDR image. That’s a solid result.
Subjectively, HDR content looked excellent. The monitor has the brightness, contrast, and color performance required to deliver superb results. Highlights, like explosions in games, were remarkably bright and vivid.
Asus also benefits from providing a good range of HDR adjustments. You can adjust the brightness or turn on the dynamic brightness mode to boost maximum brightness (this mode was used for testing). While these will technically reduce the accuracy of the image, I find they’re almost essential for PC monitors. Most HDR content is mastered on the assumption it will be viewed on a large display with the viewer many feet away, which is not the typical use case for a monitor.
Asus ROG Strix OLED XG32UCWG motion performance
The Asus ROG Strix OLED XG32UCWG is a dual-mode monitor. That means it’s designed to display 4K resolution at up to 165Hz, or 1080p at up to 330Hz. Whether that matters will depend on your priorities.
Personally, I would not play at 1080p to enjoy 330Hz, even though I can notice the improved motion clarity at 330Hz. The reduced sharpness of 1080p on a 32-inch display is just too much. However, highly competitive gamers will likely appreciate the added smoothness and motion clarity of the 1080p/330Hz mode.
The real key is the versatility this can provide. Traditionally, competitive gamers had to opt for lower resolutions to gain high refresh rates. That’s fine in Counter-Strike 2 but less so if a competitive gamer wants to boot up Cyberpunk 2077 in their down-time. The XG32UCWG offers the best of both worlds.
It’s not without sacrifice, however. At this price, you could buy a 4K/240Hz monitor instead. So, you must decide: 4K at 240Hz all the time, or the option to flip between 4K/165Hz and 1080p/330Hz? I would always opt for the first option, but I can see why some would prefer the latter.
Whichever you’d prefer, the XG32UCWG’s motion clarity is excellent. OLED monitors have low pixel response times, which reduces blur and makes the most of their high refresh rates. The XG32UCWG also provides official support for AMD FreeSync Premium Pro and Nvidia G-Sync for smooth frame pacing alongside AMD and Nvidia video cards.
The XG32UCWG also supports Extreme Low Motion Blur. This is Asus’ name for a backlight strobing feature that inserts black frames between standard frames. Due to quirks of human persistence of vision, this has the effect of reducing perceived motion blur. ELMB reduces brightness, is only in certain image modes (including a refresh rate up to 165Hz), and can cause a “double image” effect. But, on the plus side, it’s successful in noticeably increasing motion clarity. The XG32UCWG also mitigates some ELMB downsides. It has a bright panel for an OLED, so reduced brightness with ELMB on is less of a concern, and ELMB’s double image effect is less apparent than some other backlight strobing schemes I’ve witnessed.
Overall, the XG32UCWG represents the leading edge of motion clarity and responsiveness in a 32-inch gaming display. The 1080p/330Hz mode is extremely crisp, and the 4K/165Hz isn’t bad, either. I think that, for many, the buying decision will come down to motion clarity. If 1080p/330Hz and ELMB sound rad, the XG32UCWG is a solid choice. If not, a 4K/240Hz QD-OLED is probably the way to go.
Should you buy the Asus ROG Strix OLED XG32UCWG?
The Asus ROG Strix OLED XG32UCWG is a strong contender in the highly competitive battle between 32-inch 4K OLED monitors. Its perks include solid connectivity, a contrast-rich panel, good SDR and HDR performance, and support for dual-mode functionality at 4K/165Hz or 1080p/330Hz.
On the downside, the panel’s extremely glossy surface will prove divisive, its color performance doesn’t quite match QD-OLED, and the monitor is priced to compete with monitors that can provide 4K/240Hz. That last point stings most, in my opinion. If it were my money, I’d opt for the MSI MPG 32URXW. Or, at least, I would if it was in stock at MSRP (it’s currently not).
Speaking of MSRP, it’s worth mention that the XG32UCWG is not too expensive. It carries an MSRP of $999, but Asus says it will be $899 for an “initial period” at launch. That’s very competitive, and the monitor is worth a spot on any 32-inch 4K OLED short list for as long as it stays at or near that price.
The Asus ROG Strix OLED XG32UCWG will appeal most to hardcore gamers who really care about motion clarity, as they’ll see the benefit of the 1080p/330Hz mode. At the same time, competitive gamers can still choose 4K resolution when playing more graphically demanding and immersive titles. Read...Newslink ©2025 to PC World |  |
|  |  |
|
 |
 | Top Stories |

RUGBY
Otago are the first province into next week's NPC rugby final, after cruising to a 41-17 win over Bay of Plenty in Dunedin More...
|

BUSINESS
The rising cost of food is also affecting chocolate with popular Kiwi brand Whittaker's increasing its prices More...
|

|

 | Today's News |

 | News Search |
|
 |