
Search results for 'Technology' - Page: 5
| | RadioNZ - 27 Jan (RadioNZ) Experts say quantum technology and photonics were on the verge of allowing us to diagnose cancer earlier or even predict earthquakes sooner. Read...Newslink ©2026 to RadioNZ |  |
|  | | | BBCWorld - 27 Jan (BBCWorld)The home secretary told Parliament she wants to make better use of technology - such as live facial recognition and AI. Read...Newslink ©2026 to BBCWorld |  |
|  | | | PC World - 27 Jan (PC World)Intel’s Core Ultra Series 3 platform, Panther Lake, and its champion, the Core Ultra X9 388H microprocessor, offer something unique: powerful, gaming-class 3D performance with battery life that’s almost unheard of in the laptop space.
Intel positioned the Core Ultra Series 3 (Panther Lake) as a chip with the computational power of its Arrow Lake platform, with the low power consumption of the Core Ultra Series 2 (Lunar Lake). The chip maker also predicted that Panther Lake’s gaming performance is roughly equivalent to a laptop with an Nvidia GeForce 4050 laptop chip inside it. As I’ll show you, those are relatively fair claims.
Instead of just a battery of tests, we’ll try to pull out the “story” of Panther Lake, demonstrating its strengths and weaknesses as we go. Let’s just hope you can buy one.
Intel supplied this Lenovo laptop as an additional Panther Lake platform for testing the Core Ultra X9 388H. This photo has been edited to obscure personally identifiable information.Mark Hachman / Foundry
Intel’s Core Ultra Series 3 chip stands alone, for now
In October, Intel originally positioned the Core Ultra Series 3 (Panther Lake) in one of three basic configurations, combining the new “Cougar Cove” P-core and the “Darkmont” E-core and Low Power E-cores. At the high end was what Intel referred to then as the “16 core 12Xe” configuration, with 4 P-cores, 8 E-cores, and 4 LP E-cores, 12 Xe3 GPU cores, and 12 ray-tracing units. When it came time for Intel to announce the Panther Lake chip lineup, that configuration included its formal name, the Core Ultra X9 388H, with the “X9” prefix added to highlight the largest Xe3 configuration.
Intel then let reviewers benchmark the Core Ultra Series 3 chip during CES, but only using games. It was our first indication that Panther Lake could be something special.
Intel prevented reviewers from testing CPU-specific benchmarks, however, probably because the number of cores inside the highest-end Panther Lake chip (16) are less than those inside the rival Qualcomm Snapdragon X2 Elite Extreme chip (18), meaning that Intel would likely lose to the Snapdragon on paper in multi-threaded CPU-specific benchmarks.
The third contender will be the AMD Ryzen AI 400, an upgrade to the excellent Ryzen AI 300, which AMD debuted last year. The Ryzen AI 400 includes just 12 cores, but runs them at a maximum clock speed of 5.2GHz — the fastest speed of all three chips. But laptops with either the Ryzen AI 400 or Snapdragon X2 Elite aren’t yet available.
Mark Hachman / Foundry
CPUs don’t suck any more
Both Intel’s Core Ultra Series 1 (Meteor Lake) and Core Ultra 2 (Lunar Lake) were surprisingly average in CPU performance, both in single-core and multi-core tasks. (CPU-specific applications include web browsing, apps like Excel, compiling software, some games, and decompressing files.) AMD’s Ryzen and Qualcomm’s Snapdragon used to hold the advantage. No longer — well, at least compared against its older rivals.
With Panther Lake, Intel has regained its leadership in CPU computations.
Intel provided us an Asus ZenBook Duo (UX8407A) with an Intel Core Ultra X9 388H chip inside as a launch laptop for the Panther Lake platform. It was a slightly odd choice; the ZenBook Duo is a dual-screen laptop, with a gaming-class 99 watt-hour battery, which drastically inflated the battery life.
Intel also offered a prototype Lenovo laptop, which we used as a reality check for the estimated battery life and additional benchmarks. I left the ZenBook Duo in “clamshell” mode, only using one of its 2K screens to render data to produce results I felt confident in comparing to other platforms. I started using the Cinebench 2024 and Geekbench synthetic CPU tests.
Mark Hachman / Foundry
To address Intel’s claim that Panther Lake offers the CPU performance of the mobile Arrow Lake chip: yes, that’s true. Intel launched the Core 285H chip last year, and in our review of the Core 285H, I found that the Cinebench 2024 score was 1,012 (multithreaded) and 128 (single-threaded), just a hair under Panther Lake’s performance. In Geekbench (measured below), the older 285H produced a score of 16,755, again slightly less than Panther Lake’s Core Ultra X9 388H.
But if you’re a Windows fanatic like we are, you might be disappointed by the Core Ultra X9 388H’s showing. Referring to the review of the Apple M5 MacBook Pro, our colleagues at Macworld report that the MacBook M4 Pro reported a score of 1,010 in Cinebench 2024 and 14,763 in Geekbench 6. But the MacBook M5 Pro scored 1,126 in Cinebench 2024 and 18,013 in Geekbench 6, besting Intel’s current mainstream laptop chip.
Mark Hachman / Foundry
Perhaps an upcoming HX version can do better?
Battery life is massive! But so is the battery
Intel has claimed that Core Ultra 3 laptops will have up to 27 hours of battery life. That’s true — but, as is often the case, it depends. The two screens of the ZenBook Duo suck more power than a single display. However, Asus installed a 99Wh battery inside. That’s a gaming-class battery, and the largest capacity allowed on a plane by FAA rules. In this case, it’s like bolting a self-powered fuel truck to a sedan.
So yes, the battery life was insane: about 22 hours on the ZenBook Duo running on a single screen and 25 to 28 hours (1,704 minutes) on the Lenovo prototype laptop that we used in early tests at CES. Those tests were performed by looping a 4K video until the battery expired. When asked to do a bit more work (simulating office work via the Procyon Office benchmark) battery life dropped to “just” under 14 hours on the ZenBook Duo — still basically the best results we’ve ever seen. We’ll break down the battery life a bit more on the Asus ZenBook Duo in our upcoming, dedicated review.
The Asus ZenBook Duo includes two screens, but one can be covered up with the keyboard (and switched off) to emulate a clamshell laptop.Foundry / Mark Hachman
Again, Intel wants us to believe that the Core Ultra has the performance of its “Arrow Lake” chips with the power draw of its Core Ultra Series 2 (Lunar Lake) chips. We can check that, sort of, by tracking the power consumption of a Lunar Lake and a Panther Lake notebook as they undergo a benchmark. In idle, the Core Ultra 3 chip draws about five watts, but can drop down under a watt. Lunar Lake averages about three watts or less in idle.
It’s not apples to apples, though. Intel used TSMC’s N3 process technology to manufacture the CPU tile in the Series 2 Lunar Lake chip, while Panther Lake uses Intel 18A, with some tiles split between the two companies. In this case, Intel’s older Lunar Lake is a 17W TDP chip, while Panther Lake is 25W — more power to the chip typically means better performance and worse battery life, but the larger battery and Intel’s architecture seem to offset this.
Here’s a power graph showing the two chips in idle, then running a benchmark, then dropping down into idle once again. This graph just measures the power going into the CPU package, not the entire laptop. That power could vary significantly, and is best left to the battery-life comparisons you’ll find in our individual laptop reviews. Still, Panther Lake is throwing a lot more power and performance at the benchmark, and this graph demonstrates that if a Lunar Lake and a Panther Lake laptop contained the same battery capacity, the older Lunar Lake laptop could win.
This power graph tracks how the laptop moves from an idle state to opening Cinebench 2023, running the multithreaded benchmark, and then giving it some time to return to an idle state.Mark Hachman / Foundry
Still, if Intel convinces laptop makers to add larger batteries to Panther Lake laptops, though, look out. Laptop battery life numbers could explode upwards!
Performance still drops while on battery
One of the interesting things about Qualcomm’s Snapdragon chips is that they run at full power all the time. Intel’s Core Ultra chips do not, clocking down to lower power consumption, extending battery life.
I run all of our benchmarks on wall power, battery power, and at Windows’ maximum allowable settings, just to see how performance varies in different user scenarios. As you’ll notice in our Cinebench 2024 benchmarks, the single-threaded performance usually associated with OS tasks remains unchanged between wall power and battery power, keeping Windows as responsive in both scenarios.
But look here: Intel’s Core Ultra Series 3 chips seem to maintain their performance on battery much better than Intel’s Core Ultra Series 2 or Series 1. We’re using three real-world benchmarks to test this. First, here is Procyon Office, which performs various tasks in Microsoft Office / 365. Performance drops by about 20 percent on battery.
Mark Hachman / Foundry
However, on our custom real-world Handbrake test, where the laptop is asked to transcode the open-source Tears of Steel movie, performance dropped by just three percent between wall power and battery.
Here, you can see how our test Panther Lake laptop fared compared to the competition. This is a custom test, different than the one we run in as part of our laptop reviews. I also made sure to download an Arm-specific version of the app, but Qualcomm’s chip fared exceptionally badly here. It usually performs quite well.
Foundry / Mark Hachman
Since we’re looking at real-world benchmarks, we can see that Intel’s Core Ultra 300 / Panther Lake fares well in PugetBench’s Photoshop test. The test uses the shipping version of Photoshop. Here, performance dropped just about three percent on my tests when I unplugged the laptop.
Keep in mind that CPU-specific tests are one of Snapdragon’s strengths. And with the Snapdragon X2 Elite generating exceptional CPU performance — preliminary numbers crush Panther Lake, and the Elite X1 still ranks highly — this might be an area where Qualcomm catches up. This race ain’t over.
Mark Hachman / Foundry
Unfortunately, Puget Systems’ PugetBench benchmark hadn’t caught up to the version of Adobe Premiere Pro (26.0) that Adobe makes available for download, so I was unable to test that application.
Panther Lake’s GPU performance is incredible
Remember, Intel’s flagship Panther Lake chip is the Core Ultra X9 388H–this is different. The Core 9 is now the Core X9, which means the GPU has 12 Xe3 cores. Essentially, the “X” means that you’re getting the best Intel has to offer in terms of graphics.
What does this mean? For years, integrated graphics has been able to play games: older, 2D sprite-based games, and some older 3D games at lower settings. They ran. And that was fine. With Panther Lake, we’re navigating a transition into integrated graphics performing almost as well as gaming-class discrete graphics — and when you add AI upscaling and frame generation to the mix, recent top-tier titles are near your grasp.
Some gamers refer to those as “fake” frames, which is why it’s helpful to look first at both traditional, non-accelerated tests. Here, we use UL’s 3DMark, specifically the Time Spy and Steel Nomad Lite benchmarks.
Mark Hachman / Foundry
A terrific increase in gaming performance
This was one of the big stories of CES 2026: Intel’s claims that Panther Lake offered the power of a gaming laptop with a discrete Nvidia GeForce 4050 GPU, but inside an integrated package.
This, for me, was the eye-opening moment. A year or so ago, I was testing Intel’s Core Ultra 1 (Meteor Lake) and Core Ultra 2 (Lunar Lake) with custom runs of games like Cyberpunk: 2077 at Low settings, which we show below.
Mark Hachman / Foundry
But those tests prompted me to “graduate” Panther Lake into our gaming benchmarks, too, with the settings that traditionally more powerful laptops now use. Even using our aggressive gaming settings, a game like Shadow of the Tomb Raider reaches playable frame rates. (Skip down to find these results.) Yes, it absolutely is an older game, dating from 2018. Yet Shadow was a top-tier AAA title, and integrated graphics has caught up. And that’s just pure, unadulterated, farm-to-table frames, too.
Don’t get too excited, though. Metro: Exodus was released in 2019, but its 4A Engine remains out of reach for Panther Lake. On our test laptop, the game averages 24 frames per second when run at 1080p on the Highest setting — 35 fps if Windows’ performance settings are cranked to their maximum.
AI frames make an enormous difference, if they’re supported
It feels very strange to test Intel’s Core Ultra Series 2 chip using dialed-down benchmarks centered around 1080p gaming at Low settings — often a hint for a PC gamer that it’s time for a new machine or card. But the Core Ultra Series 3 hit the 60 fps threshold that signaled a “playable” game even with just rendered frames. Panther Lake’s GPU also includes two different methods of artificially increasing frame rate — 2X upscaling, or rendering a frame using a lower resolution and then increasing the resolution to the desired level — and XeSS 3, which can interpolate three additional frames using AI. Naysayers call these “fake frames,” but Panther Lake allows for purists and more aggressive gamers alike to find what they want.
Our test laptop shipped with Intel Graphics Software, a custom Intel app that allows you to control various aspects of your display and graphics — including forcing on XeSS frame generation, or AI-generated frames that can inject up to four interpolated frames for every frame the GPU renders. That’s big — or is it?
What I discovered is that, yes, turning on frame generation can make an enormous difference. Simply turning on upscaling and XeSS 3 increased the framerate to a whopping 140 frames per second! Dialing up the Windows power slider tacked on a few additional frames. Both are included in the “Max” result at the top of the chart.
Mark Hachman / Foundry
The effects seem to differ depending upon the image quality, though. When running Cyberpunk on our traditional 1080p Ultra settings, frame rates jumped from 52 to 92. Pushing the Windows slider to maximum performance gave me frame rates of 143 fps.
The difference, though, is that Cyberpunk specifically supports XeSS modes. Metro: Exodus does not — and “forcing” XeSS on using the Intel Graphics Software app didn’t work. Modern games seem more forgiving of older hardware, and support for AI frame generation certainly makes those games playable by modern laptops. Still, I wonder if there will be a tier of AAA games like Metro: powerful enough that Panther Lake laptops won’t be able to run them, but old enough that they won’t be able to support the frame generation that would otherwise bridge the gap.
I tried a handful of other games. Total War: Warhammer 3 crashed when running the “battle” benchmark, but its campaign map benchmark played back at 44 frames per second at 1080p High settings. The 2014 Thief remake produced a even 60 fps when played at 1080p at the Highest settings. Neither supported XeSS or any frame generation. Forza Horizon 6 generated 62 fps on 1080p Ultra settings with frame generation forced on, but without explicit support for it.
Can Panther Lake compete with a 4050 laptop?
This was the most provocative claim that Intel made about Panther Lake at CES, right before we had a chance to test out the chip on a prototype Lenovo laptop. Using purely rendered frames, it falls a bit short. When frame generation is included, it keeps up.
Would a gamer with a desktop PC running a GeForce RTX 5090 turn off AI frame generation? Possibly. I think that most enthusiasts, already feeling the pinch of skyrocketing RAM, SSD, and GPU prices, will turn on frame generation without much thought. Again, here’s the Core Ultra X9 388H running Shadow of the Tomb Raider, without frame generation, facing off against a number of existing, but older gaming laptops.
Foundry / Mark Hachman
And here is the Core Ultra X9 388H running Cyberpunk 2077 with frame generation enabled. This feels like a scene from an 1980s TV show, where Voltron finally pulls out his blazing sword or K.I.T.T. goes into turbo mode. The episode would be a lot simpler if both had happened from the get-go.
Basically, setting aside the scorn some have for AI and “fake frames,” AI frame generation is the “win” button here.
Foundry / Mark Hachman
AMD’s Ryzen AI Max is another option
AMD tried to work the refs (us) harder than Indiana football coach Curt Cignetti complaining about personal fouls during a halftime interview. The company claims that we should be comparing Intel’s Panther Lake to AMD’s Ryzen AI 400 chips as well as its Ryzen AI Max processor instead.
To that, we say, ship us one! We’re happy to review the Ryzen AI 400 when laptops finally are available. As for the Ryzen AI Max, well — we’ve reviewed it inside a (Framework) Desktop, and we’ve seen it in an HP ZBook Ultra G1a laptop, too. As our review benchmarks show, the Ryzen AI Max outperforms the Core Ultra X9 388H handily, though we’d probably put it in a tier that Intel’s eventual HX gaming processors will eventually compete against, rather than a power-sipping laptop chip.
AI is less important than before
But the Ryzen AI Max does have a point, so to speak. If people do want to run private LLMs locally, the Ryzen AI Max (Strix Halo) does provide gobs of VRAM necessary for such LLMs to run. An AMD driver allowed the Framework Desktop to assign 96GB for running LLMs. Our Asus ZenBook Duo review unit, which has an Ultra X9 388H and 32GB of RAM, supplied 18GB of VRAM for games and AI applications. That includes an NPU that can provide 50 TOPS, or 122 total TOPS with the GPU roped in.
And let’s face it — AI has struggled on the PC, leaving us wondering a bit if the early emphasis on the NPU was worthwhile. What we do know is that the graphics chip is the most powerful AI processor. UL provides several benchmarks; I’ve ditched the abstract “Vision” benchmark in favor of the Procyon image-generation (AI art) benchmark. (The test is a work in progress, excluding Arm and providing an odd implementation for AMD’s Ryzen processor.) But UL’s test can generally run on either the NPU or the GPU, with some exceptions.
Basically, this test reflects the score UL assigns to the process. In the real world, it shows that the ZenBook Duo with a Core Ultra 3 chip inside creates a 512×512 image once every 4.5 seconds using the GPU, while our test laptop with Intel’s Core Ultra 2 chip inside creates the same image once every nine seconds. But the Ultra X9 388H’s NPU performance suffers, and the AMD’s Ryzen AI NPU outperforms it, too.
Mark Hachman / Foundry
The same goes for running UL’s LLM benchmark. Originally, this test was one of the few that evaluated the NPU, and that was useful. But as Procyon begins adding support for the GPU, it does make you wonder why we’re using an NPU when a more powerful alternative is right there.
Procyon’s test loads and runs several models, then provides a series of prompts and generates a score. Some tests simply don’t run on some processors (Arm, again) and only run on a couple of the others. Some tests will only run on the NPU. This test is really best to compare the three generations of Intel Core Ultra processors.
Again, the test doesn’t do a great job in describing real-world results. In this case, the plugged-in Core Ultra 3 system running the LLama 3.1 (8 billion parameters) on the NPU generated about 20 tokens per second, which would appear on your screen at about four characters per token per second–a comfortable reading speed for me. Using the GPU under the Windows balanced settings, the token output was about 25 tokens per second for the same model. Running Llama 2 (13 billion parameters), the token output was between 13 to 15 tokens per second, which might be a little slow.
Foundry / Mark Hachman
I considered noodling around with Intel’s AI Playground, but the app stalled out when preparing the llamaCPP-GGUF backend, so I abandoned the project.
2026 will be an interesting year
I honestly thought that 2025’s crop of laptop processors were the best ever — you could buy a laptop whose processors were made by either AMD, Intel, or Qualcomm and go away happy. But 2026 looks like it could be even better.
Remember, though, that Intel is first out of the gate with this new generation of chips. AMD will eventually answer with the Ryzen AI 400, and Qualcomm’s Snapdragon X2 Elite Extreme is waiting in the wings. Given that Intel traditionally commands about 80 percent of the notebook PC processor market, an early jump could be a powerful advantage, especially with only older chips to compare it to. But we’re not done yet! Read...Newslink ©2026 to PC World |  |
|  | | | PC World - 24 Jan (PC World)The Philips Hue app has grown in complexity over the past several years, with more and more new features and settings being tucked into the user interface. That makes it all too easy to miss the latest and greatest in Hue functionality.
Besides such newer features as light alarms, “mimic presence,” and—flashiest of all—Hue’s motion-sensing MotionAware technology, there are also old favorites like light timers and fade durations. Taken together, these (often well hidden) features can add polish to your Hue routines, keep your daily schedules on track, and even protect your home from would-be intruders.
Read on for seven hidden Hue features you need to try, starting with…
Mimic presence mode
Philips Hue recently made the jump into the home security market, adding a series of Hue Secure-branded cameras along with door and window sensors and floodlights, But even if you’re not ready to invest in a Hue camera, you can still boost your security with this built-in Hue feature.
Just activate the “Mimic Presence” mode, and the Hue app will automatically turn your lights on and off to “mimic the activities that would be expected in those types of rooms,” perfect for scaring off burglars or anyone else thinking of sneaking into your empty home.
The “Mimic presence” feature is tucked into a long list of options on the Automations screen. Tap it, select one or more rooms, then select some or all the lights associated with the room. You can also set the mode to run all day, or only when it’s dark outside.
To trigger the “Mimic presence” mode, just locate it on the Automations tab and tap the “Play” button, or “Stop” to deactivate the mode. You can also set a Hue button to turn “Mimic presence” on or off, or map the mode to a Hue smartphone widget.
Light alarms
Here’s another Hue feature that can help keep your home safe. Rather than using audible alarms to ward off intruders, the Hue app can trigger light-based alarms that rapidly flash some or all of your Hue lights, ideal for spooking crooks as well as alerting your neighbors that something’s amiss.
To use light alarms, you’ll need a supporting device such as a Hue Secure camera, a Hue motion sensor (there are indoor and outdoor models), or a Hue contact sensor. You’ll also need to enable the Hue Security center (Settings > Security).
As you’re configuring Hue Security, it will guide you through the process of setting up a light alarm. For example, you’ll be able to decide which lights in your home should flash when the alarm is tripped, as well as whether your lights should flash white or red.
You can set your Philips Hue light alarms to flash either white or red.Ben Patterson/Foundry
Besides a flashing light alarm, you can also create automations that trigger light scenes when a Hue sensor detects activity. For example, I created an automation that activates a bright light scene in my downstairs home office whenever the upstairs kitchen door is opened, handy for letting me know when my daughter comes home from school.
In my case, I drilled down to the settings for my Hue door and window sensors (Settings > Devices > Sensors > Contact Sensors), selected the lights that I wanted triggered (Office), then adjust the Behavior settings (“Bright” when the door is opened, “Return to previous state” when the door is closed).
MotionAware motion-sensing technology
One of the coolest new Philips Hue features is also one that’s restricted to the new Hue Bridge Pro. Dubbed “MotionAware,” the feature lets you turn your Hue lights into motion sensors, perfect for triggering light scenes when people enter or leave an area.
MotionAware works its magic by detecting disturbances in the wireless Zigbee signals used to connect Hue lights to the bridge, and you’ll need at least three MotionAware-capable Hue lights in an area to enabled a motion zone.
MotionAware does have its foibles and as I just mentioned, it only works with the Hue Bridge Pro, not the standard Hue Bridge. Still, it makes for a great way to quickly deploy motion zones throughout your home, and while the $98.99 Bridge Pro is roughly twice the price of the regular bridge, it’s still cheaper than paying for individual Hue motion sensors in each room.
The Hue Bridge Pro can turn your Hue lights into motion sensors.Ben Patterson/Foundry
Fade duration
The Hue app offers both wake up and sleep automations that will gradually boost or dim the brightness in the morning or at night, but you can also set any light scenes to gradually fade in or out.
I find the “fade duration” setting particularly handy for smoothly changing daytime light scenes from, say, a warm morning glow to a cooler daytime look, all without making anyone in the room do a double-take. (The Hue app also has a “Natural Light” scene that changes the light temperature throughout the day, with an optional “transition time” between each time window. I just happen to prefer my own scene settings.)
Whenever you create a custom automation (tap the Automations tab, then tap the blue “+” button in the top-right corner of the screen), the Fade Duration option will appear as you’re setting the start time for the routine. The duration itself can last for anywhere between five minutes to an hour.
If you’re adjusting an existing scene, you’ll need to tap “Start at” or “End at” to access the Fade Duration setting.
Light effects
Hue lights can do more than just shine in solid colors or white color temperatures. They can also flicker, shimmer, and pulse in a variety of shades, perfect for simulating a candle, a fireplace, or a lava lamp.
Just select any of your Hue lights in a room, then tap the Effects button (the one with a sparkly icon) next to the color and white color temperature controls. You’ll then see a variety of light effects, depending on the type of Hue light you’ve selected. Dimmable-only Hue white lights will offer Candle effects, while dimmable and tunable White Ambience lights will add Glisten and Sparkle effects. While and Color Ambience lights get even more effects, including Fireplace, Underwater, Cosmos, and Opal, all with different color options.
Among the Hue light effects you can choose are Candle, Fireplace (left), Underwater, and Cosmos (middle). You can also adjust the color, brightness, and speed for any of the effects (right).Ben Patterson/Foundry
Even better, you can create your own custom effect for each category; tap Create Fireplace, for example, and you can pick your own color, as well as adjust the brightness and speed.
Once you’ve added effects for different lights in a room, you can create a scene with those effects by tapping the Save button.
Animated light scenes
Speaking of scenes, you can animate any Hue scene in a room (that is, any scenes aside from basic scenes like Bright, Dimmed, and Nightlight) with a single tap.
If you see a “Play” icon on a light scene, try tapping it; when you do, the scene will animate itself, pulsing gently according to its own rhythm. Tap the Edit button to change the brightness or speed of the animation, or to make the scene animate itself by default.
Light timers
Need to set a timer but hate the blare of an audible alarm? Just as the Hue app offers light alarms, it also has light timers that can activate lighting scenes after a set duration of time.
Tap the “+” button on the Automations screen, then select the Timer option. You’ll need to set how long the timer should last (anywhere from a minute to 24 hours), which room or rooms will be included (or your entire home, if you like), and which scene you’d like activated when the timer ends.
Once that’s all set, you’ll see your new timer with a “Play” button at the top of your list of automations. Press the Play button, and when the countdown ends, the timer will trigger the lighting scene you selected.
This feature is part of TechHive’s in-depth coverage of the best smart lighting products. Updated with details about Hue’s MotionAware motion-detection technology. Read...Newslink ©2026 to PC World |  |
|  | | | PC World - 24 Jan (PC World)Do you know what your gaming setup needs? A monitor upgrade! Luckily for you, LG’s 32-inch 1440p gaming monitor is on sale for just $200 on Amazon, a 33% discount off its $300 MSRP and aaaaalmost its best price of all time (it was a tad bit lower on Black Friday, but that’s it). In short, this is absolutely worth jumping on, and you’ll see why below.
View this Amazon deal
The LG UltraGear 32GS60QC-B was built to make your gaming life more beautiful and more immersive. With its massive 32-inch panel and a gorgeous 2560×1440 resolution, this monitor promises great visuals with vibrant colors and tons of crisp details. The 180Hz refresh rate is great for all but the most hardcore of gamers, and the 1ms response time is quick enough for all but the most competitive of gamers.
The 1000R curvature on this screen is neither dramatic nor gentle, right in that sweet spot area that wraps around your vision to make things more immersive and ease eye strain, but not so much that it’s disorienting. The 32GS60QC-B also supports AMD FreeSync technology, which minimizes screen tearing and stuttering for a fluid visual experience by syncing the images to your GPU.
As far as connectivity goes, this monitor features a DisplayPort and two HDMI, making it easy to connect to multiple devices at once should you have different machines for gaming and work, for instance.
All things considered, this is an excellent monitor whether you’re gaming, working, browsing the web, or just watching Netflix and YouTube, and all of these features are darn good for this discounted price. Get the LG UltraGear 32GS60QC-B for only $200 while you can! Or if you want to keep looking, check out our roundup of the best gaming monitors.
Get $100 off LG`s 32-inch 1440p 180Hz gaming monitorBuy now at Amazon Read...Newslink ©2026 to PC World |  |
|  | | | PC World - 24 Jan (PC World)Just when Intel seemed to be on the cusp of success, the company reported supply issues that will limit the number of available PC chips that PC makers will be able to buy.
Moreover, Intel said that it’s prioritizing the higher-margin data center chips with what it can manufacture, leaving supplies of upcoming chips like the Core Ultra Series 3 (Panther Lake) apparently constrained. Intel also said that its next-generation processor, Nova Lake, would arrive at the end of 2026.
“So it’s just literally hand to mouth what we can get out of the fab and what we can get the customers, is how we’re managing it,” David Zinsner, Intel’s chief financial officer, said during a call with analysts reporting Intel’s Q4 earnings for 2025. A transcript was recorded by Investing.com.
“Obviously, we’re shifting as much as we can over to data center to meet the high demand, but we can’t completely vacate the client market,” Zinsner added. “So we’re trying to support both as best we can and obviously work our way out of this supply issue. I do believe that the first quarter is the trough. We will improve supply in the second quarter.”
The problem right now is two-fold, Intel chief executive Lip-Bu Tan explained: though Intel is now shipping Panther Lake chips using its 18A technology, the company’s yields—the number of “good” wafers, capable of making finished chips—is meeting internal expectations but not enough to meet demand. The company ate up most of its in-house supply of chips during the fourth quarter and is down to about 40 percent of “peak levels,” executives said. Intel said that its supply would continue to increase during the course of the year.
Intel’s processor supply crunch comes at a time when the PC industry is facing acute shortages of memory and flash storage, all of which are having a negative impact on PC sales and prices. First benchmark impressions of Intel’s Panther Lake were terrific, but if customers can’t get them, then no one wins. Tan implied that Intel’s own allocation strategy could be brutally practical, favoring larger customers over small.
“Some of the bigger players and the OEMs and the bigger player in the hyperscale [business], they have more access into the memory allocations,” Tan said. “And then secondly, I think some of the smaller ones, they are really challenging to scramble to get the memory. So I think that will be very important for us, Dave and I, how to allocate and also our sales grid and how to allocate to the right customer. We don’t want to have a CPU we send to them but they are missing the memory. They cannot complete the products. So we try to do it correctly.”
Intel executives said that the company has “very active” engagement with customers on its foundry business, specifically on the Intel 14A manufacturing process.
Finally, Intel chief executive Lip-Bu Tan said that Intel would consolidate its data center and AI programs under a single leader, and “simplified” the company’s enterprise roadmap on the 16-channel Diamond Rapids part. Intel also said that it continues to work closely with Nvidia to build a custom Xeon fully integrated with its NVLink technology, as per Nvidia’s $5 billion investment last year, but there was no news of any RTX GPU chiplets for Intel PC processors.
Intel reported a loss of $600 million on revenue of $13.7 billion, down 4 percent from a year ago. Intel’s Client and Computing Group reported a 7 percent drop in revenue to $8.2 billion. Intel projected lower sequential revenue, between $11.7 billion and $12.7 billion. Read...Newslink ©2026 to PC World |  |
|  | | | PC World - 24 Jan (PC World)A VPN can actually offer advantages when gaming—despite some assumptions to the contrary. Here are three reasons why (and when) it can be worthwhile to use one.
If you’re looking for an even more in-depth look at this topic check out our article of all the pros and cons of using a VPN while gaming.
1. Protection against DDoS attacks
A DDoS attack is a targeted overload of an internet connection or server. It involves sending a large number of requests in a short period of time until the connection breaks down. For gamers, this means that online games are interrupted or cannot be started at all.
For such attacks to be possible, the internet connection must be specifically targeted. In online gaming, this is possible in some cases because, depending on the game and connection technology, your own IP address is visible to other players or servers. This IP address then serves as a target for attack, directly overloading your private internet connection.
A VPN prevents this by hiding your IP address. Instead of your private address, only the IP of the VPN server is visible to the outside world. Attacks no longer affect your own internet connection, and end up hitting a dead-end.
Not every VPN provider is equally suitable when it comes to security and data protection. Check out our list of the best VPNs to see which services for gaming, privacy, security, and more. Our overall winner is NordVPN, and Surfshark is a great option if you’re on a budget.
2. Positive effect on ping and connection quality
You read that right: there are certain situations in which a VPN can stabilize or even improve your connection. This depends primarily on the route the data takes between your connection and the game server.
Internet service providers do not always route data packets via the most efficient route. Instead, they use detours or heavily congested hubs. A VPN can change this route and direct data traffic via more stable or better-connected routes. In such cases, the ping does not necessarily decrease, but the connection becomes more consistent and reliable.
This is particularly noticeable with international game servers or during peak times. However, whether this effect occurs depends on several factors and can only be determined by direct comparison.
Under certain conditions, a VPN can improve the connection. The quickest way to find out is to run a test.earthphotostock/Shutterstock.com
3. Access to additional servers and game regions
Many online games use regional servers to keep connections stable and ensure fair competition. Players are automatically assigned to a specific region.
A VPN changes the virtual location and thus enables access to game servers in other countries. This also allows you to use game regions that would otherwise be inaccessible.
This is particularly relevant if you want to play with friends abroad or specifically use other server pools. A VPN also offers a direct alternative if you encounter problems with regional servers.
Please note that not all games allow the use of a VPN. Some providers expressly exclude this use in their terms and conditions. Before using a VPN, you should therefore check whether its use is permitted for the game in question.
When a VPN is not useful for gaming
A VPN is not a “hack” for every gaming situation. In some cases, its use adds no value or even has a negative effect on the gaming experience.
This is especially true for fast-paced, competitive online games where every millisecond counts. If your internet connection is already stable and the direct route to the game server is working optimally, a VPN will usually increase latency. In such cases, the additional detour via the VPN server outweighs the potential benefits.
A VPN is also usually unnecessary for local or regional lobbies. If you are playing with players from your region and have no connection problems, you’ll hardly benefit from additional security at the network level.
In addition, some games restrict or prohibit the use of VPNs. If you dial into other regions using a VPN, you risk restrictions or sanctions depending on the title. Especially in a competitive environment, the use of a VPN should therefore be carefully considered.
In short: a VPN is not a standard tool for every game, but a situational addition. Read...Newslink ©2026 to PC World |  |
|  | | | PC World - 23 Jan (PC World)After giving us such incredible innovations like pens that write upside down and ice cream that tastes bad (and a pretty huge portion of modern technology), NASA is looking earthside for its next breakthrough. Okay, “breakthrough” might be vainglorious for a computer benchmarking tool. I’m burying the lede here: NASA wants to use CapFrameX for its giant flight simulators.
CapFrameX, if you’re not aware, is a popular benchmarking tool. Users like its ability to capture and analyze system and performance info with a dizzying number of readouts and tons of customization. It’s based on PresentMon, an open-source project from Intel.
According to the official CapFrameX Twitter account, the US National Aeronautics and Space Administration (they went to the moon a few times) has “expressed interest in using CapFrameX to assess FPS performance for cockpit simulator video systems and has started the US government software approval process.” Subsequent comments from the account say that “they started the approval process.”
It makes sense that NASA is heavily invested in making its flight simulators work effectively. Even regular pilots need to rack up thousands of hours in simulated flights before they get into the cockpit of a real machine, so controlling the machines that get into the upper atmosphere and orbit of Earth has literally higher stakes.
Today, NASA uses a lot of commercial software and hardware in its flight simulator setups (as Tom’s Hardware notes) but its elaborate training sims are still some of the most advanced in the world. These include full-motion, fully enclosed systems that move on their own axes.
If I were a software dev working on an open-source benchmark tool, I’d be stoked that NASA thought I had the right stuff. Cheers, CapFrameX. Read...Newslink ©2026 to PC World |  |
|  | | | PC World - 23 Jan (PC World)Guys, I’ve done it. I’ve drunk the kool aid and joined the pantheon of super gamers that will tell you without doubt, that if you aren’t gaming on an OLED monitor, you’re missing out. That’s what I wish I’d been able to tell myself earlier, at least. This upgrade was a long time coming, but finally, after eight years with my previous main monitor, I bought an Alienware AW3225QF and there’s no looking back.
I believed the hype for a long time, and have had a monitor upgrade on my to-do list for a number of years, but the timing was just never quite right. There was something else more important to upgrade next, or the pricing wasn’t right, or I was waiting for the right monitor to come along.
But this recent Black Friday I finally pulled the trigger. I got in at a price that worked for me, and now I work and game on a 32-inch, QD-OLED, 4K, 240Hz monitor that is every bit as good as I hoped, and more. It’s not perfect, but I do wish I’d bought it sooner.
Neglecting monitor upgrades is silly
I now realize that I’ve been rather foolish with my upgrade focus. I switched up my processor and graphics card in 2023, and updated the memory and storage in 2024. A new case too, because my old one looked trash and one of the fan covers was dented. And the CPU cooler needed upgrading too, for something quiet.
my magnificent new oled monitor
Alienware AW3225QF
Read our review
Best Prices Today:
$1199.99 at Dell
All the while I was gaming on a monitor from 2016. The Asus MG279Q was a great gaming monitor when it first released: 1440p resolution, 144Hz refresh rate, IPS panel, 4ms response time, and FreeSync support. It was almost as good in 2018 when I bought it, but by the mid-2020s, it was really starting to show its age.
It’s still a decent gaming display. Still pretty fast, and 1440p still looks great. But it’s not OLED. The 4K resolution and 240Hz refresh rate of the new monitor are nice, but OLED is the real game changer here.
Upgrading my monitor has been more noticeable than any of the performance or cosmetic upgrades I’ve made in recent years. I should have prioritized this sooner.
It looks gorgeous… but not always
Obviously OLED is the best and it looks the best and anyone who says different is not the best and they’re wrong. Obviously. But my first impressions of the Alienware AW3225QF weren’t as groundbreaking as I was expecting. In games at least. A few HDR videos on YouTube looked like I could have grabbed the dripping honey right off-of the screen.
Jon Martindale / Foundry
But when I jumped into Warhammer 40K: Space Marine 2, expecting this color-popping epic of gorgeous proportions, it all looked washed out. Super bright on the highlights and some decent contrast, but not the life-changing experience the Kool Aid had promised me. Once I realized I didn’t need (or want) HDR turned on in non-supporting games (or Window 11’s desktop, for that matter), it all looked and felt far better.
The inky blacks were there, the rich and vibrant colors, and in games and with movies and videos that support HDR, I could switch it on with a quick shortcut (Windows key + Alt + B) to get those eye-popping highlights I was hoping for.
One area it is 100 percent, undeniably better than my old monitor, though, is reflection handling. Even with a glossy panel like this Alienware model has, it’s a million miles beyond what was possible on my 2016 display. Where before, a bright light behind would illuminate my silhouette no matter what I was watching, now, I can’t see a thing. Sure, the curve introduces the odd weird reflection that I have to counter, and it’s not a scratch on the matt displays out there. But compared to what I had? Night and day.
I can finally play all the games I’ve been waiting for
I didn’t realize my list of games that I’d “Play when I get an OLED,” had grown so long. Space Marine 2 was a relatively recent addition, but since it was on sale the day I brought the monitor home, it was an easy first play.
Who needs triple-A titles when you have pixel graphics?Jon Martindale / Foundry
Other games I’ve been holding off on playing just so I can enjoy them for the first time on a monitor that doesn’t wash out the blacks include: God of War Ragnarok, Clair Obscur: Expedition 33, Kingdom Come: Deliverance 2, Hellblade 2: Senua’s Saga, and the Final Fantasy VII remake, among others.
My Steam wishlist is currently 155 titles long, which is utterly ridiculous, and more a reflection of my dad-of-young-kids phase of life than my previous lack of an OLED monitor. But now I will make some progress through it. Probably. When I’m not using this monitor to write about buying it.
The price barely changed
I did manage to get quite a good deal on the monitor this Black Friday just gone. “Just” £640 ($857 after taxes) and it’s definitely a 100 percent work expense, so I can write off some of the taxes on it. That’s around £200 ($268) off its historic average, and almost half the price it originally launched at. But that’s a complete outlier.
A look at this monitor’s pricing history shows that it typically bounces between £850 ($1140) and £990 ($1,325), and that’s been about it since the monitor released. It’s only been this last sale where it went anywhere south of that range. Whether I’d bought it two weeks after it came out, or right now, the only real time the price would have been different is when I got it.
Sure, in terms of pure savings I waited for the right moment, but if I hadn’t gotten lucky here, I wouldn’t have saved much at all. I neglected this upgrade for almost two years and it was almost for nothing.
Valheim never looked so good.Jon Martindale / Foundry
And next year? It could get even worse. While OLED technology might be getting cheaper, just about all electronics look poised to get more expensive in 2026 as the memory pricing crunch radiates out through the industry. Although monitors may not be directly affected, manufacturers everywhere might be forced to raise prices to offset the lost margins on memory-adjacent products.
I still haven’t upgraded my TV though
I enjoy big movies and TV shows as much as anyone else, and do plan to upgrade the big living room TV to an OLED at some point too. But that’s another expense that keeps getting pushed down the list, with my 7-year-old, non-HDR Samsung TV being perfectly adequate, for now. But I could have been watching HDR movies and TV shows with inky blacks on my PC for a much more affordable upgrade. Where my TV plans stretch into the near $2,000 territory, I got this monitor for less than half of that.
In the absence of a TV overhaul, an HDR monitor is a very capable alternative. I don’t plan to watch too many movies by myself in my office, but I do have the option now. Not to mention non-HDR movies look utterly gorgeous with QD-OLED-boosted coloring. I’m going to have to rewatch Redline for sure.
It’s awesome and I should have done it sooner
I’m still merrily skipping through the honeymoon phase with this monitor, so I’m sure I’ll bump up against some issues, or eccentricities in the months that come, but for now, it’s just gorgeous.
I didn’t need to go quite this fancy, though. I can take or leave the curve, and the 240Hz refresh rate, while nice and smooth, is complete overkill for a non-competitive gamer like me. All my lightweight indie games can now run at a buttery smooth infinite FPS, though, so that’s nice.
Jokes aside, this is a gorgeous monitor and the Kool Aid drinkers aren’t kidding. OLED really does look like nothing else when those high-contrast scenes hit. Mini LED isn’t far off though, so don’t pigeon hole yourself on a specific technology — especially if you’re working and gaming in a brighter room, or if you still don’t want to risk burn-in.
For me, though, this one was worth the wait… even if I wish I hadn’t. Read...Newslink ©2026 to PC World |  |
|  | | | PC World - 22 Jan (PC World)Ever since I watched Wall-E (what a great movie!), I’ve had a soft spot for cute robots. Is that why I adore this lovable Ugreen Uno charger block? Maybe! But it’s not the only reason. There’s a lot to like about it, starting with the fact that it’s now on sale for just $17 on Amazon. That’s a huge 43% off its original $30 price tag!
View this Amazon deal
This cute little power adapter plugs into any regular AC outlet and provides a single USB-C port for charging. Not only is it adorable, but it’s also made with GaN II technology. What does that mean for you? It’s more compact than older USB chargers, plus faster and more efficient with less heat production. The best of all worlds, really.
That USB-C port on top reaches up to 30 watts of power delivery, so you can fast-charge your phone, tablet, wireless earbuds, and other accessories and peripherals. The charger’s prongs can be “hidden” by its little boots, protecting them during travel. And those boots are magnetic, by the way! When it’s plugged in, you can magnetize those boots somewhere safe and not worry about losing them.
But the cutest part? It has a digital LED face with expressions that change based on the current charging status. Grab the Ugreen Uno 30W USB-C power adapter for $17 before this budget-friendly deal vanishes!
The Ugreen Uno delivers 30W of USB-C power, now 43% offBuy now at Amazon Read...Newslink ©2026 to PC World |  |
|  |  |
|
 |
 | Top Stories |

RUGBY
Early positive signs for young Highlanders' loose forward Lucas Casey More...
|

BUSINESS
People appear to have regained confidence it's okay to shop More...
|

|

 | Today's News |

 | News Search |
|
 |