
Search results for 'Technology' - Page: 10
| ITBrief - 7 Mar (ITBrief) This International Women`s Day, Inde Technology celebrates the rise of women in tech, achieving 40% revenue growth and increasing female representation to 24%. Read...Newslink ©2025 to ITBrief |  |
|  | | PC World - 7 Mar (PC World)We’re in the middle of an AI revolution, and while the new technology’s benefits are clear, figuring out where to get started can be confusing. You’re faced with buzzwords and lingo, and a nonstop stream of AI-related news makes it difficult to resolve how AI applies to you.
But here’s the good news: GeForce’s RTX 50 Series GPUs serve as a great hardware platform to explore generative AI right on your own PC. From running large language models to playing AI-enhanced games, RTX 50 GPUs make generative AI more accessible than ever before.
Follow along as we explain Generative AI, show you how it’s being used to transform science, and then help you get started with AI on your GeForce RTX 50 Series powered PC.
Shop AI on RTX at Nvidia
What is Generative AI?
Generative AI has become an umbrella term for various AI technologies. When you type a prompt into your AI engine of choice, a bunch of highly specialized AI programs called “models” spring into action.
These models individually excel at specific tasks, such as recognizing patterns in language, images, audio, computer code, and more. But they can also work together to solve more complicated problems. The models typically run on enormous computers in data centers, though that’s changing with PC hardware running GeForce RTX 50 Series graphics cards.
The models excel in breaking up problems into individual components, and then work together to generate helpful results. In effect, each model handles a portion of the problem they’re trained for, then passes that output to other models.
In fact, our brains handle these types of complicated cognitive tasks without any conscious thought at all. And now, with Generative AI running locally on PC hardware, these scenarios—once limited to human brain power—enter the realm of what’s possible on a computer.
Most generative AI starts with a large language model, or LLM, which understands normal written language. When you type a prompt for an LLM, the model parses it for meaning and then can pass that output to any number of other models—and there are models that generate images, audio, spoken voice, software code, and more.
NVIDIA
What does adding an NVIDIA GPU to your PC get you?
Generative AI represents an enormous technological shift, and it’s already having an impact on a variety of industries.
Programmers are using generative AI to write better code, faster than they were able to before. Drug companies are using AI to develop new treatments for diseases—treatments that are more effective or have fewer side effects.
And getting value out of generative AI isn’t limited to scientific breakthroughs. It can also summarize meetings, parse complicated datasets, generate your grocery list using your family’s favorite recipes, or just improve your PC gaming frame rates.
Bottom line: AI can help you get you more done in less time than what’s been possible with traditional software. And running generative AI models locally on your PC offers a few big advantages: costs are relatively fixed, processing is generally faster compared to cloud-based AI, and if you’re concerned about security, running AI locally means you don’t have to upload proprietary data to someone else’s servers.
Shop AI on RTX at Nvidia
Unlimited access to cloud AI models can cost $200 or more each month. That quickly gets expensive. But once you’ve bought (or built) a machine with a GeForce 50 Series GPU that’s capable of running AI locally, you’re really only paying for the electricity your machine consumes.
But GeForce RTX 50 Series GPUs offer even more benefits: raw speed. When you use one of the massive cloud-based AI services, you’re sharing compute time with millions of other users. On the other hands, when you’re running locally on a GPU you own, you don’t have to wait in line for your next prompt.
And the security benefits can’t be over-emphasized. Many of the free—and even some paid—cloud-based AI services train their models on prompts that users input. That means if you upload proprietary data as part of an online prompt, it can show up in later versions of the model.
You just bought an RTX 50 series GPU. What’s next?
Downloading and running generative AI models locally used to be difficult, but now there are tools that make it much more accessible. One of our favorites is Ollama, which makes it easy to run popular open models locally on your own PC, and take full advantage of your GeForce RTX 50 Series GPU.
To get started, grab the latest version of Ollama from the download page and install it on your PC. Once it’s installed, open a Terminal window and type ollama run llama3.2. This will download one of Meta’s smaller Llama LLMs and let you start chatting with it and experimenting with prompts. When you’re done, or want to try another model, just type /bye at the prompt.
The Ollama site maintains a list of all the different models that are available. We recommend checking out one of the new reasoning models, like deepseek-r1, which walk you through the thought processes behind their responses to your prompts. The models in Ollama are all free to download, but some of them are quite large.
If you want to try out AI image generation running locally on your GeForce RTX 50 Series cards, you’ll want to try Stable Diffusion. Make sure to watch this 5-minute tutorial video, which will help guide you through the process.
Once you’re done experimenting with text and image generation, you can fire up one of the 75-plus games that support DLSS 4. DLSS is NVIDIA’s suite of AI tools, which enable high-refresh rate 4K gaming, even in the most advanced ray-tracing games. Using DLSS Multi Frame Generation and the new transformer AI upscaling model lets your GeForce RTX 50 Series card get up to 8X the performance of traditional brute-force rendering. PCWorld has published an explainer on DLSS 4, and all the benefits it provides to PC gamers.
Between running local LLMs, creating AI art, and playing DLSS 4 games, you’ll have plenty to explore in leveraging the power of your GeForce RTX 50 Series GPU. For even more inspiration, check out NVIDIA’s content hub on how NVIDIA powers the AI world, along with this list of more than 100 RTX-accelerated creative apps that run local AI. NVIDIA hardware can power your AI projects today, and is ready to handle whatever comes throughout 2025 and beyond.
Shop AI on RTX at Nvidia Read...Newslink ©2025 to PC World |  |
|  | | PC World - 7 Mar (PC World)Bluetooth 5.0 was officially announced almost nine years ago. The wireless technology has undergone several updates up to the current version (5.4), but we had to wait a long time for the next big thing. Now the time has finally come: Bluetooth 6.0 is just around the corner. Read on to find out what’s in it for you and what capabilities the developers have given the new wireless standard.
Ingenious new feature: Channel sounding
We are particularly looking forward to one exciting new feature of Bluetooth 6.0: “Channel Sounding.” This technology lets Bluetooth devices locate each other with impressive precision. In the future, the direction and a fairly exact distance of one Bluetooth device to another can be determined.
The distance measurement achieves an accuracy of up to 10cm. This means that lost Bluetooth headphones, for example, can be found quickly, even if they are small in-ear models that you would often have to search for a long time (or in vain) by eye.
Bluetooth SIG
Channel Sounding also brings improvements in terms of security. For access authorizations such as contactless unlocking of locks (e.g. on cars), the technology promises more protection — for example, against man-in-the-middle attacks.
More reliability, less latency
With the help of the Isochronous Adaptation Layer (ISOAL), Bluetooth 6.0 can split longer data packets (such as audio) into smaller data packets for transmission and then reassemble them. The waiting time between sending such data packets is now also variable and can be flexibly defined by the communicating devices themselves.
Burst transfers (several contiguous data fragments) can be processed more quickly if a fast transfer is required. At the same time, the waiting time between fragmented data packets can also be extended if this makes sense for narrow bandwidth or small amounts of data. This should make transmission more reliable for sensitive or interference-prone applications.
The optimized ISOAL also enables noticeable improvements in the latency of wireless transmissions. This is not particularly important for applications such as simply listening to music. However, when gaming with Bluetooth headphones or watching films, even small latencies between sound and image can be perceived as very annoying. With Bluetooth 6.0, such problems could soon be a thing of the past.
Bluetooth Extended Advertising
With the new “Bluetooth Extended Advertising,” related data packets are transmitted on both a primary and a secondary radio channel. The so-called “decision-based advertising” allows a scanning device to use the content of a data packet on the primary channel to decide whether it should also search for associated data packets on the secondary channel. This can improve the efficiency of scanning and reduce the time spent on such operations, as scans on the secondary channel can be carried out on a more demand-orientated basis.
More efficiency and improved security
In addition to striking innovations such as channel sounding or ISOAL optimizations, Bluetooth 6.0 also brings a few more mundane but no less important upgrades: These include a further improvement in energy efficiency, which can also implement demanding functions in a more battery-friendly way and extend the device runtime.
In standby mode, the energy requirement is reduced again compared to Bluetooth 5.0. At the same time, the reliability and security of transmissions is improved.
The transmission quality in densely populated regions or in areas where many devices communicate simultaneously should also be less susceptible to interference with Bluetooth 6.0.
Switch more easily between devices
In addition to technical improvements, the developers have also thought about the user experience and improved usability. Bluetooth 6.0 should make it easier to switch between connected devices. If you use the same Bluetooth headphones for devices such as your smartphone, hi-fi system, or laptop, you will no longer have to go through the hassle of switching manually.
Further reading: 3 ways to quickly connect Bluetooth devices in Windows 11
When is Bluetooth 6.0 coming?
The specifications have been finalized and manufacturers can start equipping smartphones with Bluetooth 6.0. However, only a few devices that already support Bluetooth 6.0 are currently available. But a whole series of new smartphones with the improved wireless standard are expected to be launched in 2025. These include the flagship series from top manufacturers such as OnePlus, Apple, and Google.
Xiaomi
Which smartphones will get Bluetooth 6.0?
In the long term, all mobile phones will certainly be compatible with the new standard. However, only a few models are currently known to already support Bluetooth 6.0. Samsung and OnePlus also still rely on Bluetooth 5.4 for their current flagships (Galaxy S25, OnePlus 13), even though the Snapdragon 8 Elite installed in these models is generally compatible with Bluetooth 6.0.
These smartphones already use Bluetooth 6.0:
Xiaomi Redmi K80 Pro (availability in the EU has not yet been confirmed)
Xiaomi Redmi Turbo 4 (will probably be launched as Poco X7 Pro in Germany)
We assume that these models will have Bluetooth 6.0 on board when they are released in 2025:
Apple iPhone 17
Google Pixel 10
Xiaomi 16
Honor Magic 8
Motorola Edge 60
Vivo X300
Galaxy S26 series (release expected in early 2026)
Related: Oh, that’s why it’s called ‘Bluetooth’! Read...Newslink ©2025 to PC World |  |
|  | | ITBrief - 6 Mar (ITBrief) Accenture`s Technology Vision 2025 report unveils AI`s transformative potential, highlighting trust as key to leveraging its capabilities for businesses. Read...Newslink ©2025 to ITBrief |  |
|  | | ITBrief - 6 Mar (ITBrief) IBC has unveiled eight projects for its 2025 Media Innovation Programme, collaborating with firms like Verizon and BT to tackle key media technology challenges. Read...Newslink ©2025 to ITBrief |  |
|  | | ITBrief - 6 Mar (ITBrief) D2L has been honoured with top accolades from G2 and Tech & Learning for its innovative educational technology, including the award-winning AI tool, D2L Lumi. Read...Newslink ©2025 to ITBrief |  |
|  | | ITBrief - 6 Mar (ITBrief) As International Women`s Day 2025 approaches, the tech industry faces an uphill battle with women comprising only 29% of the workforce amid a stubborn 11.9% pay gap. Read...Newslink ©2025 to ITBrief |  |
|  | | RadioNZ - 6 Mar (RadioNZ) Analysis: The accelerating and dangerous world of advanced military technology is sparking more talks between governments, and more drones on the battlefield, writes Phil Pennington. Read...Newslink ©2025 to RadioNZ |  |
|  | | PC World - 6 Mar (PC World)At a glanceExpert`s Rating
Pros
Fantastic 1440p gaming performance
Radeon 9070 beats RTX 5070 performance; Radeon 9070 XT goes toe-to-toe with RTX 5070 Ti for $150 less
16GB of memory and 256-bit bus are built for 4K gaming and strenuous modern games
Ray tracing performance is vastly improved
AI accelerators enable FSR 4 upscaling
Hypr-RX can turbocharge performance in 1,000s of games, with some visual compromises
Cons
FSR 4 was frustrating to use and inConsistently applied
Much slower than RTX 5070 in AI text generation and Premiere Pro
Occasional driver crashes; bad minimum frame times in Returnal
No answer to Nvidia’s fantastic DLSS 4 Multi Frame Generation feature
Our Verdict
The Radeon RX 9070 and 9070 XT offer much faster performance and more memory than Nvidia’s lackluster RTX 5070. Some software bugs mar the experience but overall, AMD’s 9070 graphics cards offer such a compelling mix of performance, value, and memory capacity that it’s worth accepting those quibbles.
Price When Reviewed
This value will show the geolocated pricing text for product undefined
Best Pricing Today
Nvidia fumbled the ball with its $549 GeForce RTX 5070, and AMD’s new Radeon RX 9070 and 9070 XT are primed to seize advantage.
The RTX 5070, hitting store shelves today, is a good 1440p graphics card but a stagnant generational sidegrade at best. Enter the $549 Radeon RX 9070 and $599 Radeon RX 9070 XT, launching tomorrow. Both cards are faster than the RTX 5070, with the 9070 XT going toe-to-toe with the $750 RTX 5070 Ti in many games, and each includes an ample 16GB of VRAM. The RTX 5070 is stuck with a disappointing 12GB. Even ray tracing, long an AMD weakness, improved dramatically!
Bottom line? AMD’s Radeon RX 9070 series is the new 1440p gaming champion. If you opt for an RTX 5070 instead to get in on Nvidia’s DLSS 4 greatness, you’re making some major sacrifices in other areas.
We’ve spent the past week testing the XFX Swift Triple Fan Gaming Edition and Asus TUF Gaming OC models of both of these cards. The Asus card is a heavily juiced custom model that sports an extra third power connector compared to other models – all the better to overclock with. Here are the key things you need to know before buying an AMD Radeon RX 9070 or 9070 XT.
AMD Radeon RX 9070 and 9070 XT performance benchmarks
Our benchmarks above include results from both the aforementioned XFX and Asus 9070 cards, albeit only at 1440p resolution. We skipped 4K testing to be able to include multiple Radeon 9070 models in these graphs.
The Radeon RX 9070 cards kill it.
Even though AMD’s new GPUs remain well behind Nvidia’s in complex ray tracing scenarios – performance is great in lighter RT loads, however – the $549 Radeon RX 9070 is flat-out faster than the $549 RTX 5070 when you average out the combined results from all games in our suite. All told, the Radeon 9070 runs about 8 percent faster than the RTX 5070 at 1440p. If you omit Black Myth Wukong – a very strenuous game with full, complex ray tracing, and an outlier where all Radeon GPUs noticeably falter – the Radeon RX 9070 is 11 percent faster than the RTX 5070. Wukong is the only game where the 9070 falls behind the 5070’s performance.
But wait! While that’s impressive, the $599 Radeon RX 9070 XT blows both the RTX 5070 as well as the vanilla 9070 out of the water for just $50 more.
AMD
Across our suite, the Radeon 9070 XT runs 15 percent faster than the RTX 5070 on average, and 7 percent faster than the vanilla Radeon 9070. Excluding Black Myth Wukong, the 9070 XT runs 19 percent faster than the RTX 5070 – and it comes with 16GB of onboard memory, compared to the RTX 5070’s paltry 12GB. This is a major, major win for AMD.
So major, in fact, that the Radeon RX 9070 XT punches above its weight class to challenge the $750 RTX 5070 Ti. The 5070 Ti is only 6 percent faster than the Radeon 9070, and that plummets to 3 percent if you omit Black Myth.
Sweet holy moley. Did I mention that the RTX 5070 Ti costs $150 more than the Radeon RX 9070 XT? That means you get 3 to 6 percent more performance for a 25 percent jump in price – making the RTX 5070 Ti a terrible value proposition unless you really want DLSS 4 or better AI and creation chops.
One tiny note: The 1 percent lows in Returnal are terribly low, and only on the 9070 GPUs. We’ve made AMD aware of the problem.
AMD’s ray tracing doesn’t suck anymore
Let’s bring back some of our earlier performance graphs, zeroing in on performance in ray traced games specifically.
AMD focused heavily on improving ray tracing performance in RDNA 4, the next-generation graphics architecture powering the Radeon 9000-series. It shows in our benchmarks.
Ray tracing performance was a major Achilles’ Heel for prior Radeon generations. No more – mostly. The Radeon 9070 series performs neck-and-neck with the RTX 5070 in games with moderate to heavy levels of ray tracing. In F1 24 and Returnal, both AMD GPUs actually run faster than the RTX 5070. That’s nothing I ever thought I’d be saying about these new Radeon GPUs.
Adam Patrick Murray spent several days evaluating the 9070’s ray tracing performance in his small form-factor rig, playing RT games first on last generation’s Radeon RX 7900 XTX flagship, then the 9070. He’s still wrapping up final observations for a video – more on that soon – but in general, he reports large leaps forward in ray tracing performance on the 9070 XT.
It’s not all sunshine and rainbows though. Once games start layering on multiple ray tracing effects and more strenuous RT features, such as path tracing, AMD’s GPUs fall behind Nvidia’s. The RTX 5070 is markedly faster than the Radeon 9070s in Black Myth Wukong as well as Cyberpunk 2077’s grueling RT Overdrive mode. If complex ray tracing matters to you, Nvidia’s cards may be a better option.
16GB of memory FTW
So the Radeon RX 9070 series stomps the RTX 5070’s performance in all but the most strenuous ray traced games at 1440p. But here’s another consideration: The memory configuration of AMD’s offerings is more future-proof and built to run 4K as well. In fact, AMD marketed the Radeon 9070 series as “4K gaming at a 1440p price,” twisting a knife into Nvidia’s ribs.
That’s because Nvidia outfitted the RTX 5070 with just 12GB of memory, paired with a puny 192-bit bus. (Think of a memory bus like a road; the bigger the bus, the more lanes in the road, letting more traffic move more swiftly.) A configuration like that limits the 5070’s potential to 1440p gaming alone; while many games can run at 4K on the RTX 5070, you shouldn’t buy that GPU with 4K gaming in mind.
The Radeon RX 9070 and 9070 XT, meanwhile, both include an ample 16GB and a wider 256-bit bus. That means two critical things.
One, while our testing focused on 1440p resolution, these cards – especially the 9070 XT – truly are built to handle 4K gaming, even if they’re tuned for prime 1440p performance.
And two, the 16GB of memory makes it much more future-proof in an era where games gobble up ever-increasing amounts of memory, especially with ray tracing and frame generation active. The 12GB GeForce RTX 5070 already runs into issues at maximum settings in a handful of games, like the new Indiana Jones game, because of memory capacity issues.
FSR 4 and Hypr-RX amplify performance
AMD
Nvidia placed the fate of the RTX 50-series in DLSS 4’s hands, and more specifically, its excellent new Multi Frame Generation feature. MFG uses AI to insert up to three generated frames between every traditional rendered frame. It doesn’t really improve performance as much as the raw frame rates may lead you to believe, but MFG delivers such a shocking improvement in visual smoothness and raw frame pacing that it’s truly transformative.
AMD has no feature to match that directly – but it does have some performance-boosting software tricks up its sleeve.
First is FSR 4. Prior FSR generations leveraged traditional GPU hardware to upscale images; FSR 4 instead leans on new, vastly improved AI accelerators built into the RDNA 4 architecture to handle upscaling instead, in DLSS-like fashion. We haven’t had much time to play with FSR 4 yet, but the image quality boost over FSR 3.1 is tangible. AMD says FSR 4 will be available in over 30 games at launch, with 75+ games expected to integrate the technology by the end of the year.
The problem? We’ve had a terrible time reliably activating FSR 4 in games, needing to jump through hoops both in-game and in-driver, only for it to fall half the time. It’s an inauspicious start for FSR 4. My bud Adam Patrick Murray details his FSR 4 trials in the video above.
Then there’s Hypr-RX, a great feature with a cringe name.
AMD
Hypr-RX is AMD’s name for a one-click feature that activates a bunch of separate Radeon features to supercharge performance in virtually all modern games. AMD supports driver-level FSR, Frame Generation (AMD Fluid Motion Frames), and anti-lag technologies, among others. That means developers don’t need to actively code in support for the features, like they do with DLSS and FSR – it just works. Flipping on Hypr-RX can send frame rates absolutely skyrocketing in almost any game you throw at it.
It’s not as seamless as DLSS 4’s Multi-Frame Gen. Since these are driver-level tools, AMD’s FSR equivalent lacks developer integration, and image quality can sometimes take a hit – blurry interface elements and a general softness in image quality being the biggest offenses. You’ll also want to make sure the game is running at a solid frame rate before activating AMD’s frame gen (Hypr-RX’s upscaling usually takes care of that, especially on the powerful 9070 GPUs). But if you can tolerate some image softness, Hypr-RX is a killer solution that puts the performance pedal to the metal universally. It’s a fantastic, versatile tool.
Nvidia reigns supreme in content and AI workloads
We only ran a couple of non-gaming benchmarks – one focused on Adobe Premiere Pro performance via the fantastic PugetBench benchmark, and Procyon’s AI text generation benchmark, which assaults GPUs with a variety of large language model tests.
The RTX 5070 absolutely stomped the Radeon RX 9070 series in both of these. If you use your graphics card for work as well as play, Nvidia remains the superior option despite AMD’s gaming and memory capacity dominance.
There are no Radeon RX 9070 reference cards
AMD
Nvidia’s Founders Edition models are usually among the best GeForce options around. You won’t find an AMD equivalent this generation. All Radeon RX 9070 and 9070 XT models come from AMD partners like XFX, Sapphire, and Asus. AMD will not be releasing a reference “Made by AMD” version of these GPUs.
That said, the Radeon RX 9070 series utilizes a pair of 8-pin power connectors as standard. Some custom models may opt for an Nvidia-esque 12-pin connector instead, while overclocked models like the Asus TUF sometimes add an additional 8-pin connector to add in power delivery and overclocking. The vast majority of Radeon 9070s will stick to a pair of 8-pins, but check to make sure your chosen GPU meets your power supply specs before you buy.
Should you buy AMD’s Radeon RX 9070 and 9070 XT?
Adam Patrick Murray / Foundry
The Radeon RX 9070 and 9070 XT are the new 1440p gaming champions. I’d definitely opt for those over the $549 GeForce RTX 5070, which is just a stagnant sidegrade over its predecessor.
Don’t get me wrong: I adore DLSS 4’s Multi-Frame Generation and consider it truly transformative. The visual smoothness it provides must be seen to be believed. But the $549 Radeon RX 9070 slings frames an average of 11 percent faster than the 5070 if you remove outlier Black Myth Wukong. Paired with a full 16GB of memory and a wide bus that actually allows for 4K gaming, the Radeon RX 9070 feels like an all-around more compelling option for the price, especially now that ray tracing isn’t the Achilles’ Heel it once was for AMD.
But really, the $599 Radeon RX 9070 XT is the graphics card you want if you can snag one. It features the same beefed-up 16GB memory configuration, but spits out frames a whopping 19 percent faster than the RTX 5070 for just $50 more. In fact, it punches closer to the $750 GeForce RTX 5070 Ti. Nvidia’s card is only 3 to 6 percent faster than the 9070 XT despite costing 25 percent more.
So yeah: AMD has a pair of winners on its hands with the Radeon 9070 series.
Adam Patrick Murray / Foundry
It’s not quite a perfect landing though. We found AMD’s much-hyped new FSR 4 feature frustrating to (try to) use in real life; bad 1 percent low times in Returnal are a bit of a bummer; Nvidia maintains the lead in content and AI creation; we suffered some driver crashes on all tested 9070 GPUs; and while Hypr-RX is very cool, Radeon still has no answer for DLSS 4 Multi Frame Gen. But the Radeon RX 9070 and 9070 XT offer such a compelling mix of performance, value, and memory capacity that it’s worth accepting those true, valid, concerning quibbles and hope AMD gets its software act together.
Unlike the RTX 5070, AMD’s Radeon RX 9070 series pushes gaming performance forward. I hope AMD made a bunch of them. Read...Newslink ©2025 to PC World |  |
|  | | PC World - 6 Mar (PC World)Everyone knows them, everyone has them. USB sticks, or thumb drives, are ideal for quickly exchanging data between computers — but also for transporting digital content between mobile phones and PCs when the cloud is not an alternative.
Because they are so small, USB sticks are also very popular as promotional gifts or are quickly added to the shopping basket just before the checkout in the electronics store.
Trouble later arises because the stick turns out to be a bad buy. That’s why it’s worth knowing a little more about USB sticks. After all, we entrust our private and business data to these small data transporters.
Cheap USB sticks in particular (here from Unionsine) hide the fact that they only support outdated USB 2.0 speeds behind their versatile application.IDG
The USB interface makes them universally applicable — with the Type A and Type C designs now available in two connector formats. The differences in the flash modules themselves are worth knowing.
It is also interesting to know which capacities and transfer speeds are now possible — and which are not.
Because only those who know why one USB stick is more expensive and what distinguishes it from the supposed bargain options will ultimately make the right choice for their own purposes.
Tip: You’re using your USB flash drive wrong. Do this instead
Differences in flash drives
Both USB SSDs and USB sticks use flash memory as their basis. The fundamental difference lies in the quality of the flash modules. They are manufactured in the same production facilities and may even come from the same machines.
However, they are not identical in quality. The best flash chips are generally used for the production of SSDs. The next quality level involves flash components for the production of memory cards — such as SD or MicroSD cards. This is followed by flash memory for USB sticks.
However, this does not automatically mean that every USB stick is of poor quality, per se. However, it does show that USB sticks are not designed for long-term archiving. It also makes it clear why USB 2.0 speed is still widely used for inexpensive USB sticks in particular.
Last year, a report by the company CBL Datenrettung also caused a stir: More and more USB sticks with inferior memory chips were ending up in the company’s laboratories.
The manufacturer logos on the NAND chips were missing or had been made unrecognizable. The capacity information was also incorrect. As a rule, the actual storage space was less than the label would have us believe.
Further reading: The best external drives
A USB stick on which the manufacturer’s name of the NAND chip has been made unrecognizable indicates that this is actually discarded flash memory. (Image: CBL Datenrettung)IDG
The data sticks examined were both promotional gifts and branded goods. The company concluded from this that more and more memory chips are ending up on the market that should actually have been decommissioned.
This observation emphasizes once again that USB sticks are not the right choice for sensitive storage tasks. You can find out how to treat the drive so that it provides good service for as long as possible in the box below “USB stick: Handling tips.”
Further reading: How long does data last on a USB flash drive? It’s complicated
Capacities of USB flash drives
If you look for USB flash drives on price comparison websites, you can usually specify the desired storage capacity. The selection range here is usually between 64GB and 1TB. And that’s a good thing. Because if you are offered significantly higher capacities for a USB stick, then the products are almost certainly counterfeits.
The most obvious clue is the price. If the price is exceptionally low for a 2TB stick — for example less than $10 — then all your alarm bells should be ringing.
If the online provider also comes from the Far East, the scam is obvious.
A healthy dose of skepticism is worthwhile when buying a USB stick to avoid trouble later on. In most cases, you will actually receive a drive with only 32GB or 64GB of flash memory.
At the same time, the boundaries between USB stick and USB SSD can become blurred with high capacities.
One example: While price search engines refer to the Adata SC610 2TB model as a USB stick, the data carrier is labelled by the manufacturer as an “external solid state drive” — in other words, an external SSD. The price of around $145 emphasizes the latter, but the form factor with USB-A interface directly on the housing gives the impression of a stick.
With the Adata SC610 2TB model, it is not entirely clear whether it is a USB stick or a USB SSD. The manufacturer lists the model under Solid State Drive, in stores you will find it under Sticks.IDG
Nevertheless, it can be said that the majority of branded USB sticks are currently available in capacities of 64, 128, 256, and 512GB. An entry-level stick with 64GB currently costs around $10. For a 512GB model, you can expect to pay between $40 and $50.
Reading tip: Best USB-C cables 2025: Get quality charging and data transfers
Boot stick for Windows
There are always problems with Windows boot sticks. They are either not recognized or cannot be created in the first place. The problem is usually not the capacity. According to Microsoft, the official minimum requirement is 8GB. You are definitely on the safe side with a 64GB stick.
There are also a few points that you should bear in mind: In principle, the USB stick should ideally be new and unused, as this provides the freshest flash memory. It is also advisable not to use a USB stick that is too slow. It is better to avoid USB 2.0 models for the installation medium.
It is advisable to use a stick from a brand manufacturer such as Sandisk (Western Digital, WD), Samsung, Kioxia, Lexar, Kingston, or Crucial (Micron). Or from providers such as PNY, Adata, Hama, Intenso or Verbatim.
For a Windows boot medium, it is best to use a stick from a brand manufacturer. For 64GB capacity, invest around $10, as here for the Sandisk Ultra Slider with USB 3.2 Gen1 interface.IDG
With cheap and no-name sticks, you cannot judge the quality of the components used.
A complaint in the event of a defect is also very likely to come to nothing. Basically, you can only hope that the retailer will refund the purchase price — but not that the retailer will stop selling the product.
Speed boost through USB standard
Not every new USB stick automatically provides a speed boost. Conversely, you will achieve the greatest boost in data transfer if you switch to a faster USB interface — provided your host device supports the standard.
USB 2.0: The slow USB 2.0 standard is still widespread, especially with very inexpensive models. Here you can only achieve a transfer rate of around 45MB/s even when reading. Write rates can even be as low as 10MB/s.
USB 3.2 Gen1: In many cases, the outdated designation USB 3.0 can still be found on USB stick packaging. The read transfer rates are at best 450MB/s, while the write rates can easily drop to less than half that. However, many sticks only achieve around 200MB/s.
USB 3.2 Gen2: This interface, formerly known as USB 3.1, has now become very established and can be present on the computer as both a Type-A and Type-C connector. This port is also extremely popular for USB accessories. Most external USB SSDs in which an NVMe data carrier is installed already rely on USB 3.2 Gen2.
USB 3.2 Gen2 is currently the fastest standard for USB sticks. It has two types: Type A and C. You can choose the appropriate connection — like here with the Kingston Data Traveler Max.IDG
USB sticks with this interface are also becoming increasingly popular, but are still on the rise and therefore expensive. Based on the interface standard, a bandwidth of 10Gb/s is available here.
In the best case scenario, data rates of around 1000MB/s can be achieved — again, the ideal case only applies to read tasks.
However, the presence of a Type C connector alone does not automatically signal that the promised maximum speed is supported.
For example, we tested the Verbatim Dual Quickstick model with 256GB capacity for this guide. It has both a Type A and a Type C connector and, according to the packaging, is supposed to support USB 3.2 Gen1.
The Verbatim Dual Quickstick has two connections — Type A and Type C. However, it only achieves the promised speed of USB 3.2 Gen1 via USB-A. It falls back to USB 2.0 speed via Type C.IDG
The benchmark runs with CrystalDiskMark show that this is only the case with one connection: The drive actually achieves over 450MB/s read and write speeds via Type A. Via Type C, the transfers drop to a good 40MB/s. This makes it clear that only the much slower USB 2.0 speed is possible via Type C.
More space, perhaps more speed
The transfer rates stated by the manufacturers represent ideal values for sequential tasks. They can be verified with benchmark tools, but rarely occur in everyday life — most likely when copying a large file, such as a video.
The controllers used play an important role in achieving fast data transfers. If they can distribute the data to several memory modules at the same time, the stick can complete the tasks faster. The greater the capacity, the more flash modules are available. Sticks with a higher capacity can therefore also work faster.
Such controllers are not always built into the sticks. Very cheap versions, in particular, use chips that are not capable of simultaneous distribution. The work is then done one after the other: Only when one flash module is completely filled is it the next one’s turn.
This procedure slows down the process. Even sticks with a large capacity do not gain any speed.
File systems for USB sticks
USB sticks can be formatted ex works in different file systems. FAT32 (File Allocation Table) used to be very common. However, it has the disadvantage that it cannot handle file sizes over 4GB. Anyone who likes to drag videos onto the stick should bear this in mind.
An alternative is exFAT (Extensible File Allocation Table), which was specially developed for flash memory. It is the most popular file system for USB sticks. Its strength lies in its flexibility. For example, exFAT is compatible with both Windows and Mac OS and works with Android and iOS devices. This makes it easier to exchange data between PCs and mobile devices.
However, you need to be careful if you want to connect the USB stick to a smart TV. Many TVs can’t do anything with it, some are even exclusively set to FAT32.
Sometimes you will also encounter the NTFS (New Technology File System) file system on USB sticks. It is the Windows standard and therefore the first choice if you only want to use the stick with Windows systems. NTFS is also compatible with Chrome OS.
USB stick: Tips on handling
Everyone uses memory sticks — usually for several years. To avoid data loss or problems recognizing the USB port on your computer, the following tips will help.
1. Use USB sticks in rotation: If you regularly write and delete data on USB sticks, you should not just use one stick. It is best to have several drives that you use alternately. The reason: Flash memory only has a limited number of erasure cycles and ages accordingly.
2. Beware of very small sticks: The size of a USB stick can have an impact on data security. Very small USB sticks usually have poorer heat dissipation than larger versions. In addition, mini housings are usually not as robust. They can be damaged when plugged in and unplugged and do not cope so well with transport.
3. Cool storage: If you want your USB stick to last as long as possible, you should ensure that it is stored in a cool place. High temperatures can cause the flash quality to deteriorate, which favors gradual data loss.
4. Regular use: USB flash drives (like other flash storage media) should not be left unused in a drawer for years. It is best to plug important USB sticks into the computer regularly — every six months or so — to read data on a trial basis. This triggers internal error correction mechanisms. Any unstable data is copied internally.
5. Do not write to the USB stick completely: The write cycles of flash cells are limited. Repeated write/erase cycles of the same memory area lead to wear and tear. Internal control mechanisms ensure that the data is distributed across the available memory space. To ensure that as many new or fewer used cells are available for as long as possible, it is recommended that the capacity on the stick is not fully utilized. Read...Newslink ©2025 to PC World |  |
|  |  |
|
 |
 | Top Stories |

RUGBY
Blues number eight Hoskins Sotutu is putting any All Blacks test aspirations on hold as he prioritises reviving his Super Rugby side's one-win-five-loss start to the season More...
|

BUSINESS
Dairy prices have risen another 1.1 percent at this morning's Global Dairy Trade auction More...
|

|

 | Today's News |

 | News Search |
|
 |