Motherboard software bug can accidentally kill AMD Ryzen X3D CPUs

In brief: An apparent bug has been discovered in the motherboard management software from several leading board vendors that makes it incredibly easy to fry your X3D-based AMD CPU. We do not yet have a timetable for when safeguards will be installed so until then, be very careful when tinkering around in motherboard management software with these chips. One mistake and poof, your CPU could become a shiny desk ornament.

Igor’s Lab recently stumbled upon the issue while tinkering around in MSI Center. Using an MSI B550 Unify board with an AMD Ryzen 7 5800X3D processor, Igor noticed the software was seemingly detecting the chip as a standard Ryzen 5xxx, which does not have 3D vertically stacked L3 cache.

The extra cache is known to boost gaming performance but its physical presence also negatively impacts heat dissipation, which is one reason why it ships with slower clock speeds than the standard 5800X.

With the software unable to detect an X3D chip, it treats it like a standard Ryzen 5xxx and allows largely unfettered manipulation of the core multiplier and core voltage. Intrigued, Igor fiddled with the voltage settings and managed to kill the CPU in short order. He did not mention how much juice was fed to the chip to take its life but Tom’s Hardware notes that core voltage can be set all the way up to 1.55v.

Worse yet, the apparent bug has since been discovered in similar management software from ASRock, Asus and Gigabyte.

Just last week, renowned overclocker der8auer tried his hand at overclocking and overvolting the AMD Ryzen 7950X3D. Even with liquid nitrogen, the chip instantly died when the core voltage was set to 1.55v. der8auer barely made it out of the BIOS (and without any significant load) before things went south. Needless to say, he was surprised it died so quickly.

At the very least, it sounds like these extreme core voltage options should carry a firm warning before users are allowed to apply risky settings.

Microsoft is exploiting its anti-competitive advantage over the European cloud market, Google says

Highlights:

  • Nvidia Corporation is expanding its product line with new software capabilities and a high-performance computing platform that scientists can use to expedite their research.
  • During SC22, Nvidia will also detail its collaboration with Lockheed Martin Corporation to develop a system for visualizing geophysics data such as sea temperature measurements.

Nvidia Corporation is adding new software features and a high-performance computing platform that scientists can use to speed up their research.

The updates were scheduled to debut at the Supercomputing 2022 event. During SC22, Nvidia will also detail its collaboration with Lockheed Martin Corporation to develop a system for displaying geophysical information, such as readings of sea temperature measurements.

Digital twin collaboration

The National Oceanic and Atmospheric Administration of the United States has chosen Nvidia and Lockheed Martin to construct a new computing system known as the Earth Observation Digital Twin or EODT. The system will be able to process multiple types of geophysical data, including sea temperature measurements and solar wind data. Climate and weather visualizations will be created using these data to support research initiatives.

EOTD will operate on cloud instances from Amazon Web Services Inc. equipped with graphics processing units. In addition, it will perform some computing tasks using systems from the DGX and OVX data center appliance product lines by Nvidia. The appliances include GPUs optimized for artificial intelligence applications and other workloads.

According to Nvidia, the system’s software architecture consists of multiple components.

The OpenRosetta3D application from Lockheed Martin will collect the geophysics data that EOTD will process. Once collected, the data will be processed into the Omniverse Nucleus database. Another system component is Lockheed Martin’s Agatha software tool, which makes it easier for researchers to interact with geophysics data gathered from multiple sources.

The first demonstration of EOTD’s capabilities is scheduled for September next year. According to Nvidia, the initial prototype of the system will be created to display data regarding the sea surface temperature.

New edge platform 

Nvidia also intends to unveil several new product enhancements during SC22. The first feature is a platform that will facilitate the transfer of scientific data across geographically distant computers and other systems.

There are several circumstances in which researchers must be able to transmit data over extensive distances. For instance, a university may desire to share measurements from research equipment in one facility with a supercomputer hosted on another facility. Similarly, researchers may want to communicate simulation findings with many supercomputers operating in separate locations.

Senior Product Manager Geetika Gupta stated, “To overcome this problem, Nvidia has introduced a high-performance computing platform that combines edge computing and AI to capture and consolidate streaming data from scientific edge instruments and then allow devices to talk to each other over long distances.”

The platform is based on three Nvidia chip technologies: MetroX-3, Holoscan, and BlueField-3.

MetroX-3 is a soon-to-be-introduced technology that can significantly expand the reach of a data center network. With this technology, a data center can connect to IT infrastructure over 25 miles away. Researchers can transfer scientific data between servers located in different facilities using MetroX-3 network links.

The new platform from Nvidia also includes Holoscan and BlueField-3 chips. Researchers may utilize Holoscan to process data from medical devices. BlueField-3 chips from Nvidia are specialized processors optimized for network traffic coordination between servers.

Software upgrades

Omniverse is a software development platform offered by Nvidia to create digital twins and simulations. Recently at SC22, Nvidia was scheduled to unveil an update to Omniverse that makes it easier for scientists to use the platform for research projects.

Omniverse can now execute batch workloads on systems powered by Nvidia’s H100 and A100 graphics cards in data centers. Batch workloads are applications, such as physics simulators, that do not require user input to perform calculations.

As part of the update, Nvidia is also integrating popular scientific applications into Omniverse. The applications include the scientific data visualization tools ParaView, IndeX, and NeuralVDB. Modulus, a software tool for building neural networks that automatically perform physics calculations, is now compatible with Omniverse.

One of the areas in which the company’s GPUs are used to support research initiatives is quantum computing. To facilitate the work of scientists, Nvidia is releasing two new features dedicated to quantum computing.

The first functionality is being rolled out for the CUDA Toolkit of the firm. This is a collection of software components for developing apps compatible with Nvidia graphics cards. The purpose of the new functionality is to enhance the performance of scientific apps that do quantum mechanical computations.

Nvidia is concurrently improving its cuQuantum framework. Researchers utilize the architecture to mimic quantum computers on conventional computing gear. cuQuantum now enables the simulation of quantum computers with up to tens of thousands of qubits due to the upgrade introduced.

Meta reminds people that the metaverse will probably be awesome, one day

In context: With all the excitement, headlines, and talk about generative AI, Meta is reminding people that these systems aren’t as important as the area it has plowed over $24 billion into over the years: the metaverse. Nick Clegg, the company’s head of global affairs, just held a press conference in the virtual world to hail the metaverse as the future of computing.

Bloomberg reports that Clegg held a small press conference within Meta’s Horizon Workrooms. He was in London and spoke to Washington-based reporters who were wearing borrowed Meta Quest headsets. The publication notes that the group appeared as torsos seated around a large wooden table, and only Clegg’s avatar resembled the person wearing the headset.

“We’re going to stick with it, because we really believe, all the early evidence suggests, that something like this will be the heart of the new computing platform,” Clegg said. “But it’s going to take a while.”

It wasn’t that long ago when Facebook went all-in on the metaverse, going so far as to change its corporate name to Meta and pour billions into its Reality Labs division. But getting consumers to feel the same level of excitement has always been an uphill battle. Even teens aren’t interested in the concept.

Things have gotten even worse for Meta in recent times. The global economic slump means expensive VR headsets are pretty low on people’s shopping lists – shipments slumped more than 12% year-over-year in 2022.

But the biggest blow to Meta’s ambitions has been the rise of generative AI over the last few months. Tech firms are rushing to implement the likes of ChatGPT into their services; even Meta said it would be introducing AI-powered chat in WhatsApp and Messenger. It’s taken what little focus was on the metaverse away from that area and placed it squarely onto AI – Zuckerberg rarely mentions the virtual world anymore.

Big companies seem to have realized that spending a lot of money on the metaverse is becoming a pointless endeavor. Microsoft’s round of 10,000 layoffs saw its industrial metaverse project killed off, and Disney laid off its entire metaverse team as part of cost-cutting plans. There are also the cuts Meta has made that impacted the Reality Labs division, and senators demanding the metaverse be a place for adults only.

One supporter in Meta’s corner is Tim Sweeney. The Epic Games boss recently explained why he thinks the concept still has promise.

Clegg believes advertising and commerce will help Meta recoup the billions it has invested in the metaverse – Zuckerberg famously said it could be earning billions or even trillions of dollars in ten years – but then people actually have to be using these virtual worlds to buy things or be served ads.

There were plenty of obvious bugs during Clegg’s event, including all the avatars’ mouths moving when one person spoke. He believes the hardware is another area that will improve in time. “I just really want to stress that we’re going to look back on the headware we’re wearing now and think, ‘Gosh, do you remember the days when you would wear a Quest Pro?'” Clegg said. “We’ve always been very clear that we’re in this for the long haul. This is not going to happen overnight.”

US national lab is using machine learning to detect rogue nuclear threats

In context: While the entire technology world is focused on generative AI and its alleged capabilities to destroy the economy and the job market, researchers are employing neural networks to tackle challenges in science, energy, health and security, such as detection of rogue nuclear weapons.

The Pacific Northwest National Laboratory (PNNL) is trying to hunt for unknown nuclear threats by using machine learning (ML) algorithms. PNNL, which is one of the United States Department of Energy national laboratories, said that ML is everywhere now, and that it can be used to create “secure, trustworthy, science-based systems” designed to give people and nations answers to different kinds of difficult scientific challenges.

The official public debut of an ML algorithm dates back to 1962, PNNL said, when an IBM 7094 computer won against a human opponent in checkers. The system was able to learn by itself, thanks to the aforementioned algorithm, without being explicitly programmed to change its strategy against chess player Robert Nealey.

Today, PNNL said, machine learning is everywhere as it powers personalized shopping recommendations and voice-driven assistants like Siri and Alexa. Generative AI tools like ChatGPT are just the latest public face of a technology that has had many decades to mature and evolve.

PNNL researchers are employing machine learning for national security, too, as the laboratory’s experts are combining their knowledge in nuclear nonproliferation and “artificial reasoning” to detect and (possibly) mitigate nuclear threats. The main target of their research is to employ data analytics and machine learning algorithms to monitor nuclear materials that could be used to produce nuclear weapons.

The AI employed by PNNL can be useful for the International Atomic Energy Agency (IAEA), which is monitoring nuclear reprocessing facilities in non-nuclear weapon nations to see if the plutonium separated from spent nuclear fuel is later employed for nuclear weapons production. The IAEA uses sample analysis and process monitoring in addition to in-person inspections, which can be a time-consuming and labor-intensive process.

PNNL’s algorithms can create a virtual model of the facility inspected by the IAEA, tracking “important temporal patterns” to train the model and predict the pattern belonging to normal use of the various areas in the facility. If data collected on-site doesn’t match the virtual prediction, the inspectors could be called to check the facility once more.

Another ML-powered solution designed in PNNL labs can process images of radioactive material through an “autoencoder” model, which can be trained to “compress and decompress images” into small descriptions that are useful for computational analysis. The model looks at images of microscopic radioactive particles, searching for the unique structure that the radioactive material develops because of the environmental conditions or purity of the source materials at its production facility.

Law enforcement agencies (i.e., the FBI) can then compare the microstructures of field samples with a library of electron microscope images developed by university and national laboratories, PNNL said, so that they can speed the identification process up. Machine learning algorithms and computers “will not replace humans in detecting nuclear threats any time soon,” PNNL researchers warn, but they can be useful in detecting and averting a potential nuclear disaster on US soil.

Sony may be forced to lower PSVR 2’s price after slow launch

In brief: While Sony’s PSVR 2 is said to offer an excellent virtual reality experience, it seems asking $550 for the device wasn’t the best idea, especially at a time of economic uncertainty when people are tightening their belts. According to a new report, sales of the headset are off to a slow start, and a price cut might be necessary to avoid a total disaster.

Research firm IDC (via Bloomberg) estimates that Sony will likely sell about 270,000 units of the PSVR2, which launched on February 22, by the end of March.

Francisco Jeronimo, IDC’s Vice President of data and analytics, pointed to the shaky global economy as one the main reasons behind the PSVR 2’s disappointing start. At a time when utility prices are rising, interest rates are high, and many companies are cutting jobs, paying $550 for a virtual reality headset isn’t a big priority for most people.

“I suspect a price cut on the PSVR2 will be needed to avoid a complete disaster of their new product,” Jeronimo said.

Sony initially hoped to sell 2 million PSVR 2 headsets by March next year, but it revised that figure to 1.5 million. The PSVR 2 likely saw its biggest surge in sales during its first month of release, with purchases slowing down going forward. The Japanese gaming giant believes that the latest version of its console’s headset will be popular enough to outsell the original PSVR’s 5 million lifetime sales, which seems optimistic.

The PSVR 2 has won rave reviews for its impressive set of features, excellent controllers, and dual 2,000 x 2,040 OLED displays. But pricing the headset higher than the PlayStation 5 was always going to put some people off.

The news is yet another blow to the VR industry and, by association, the metaverse. Headset shipments were down 12% year-over-year in 2022, while Microsoft and Disney have laid off their entire metaverse teams as part of cost-cutting plans.

Meta, of course, still insists the metaverse will be the future of computing, but until VR becomes more accessible to the masses, that will probably remain Mark Zuckerberg’s unfulfilled dream.

Surge in Cyberpunk 2077 sales give CD Projekt Red its second-best year ever

In brief: CD Projekt Red hasn’t released a non-mobile game since 2020, yet last year was the second most-successful in the company’s history. It recorded an increase in revenue and net profit, much of which was due to an uptick in Cyberpunk 2077 sales, while operating costs fell.

CD Projekt Red’s latest financial results show that it brought in $222 million in revenue last year, up from the $207 million it made the previous year. Net profit was also up year-on-year, jumping 66% from $49 million in 2021 to $81 million in 2022. An 18% fall in operating costs also helped the company’s bottom line.

Much of the company’s good fortune last year resulted from increased Cyberpunk 2077 sales. The next-gen update in February 2022 helped its popularity, but the game’s big resurgence followed the debut of Netflix’s Cyberpunk: Edgerunners series. As it did with The Witcher, Netflix’s brilliant anime resulted in a surge in sales for CDPR’s game.

Last year’s boost still didn’t match the initial sales Cyberpunk 2077 enjoyed at release. Despite the many problems it experienced at launch, the game sold 13 million copies in ten days. As of September 2022, the total number of units sold was over 20 million worldwide. It faced plenty of criticism for the many bugs and performance issues, but updates have made the current Cyberpunk 2077 a lot closer to the game we were expecting in 2020.

“The popularity of the series (Edgerunners) and the positive reception of the update, released a week before the premiere, had a notable effect on Cyberpunk sales and general sentiment around the game, as evidenced by gamers’ reviews. This is a clear sign that deeper involvement in our franchises and expanding their reach is the right way to go,” said CD Projekt Red CEO Adam Kicinski.

CD Projekt Red added that it would reveal more information about the first (and only) massive story DLC for Cyberpunk 2077, Phantom Liberty, in a dedicated showcase this June ahead of its release sometime later this year.

CD Projekt Red has also restarted development of its Witcher spin-off Project Sirius, a spin-off developed by the company’s The Molasses Flood studio. There are plans to make several more Witcher games as well as a sequel to Cyberpunk 2077. Check out all the details of those titles here.