May 2, 2024

Why Paramount’s problems should worry the rest of the media giants

Why Paramount’s problems should worry the rest of the media giants

The challenges Paramount is facing are the same challenges faced by media in general. It’s hard to compete in a new streaming and digital world.

Paramount is in trouble: The one-time media giant’s ad sales are plummeting, and so is its stock price. Would-be buyers are kicking the tires, but no one seems in a hurry to make a deal — the price will surely keep going down. This week, the day after the company broadcast the Super Bowl to a record-setting number of viewers, it announced companywide layoffs.

But why should you, a person who doesn’t work at Paramount, care about the future of the company?

Because, as Lucas Shaw explains in a new Bloomberg Businessweek story, it’s a proxy for traditional media in general:

The company’s troubles are also a warning sign for Hollywood, which looked to avoid the fate of newspapers, magazines and music—industries ravaged by the internet. But as media companies struggle to transition from cable to streaming, they’re surrendering the next generation of TV viewers to short-form video apps and services that tech giants in Silicon Valley and China own. So far, Hollywood has relied on restructuring and layoffs rather than innovation and growth, leading to questions about whether we’re in the last great age of TV.

As Shaw notes in his piece, Paramount’s problems are both particularly acute and self-inflicted: Compared to the likes of Disney and Warner Bros Discovery, it has less room for error because it is less diversified. And under the leadership of longtime owner Sumner Redstone, the company stubbornly refused to accept the fact that its young audience was particularly likely to leave for digital alternatives; more recently, under the leadership of Redstone’s daughter, Shari, it has missed opportunities to sell all or parts of the company at prices it has no hope of getting again.

But even under the best-case scenario, it would be hard for Paramount or any other traditional media company to survive the transition to streaming and digital. Which is why two of the biggest traditional giants — Time Warner and Rupert Murdoch’s Fox — took the opportunity to sell most of themselves in 2016 and 2017.

More at:



The AI Boom Could Use a Shocking Amount of Electricity

Powering artificial intelligence models takes a lot of energy. A new analysis demonstrates just how big the problem could become

The AI Boom Could Use a Shocking Amount of Electricity

Powering artificial intelligence models takes a lot of energy. A new analysis demonstrates just how big the problem could become

Every online interaction relies on a scaffolding of information stored in remote servers—and those machines, stacked together in data centers worldwide, require a lot of energy. Around the globe, data centers currently account for about 1 to 1.5 percent of global electricity use, according to the International Energy Agency. And the world’s still-exploding boom in artificial intelligence could drive that number up a lot—and fast.

Researchers have been raising general alarms about AI’s hefty energy requirements over the past few months. But a peer-reviewed analysis published this week in Joule is one of the first to quantify the demand that is quickly materializing. A continuation of the current trends in AI capacity and adoption are set to lead to NVIDIA shipping 1.5 million AI server units per year by 2027. These 1.5 million servers, running at full capacity, would consume at least 85.4 terawatt-hours of electricity annually—more than what many small countries use in a year, according to the new assessment.

The analysis was conducted by Alex de Vries, a data scientist at the central bank of the Netherlands and a Ph.D. candidate at Vrije University Amsterdam, where he studies the energy costs of emerging technologies. Earlier de Vries gained prominence for sounding the alarm on the enormous energy costs of cryptocurrency mining and transactions. Now he has turned his attention to the latest tech fad. Scientific American spoke with him about AI’s shocking appetite for electricity.

[An edited and condensed transcript of the interview follows.]

Why do you think it’s important to examine the energy consumption of artificial intelligence?

Because AI is energy-intensive. I put one example of this in my research article: I highlighted that if you were to fully turn Google’s search engine into something like ChatGPT, and everyone used it that way—so you would have nine billion chatbot interactions instead of nine billion regular searches per day—then the energy use of Google would spike. Google would need as much power as Ireland just to run its search engine.

Now, it’s not going to happen like that because Google would also have to invest $100 billion in hardware to make that possible. And even if [the company] had the money to invest, the supply chain couldn’t deliver all those servers right away. But I still think it’s useful to illustrate that if you’re going to be using generative AI in applications [such as a search engine], that has the potential to make every online interaction much more resource-heavy.

I think it’s healthy to at least include sustainability when we talk about the risk of AI. When we talk about the potential risk of errors, the unknowns of the black box, or AI discrimination bias, we should be including sustainability as a risk factor as well. I hope that my article will at least encourage the thought process in that direction. If we’re going to be using AI, is it going to help? Can we do it in a responsible way? Do we really need to be using this technology in the first place? What is it that an end user wants and needs, and how do we best help them? If AI is part of that solution, okay, go ahead. But if it’s not, then don’t put it in.

What parts of AI’s processes are using all that energy?

You generally have two big phases when it comes to AI. One is a training phase, which is where you’re setting up and getting the model to teach itself how to behave. And then you have an inference phase, where you just put the model into a live operation and start feeding it prompts so it can produce original responses. Both phases are very energy-intensive, and we don’t really know what the energy ratio there is. Historically, with Google, the balance was 60 percent inference, 40 percent training. But then with ChatGPT that kind of broke down—because training ChatGPT took comparatively very little energy consumption, compared with applying the model.

It’s dependent on a lot of factors, such as how much data are included in these models. I mean, these large language models that ChatGPT is powered by are notorious for using huge data sets and having billions of parameters. And of course, making these models larger is a factor that contributes to them just needing more power—but it is also how companies make their models more robust.

What are some of the other variables to consider when thinking about AI energy usage?

Cooling is not included in my article, but if there were any data to go on, it would have been. A big unknown is where those servers are going to end up. That matters a whole lot, because if they’re at Google, then the additional cooling energy use is going to be somewhere in the range of a 10 percent increase. But global data centers, on average, will add 50 percent to the energy cost just to keep the machines cool. There are data centers that perform even worse than that.

What type of hardware you’re using also matters. The latest servers are more efficient than older ones. What you’re going to be using the AI technology for matters, too. The more complicated a request, and the longer the servers are working to fulfill it, the more power is consumed.

More at:


America’s electricity grid is stressed, and new data centers may put even bigger demands on it

The estimated global energy consumption in 2022 for data centers and crypto mining, as well as the power to transmit that data, was between 600 and 850 terawatt hours. It was enough to power 8.5 trillion 100-watt light bulbs for 1 hour

America’s electricity grid is stressed, and new data centers may put even bigger demands on it

The estimated global energy consumption in 2022 for data centers and crypto mining, as well as the power to transmit that data, was between 600 and 850 terawatt hours. It was enough to power 8.5 trillion 100-watt light bulbs for 1 hour.

For the past couple of years, assessments of the national electric grid’s ability to deliver power during peak demand periods, such as heat waves and cold snaps, have shown increasing risk for blackouts.

The North American Electric Reliability Corporation, the nation’s grid watchdog, finds the main cause is retirements of coal plants without enough natural gas plants coming online.

Besides the ability of generation sources to meet demand during peak periods, the general demands on the grid are also increasing. Environmental groups are pushing to transition home heating from natural gas to electricity, and electric vehicles are also adding to the grid’s thirst for power.

Among this mix of increasing electricity needs are data centers. Data centers manage and store the data for streaming services, email applications, e-commerce transitions, online gaming, and machine learning and artificial intelligence (AI). Along with crypto mining and data storage, AI is expected to double the electricity demand from data centers by 2026, according to the International Energy Agency (IEA).

The estimated global energy consumption in 2022 for data centers and crypto mining, as well as the power to transmit that data, was between 600 and 850 terawatt hours, according to the IEA. To keep a 100-watt light bulb running for one hour requires 100 watt hours. So the amount of energy in 2022 consumed by data centers, crypto mining and data transmission networks was enough to power 8.5 trillion 100-watt light bulbs for 1 hour.

According to the Energy Policy Research Foundation, a nonpartisan not-for profit think tank founded in 1944, this single sector puts these electricity demands on par with the total national electricity consumption of countries such as Brazil, Canada, or South Korea.

Max Pyziur, research director for the Energy Policy Research Foundation, told Just The News there’s some wildcards that make estimations of future electricity consumption by data centers difficult to pin down with any precision.

For one thing, data centers have their own electricity backup. Often this is in the form of diesel generators, but some utilize batteries. Pyziur said that up to 40% of the square footage of a data center is allocated to backup generation. While these resources are only used intermittently, they will contribute to the resources available to satisfy demand.

The other factor that creates uncertainty is efficiency gains, Pyzuir explained. This includes the replacement of legacy copper lines with fiber optic cables, which accelerates the throughput rates of data. Also, legacy mechanical spinning hard drives are being replaced with solid state drives, which use less energy.

The last piece, Pyziur said, is the central processing units (CPU). These have gotten considerably faster, which means you need less of a footprint to process the same amount of data. “The bulk of electricity that’s used by a data center is by the CPUs. So we know that will be the next physical frontier that we have to master,” Pyziur said.

Between 2010 and 2023, data center power requirements, as a result of the growth in the number of data centers, grew by 168%, or just 4% annually, according to Pyziur’s and his colleagues’ research. In that time, however, the power usage effectiveness, which is an industry measure of efficiency, grew by 1,430%.

This has constrained some of the growth in electricity demand. Between 2010 and 2023, data center energy requirements globally grew from 0.43% of the electricity generated to 0.52%. During that period, the number of workload units, which is a measure of computational task, processes or data transactions, grew from 58 million to 821 million, an annual increase of 22.7%.

Whether or not efficiency gains will reduce or flatten energy requirements from data centers — or just slow the increase — could depend on what economists call the “rebound effect.” According to the Energy Policy Research Foundation, rather than promoting conservation and diminished usage of a technology, such as the internet, efficiencies lead to increased usage of the technology.

More at:


Share the News