Mike Kramer: Navigating power deals in the new data economy

Mike Kramer has a background in finance, not engineering, but a combined 20 years at Exelon and Constellation and a key role in the deals that have Meta and Microsoft buying power from Constellation’s Clinton and Crane sites have made him something of a nuclear expert.
Kramer spoke with Nuclear News staff writer Susan Gallier in late August, just after a visit to Clinton in central Illinois to celebrate a power purchase agreement (PPA) with Meta that closed in June. As Constellation’s vice president for data economy strategy, Kramer was part of the deal-making—not just the celebration.

Mike Kramer
Meta gets power, and Clinton gets an end to more than a decade of economic challenges for the 1,092 net MWe boiling water reactor that was slated to shut down in 2017 and put more than 500 plant staff out of work. Clinton is a single-unit plant, but it also has an early site permit for new construction—Constellation’s only ESP.
“Now we have revenue certainty for 20 years,” Kramer said. “And the excitement at the plant for a sustained future, the community impact, and the jobs impact was fun to be a part of.”
Jobs are back in Pennsylvania, too, where about 500 people are working at Constellation’s Crane Clean Energy Center, preparing the former Three Mile Island-1 for restart in 2027, backed by a PPA with Microsoft.
A number of other deals are “in the pipeline,” Kramer said, as he shared insights into Constellation’s strategy for meeting data-driven demand growth, getting maximum power from existing nuclear reactors, and preparing to invest in new nuclear power.
What do you do as Constellation’s VP of Data Economy Strategy?

Constellation’s Bryan Hanson (right) and Meta representatives tour Clinton during the August 26 celebration with Meta. ( Photo: Constellation)
Data Economy Strategy is a group we created in the fall of 2024 in response to the demand that’s coming from the data economy. We saw changes in the size of data centers and where they’ll be located and wanted to make sure that we were proactive in meeting that demand.
My group is responsible for a couple of things. The first is tracking deal flow for the opportunities we have to provide power solutions to data companies. A lot of what I do is making sure we understand how that’s evolving and what types of transactions are possible. You’ve seen that come to fruition in the deal we did for the Crane Clean Energy Center and also the one we celebrated at Clinton.
The second thing that I’m responsible for is market intelligence on that industry: what’s changing, how it’s changing, and what that means for energy needs, load forecasting, and other key considerations that impact not just our ability to serve those customers but also our actions related to the energy industry and Constellation’s broad business.
What gives you confidence that the demand you’re seeing now has staying power?

Representatives of Constellation (at left) and Meta (at right) at the Clinton site celebration in August. (Photo: Constellation)
It’s challenging right now to get a good feel for where and when data centers are coming and to what extent they’re going to use energy on the system. What we argue is that existing data show there’s a very clear demand. In our earnings call in early August we put out a slide that shows existing data centers use quite a bit more power than they have before. And if you look at what’s getting built, a typical data center two years ago was in the 10- to 20-MW range. Those getting built today are in the 50- to 100-MW range, with some up to 1 GW.
You can tell there’s a real demand for the AI economy in the amount of spend that’s coming from the large tech companies. The most recent estimates for this year for the top four—Amazon, Google, Meta, and Microsoft—are over $350 billion. Clearly a significant amount of demand is out there.
I’ve heard both you and Bryan Hanson—executive VP and chief generation officer of Constellation Energy—warn that double accounting may be at play, exaggerating data center demand. Can you explain?
Because there is a lot of activity in the industry, you have a lot of people trying to meet those needs with speculative data centers. They buy property, and typically before they even have a customer, before they have the financing to build a $10 billion data center, they have to get a utility service center interconnection. These developers are potentially putting in multiple utility service requests to see which ones move fastest, and when they get that answer, they try to turn that site into a data center.
There’s also some regulatory uncertainty. States have different policies that often change as these companies go through the building process. Companies trying to figure out where they can get power fastest and most reliably end up weighing some different options.
Utilities are having a hard time parsing these speculative requests. They’re doing their best and trying to add requirements—whether that’s additional financial requirements or details on customer arrangements—but you can’t get too far down that path if you don’t know when you’re going to be able to get those customers power.
In our business, the location matters quite a bit because if it all gets built in one place, that changes the way you plan from an infrastructure perspective. I appreciate all the work that’s happening on the utility front and with the RTOs [regional transmission organizations] trying to figure out better ways to do this. But it’s difficult to figure out how much demand is coming to an individual location. To the extent we know where the end-use customers are, that’s helpful. Our deals have been with big tech companies because we know there’s an end-use customer.
How has it been working with hyperscalers, specifically on the Microsoft and the Meta deals?
We’ve always had good relationships with those customers. Constellation serves three-quarters of the Fortune 100, so working with hyperscalers is not new from our perspective.
I’d say there are two things that are interesting about working with those customers. First, they have very good sustainability goals. They’ve been vocal and we’ve heard repeatedly that nuclear is a good fit with what they’re trying to accomplish from a sustainability perspective.
The second thing is their long-term perspective. Typically in competitive markets a customer might be looking for a short-term deal for two or three years of power. Getting long-term revenue certainty from hyperscalers has been super helpful, because we are making long-term investments at these sites.
Constellation is competing with other power producers to land the best deals and with new reactor developers offering to site smaller reactors at the point of need. How is Constellation marketing its fleet of larger reactors that were sited decades ago?
I think there are a couple of things that are important in this question. The first is it does take a long time to get new things built. With these needs coming in the near term, you have to figure out what you can do in the existing system.
The first priority is making sure that the existing fleet has some revenue certainty and we can extend the license lives of the existing assets. We have a number of plants that are coming up on their license renewal dates. We’re trying to figure out how to renew those licenses and make the long-term investments required to operate those plants at the safe and reliable levels that we’re used to. That was a big part of the deal we did at Clinton. Clinton’s license expires in 2027, and this allows us to extend it and renew the license for another 20 years.
Renewing those licenses and having revenue certainty for a period of time allows you to consider how you add new capacity to units. Uprating is one way to get additional capacity, but that is not a one-size-fits-all solution. Some units are candidates for considerable uprates, while others have less potential and some have already done all possible uprates. What we’ve said is that we have about a gigawatt’s worth of additional uprates we can do across our fleet, pending some customer commitments. We’re working toward getting those on the system, which is faster than building new capacity.

Constellation’s Crane site on Pennsylvania’s Three Mile Island. (Photo: Constellation)
Where—and when—does new capacity come into the picture?
One option is restarting existing reactors that have shut down. There’s only a handful of those across the country, of course, so that’s not a sustainable solution, but it does get us—in the case of the Crane Clean Energy Center—an 835-MW pickup by 2027. Between the gigawatt’s worth of uprates and the 835 MW at Crane, we’re adding the equivalent of about two large reactors to the existing fleet.
By having existing assets run for an additional 20 years, we also preserve great locations for new builds. At those locations, you have existing infrastructure, community support, and good technical people who live in those areas and are capable of supporting new builds. When it comes to getting new generation built in the U.S., one of the best places to do that is at an existing reactor site.
Constellation has said the deal with Meta opens the possibility of building a new reactor at Clinton. Is new generation incentivized? Could it be folded into the PPA with Meta if and when it’s built?
We haven’t disclosed any details on how that would work, but it is something that is of interest, and we’re exploring that opportunity.
Returning to Constellation’s planned uprates, licensing timeline changes at the Nuclear Regulatory Commission mean you can likely get NRC approval for those uprates quicker than you could last year. Will this allow Constellation to turn planned uprates around faster?
The short answer to your question is that for those that are regulatory driven, I agree that you can probably pull some of the regulatory work to the left. The real question is the physical work that needs to happen. The supply chain considerations for large equipment that generally come with some of the bigger uprates are probably constrained by those equipment challenges and not by other things.
We haven’t disclosed specifics on many of the uprates. The only ones that we’ve talked about publicly are the turbines that we’re doing at Braidwood and Byron—those projects are underway—and we expect to get 30 MWe from uprates at Clinton. We announced as part of the Pennsylvania Energy and Innovation Summit in July that we have a potential uprate at Limerick. That is an extended power uprate that includes some pretty long-lead-time equipment. We’ve already started thinking about how to make sure that we can meet the timelines that we have planned, for example, by either having those components on order or by working through the engineering and design on those projects.
What are some other ways Constellation is trying to meet growing demand with its existing assets, such as demand response tools or battery storage?
Demand response is definitely something that we are working on. We are partnering with a company called Grid Beyond to work through how to focus on getting more demand response on the system and working through solutions for our customers to aggregate and potentially use that as a means to help offset some of the concerns that exist today around peak and reliability challenges.
From a battery storage perspective, we have evaluated that and continue to evaluate where we can potentially play in that space. We are in the process of acquiring Calpine, which does have some battery storage assets. We can’t really talk too much about what we’re doing with Calpine assets until that deal closes for legal purposes, so I’ll just leave it at that.
Nuclear plants do best with a stable, predictable load, but data center loads can fluctuate, especially when they’re serving generative AI. What are the challenges to integrating the power supply and the load and protecting your nuclear assets?
Yeah, it’s a really good question. I’m not an engineer, so I’ll give you my nontechnical answer. I would frame it by saying a lot of the data centers getting built today are on the grid. All our deals so far have been front-of-the-meter PPAs. They’re not directly connected to the plant, so the load variability that you see coming from potential data centers gets managed like every other load variability. There have been some studies about how to do that best, knowing that we could be seeing some significant variability in size relative to what exists on the grid today. A lot of people are working on the solutions to that problem.
The second thing that’s interesting, at least from my perspective, is that those problems are also being considered in the design of racks in the data centers. They’re designing racks to balance some of the compute variability by looking at workloads both in terms of when they happen and from a power conductibility standpoint. NVIDIA in particular just came out with their latest rack design that includes some balancing components to space out some of the load.
Through the Electric Power Research Institute’s DCFlex project, we’re working with the hyperscalers, the generators, and the utilities, trying to figure out ways in which we can have more flexible data centers to support peak load and manage the compute in a different manner to protect the grid or protect the generators.
I think the approach to integrating the grid and the load is a combination of those two things: the macro scale of using the grid and the balancing effect that comes from more resources, and the micro scale of how the data center itself operates.
We at Constellation are not the ones managing the transmission and distribution system and we’re not the ones dealing with the data center build-out design work itself. But I’m involved in understanding how the other stakeholders are trying to work through that. And I think they are all coming up with different solutions that will hopefully avoid any concerns there.
Understanding that Constellation first wants to get maximum value out of its existing assets, what is the plan for new nuclear builds?
There’s definitely a need for new nuclear, and there is a role for us to play. The short answer to your question is that there is a lot of activity in the space and people are working on different solutions.
Trying to figure out which solutions you can scale quickly from first-of-a-kind to nth-of-a-kind has been a priority. We’re figuring out how we can help facilitate that, potentially with projects at our existing sites or in terms of support from a technology development perspective, and we’re looking at ways we’ve done that in the past. Getting a better understanding of how we can move those quickly down that cost curve is really our effort in terms of getting new build.
Does that include looking at new ways of collaborating with end users on siting, offtakes, or operation, for example?
The answer is yes. I can’t go into detail, but yes, we talk to customers on a regular basis about the potential for new nuclear, where facilities might be located, how we think about the offtake agreements, how we think about risk sharing—all of those are part of the discussions. Individual discussions are all confidential, so I can’t really get into those details.
Similarly, on the construction side, thinking through potential folks we can partner with is something that we’ve had a lot of conversations on. There’s been a lot of activity there, but not a lot that we can talk about in specifics at this time.

Constellation’s executive vice president and chief strategy and growth officer, Kathleen Barrón, speaks to attendees at the Clinton site celebration on August 26. (Photo: Constellation)
What does Constellation like about what it sees in current policy and legislation and what are you looking for more certainty on?
I think the thing that we like and that has been very clear is the bipartisan support for nuclear. You saw that with the One Big Beautiful Bill Act in the way in which it treated nuclear and kept those tax credits for both the existing nuclear fleet with the 45U [production] tax credit, but then also in the way in which new nuclear gets treated under the 45Y [production] or the 48E [investment] tax credits.
We are getting a lot of support from the administration and from the Democrats. Seeing both of those groups come together behind a type of technology, and nuclear in particular, has been something that we’re super proud of.
That said, it’s hard to get new reactors built. There are a lot of people working in the administration on ways in which we can put all that together. Making sure that there’s the right level of risk sharing and protection for the companies that want to take that on and figuring out market design elements that will help are some of the biggest issues from a policy support standpoint.
It’s very exciting to be part of this industry today, and advocacy is still important despite the bipartisan support. I think it doesn’t hurt to continue to educate folks on the benefits of nuclear and the attributes that it provides to the system.
Does Constellation see itself in a leadership position, modeling how nuclear power can support the data economy?
Yes, I would say definitely we see ourselves as a leader in the industry and we want to stay in that position. We have a role to play in both making sure the existing fleet runs for as long as possible and runs at excellent levels. Where new nuclear can be added we have a role as well.
So, I would say yes, we certainly want to be a big player as it relates to how we participate and also in terms of thought leadership, which requires a lot of effort from the industry. We work closely with a lot of industry partners trying to figure out the best ways to do that and how Constellation can contribute.