Forward-looking: The Chips Act was born out of a desire to give the US economy a boost following the pandemic and to improve its competitiveness profile on the global stage. Little did its advocates know how successful it would prove to be. Thanks to the investments, by 2030, the US will probably produce around 20% of the world’s most advanced chips, up from zero percent today.

Last week President Biden visited Syracuse, NY, to do something government officials typically do: tout a massive investment in the local economy. But this was not just any investment – it was $6.1 billion provided by the CHIPS and Science Act to Micron Technology, which plans to spend $100 billion building a manufacturing campus in Syracuse’s northern suburbs, as well as a factory in Boise, Idaho.

The investment will have significant impacts on Syracuse, which is hopeful it will revive the local economy. It has a larger significance as well: it is the latest in a series of federal grants doled out under the Chips Act that has spurred an unexpected investment boom across the US.

Multi-billion-dollar grants have been provided to Intel for projects in Arizona, Ohio, New Mexico and Oregon; TSMC for projects in Arizona; and most recently Samsung for projects in central Texas.

The US government has now spent over half of its $39 billion in Chips Act incentives with chip companies and supply chain partners announcing investments totaling $327 billion over the next 10 years. There has also been a 15-fold increase in the construction of manufacturing facilities for computing and electronics devices.

Consider the impact of the Micron investment. Its Idaho facility is expected to be production-ready by 2026, followed by the two facilities in New York in 2028 and 2029. The White House predicts they will create 20,000 construction and manufacturing jobs as well as tens of thousands of indirect jobs in the regions.

It is doubtful the Act’s proponents envisioned such wild success when they were advocating for its passage. Instead, the focus was on the increasingly dwindling competitiveness of the US semiconductor industry on the global stage.

As the Semiconductor Industry Association noted at the time, the share of modern semiconductor manufacturing capacity located in the US has eroded from 37% in 1990 to 12% today, mostly because other countries’ governments have invested ambitiously in chip manufacturing incentives and the US government has not. Meanwhile, federal investments in chip research have held flat as a share of GDP, while other countries have significantly ramped up research investments.

Fast forward a few short years and Commerce Secretary Gina Raimondo is claiming that by 2030, the US will probably produce around 20% of the world’s most advanced chips, up from zero percent today.

This will go far in reducing the US’ dependence on global supply lines, a painful lesson brought home during the pandemic. It probably won’t mean complete self-sufficiency, given that the US consumes over a quarter of the world’s chips, writes Chris Miller, author of Chip War, in the Financial Times.

“Production of smartphones and consumer electronics would be disrupted in the event of a crisis in east Asia, an ever looming fear,” he says. “But this production would be roughly enough for the needs of critical infrastructure like data centers and telecoms.”