True. It requires more energy to get the laser up and running, so it's actually an energy loss.
However fusion has been happening in the sun and stars, but now it happens in the lab; that's progress.
Someone must work out the details and fine-tune the process to make it practical.
Fusion has been happening in controlled laboratory settings for a while. Here's a PDF from LANL, a retrospective article written in 1983, that says "The first experiment in which thermonuclear fusion was achieved in any laboratory was done in 1958 with the Scylla I machine."
https://sgp.fas.org/othergov/doe/lanl/pubs/00285870.pdf. That was 64 years ago--we've been screwing around with fusion for much longer than screwing around with getting back to the moon.
What is new here is that the amount of energy needed to trigger the reaction is less than what was produced by the reaction.
As others have pointed out, the lasers that supplied the energy for the reactions are quite inefficient. Apparently the lasers in this experiment were built in the 80s so we could achieve some efficiency improvements merely by not using ancient technology. But it would be nowhere near enough to solve efficiency issues all on its own. Why are we doing this with 40 year old laser technology? Well...
I misplaced the link, but a few years ago I read an article about he history of funding of fusion research. Apparently several decades ago (early 70s?) a study was done as part of a research proposal, to estimate the amount of funding necessary worldwide to "solve" this problem of fusion energy production for the grid. They had two extremes, one to solve it as fast as possible (i.e. within the constraints of time needed to do the work and construct facilities). Of course that's guesswork. But at the other end of the spectrum was the minimum amount, and I found the concept very interesting. It was basically the amount of funding needed to keep research facilities going: power the lights, mow the grass, empty the trash, answer the phones, security guards, as well as pay researchers' salaries but not give them money to conduct experiments. The article succinctly summarized this as "The maximum amount of money you can spend in order to achieve nothing". It is, unfortunately, the latter which governments have been spending for the last fifty years. I think the conclusion here is that if we had taken it seriously and treated it like a modern-day Manhattan Project, then we could have solved this already and we wouldn't be worried about global warming, fallout from failed fission reactors, or how many batteries are needed for when the wind doesn't blow (or how to recycle those batteries when they're exhausted).