Here is an interesting set of articles that tries to cover the difference between “interpretive facts” net energy and “real” net energy for a fusion reactor. Our long form advice when reading any fusion article is that true net energy is defined as producing more energy, starting at point A and ending at point Z, than is put into the system from point A to point Z. So if it took 50 megawatts at point A to get the reactor started, then the reactor produced 60 megawatts of fusion energy or converted from heat to energy, and the reactor consumed 5 megawatts to keep it running until point Z then you have produced net energy of 5 MW. Math: -50MW+60MW-5MW=5MW of net energy. But if it took another 6 megawatts to keep another piece of equipment running to keep the fusion under control then you would have a net loss of 1 MW. Today, no reactor has been able to produce net energy. Most are not even close to being ready or are designed to do so. So pay close attention to the “Net Energy” claims or energy used and then created it matters.
(Editor’s note the first article is a little harsh in using the word fake claim. The second really dives into the ITER claims. If you follow the path through Google searches you’ll find it does generally track the way it was mis-represented. As they say the devil is in the details. The editor hopes that all will be forgiven by when net energy worthy of commercial operation is achieved.)
Article 1: http://news.newenergytimes.net/2019/02/12/first-light-fusions-fake/
Article 2: http://news.newenergytimes.net/2017/10/06/the-iter-power-amplification-myth/