Music charts, bestseller lists and sport ladders are well-established measures of success. Fans will anxiously watch their favourites climbing the charts, crowing with delight when they make the top of a list.
The photovoltaics industry has its own leaderboards, and in the world of solar, they’re just as exciting as a Hottest 100.
Every 6 months since 1993, the journal Progress in Photovoltaics releases a new set of solar cell efficiency tables, marking who in the world has made the most efficient type of solar cells and panels. The most recent publication was in July.
Solar cell efficiency – the amount of electricity a cell can generate, compared to the amount of energy it’s receiving from the Sun – is a crucial element of their adoption. The more efficient a solar panel, the fewer you need to meet electricity demand.
The tables are split not by genre, but type of cell – from the well-established crystalline silicon cells (the current record is 27.3%, held by LONGi), through to highly experimental cells like perovskite and organic cells.
Companies and research institutions around the world celebrate making it on to these tables, and compete hard to stay at the top.
So why have solar leaderboards?
According to Professor Martin Green, the University of New South Wales (UNSW) expert who leads each paper, the tables help to avoid “mayhem in the journals”.
“If the result doesn’t appear in there, it’s not treated as credible,” Green tells Cosmos.
Certifying efficiency has always been a crucial part of solar research.
“The solar program really got kickstarted during the oil embargoes of the 70s,” says Green.
US president Richard Nixon set up Project Independence, which channelled funding into a number of alternative energy research programs, including solar power.
“The US subcontractors couldn’t just claim a cell result. They had to have them measured at one of the US National Laboratories.”
Green and his colleagues began setting their own efficiency records at UNSW in 1983, and once they had them, they were keen to make sure that other potential challengers had measured their work as accurately.
“There’s a lot of pressure on academics to get the next grant. So you always tend to get some exaggeration of results. Students are often doing the measurements, and there’s pressure on them to get good results, and they might not even know exactly what they’re doing,” says Green.
“So we were keen, in particular once we got the world record, to make sure no one just said they passed us without having it certified.”
The opening of the new journal Progress in Photovoltaics, in 1993, was the next catalyst.
“I just thought, ‘Oh, that’d be a good segment to have in that – just keep track of all the records in the different categories’,” says Green.
And the rest is history. Green and an international group of colleagues collate results from a group of hand-picked test centres, and publish them twice yearly.
“We don’t accept measurements from independent labs. They have to be particular labs that we know can do the measurements well,” says Green.
Some of these test centres have been working with Green since the 1980s – “I’d be there in the morning to be able to catch them in the afternoon in the US, taking the results down over the phone a few days after I mailed the cells off,” he says.
But they’re also keen to bring new centres into the fold.
“We’re expanding, because the manufacture has moved to Asia,” says Green
“We have one test centre in China, and we’re looking to set up 2 more there in the fullness of time, just because that’s where a lot of the best results are coming from now.”
As well as expanding the test centre pool, the team regularly checks the accuracy of both newer and established centres.
“We’ve just conducted a round robin test where we prepare some cells, then circulate them to a large number of test centres, and see how the measurements come back. So in the next version of the tables, we’re going to include the results from that.”
As well as trying to dispel extraordinary claims on solar, Green says he’s aimed to try and set some standards with the tables that filter into industry.
“We have criteria for how you measure the cells, and they’ve got to stay stable for a certain period.”
Highly experimental solar cells can often have super-high efficiencies – for a few seconds. Getting them to last is much trickier.
“It’s not all that onerous. You’re talking about output staying constant for 3 minutes, is what the criteria essentially boils down to,” says Green.
The cells also have to be above a certain size to make the tables.
“The cells have got to be at least one square centimetre, which isn’t all that big,” says Green.
“But some of the cells are about 3mm2, which means you don’t have to add features that you would have to if you wanted to use the cell seriously.”
When they see something that doesn’t meet these criteria, but is still interesting for the field, the team adds it to a “Notable Exceptions” table below the leaderboard.
They also include information on common measurement errors, and observations on how the field is moving.
After more than 30 years on top, UNSW finally lost its record of most efficient silicon solar panel in 2014. Green says that this is part of a broader trend.
“The companies are all the ones holding the records in the silicon area, because they have these big research labs with thousands of people staffing them,” he says.
Chinese solar company LONGi has been topping the silicon charts in recent years.
“It’s established itself as top of the pecking order in terms of its research capabilities,” says Green.
But universities still boast records in less commercialised cells. And this is often where the most interesting trends happen.
“Perovskites – we got our first certified result in 2014 and that was 14% efficient. Now they’re up to 26.7% so that’s nearly doubled in 10 years. That’s really quite remarkable progress,” says Green.
Now in its 64th edition, the tables have been a useful way to both record – and sometimes influence – the field of photovoltaics.
“People work hard to get a result in there. It adds more value for them to have documentation like that,” says Green.
“It’s served multiple purposes – keeping the field honest, I think, is probably the most valuable.”