Bringing the age of steam into the information age

Moorburg (Germany): in late February 2015 regular operation of the first of the two units began. Photo: © Vattenfall.

Moorburg (Germany): in late February 2015 regular operation of the first of the two units began. Photo: © Vattenfall.

‘Greetings from the Stone Age’ proclaimed the German newspaper Süddeutsche Zeitung upon the opening of a brand new coal power plant near Hamburg this year, reflecting the growing view that coal power is yesterday’s technology. However, the mounting pressure faced by coal plants to stay relevant in a cleaner power sector is actually compelling them to make use of increasingly futuristic technologies. Nowhere is this more the case than in a plant’s control system, which acts as its central nervous system and is tasked with finding a ‘sweet spot’ that satisfies the often conflicting goals of high energy efficiency and cleaner emissions.

On top of this, as the use of intermittent wind and solar power expands rapidly in countries like Germany and the USA, coal plants are increasingly obliged to turn their output up and down in a manner which makes it even more challenging for their control systems to keep everything running smoothly. In these demanding conditions, the system needs both more sensory data to work with and more advanced software which is able to make sense of the daunting complexity of a large power plant. In the past, the furnace of a coal plant was essentially a black box into which coal and air were fed, and from which a mixture of gases and ash emerged on the other side.

Relatively little information was available on the exact nature of the flows in and out, and still less on the combustion processes occurring within, making it difficult to accurately adjust the critical ratio between air and coal. Although the coal needs enough air to consume as it burns, too much air represents an energy loss to the plant, as well as encouraging the formation of the strictly regulated pollutant, oxides of nitrogen. On top of this, with large furnaces feeding coal and air through dozens of separate burners, hitting the best ratio on a global scale is no guarantee that the whole furnace is well mixed and free from areas with too much or too little air. To help the control system balance out and optimise the combustion process, plants have been making use of newer technologies which give them a much better idea of what’s going on inside the furnace.

US-based company Zolo Technologies installs a grid of infrared lasers which criss-cross the furnace and use spectroscopy to map out levels of critical gas components like oxygen, carbon monoxide (a key indicator of poor combustion), and oxides of nitrogen. An alternative approach for mapping temperature variation across the furnace is provided by Enertechnix, who place sound wave transmitters and receivers around the furnace to measure the temperature-dependent speed of sound through the hot combustion gases. The detailed information generated by these systems is then fed to the control system and used to balance out and optimise the levels of each parameter. Even with the extra data being generated by sensors like these, a power plant has such a large number of variables to adjust that it can be difficult for operators to identify exactly which combination will give the best results, particularly when the grid’s power demand can change so unpredictably.

Over the last decade, plants have therefore begun to install advanced software known as ‘combustion optimisers’ which help control systems by generating a complex model of how each combination of control actions will affect the plant. Rather than being based on physical principles, these models use complex mathematical algorithms such as neural networks, which are trained on operational data to mimic the response of the plant to various inputs. With more and more computing power readily available, these models have become increasingly powerful and capable of predicting how the plant will respond even in rapidly changing conditions.

The importance of these sensor and control technologies has been fully recognised by the United States Department of Energy, which is funding a research drive into how they can be developed even further to help bring the country’s huge fossil fuel fleet into a low carbon age. This programme is focussing particularly on designing sensors which will be able to survive in the even harsher conditions likely to prevail in coal power plants of the future, whether they are using hotter furnaces to generate higher steam temperatures, or first ‘gasifying’ the coal for use in more efficient gas turbines. Many of the sensors being developed are miniaturised, solid state devices which can packaged and deployed in large numbers to maximise the flow of data from the process.

However, traditional silicon-based chips are not able to withstand the temperatures of over 1000°C encountered in coal furnaces, gasifiers, and gas turbines. Novel materials such as high temperature ceramics or silicon alloys are instead being employed for the fabrication of more robust devices, with new gas sensor designs even making use of high surface area ‘nanomaterials’ to enhance their performance. In the US and elsewhere, there is growing interest in replacing microelectronic sensors altogether with optical devices which use light instead of electrons as their medium for sensing and transmitting information. Not only can optical fibres be made from high temperature materials like sapphire, they are immune to signal interference from the electromagnetic noise which abounds in power plants.

Miniaturised devices are also possible, as sections of individual optical fibres can be engineered to modulate light according to the temperature, pressure, and chemistry of their environment. One idea being investigated at the University of Massachusetts is to surround the furnace with optical fibres which can both generate and detect sound waves, allowing the temperature profile of the whole space to be mapped out in three dimensions. This ability of optical fibres to simultaneously report on the environment along their entire length is another property which makes them particularly useful. Whilst many of the basic concepts being explored have already been used in other industries, making them viable at such high temperatures poses a real challenge to researchers. To better protect these sensors, as well as bring them closer to the processes they’re monitoring, researchers are also attempting to actually embed them into power plants components such as steam pipes and turbine blades.

This idea relies on the relatively new and revolutionary technology of additive manufacturing, in which solid objects are made from the bottom up by selectively binding together successive layers of a precursor material. For example, a turbine blade can be fabricated by using a pair of lasers to melt together powdered metal, point by point and layer by layer. Using this technique, researchers at Herriott-Watt University in the UK have placed optical fibre sensors within a blade during the fabrication process, producing a ‘smart part’ which can report on how it is coping in the demanding physical and chemical environment of a gas turbine. This approach allows the failure of components to be preempted, and equipment to be run closer to its limits with reduced risk.

The prospect of plants containing large networks of miniaturised sensors has also prompted a new look at the computational approaches which can make best use of the growing quantities of data. By combining wireless communication technology with microelectronics which shift processing power to the level of the sensor, there is a possibility of highly interconnected networks of ‘smart’ sensors which can communicate and make control decisions amongst themselves, without needing higher level supervision. Some of the algorithms best suited to managing this kind of scenario take their inspiration from biological systems such as ant colonies, where complex behaviour emerges from a large number of entities making simple decisions.

This could be an effective means of dealing with control of larger, more complex power plants which defy attempts to create a global model of the system. Such is the size of the coal fleet in the USA, the Department of Energy has calculated that even the incremental improvements to plant efficiency and reliability gained from these technologies could represent yearly savings of 358 million dollars and 14.4 million tons of carbon dioxide. Considering the even greater impact that could be made by applying successfully commercialised technologies to the still larger coal capacity in China, the value of such improvements in mitigating the impact of the power sector cannot be underestimated. Above all, it demonstrates that this kind of hi-tech laboratory research should not just be limited to making our computers run faster, but to helping clean up the power they consume – even when it comes from ‘stone age’ sources like coal.

Toby Lockwood

  • author's avatar

    By: Toby Lockwood (ONE Team)

    Technical author and analyst at the IEA Clean Coal Centre

  • author's avatar

Be the first to comment on "Bringing the age of steam into the information age"

Leave a comment

Your email address will not be published.