Air quality is a growing problem around the world, particularly in dense urban environments. Often when we experience poor air quality, or visually see smog in a city, we think of congested traffic and the impact cars have on the air we breathe. Yes, combustion engines are a big problem, however, certain industries are left out of the public eye when it comes to air pollution. Construction is one of these industries. According to the London Atmospheric Emissions Inventory, construction sites are responsible for approximately 14.5% of PM2.5, one of the most dangerous fine particle emissions there is. The primary culprit behind this being diesel, which happens to be the fuel that powers most construction equipment. To put into the context of human health, there are approximately 36,000 early deaths in the UK each year that are linked to air pollution. Out of these premature deaths, 9400 are in London alone. Air quality has been on Qflow’s radar for a long time and needless to say, the situation calls for change.
Last year, we launched our Air Quality module on closed trials for the first time in order to help construction sites more closely monitor their air quality and use this data to reduce air pollution on their sites. Last week we hosted an air quality workshop at Qflow HQ, with a range of air quality experts employed in a variety of contexts: from consultancy working across multiple sectors to in-house professionals working across a multitude of construction project types.
Workshops form part of our product discovery process. They are vital to ensure our products are not only aligned to the rules and regulations of that subject area, but that we are truly discovering the critical pain points that are time-consuming, difficult and sometimes borderline impossible to consistently avoid. In this article, we are going to share some of the outcomes from this workshop and a few key insights.
What was the most insightful discovery?
Transparency is key. Showing the data, explaining the context and engaging clients and other stakeholders diplomatically is essential to manage air quality effectively and efficiently. This is especially true for dense urban areas where correlation does not always imply causation. Being able to say, this happened, we identified this as the cause, so we changed this is essential for gaining clients’ trust.
Air Quality is becoming more and more mainstream
In the public sphere, air quality is creeping up the agenda. The rise of electric vehicles, encouraging the public to not use private modes of transportation (such as Transport for London’s Active Travel campaign) and activism by groups like the Environment Defence Fund (EDF) is pushing clients to make sure they hold contractors more to account. It is categorically no longer a fringe issue. Perhaps this is best demonstrated by works on sites being stopped by local authorities resulting in the cost of project delays and additional consultant fees to address, alongside increasing coverage in mainstream newspapers (24th Feb, Evening Standard).
Environmental professionals need to plan ahead, ensuring they are proactively managing air quality and are not caught off guard by a client, or exposed by a 3rd party such as a neighbour or activist group. Effective monitoring is the first step towards this, with a minimum of a coherent sampling regime should permanent monitors prove uneconomical.
Learning; understanding; predicting
At present, the emphasis is very much on the environmental professional to bring their knowledge and experience in managing air quality to enable proactive steps to be taken. Otherwise, the management of air quality remains quite a reactive affair, perhaps best demonstrated by the monitors triggering and notifying relevant persons about an air quality exceedance. However, by the time the exceedance has happened it is already too late. There are ways around this, such as setting the trigger at a lower threshold, but this is at best a proxy.
The potential of various data science techniques and artificial intelligence technologies to fundamentally change air quality management from reactive to proactive cannot be stated enough. Knowing an exceedance is about to happen, alongside understanding the likely causes, will enable incidents to be stopped in their tracks. Spatial data is especially important in this context, particularly on large infrastructure projects and/ or in urban environments. Air quality is inherently difficult to model using traditional methods due to ambient levels, multiple disperse sources and numerous contributing factors. Rather than focussing in on dispersion modelling, which is notoriously difficult, it is better to focus on the outcome we are trying to achieve: avoiding likely exceedances.
At present, lessons learnt predominantly stay with the environmental professionals, occasionally recorded in articles or legacy documents. There needs to be a way of scaling this history, making more of it available and easily accessible.
The need to depoliticise and invigorate
Air quality and individual freedom are inversely related, tapping into wider political themes of individual freedoms verses societal benefits. It is important that air quality remains unpolitical, remaining purely factual on aiming to keep air pollutant levels down and open minded on how this is achieved.
One interesting thought that may add urgency too late is that air quality is the asbestos issue of the future. It would certainly be easy to imagine workers exposed to exceedances taking places of previous employment to court over health issues later in life.
So, what is Qflow doing on this?
Data collection, cleansing and aggregation can already be facilitated across multiple monitors and projects without any issue. We are now focusing on predicting pollutant levels, based on a range of variables. We have proved this in theory using standalone models, and are perfecting these models and building out the software architecture to embed this within our products. Watch this space, and get in touch if you would like to come on the journey with us.