Changes in Air Quality Infrastructure

An infrastructure investment that marries traditional air quality data sensors with big data analytics and visualization software would significantly improve monitoring agencies’ ability to manage air quality issues and respond to incidents

Our country’s air quality infrastructure is about to be tested as never before. But unlike the infrastructure challenges that typically grab newspaper headlines – bridges that are falling down, roads in disrepair, and so on – the air quality infrastructure is facing a far more formidable opponent: progress.

To date, the air quality data used in spotting pollution problems, analyzing trends, and guiding effective responses has been collected by more than 370 government monitoring stations and thousands upon thousands of other monitors around the country. In the not-too-distant future, though, those monitoring stations and the air quality analysts who staff them seem likely to be overwhelmed by an avalanche of data unlike anything they have seen before.

Spurred by major improvements in sensor technology quality and a corresponding drop in price, it is now possible to create much denser monitoring networks capable of generating a very granular picture of air quality conditions. Chicago’s Array of Things, for example, features a network of interactive, modular sensor boxes installed around the city to collect real-time data and measure factors that impact liveability, such as climate, air quality, and noise.

The lower cost also allows monitoring equipment to be placed in more locations, including sparsely populated areas which previously would have been ignored.

Coupling these advances with the emergence of Internet of Things (IoT) communication platforms will result in a huge increase in the volume of quality data available and with it, a huge problem. Quite simply, the techniques now being used to analyze data from thousands of monitoring stations will no longer be effective when the number of monitors climbs into the millions. This onslaught of information potentially could cause analysts to miss out on valuable data insights, while monitoring agencies are likely to remain in a permanent reactive state, incapable of taking proactive steps to minimize or prevent air quality issues.

Fortunately, there is a way to deal with this dramatic rise in data. Big data analytics software, already being used extensively in other applications, potentially could easily be adapted to collect and analyze the massive data sets that new monitoring technology will produce. Adding big data to the air quality infrastructure would enable air quality analysts to rapidly review huge volumes of both structured and unstructured data, while simultaneously applying the predictive models, statistical algorithms, and what-if analyses inherent in big data tools.

As in its other applications, big data could also be adapted to present air quality data visually. This visual component would allow data from other sources to be added to air quality data. For example, data from a sensor network could be overlaid on digital maps, providing an effective way to see and then determine the source of pollution and which adjacent areas are most likely to be affected. Similarly, data could be combined with feeds from weather agencies so that sudden wind shifts or changes in atmospheric conditions can be factored into a determination of the likely movement of pollutants.

Clearly, an infrastructure investment that marries traditional air quality data sensors with big data analytics and visualization software would significantly improve monitoring agencies’ ability to manage air quality issues and respond to incidents. Armed with these tools, forecast models could be constructed which show where problems are most likely to occur so that proactive steps can be taken to minimize the effects on residents.

Similarly, analysts could respond to receipt of a complaint about an odor by generating a back track that identifies the likely source of the problem. This would allow the proper agency to respond quickly to the complaints or incidents with an answer based on real data, rather than anecdotal evidence.

Ultimately, more data will serve to shift the focus of air quality analysts from specific bits of data to the trends that emerge from this data, enabling them to study the correlation between data sets and contextualize data with geography. Applying such an approach to an area like Long Beach, CA, for example, could help analysts to conclude that while industrial facilities are typically blamed for air pollution in the area, the real culprit is automobile exhausts.

Just as money spent on proactive health initiatives such as exercise and healthy eating campaigns can lead to longer-term savings for the medical system, investing in better monitoring and analysis tools will have a huge payoff when it comes to the natural environment. An investment now in visualization and big data analysis tools which are capable of effectively processing the growing avalanche of data being generated by sensor networks will significantly improve environmental agencies’ ability to deal with the air quality challenges of the future.

To be truly effective, though, changes in air quality infrastructure must be accompanied by a shift in the mindset of the air quality community. Analysts will need to recognize – and embrace – the fact that the more data they have, the better off they are.

Andres Quijano is ​Systems Operations Manager – Americas for Envirosuite Limited, a global provider of environmental management  technology through its leading Software-as-a-Service platform. The Envirosuite platform provides a range of environmental monitoring, management, and investigative capabilities that are incorporated into a diverse array of operations from waste water treatment to large-scale construction, open-cut mines, port operations, environmental regulators, and heavy industry uses.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.