Leveraging Digital Twins For 'Smart Water' Analytics

Article | April 13, 2020

Leveraging Digital Twins For 'Smart Water' Analytics

Source: OSIsoft


Optimizing water and wastewater operations requires analyzing a lot of variables and the relationships among them. One good way to do that is with a digital twin — a digital parallel to a physical system — that can be used to test and develop new analytics or control strategies without disrupting real-time operations. Here are several examples of how to use digital twins to yield more efficient results.

Building Toward Better Insights

A digital twin is essentially a software-based replication of an existing (or proposed) physical control system or analytics system. It can take advantage of real-time data streams from the physical and logical objects that feed the real-time systems to model how the system would behave under alternate control strategies or operating scenarios. Either way, it can do so without disrupting the live system’s operation or outcomes.

Using digital twin capabilities to provide greater insight into water and wastewater treatment operations, water distribution systems, or stormwater/wastewater collection systems depends on having an organized framework to represent all assets, data streams, and relationships among them in the real-world application. One such example is the Asset Framework (AF) feature used to organize the functionality of PI System operational intelligence attributes to create a digital twin. This can include characteristics such as flow rates, temperature, density, calculations, nameplate information, data stored in external tables, and much more.

Digital twins are great for ‘what-if’ analysis and for raising the profile of hidden factors that could become costly if they remain obscured in detailed data. For example, during a plant upset condition, a digital twin can enable operators and managers to hypothesize and test how the system will react to different control strategies or interventions. It can also be used to fine-tune control options in routine operations, by modeling incremental changes in the degree or timing of process adjustments.

Black-Box Mystery Or Insightful Tool?

Water treatment operators and managers of varying experience levels have typically run at least simple spreadsheet analyses of time series data comparisons and results achieved under ‘normal’ operating conditions. The challenge in getting them to appreciate the logic and value behind digital -twin analytics systems is in being able to demonstrate the insights those systems can provide — in near real time — by using real-world system data.

A key to speeding up that process is overcoming the complexity of cleansing and standardizing data among the many diverse formats and descriptions fed from multiple systems within a utility. Fortunately, smart water analytics systems that use a translation layer (Figure 1) to automate a large part of such data formatting make it easier for skeptical operators to prove the concept within a very short period. Systems that do so can demonstrate their value, in-house — and within a few hours — instead of requiring of weeks or months of effort on the part of third-party data scientists.

Graphic courtesy of OSIsoft

Figure 1. Whether simply formatting an analytics system or creating a true digital twin, having the right data translation infrastructure between diverse data sources and the various users who analyze them is critical for removing confusion and ensuring accuracy throughout the operation.  

The heart of any successful analytics solution is the ability to manage all types of valuable data translated into a common framework. When looking for a digital twin or operational intelligence system to monitor utility performance, make sure that it can accommodate information from all potential sources within the water/wastewater organization and from any third-party resources it might use. These include:

  • Real-Time Data. When a straightforward temperature, pH, or residual chlorine reading is an important factor in system control, be prepared for data inputs to be translated into an appropriate format in real time.
  • Processed Data. If a constant stream of real-time readings would overwhelm an analytics or control system, be prepared to incorporate data that reflects a calculated or cumulative value — such as a 5-minute average.
  • Historic Data. To model potential new responses to extreme circumstances — such as combined sewer overflows or major disruptions to distribution-system pressure — it might be necessary to revisit historic data captured or archived in older datalogging or other onsite systems or in the cloud. The right analytics should be able to translate data from those resources as well.
  • External Data. Finally, make sure that the analytics system can sync up data from external resources to improve utility efficiency. A particularly common application for many utilities is identifying GPS locations for key pieces of equipment or data collection points through external services such as Esri ArcGIS for more detailed mapping, big-picture visualization, or real-time situational awareness (Figure 2).

Graphic courtesy of OSIsoft

Figure 2. The ability to synchronize data with third-party systems — such as GIS mapping services — can make it easier for utility workers to visualize the scope of a problem and track its resolution in real time.

Take Typical Data To A Higher Level

Once the concept of normalized data for an analytics system using digital twin capabilities is proven at a fundamental level, applications can be enlarged to encompass more data points within the core approach.

If more advanced solutions are required, the common data infrastructure of the analytics system should be able to feed normalized data from multiple resources into more advanced platforms — such as third-party artificial intelligence or machine learning technologies — and accept data back from those applications, ready for use in the analytics system. With such two-way compatibility, user monitors can display past performance, current status, and future projections all on a single screen.

See How Data Analytics Proves Its Worth

United Utilities, one of the largest water utilities in the U.K., has used such a predictive data analytics system to achieve 98 percent accuracy on forecasts of the volume of drinking water required to be treated for consumption the following day. It also used the same data analytics system to predict which storm drains were at greatest risk of flooding related to debris clogs before large storms — in time to do something about them. In the Northwest U.S. the City of Salem, OR, is also using predictive analysis to forecast and plan for potential harmful algae blooms.

Gary Wong is the Principal, Global Water Industry at OSIsoft, a leader in real-time operational intelligence. He has over 20 years of extensive international experience providing sustainable, strategic and cost-effective business solutions in the water industry. Prior to joining OSIsoft, he has held positions with Metro Vancouver and as a consultant directing both public and private sectors on Operations, IT strategy, planning, sustainability, and engineering. Mr. Wong is also the Chairman for the Smart Water Networks Forum (SWAN) Americas Alliance and holds a Bachelor’s Degree in Chemical Engineering, is registered as a Professional Engineer in Computer Engineering, holds an M.B.A. from the Queen’s School of Business and is also a Chartered Professional Accountant.

Share Your Comments

Only members can comment, Click here to sign up for free right now

(Your e-mail address will not be published)
Submit Review
No Comments Yet