1. Introduction
This White Paper sets out a methodology for using the Meniscus Calculation Engine (MCE) as a bottom up process centric Energy Monitoring and Targeting based application for use in the Water Industry.MCE is a generic application and so can be used for the monitoring of any type of numeric based data and for setting up set of calculations required to understand the performance of assets.

2.      Methodology
2.1.    Setting up a Process Taxonomy
Each Water Company has its own way of managing their asset base so as to ensure that naming conventions related to the myriad pumps, blowers, drives that comprise a water or wastewater asset have a unique asset identification tag.In developing this form of asset database, each company will normally have a taxonomy for classifying the purpose of these assets which generally is process centric. I.e. the assets are related to some physical process such as; Inlet wastewater pumping etc.Our recommendation in setting up an energy based (also applies to chemical monitoring) is to use the taxonomy structure already used by the Water Company and to match the database structure to this asset taxonomy.
2.1.    Bottom up monitoring
Many Energy Monitoring and Targeting applications are top down driven, i.e. from the fiscal electricity meter installed for the Site and then down, possibly one layer, to any sub meters installed on site. This is because these applications require the use of metered electricity data.Unfortunately the information gathered by such monitoring is limited since sub meters are expensive to install and, in the UK, few sites have comprehensive sub metering at process and sub process levels.Using MCE, the customer can adopt a bottom up approach by maximising the value from existing data that is frequently available from site, namely hours run data for the fixed speed pumps and drives and converting this into consumption data using the rated kW power demand for each drive. By using this very granular data (often available at 15 minute periods) it is possible to build up a very good approximation of energy use at sub process and then process level and to be able to set and apply process centric targets. This process level energy data can then be aggregated up to Site level where it can be compared to the actual fiscal electricity use. This gives a good understanding of the accuracy of the overall ‘model’ developed for your site.
2.3.    Data Accuracy
There are obvious limitations to this approach and so in practice it is important to understand and then workaround these limitations.
1.     Variable speed drives. This is not an approach that can be used with variable speed drives (VSD) and sub metering is realistically the only alternative if there is a sub process or process making heavy use of variable speed drives. Experience shows that VSDs are either used on large pumps and blowers (wastewater inlet pumps, water distribution pumps, aeration blowers etc) where the energy consumption may justify sub metering, or on small precision drives (sampling pumps, chemical dosing pumps etc) where consumption is minimal and can be ignored.
2.     Fixed speed drives that are intentionally over-rated. Some wastewater processes will have drives installed that are over-rated for normal operation. Whilst this may represent an inefficiency, they are installed where they may be called to deliver larger load at specific times. Examples include; Wastewater inlet screw pumps (Archimedes screws) and aeration rotors/surface aerators. In these cases, and in any case where the validity of the kW rating of the drive is questioned, then the current drawn should be used instead.
3.    kW, kWh and Current data form HMI panels. Newer electricity distribution panels are frequently installed with energy monitoring capability already built in and displayed on local HMI panels on the distribution board. Where such data is available then this should be used instead of any hours run data. In this way, over time the reliance on less accurate hours run data will become smaller and smaller.
2.4.    Real Time vs Historic
There is a continual discussion concerning the merits of near real time vs historic monitoring. When using MCE this discussion is not that relevant since the system can deliver both modes of operation. Though large volumes of real time energy or hours run data will impact on server performance.The issue is more a question of what will be done with the information once the system is up and running. If your company has the ability to act on the larger volume of alerts coming from a near real time system then fine but experience indicates that this is frequently not the case (many companies do not want energy based alarms included in their main control room responsibilities) in which case the processing of D+1 data may be more appropriate. D+1 means that yesterday’s data is received and processed early this morning. The raw data is still processed at 15 minute intervals – but the overall calculations are delivered in time for operations to act on them early the next day.
2.5.    Data Cleansing
Data cleansing is an essential element of any data monitoring task and must be considered as part of the development.Energy data itself is frequently of good quality but the area where data cleansing is required is in the normalising factor when creating benchmarks and performance KPI’s. This normalising factor is often flow based (i.e. kWh/Ml, kWh/m3 etc). Examples of data cleansing issues to consider include:
1.     Negative values. Frequently seen with flow meters caused by errors in the set up of the flow meter. Look to resolve the issue with the meter but ignore these negative values using an IF statement – IF flow < 0 THEN 0 ELSE calculation
2.     0 values. Whilst 0 values may be correct you will generate an error since they cause a divide by 0 error. Remove by using a IF Statement – i.e. IF flow = 0 THEN 0 ELSE calculation. Look to combine error 1 and 2 into the same calculation expression.
3.     Automated removal of outliers. Used in a range of target setting application when you want to select a particular set of data for automated target setting and need to ensure that any outliers are excluded from the dataset. Create calculations based on deviation above and below standard deviation values – I.e. ignore anything where the value < or > mean +/1 2/3 standard deviations. Can use a REMOVE function to ensure that any mean or standard deviation calculation ignores values of 0
2.6.    Process based Targeting –historic or model based?
The purpose of any Monitoring and Targeting system is to identify the key normalisers that control energy use and set the targets around these.For a range of Industrial/Commercial applications this is relatively simple since the normalisers for monitoring energy use in buildings and in some production environments are quite simple. With Water and Wastewater assets then these normalisers are more complex and in some cases may not be available/suitable to be monitored – i.e. Biological Oxygen Demand as the normaliser for the aeration process in a wastewater treatment plant.As such the setting of targets becomes more complex and may require the use of more than 1 independent variable and the variables themselves may be surrogates for the variable that we really wish to monitor.The purpose of setting this target is to be able to identify when the actual sub process and process performance deviates away from this target.An example of the relationship between the Actual and Target use for a large wastewater treatment plant is shown in Appendix 7.1. This data is derived from a bottom up approach with targets established for each process. There are two main ways that these targets can be set.
2.6.1.Historic targets
These relate current performance to historic performance with the objective of ensuring that overall improvement does not dip below historic/baseline performance.Easier and quicker to set up than model based targets but can only really identify that you are operating your assets at the same level that you always have done. If undertaken following a site audit then this helps ensure that the site remains operating in the optimised state following the audit. Can also be used as part of an overall strategy of implementing other energy efficiency savings as it delivers the evidence and quantification of energy savings. Can include x% reductions in the target to help establish an incentive to drive down energy use.
2.6.2.Model based
Uses a more process based understanding of the physical processes being used. Derives a process based model that determines the best theoretical performance that the process can deliver.This approach generates more valuable information on the potential for savings but is more complicated to set up since it requires an in-depth understanding of the water and wastewater processes in use at each site to define the model.  As per all models this then requires a much more thorough testing routine to validate the model being used.
2.7.    Calculation complexity
Calculations can be as complex as required to deliver the relevant KPIs and metrics. MCE supports a broad range of its own functions or more experienced users can use the full range of the C# functions in the Microsoft .NET Framework libraries. This also means that high level uses of MCE can write their own C# functions for complex problems.This level of calculation complexity can also be applied to the targets, high and low alert limits and to the costs. This latter point means that MCE can be used to model any electricity tariff.MCE also retains a dependency tree of calculation in memory allowing chains of calculations to be created. Re-calculation of one will subsequently lead to recalculation of all the other relevant calculation in the tree.
2.8.    Automated identification of Savings Opportunities
By aggregating both consumption and target data using a bottom up approach allows the user to create calculations to generate alerts if the sum of the savings (actual use – target use) for the whole Site, or even for particular process, exceeds specified limits which may include a period of time over which the savings should have been occurring and/or a value of savings. This helps to ensure that Opportunities are only identified for meaningful amount of money and are not short duration ‘blips’.
3.      Data feeds
All data feeds can be automatically uploaded into MCE as standard semi colon delimited text files in the format Alias; Date and Time; Value.
3.1.    SCADA
Main source of process and hours run based data. Look to extract at anywhere from 5-30 minutes periodicity (gap between the raw data readings). Includes analytical sensor based data such as flow, dissolved oxygen, raw water colour/turbidity/ pH etc.
3.2.    Laboratory Information Management Systems (LIMS)
Extract historic data once analysed for inclusion in the process based targets and models. Examples of data used include Biological Oxygen Demand (BOD0 mg/l) and ammonia.
3.3.    Energy
In the UK this would be the half hour readings from the fiscal settlement meter of from the AMR sub meters.
4.      Set up and configuration
4.1.    Using Templates
Based on the Company’s asset taxonomy Meniscus will configure existing process templates to match the naming convention in the taxonomy. This is to ensure that the overall naming convention matches that of the company’s main asset database.Templates make use of the API capabilities built into MCE – see section 6.4 for more detail – to dramatically reduce the time to set up the database by allowing users to ‘copy’ processes with all the calculations from the template into a new Site.
4.2.    Configuration with the PC Client
The PC Client application is a thin client application that runs on the customer’s PC’s and is used to configure the entire database. Provides accessibility to every part of the database configuration allowing authorised users the ability to set up their own calculations, targets, costs, conversions and associated high/low alert calculations. Provides the means to download raw and calculated data from the server to the user’s PC from where it can be used in Excel and other applications

4.3.    Configuration with the web site
The main MCE web site provides an alternative means to configure the database for simple updates/additions (PC Client is much better for making a number of such changes) but is primarily used for analysis using a range of drill down graphs, dashboards and management reports. The web site is the only way for an authorised administrator to manage access rights and permissions.Designed principally as the management access to the data but is being generally superseded with the range of Silverlight dashboards.

5.      Target setting using dashboard
This is a Silverlight dashboard solution aimed at giving the user the ability to select the key data they wish to use to set the targets. Examples are given in Appendix 7.2 and 7.3. This dashboard was developed as a tool to help automatically identify the best pump combination to use for multiple large pump installations.
5.1.    Choosing the right time range
Provides small mini trend graphs of each benchmark, KPI, target calculation to help the user select the right range of data around which to set the regression target.
5.2.    Setting regressions
Can use a manual method to de-select outliers, or can use an automated calculation methodology, to remove any outliers from the analysis. Regression equation is automatically calculated.
5.3.    Applying Targets
Regression equation can be updated into the database as a new target
6.      Visualisation
6.1.    Dashboard
A range of Silverlight dashboard are available to view a range of management views of the data. These incorporate map based overviews, drill down capability, tables to name a few.
6.2.    Reports
A range of over 75 management reports, developed in Business Object – Crystal Reports, are available from the web site. Additional reports can be added/developed as required.
6.3.    E-mail alerts
Any calculation or Item in MCE can be set up as an e-mail alert so that an exception e-mail is generated for any instance that the Actual value exceeds the High Limit. By using calculations in the High Limits it is possible to create intelligent alerts to limit the sending out of low value or duplicate alerts.
6.4.    Using the Meniscus API
6.4.1.SOAP Web Services
MCE includes a comprehensive set of SOAP web services that are used by the PC Client application. These are complex to use but give full control over every part of the creation, editing and deletion of database entries. These are only used by Developers with a thorough understanding of the MCE system.
6.4.2.Simplified RESTful Web Services
A much simplified set of RESTful web services is available for users that want to, primarily, get data from the Meniscus servers. There is capability to update calculations, targets, costs and high/low limits.
6.4.3.Creating your own dashboards
For users wanting to create their own Silverlight dashboards then Meniscus have a subset of the SOAP web services created for Silverlight. Alternatively the RESTful web services can be used to create dashboards for any other platform or device.
6.4.4 Smartphone/Tablet friendly Widgets
Using the RESTful web services users can create widget style graphs where the output HTML code can be embedded into a web page to create a graph that automatically refreshes with latest data from the server (real time options available).Makes it simple to quickly view trends for any Item you want and to incorporate them into your own applications.An example of the Widget User Interface for selecting the Widget is given in Appendix 7.5.
6.4.5.Creating your own Templates
Using a combination of the SOAP and the RESTful web services allows users to create their own templates for the entire creation of Sites plus all the associated calculations required for that Site. This could allow a user, for example, to register online and this registration activity calls the template to create their Site.