As a healthcare technology firm, we are known for our expertise in analytical work for the NHS. This includes operational improvement or benchmarking delivered visually via dashboards, and digital tools to improve scheduling, bed planning or empower better decision-making on the ground.
But people are realising that analytics need good engineering
Until recently, we were seldomly asked about how it all works under the hood. However, we are noticing that as trusts and national bodies have developed more and more dashboards, analytical tools and data collection systems, they are noticing errors, outages, delays that are due to back-end processes not being scaled and maintained in line with the increased demands of the analytics teams. In turn, questions that we are asked are pivoting from “can you help us build a dashboard?” to “can you help us make sure our dashboard works and is usable?”.
New tools make it easier than ever to do it well
At the same time, the tools that are available for data engineering, especially in the cloud with Azure, AWS or Gcloud, are making it easier and easier to get it right. And when data engineering is done well, it can ensure that analysts have the right data, refreshed without delay, displaying the right things in the right place.
Data engineering is like plumbing: data flowing to where you need it at the right time.
To deliver high-quality analytics to our clients we have invested heavily in data engineering, both in terms of tools and talent. Our team of data engineers is skilled in building efficient and robust data pipelines using cloud technologies such as Azure, which allows us to process, store, and analyse vast amounts of data in real-time and get our data plumbing right first time.
Example: Work with a National Programme on processing a national data collection
We recently worked with a national programme that collected theatre data from every Trust in the country. The data needed to be organised, flow together into a coherent schema and ultimately be validated, checked and organised into templates and tables. All of this required to be done in a stable, scalable and reproducible fashion. We achieved this using Azure tools, which allowed the data to power high quality analytics and feed into performance improvements for the entire country.
For a detailed case study of this work, click here.
Data engineering is the foundation of analytics
As data and dashboards proliferate, data engineering is becoming increasingly important and more senior leaders are noticing that problems with dashboards may lie under the hood. At Edge, we believe that getting the data right requires even more than engagement on the ground and high quality analysts. It needs the right plumbing to ensure that integrity, scale and reliability of any data products are guaranteed. Talk to us about our data engineers, and rest reassured that any data products we deliver have the highest quality plumbing underneath.
Engineering and insights to support elective recovery and hub-based working with GIRFT and NHSE
One of the ways in which the NHS is trying to reduce the list of patients waiting for surgery is by enabling increased theatre throughput. A method of achieving this is bundling high-volume, low-complexity surgeries together and thus operating on these more efficiently. To understand the impact of this nationally as well as monitor implementation over time, GIRFT and the NHS more generally needed to collect theatre data on a national scale and refresh it regularly.
Several teams in NHSE and GIRFT were stood up to work on this. Edge Health was asked to assist this work by supporting part of the data collection, the engineering once the data was available, help assess the data quality and analyse the data to provide insights into the effects of theatre efficiency, both generally and at hub-based level.
Data Collection: We ensured consistent data collection of all theatre systems
We supported the data collection by working with key stakeholders to identify what data was available and required to answer the question at hand. We then designed a data request for a one-off collection and collaborated with the national data collection teams to obtain data from their regular collection.
Engineering: We designed a system that could be robust, interpretable and updated regularly
Theatre data across all trusts, updated every two weeks requires significant streamlined plumbing under the hood to ensure accuracy, replicability and ability to use the data across teams. We therefore engineered a solution that fit into a Microsoft Azure based data platform, utilising Storage Accounts for reference and input files, Azure Data Factory to orchestrate and carry out the processing and Azure Synapse as a data warehouse from which the data could be consumed from.
Data quality: we put a process in place that flags and automatically improves data quality
As part of the process, we created a virtuous loop of data quality improvement. For every cycle update (i.e., trusts submitting data, engineering pipeline refreshing it and insight being generated), we produce automatic flags and updates that enable feedback to trusts about data quality issues.
Insights: We generated insights that are used around the country
Using the cleaned, processed data, we worked with clinical leads, theatre experts and Trusts to develop analysis to demonstrate opportunity across all theatres in the country. Particularly, we were interested to model what would happen if Trusts were able to schedule and operate their high-volume, low-complexity cases in a routine way, thus hitting the activity levels suggested by early hubs and clinicians trailing the concept.
The analysis suggested that Trusts could significantly increase throughput and reduce the waiting list quickly. That work has since fed into new funding models for hubs and advanced roll-out of the measures across the country. Our robust system ensures the NHS is able to update and track improvements over time across 110 providers and 39 systems as it moves forward towards elective recovery.