Data migration matters

Bradford Teaching Hospitals NHS Foundation Trust has just gone live with a Cerner Millennium electronic patient record in a joint deployment with Calderdale and Huddersfield NHS Foundation Trust. Data migration was a key part of the project, and one that other public sector organisations can learn from, says chief information officer Cindy Fedell

Data migration is a very important part of an electronic patient record deployment, but it’s easy for people to underestimate the work involved. The temptation is to think that you can push it through in a couple of weeks, but if it goes wrong it can cost you millions to put it right.

Every time it goes wrong, your go-live could be put back by three or four months and that’s all time that you have to pay for, even if your go-live goes well in the end. And we’ve all heard of trusts that are still struggling many months later.

That’s why it’s worth getting a data migration partner. They can find out where you are before you start the migration testing cycle, so you are in a better position to begin. Also, you really need to sell this.

Sell might not be quite the right word, but you need to get across the idea that data quality is very important. It matters to patients. It’s not something that can just be left to the ‘back office’ – it needs to be owned by clinical teams.

Pick the right partner – and get ahead

We built data migration into the contract. We knew we needed to migrate data and we knew that our old patient administration system was very flexible, so we were likely to have some data quality issues.

Also, every trust we went to had those data migration horror stories, and we didn’t want to write one ourselves! So, we did our homework. We looked at two suppliers, and then we picked Stalis, who were contracted to Cerner.

When it came to choosing a supplier, we were looking for two things. The first was very robust model that had been used in other Cerner trusts. The second was the team. We wanted a team that was experienced and that could dedicate time to us. We wanted a team to see us through.

In practical terms, though, we didn’t start thinking about data quality until much later. If I was going to do one thing differently now, I would start much earlier.

Once you start the implementation project you are incurring costs every day. We knew we were going to have data quality issues, and if we had addressed some of them earlier, we could have got ahead.

The problem is that’s a hard sell; you are spending money before you start spending money! But if you know you are going to begin your project in three months, and you spend that time doing data analytics and cleansing, you will be in a better state when you begin.

Stats don’t have to be scary

When we did start, Stalis came in and took the data from our existing system and ran it through their predictive tool. It says: ‘you have so many people without a postcode’, or ‘you have so many people for which it does not look as if the pathway is complete’.

That’s the bit that can really bite you: trusts that don’t get this right can do their migration, and then find out that they have lots of patients with incomplete pathways. That means they have no idea who is waiting, or for how long, so they can’t do their RTT – referral to treatment time – reporting.

Armed with the predictive information, you can start cleaning the data. Then you do several trial runs. You push information into the new software, to make sure it does not throw up anything new. That is why we worked with Stalis; they understand Cerner very well.

We did four runs, and every time it went better. By the time we did the fourth run, the stats were really, really nice. But we might have got the number down by doing more work up front.


Go-live can go well

When it comes to go-live, you do a bulk load of data that is older so it isn’t changing because it takes a week or so to load. And then you do a delta load, which takes a couple of days for the more recent data. We started in A&E. We switched off the old system on Friday afternoon, and then we switched on the new system at about 5am on Sunday. Then we did the wards. It was fine. In fact, it was good.

The data statistics were stunning: up above 99% everywhere. We heard Stalis were a bit disappointed they weren’t even better, but the point is that the gap was manageable; because you have to manually enter whatever didn’t automatically migrate. We were very, very happy.

Not a back office project

We only migrated the minimum. We transferred our master patient index, which was 1.1 million records of unique people, and 1.7 million historic appointments and admissions. That’s for a population of 500,000 people.

Bradford Teaching Hospitals NHS Foundation Trust is classed as a large trust, but not as large as others. A lot of trusts are going to have a lot of records to get across. So, if I was going to give people starting on EPR projects some advice, it would be: plan for the worst! And start early.

Also, make sure this is high on everybody’s agenda. In an EPR project, you report on design and build status. You need to add in quality, so everybody understands the importance of it. This is a huge financial risk, but it’s also a huge clinical risk.

When you are dealing with an EPR, you are dealing with clinical information. Getting this wrong has an impact on RTT and outpatient appointments. That is scary. Potentially, it’s a big reputational hit. So, you can’t just think of this as a data migration or even a data quality exercise.

It is clinical information, it is about patients and how they are treated. You need to put resources into doing this, and into making sure that you do it right.

Related reading