Eligibility Scraper

The Case of the Missing 270

Problem:

A robust community support operation has an unusual problem. Their governing body does not allow them to check their clients eligibility directly with their payers. Instead, they must manually check each one through a web portal. The issue? They have over 3000 clients.

Solution:

This was one of our very first solutions, and has a special place in our hearts. We custom coded a specialized web scraper that autonomously queried the web portal and extracted the eligibility, parsing it and alerting the billing staff of changes.

The Details

At Algernon, we often find the problems by simply being at our client’s clinics. We find that be working side to side with our the providers¬† and administrators we gain unique insights into their daily challenges, as well as their remarkable strengths.

While working as an EMR administrator for one of our clients, we noticed that there were two or three staff who always seemed exhausted. We knew that they worked in some capacity in the billing department, but not much more. On a whim, we wandered back to see what was keeping them so hard worked.

It turns out that these staff had the most unusual job of manually verifying the eligibility of each and every individual who received a service through the agency. By itself, this doesn’t sound too burdensome, but we learned that given the volume of clients in their programs, and the volatility of their situations, this process could take these three folks upwards of six months to complete. To compound the problem, it was dull, tedious work.

When we consulted with their CFO and his staff we learned that their agency was not permitted to use the industry standard 270 form for verifying eligibility of their clients. Their governing body instead required them to either use a central EMR system (which we learned was often inaccurate) or manually check each client. Neither option was viable, given the scale of their operation.

Our Solution

 

We were told up front that there was no budget available for a solution to fix this problem. Didn’t much matter to us, so we offered the solution pro bono.

We built a specialized web scraper, one of our first full homebrew solutions. The agency had no servers or capacity, and given that we had no funds to finance the operation, we decided to host it on a spare desktop unit currently gathering dust.

Initially written in Java, and writing it’s output to CSV files, we were able to scrub all 3000 clients in a little under an hour. As time passed, we often revisited this project, migrating it from Java to Python, and then from CSV files to a SQL database. As we grew, it grew with us. Currently, it is hosted on a hybrid on-premise/cloud distributed cluster. With this power, we can easily scrub all 3000 clients, store the information, parse the information, compare it to past versions, and alert billing staff of changes in under 15 minutes. We could actually go faster, but the provider of the web portal politely asked us to throttle it down.

What did we get out of it? When we told those three tired souls that we had automated the process, leaving them free to attend a host of more pleasant needs, they nearly broke into tears and hugged us. We consider that price paid in full.

About the author: Jeff Cubeta