Speculation about why the federal exchange is so slow and buggy

This morning, I was discussing with a colleague Ezra Klein’s Wonkblog takedown of the federal exchange website. He points to a “darkly amusing” Reddit thread full of evidence of technical goofs by the site’s coders. My colleague and I speculated that perhaps government red tape and security regulations may have driven some design decisions in a way that may make the site less efficient. Yeah, maybe some of them look stupid from the outside, but perhaps some could not be avoided for compliance reasons. It’d take a good, technical/journalistic investigation to know for sure.

The strength of this theory is that there really is a lot of government red tape and a ton of security regulations. The weakness of this theory is that other, single-state sites are performing relatively well (e.g., Kentucky’s). Still, it could be a confluence of compliance-driven inefficient design and traffic in excess of server capacity that makes the difference. I’m only speculating.

Still, some support of this regulatory compliance design constraint theory is found in Timothy Lee’s interview with Robert Moss. (Emphasis added.)

Even though in this type of setting the development teams are using what you might call agile methods, there’s still a huge layer of requirements and review and sign-off. There’s lots of policy decisions that have to be made that shape ever step of the way. There’s much more overhead involved in this sort of thing than if you’re trying to have a small set of people developing the Web site.

The [bottom] layer is the Affordable Care Act, which laid out the parameters. Then on top of that are all the regulations that HHS issued over the course of two years. Then it goes to contractors who have to build it. If you look at the contract, there’s usually a prime contractor and subcontractors. And I think that just adds to the complexity and adds to the number of parties involved. The state governments had to comply with CMS mandates and then work with their contractors. So it’s a pretty complicated structure of trying to roll out. To design what you’re trying to build and build it at a time where the regulations were being written. […]

These projects so big is that there is a very rigorous security oversight involved and layers of audit, layers of rules. The kind of thing that small start-up companies who are just winging it [don’t deal with]. The Center for Medicare and Medicaid Services has been doing this for a long time in terms of the Medicare and Medicaid program. There’s already a lot of rigor, security audits that have to be passed. […]

The Internal Revenue Service is involved for tax reasons. The Department of Homeland Security is involved because of immigration status. The Social Security Administration is involved. Lots of agencies are involved to confirm eligibility for coverage and subsidies. Part of the challenge has been building a data hub to connect data from all of these agencies.

You can bet all these agencies have their own requirements that might trickle down to design constraints. Maybe it’d be more efficient to host or access data this way vs. that way but the agency in question can’t permit the efficient route because of regulation XYZ, blah blah blah. This would not surprise me in the least, but that does not mean I’m right. It also does not excuse an unworkable website.

@afrakt

Share on twitter
Share on facebook
Share on linkedin
Share on reddit
Share on email

Hidden information below

Subscribe

* indicates required
Email Format