GSA looks into facial recognition bias and improving accessibility in federal web services
The U.S. General Services Administration, which procures and investigates tech for things like government websites and online services, is making a two-pronged push for accessibility in its recently released Equity Action Plan. Websites must be made accessible beyond the bare minimum, it said, and bias in facial recognition systems means the feds will be avoiding it wherever possible.
The Action Plan is the result of a bit of introspection at the GSA, which "conducted equity assessments and identified a set of actions for three high-impact focus areas," one of which is "federal technology design & delivery."
"Those who most need government services will often have the most difficulty accessing them," reads the memo's intro. "We are dedicated to actions that prioritize equitable user experience as a core design principle, mitigate algorithmic bias, improve digital accessibility, and modernize the delivery of government services to the American people."
To that end the GSA identified two major problems with the recent approach to providing those services.
One is an under-commitment to accessibility, or perhaps it is better stated as a firm commitment to bare compliance and not meeting the community's needs.
"Often government applications and websites have minimal language accessibility, confusing navigation, and poor design practices resulting in user mistrust and frustration," the GSA assessment reads. In particular it noted that the habits of visually impaired users who rely on screen readers differ from the assumptions made in designing government sites. Basic tasks like logins and account checks may not respect these common choices or may require tools (such as cursor use) that are often unavailable to users.
To improve this, the GSA says it will expand usability testing with communities that have been underrepresented in the design process. (As accessibility advocates have told me over and over again, these communities need to be involved from the start or the outcome will be exactly what the agency described above.)
It will also work on making sites perform better on old computers, phones and devices with limited bandwidth.
The second problem is that facial recognition services are racially biased. This likely will not come as a surprise to readers of this website, but government procurement and deployment processes are slow and weird, so it's not entirely surprising that the feds will only now be catching up with what the tech community has been warning of for years.
"Through our own testing, GSA learned that major commercial implementations of facial matching had disproportionately high 'False Rejection Rates' for African Americans," the memo reads, noting at least that this is consistent with the larger body of research in this domain.
Its approach here:
Address discrimination in emerging technologies and data sovereignty. To provide an equitable remote identity-verification experience for a diverse population, GSA’s Login.gov team will perform research studies on equity and bias in facial matching services. Further, GSA will apply an equity lens to the user guides it publishes, which influences governmentwide and industry best practices.
Frustratingly vague, but a broad response can indicate systemic change as well as lip service. Further research on bias in facial recognition services will almost certainly lag academic and industry research by years, but the GSA probably wants to be able to cite itself as a disinterested party.
The "equity lens" may or may not be helpful, but one hopes that, as in so many companies and industries, there are some people who have been flagging problems along these lines for years and can't get anyone to listen. Perhaps this is an opportunity for those voices to be given the attention they deserve.
The GSA also lists plenty more ways it can improve accessibility and equity in the full document, which you can download here or view below.