Governments must build trust in AI to fight COVID-19

Governments must build trust in AI to fight COVID-19

As the crisis continues, citizens might be more willing than ever to forgo their civil liberties and data protections to fight coronavirus. To maintain trust, governments must put much-needed AI governance architectures in place.   Governments must put in place appropriate AI governance architectures that enable the creation long-term solutions to conquer COVID-19 and other potential health crises.

These include:  

1. Time limits: Use of personal data should be time limited to the duration of this crisis and then deleted.      

2. Use limits: Use of personal data should be restricted to specific and limited use cases such as contact tracing or quarantine enforcement.  

3. Fairness and inclusiveness: AI systems must treat all citizens equally regardless of gender, ethnicity and other protected classes. This requires governments to be extremely conscious of what datasets are feeding their algorithms. All citizens’ data needs to be included, especially as minority groups may be disproportionately impacted by Covid-19.  

4. Transparency: The types of personal data being used and for what purpose must be (over) communicated to citizens to build trust. AI decision-making algorithms should only be used with clear explanations of their rationale.

5. Accountability: Use of personal data and AI should be accountable to named and visible figures in government and their agencies.  

6. Oversight: There should be oversight by both parliamentary and independent bodies to ensure the implementation of responsible AI. Appropriate ethical AI architecture can ensure that we leverage the best that AI can offer to the present situation without exploiting an anxious public’s desire to find fast solutions. Good AI governance was needed long before COVID-19 arrived. Now, it’s that much more critical

About the Author

One thought on “Governments must build trust in AI to fight COVID-19

Comments are closed.

You may also like these