AI Summit 2022: H&M AI Chief on Creating a Culture of Responsible AI
Consider this scenario:
Mauricio, your voice assistant, has been a big success. Sales are up significantly after his launch. Mauricio’s voice is smooth, and some might even call it seductive. His conversations with customers never feel intrusive even when he is delivering anecdotes based on their personal details.
Then customers started sharing more information about themselves, about “life, love, lust. People are sharing their innermost secrets. He is such a good ‘listener’ and you are getting amazing data on customer behavior. Many of your customers are teenagers. Do you continue to use Mauricio?”
This is one of the exercises posed to AI/ML teams, according to Linda Leopold, head of responsible AI and data, H&M Group, at the AI Summit London 2022.
“AI can have amazing impact but at the same time it has to be handled with care,” she said.
To build a culture of responsible AI in a large organization, she cited three guideposts:
- Do good, which helps an organization reach its sustainability goals
- Do it right to actively work to prevent harm
- Do more and lead the way to responsible AI practices in collaboration with others
Key principles underpinning responsible AI include fairness, transparency, collaboration, respecting privacy and security.
Leopold said it’s important to build awareness but also have a plan of action. “We have to make people embrace them,” she said.
She recommends the following four steps for organizations to take to imbue a culture of responsible AI:
- Build on core values: This is the organization’s anchor, even if the environment is changing quickly.
- Embrace diversity: There is value in hearing different perspectives.
- Make it relatable: Use the power of storytelling.
- Share the mission: Responsible AI is not just the duty of the AI team but throughout the company.
This article first appeared in IoT World Today’s sister publication AI Business.