Business

Amazon looks to lure corporate customers with generative AI safety measures

[ad_1]

Amazon is trying to lure big corporate customers to it AWS cloud computing service by offering to guard them against legal and reputational damage that can come from the output of artificial intelligence.

AWS CEO Adam Selipsky, at Amazon’s annual cloud computing conference in Las Vegas, announced a new safeguard against objectionable content on generative AI applications, called Guardrails for Bedrock. The service allows users to filter out harmful content, he said.

Because generative AI is trained on publicly available content, offensive words or other objectionable content can slip through into results from users’ prompts. That is particularly problematic for younger users, in times of global conflict or during elections when generative AI’s output in search results can influence opinion.

Safety advocates have cautioned that generative AI could operate out of the control of its human creators and pump out increasingly dangerous content or operate entire systems without oversight. In particular, they worry about the software putting influential – and convincing – content on social media sites like X and Facebook.

Selipsky said the new service was important for customers to put limits they see fit on the generative AI they use.

“For example, a bank could configure an online assistant to refrain from providing investment advice,” said Selipsky. “Or, to prevent inappropriate content, an e-commerce site could ensure that its online assistant doesn’t use hate speech or insults.”

Also at the conference, Amazon announced it would indemnify its customers against lawsuits based on the misuse of copyrighted materials. Stock photography company Getty Images, for instance, sued Stability AI earlier this year, alleging it scraped its website for images without permission.

Guardrails for Bedrock is in limited preview today, Amazon said. The Seattle company did not provide additional details about its indemnification policy.

[ad_2]

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button