Google changes its rules about AI for weapons and spying – EASY

Stay up-to-date with the latest stories. Follow on social media!
Listen to the story (American accent)

Sign up the for the newsletter!

Originally, Google said it wouldn’t use artificial intelligence (AI) for bad things, like weapons or spying

Now, Google changed their rules. It doesn’t say it won’t use AI for weapons or spying anymore.

AI is getting better very fast. A new AI tool called ChatGPT came out in 2022. Now, many companies are making better AI. But, there aren’t many new rules about using AI safely.

Google leaders said they learned more about good and bad AI. They said many countries want to be the best at AI. They think countries that like freedom should make AI. They want everyone to work together to make AI that helps people, helps the world, and keeps countries safe.

Google made its first AI rules in 2018. In 2018, Google stopped working with the US military because some Google workers were worried. Many Google workers protested then. This new change is very different.

Vocabulary

  • Originally – at first; in the beginning;
  • Artificial intelligence (AI) – human intelligence in a machine. See more here.
  • Weapon – Something used to hurt or harm people; a tool used to hurt people
  • Spying – The act/action of secretly collecting information about someone or something, especially for military or political reasons
  • Rule – A guideline or regulation; sometimes a law; A guide that tells us what we should or shouldn’t do
  • Tool – a thing used to help do a job; a device with a specific function; instrument; equipment
  • Leader – someone who is in charge of a group, organization or country; a boss or manager; a person who guides, directs, or commands a group
  • Freedom – Being able to do what you want; no restrictions
  • Keep – To continue doing something; to continue to hold
  • Military – armed forces; army; agency or department of a government for combat/war
  • Protest – an event where people come together to show strong disapproval/disagreement about something; demonstration; rally; to fight against

Discussion Questions

  • Is it good or bad that Google changed its mind about AI? Why?
  • Do you think AI will mostly help people or hurt people? How?
  • Should companies make AI even if it could be used for weapons? Why or why not?
  • Who should make the rules about AI? Governments? Companies? Someone else?

Original Story

Fill-in-the-Blank Listening Practice

Listen to the story (American accent)

Originally, _____ said it wouldn’t use artificial intelligence (AI) for bad _____, like weapons or spying.

Now, Google changed their _____. It doesn’t say it won’t use AI for weapons or _____ anymore.

AI is getting better very fast. A _____ AI tool called ChatGPT came out in 2022. Now, many _____ are making better AI. But, there _____ many new rules about using AI safely.

Google _____ said they learned more about good and bad AI. They said _____ countries want to be the best at AI. They think countries _____ like freedom should make AI. They want _____ to work together to make AI that helps people, _____ the world, and keeps countries safe.

Google _____ its first AI rules in 2018. In _____, Google stopped working with the US _____ because some Google workers were worried. _____ Google workers protested then. This new change is _____ different.